Dallas, TX , Dallas TX
100000 $ Per Hour
java and hadoop
Location: Dallas, TX
Open for Contractors/FTEs (Any visa status is fine)
'1. Needs to be a Kafka technologist, which practiced Kafka and implemented large scale Kafka (Confluent Kafka) environment to process TB of data per day.
2. Instrumental in driving Kafka implementation and engineering Kafka sizing, security, replication and monitoring
3. Implemented as a firm wide scalable service (Enterprise grade) similar to LinkedIn
4. Worked with team (onsite-offshore)
5. Good in Java and real time event stream processing (Spark, Sala, Kafka Streams)
6. Integrated with Hadoop and familiar with Bigdata tools (Cloudera,Hive, HBase, Zookeeper, Impala, Flume)
7. Implement and develop Cloudera Hadoop data driven platform with advanced capabilities to meet business and infrastructure needs
Good to have
1. Log aggregation tools FluentD, Syslog NG,etc.
2. Elastic Search, Kibana, Log Stash
3. Knowledgeable in technology infrastructure stacks a plus; including: Windows and Linux Operating systems, Network (TCP/IP), Storage, Virtualization, DNS/DHCP, Active Directory/LDAP, cloud, Source control/Git, ALM tools (Confluence, Jira), API (Swagger, Gateway), Automation (Ansible/Puppet)
1. Bachelor's degree in Computer Science, Information Systems, Math or equivalent training and relevant experience
2. 10+ years of work experience within one or more IT organizations. Prior work experience in the technology engineering and development is plus.
3. 5+ years of advanced Java/Python Development experience (spring boot/python, server-side components preferred)
4. 2+ years of Hadoop ecosystem (HDFS, Hbase, Spark, Zookeeper, Impala, Flume, Parquet, Avro) experience for high volume based platforms and scalable distributed systems
5. Experience working with data model, frameworks and open source software, Restful API design and development, and software design patterns
6. Capable of full lifecycle development: user requirements, user stories, development with a team and individually, testing and implementation
7. Production Implementation Experience in projects with considerable data size (in Petabytes PB) and complexity
8. Strong communication and written communications skills with the ability to be highly effective with both technical and business partners. Ability to operate effectively and independently in a dynamic, fluid environment.