Top "Must Have" Skills for This Job:
1. Bachelor Degree and 5 years Information Technology experience OR Technical Certification and/or College Courses and 7 year Information Technology experience OR 9 years Information Technology experience
2. Must have extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster
3. Must have experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog
4. Must have experience with NoSql Databases like HBASE, Mongo or Cassandra
5. Must have experience with Developing Pig scripts/Hive QL ,UDF for analyzing all semi-structured/unstructured/structured data flows
6. Must have working experience with Developing MapReduce programs running on the Hadoop cluster using Java/Python.
7. Must have working experience with Spark and Scala.
8. Must have knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2) and considerations for scalable, distributed systems
9. Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production.
10. Must have working experience in the data warehousing and Business Intelligence systems
11. SDLC Methodology (Agile / Scrum / Iterative Development)
12. Candidates need Green Card or be US Citizens Top
Reason to Love this Job:
1. Work for a company that encourages and fosters resourcefulness, strategic thought and empowers you to make a difference in the lives our members and their communities.
2. Comprehensive suite of financial and medical benefits including Annual Performance Incentives, 401k contributions and 28 PTO days
3. Live, work and play in either Chicago, IL