Building APIs for Energy Sector which enables Utilities , Consumers to do Advanced Energy Management11-50 Employees
- B2B
- Early StageStartup in initial stages
- Website
- Location
- Company size
- 11-50 people
- Total raised
- $900K
- Company type
- SaaSEnergy Service Company (Esco)Smart Home Technology
- Markets
Jobs at Ramanujan
• MS or BS in Computer Science or a related field OR equivalent practical experience in data engineering. • 4+ years of industry experience working with distributed data technologies (e.g. Hadoop, MapReduce, Spark, Flink, Kafka, etc.) for building efficient & large-scale data pipelines. • Software Engineering proficiency in at least one high-level programming language (Java, Scala, Python or equivalent). • Experience required in building batch data processing pipelines curating data for data science consumers. • Experience strongly preferred building stream-processing applications using Apache Flink, Spark-Streaming, Apache Storm, Kafka Streams or others. • Ability to work on tight deadlines and able to write code with over 90% code coverage • Experience with AWS and/or Azure is preferred. • Experience with CI/CD tools and technologies is a plus • Experience working in Sprints and as part of a distributed team