- B2B
- Growth StageExpanding market presence
Blockchain Data Engineer
- $150k – $200k
- Remote •+3
- 5 years of exp
- Full Time
Not Available
Onsite or remote
About the job
About Us
OpenBlock is a data-driven platform dedicated to fostering sustainable growth within decentralized protocols. Our product powers incentive recommendations for leading protocols in the space, including: EigenLayer, Lido, Arbitrum, Solana, Sui, Uniswap, Scroll, Fuel, Linea, Mode, and many others.
OpenBlock is backed by notable investors, including: Foundation Capital, Electric Capital, Circle Ventures, AlleyCorp, and others. Our team of 40+ has backgrounds from Stanford, a16z, Carnegie Mellon, Meta, Palantir, and other top-tier institutions.
We invite you to explore our work at www.openblocklabs.com
For the Blockchain Data Engineer position your responsibilities will center around collecting data from disparate data source.
This includes:
- Develop and maintain automated ETL (Extract, Transform, Load) processes for ingesting and transforming raw blockchain data into usable formats.
- Work across multiple L1s, L2s, and other blockchain ecosystems. Connect to RPC endpoints, index smart contracts, and quickly make sense of blockchain event data.
- Evaluate and implement tools and technologies for blockchain data processing, such as blockchain explorers, node APIs, and data streaming platforms.
- Integrate and manage these workflows within Dagster for streamlined data processing workflows with robust monitoring and alerting.
- Write clean, robust, and maintainable code that adheres to industry standards and best practices.
- Manage and optimize databases for storing blockchain data, including schema design, indexing, and query optimization.
Qualifications:
- 5+ years of experience as a data engineer.
- Understanding of blockchain event data, familiarity with smart contracts and blockchain data structures.
- Proficiency in programming languages such as Python and SQL.
- Experience with distributed storage systems (e.g. S3).
- Expertise with any of ETL schedulers such as Apache Airflow, Dagster, Prefect or similar frameworks.
- Excellent problem-solving skills and the ability to work independently as well as part of a team.
- Strong communication skills to effectively collaborate with researchers, engineers, and data teams.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
Salary ranges provided for US based candidates.
About the company
- B2B
- Growth StageExpanding market presence