Avatar for Join Parachute
Powered by technology and compassionate design, Parachute is reimagining the plasma donation

Data Engineer

  • 5 years of exp
  • Full Time
Posted: today• Recruiter recently active
Visa Sponsorship

Not Available

Remote Work Policy

In office

RelocationAllowed

About the job

Location: Austin, TX (Onsite)
Sponsorship: We will not be able to provide work sponsorship now or in the future.

Who Are We?
Powered by technology and compassionate design, Parachute has reimagined the plasma donation experience into one that is easier and friendlier. Using a simple app, our members can book donations and track earnings from the palm of their hand. In using a tech-forward approach, we’re able to offer each member a highly personable and best-in-class experience that’s consistent at each and every visit.

Our vision is to introduce an elevated plasma donation experience to markets with smaller populations that’s grounded in convenience. This model allows us to positively impact the industry supply chain and help patients gain access to the medication they need.

We have grown from 2 to 28 operations in less than three years and plan to continue our rapid expansion. We are looking for people who share in our passion for helping others and are invigorated by the speed at which our start up moves. Come join us as we help the world gain access to more plasma - one donation at a time.

What You'll Do:

The Data Engineer will develop Parachute’s next-generation data platform and scale critical systems. This role partners with Data, Operations, and Product teams to transform raw data into actionable insights that improve decision-making and optimize center operations.

  • Develop data architecture and validation logic to ensure collected plasma is ready to be shipped to customers
  • Migrate existing data pipelines from Python to dbt, ensuring scalability and future-proofing data models.
  • Build and maintain robust data models and pipelines using dbt.
  • Translate customer and product requirements into functional, operational code that meets business needs.
  • Manage project timelines, deadlines, and the development roadmap to ensure timely delivery.
  • Oversee smooth deployments to all users and end points.
  • Deploy data solutions to production environments using dbt, ensuring minimal disruption.
  • Integrate downstream reports and applications using dbt, Python, Azure, and Pipedream.
  • Update all documentation, decision trees, and algorithm diagrams with new features, ensuring references are current and accurate.
  • Serve as the technical leader for comprehensive validation processes.
  • Design mock data template for validation team during each release cycle.
  • Implement new data technologies to streamline and enhance the validation process.
  • Collaborate cross-functionally to align validation efforts with company standards and regulatory requirements.
  • Collaborate across teams to implement modular solutions, monitor and maintain data systems.
  • Monitor and troubleshoot data pipelines to prevent disruption to shipping and casing activities at local centers.
  • Triage, investigate, and resolve data issues reported by stakeholders, ensuring quick resolution.
  • Respond to executive and team inquiries regarding logic, workflows, and validation requirements for customers and products.

Who You Are:

  • Advanced proficiency writing SQL and Python .
  • Deep knowledge of databases; particularly Snowflake though BigQuery, Databricks, and/or Redshift are also relevant.
  • At least two (2) years of experience using dbt for data transformations and architecture development.
  • At least two (2) years of experience using cloud platforms (ex. Azure, AWS, GCP)
  • Familiarity with ETL/ELT concepts and best practices as well as data testing and validation strategies.
  • Extensive experience in managing development, testing, and deployment within cloud-based environments.
  • Ability to interpret complex customer requirements and translate them into functional code.
  • Comfortable with agile or lean development practices
  • Willingness to learn new data technologies.
  • Proactive, adaptable, and demonstrates initiative in driving projects forward
  • Ability to manage cross-functional stakeholder and priorities.
  • Experience with orchestration tools (eg. Airflow and Dagster) and * Streamlit for app development is a plus.

Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field is required
  • Minimum five (5) years’ experience in data engineering or software development in data-driven environments.

About the company

Join Parachute company logo
Powered by technology and compassionate design, Parachute is reimagining the plasma donation501-1000 Employees
Learn more about Join Parachute image

Similar Jobs

Spero Institute company logo
Spero Institute
Optimizing virtual intensive mental healthcare
Archesys company logo
Archesys
Improving the government services that impact everyday lives
MOLTEN Cloud company logo
MOLTEN Cloud
Multi-cloud SaaS arming digital rights holders with better operations to manage & monetize content
BigID company logo
BigID
Find what you need, when you need it. Data driven intelligence and inventory for your business