Electric CapitalHire With Rapha

Data Engineer

Toronto, Canada 🇨🇦full-time

Benefits

Competitive payDesk setupUnlimited PTO401K matchGym StipendLearning StipendPaid Parental LeaveMacBook Pro + AccessoriesHealth Coverage

About company

Electric Capital is an early-stage venture firm investing in the next generation of iconic crypto founders. We leverage deep technical expertise and an understanding of community governance, token economics, cryptography, and distributed systems to partner with top crypto founders We are often the first investors in a company and help founders from idea through launch and scale.

We have backed projects and companies such as Anchorage, Bitwise, Celo, Coda, Dfinity, Lightning Labs, MobileCoin, NEAR, Oasis, and many more. Prior to starting Electric Capital we were investors in a variety of technology companies including Notion, Newfront Insurance, Optimizely, Cruise, Boom Supersonic, Color Genomics, and Threads. The founders of Electric Capital were previously successful serial entrepreneurs with executive experience at Facebook and Google.

About the role

Responsibilities

  • Build and optimize our data pipelines and architectures for ingesting, processing, and storing large volumes of structured and unstructured data from Github.
  • Developer and maintain our robust ETL (Extract, Transform, Load) processes to ensure data integrity and consistency.
  • Collaborate with cross-functional teams to understand data requirements and translate them into efficient data models and schemas.
  • Maintaining our data warehousing solutions, utilizing both traditional and emerging database technologies (e.g., Postgres & Clickhouse).
  • Develop and maintain data processing workflows using programming languages such as Python and Rust.
  • Continuously monitor and optimize data systems for performance, scalability, and reliability.
  • Contribute to the development of data quality assurance processes and tools.

Requirements

  • 4+ years of experience as a Data Engineer or in a similar role, working with large-scale data pipelines and architectures.
  • Bachelor's degree in Computer Science, or a related field
  • Strong proficiency in Python and Rust.
  • Extensive experience with relational databases (e.g., PostgreSQL, MySQL)
  • Familiarity with emerging and big data technologies and frameworks (e.g., Apache Spark, DuckDb, Clickhouse, Hadoop).
  • Experience with data warehousing solutions (e.g., Snowflake, AWS Redshift, Google BigQuery).
  • Knowledge of data modeling techniques and best practices.
  • Strong problem-solving and analytical skills.
  • Excellent communication and collaboration abilities.
  • Familiarity with Docker and Kubernetes.

We are even more interested in chatting if you…

  • Knowledge of cryptocurrency, blockchain technology, and the crypto ecosystem (e.g., DeFi, NFTs, L1s, L2s).
  • Have experience in web scraping (puppeteer, selenium).
  • Have experience with React and front-end development. 
  • Hands-on experience with cryptocurrency transactions and interacting with blockchain networks.