Expired!

clients ConsenSys

Data Engineer (Python)

Location

🇺🇸 USA

Job Type

Full-time

Experience

N/A

Salary

N/A

Skill Set

Role

All Other Remote

Job Details

Remote, US, Office of CTO/Operations Teams, Full Time


Our mission is to unlock the collaborative power of communities by making Web3 universally easy to use, access, and build on.

Working with ConsenSys puts you at the forefront of an evolving paradigm, transforming our society for the better. We fundamentally believe blockchain is the next generation of technology that can lay the foundation for a more just and equitable society.

Blockchain tech is just over 10 years old. Ethereum itself is still a toddler and we're far from reaching our full potential. You'll get to work on the tools, infrastructure, and apps that scale these platforms to billions of users.

You'll be constantly exposed to new concepts, ideas, and frameworks from your peers, and as you work on different projects — challenging you to stay at the top of your game. You'll join a network of entrepreneurs and technologists that reaches the edge of our ecosystem. ConsenSys alumni have moved on to become tech entrepreneurs, CEOs, and team leads at tech companies.


About ConsenSys's Company Data Team (Office of CTO & Operations)


ConsenSys Data sits within Consensys Software Inc. to help address all our variants of data, break down silos, enable best practices, provide first-rate resources, and accelerate our mission of becoming a cutting-edge data-driven organization. We are using a mix of providing some centralized data engineering functions as a shared service while enabling our business units to make great data decisions with their own data functions.

ConsenSys Software Inc. is a wide organization with each individual business unit facing unique data challenges. Infura needs to provide real-time analytics on top of a data pipeline doing billions of events per day. MetaMask Swaps needs to provide financial accounting for a purely on-chain data source. Truffle needs to track developer engagement across the open-source ecosystem.


What you'll do


We are looking for a data engineer to join our shared data engineering team, with the goal of helping to build, maintain and evolve our data warehouse to support the organization.

You will join a team to work with the business to ensure we have a first-class data warehouse supporting our business units and our business decision-making. Some of the key areas you will help ensure we are doing well are:

  • data quality
  • data governance
  • master data management


Responsibilities:


  • Understanding of the business and data strategy
  • Contribute to the collection, storage, management, quality, and protection of data
  • Implementing data privacy policies and complying with data protection regulations
  • Effectively communicate the status, value, and importance of data collection to executive members and staff
  • Knowledge of relevant applications, big data solutions, and tools
  • Knowledge of real-time streaming data pipelines
  • Governance: Advising on, monitoring, and governing enterprise data
  • Operations: Enabling data usability, availability, and efficiency
  • Innovation: Driving enterprise digital transformation innovation, cost reduction, and revenue generation
  • Analytics: Supporting analytics and reporting on products, customers, operations, and markets


Who we're looking for:


  • Personal and/or Professional involvement in web3 (crypto, tokens, NFTs, dev tooling, dapps, DAOs, blockchain, courses, etc.)
  • 5+ years of overall working experience in an enterprise engineering domain
  • Tech stack you'll work within: Python, SQL, LookML/Looker, BigQuery
  • Preferably focused on a company's operational data (pulling, analyzing, ETL, consolidating, integrating, building out pipelines, etc.)
  • Experience collecting external on-chain market data from the web3 environment.
  • Proficient in Python and SQL programming (building scripts from scratch)
  • Well-versed with popular frameworks like Pandas, Flask, Airflow, Apache Beam etc.
  • Experience with any one cloud technology i.e Google BigQuery (GCP), AWS or Azure (GCP preferred)
  • Hands-on experience with Cloud function, Big Query, DataFlow, Cloud Run, Airflow etc.
  • Strong with Linux Commands and Shell Scripting
  • Experience with Docker & Kubernetes
  • Experience building CI/CD pipelines
  • Knowledge of any one SCM tool like Git, BitBucket, etc.
  • Strong knowledge of Terraform scripting
  • Enthusiasm for shipping high-quality code and helping peers do the same
  • Understanding of web development practices and terminology

Bonus points:


  • Hands-on experience with Kafka
  • Hands-on experience with Apache Spark (py-spark preferred)
  • You're a MetaMask user!


Don't meet all the requirements? Don't sweat it. We're passionate about building a diverse team of humans and as such, if you think you've got what it takes for our chaotic-but-fun, remote-friendly, start-up environment—apply anyway, detailing your relevant transferable skills in your cover letter. While we have a pretty good idea of what we need, we're ready for you to challenge our thinking on who needs to be in this role.

ConsenSys is an equal-opportunity employer. We encourage people from all backgrounds to apply. We are committed to ensuring that our technology is made available and accessible to everyone. All employment decisions are made without regard to race, color, national origin, ancestry, sex, gender, gender identity or expression, sexual orientation, age, genetic information, religion, disability, medical condition, pregnancy, marital status, family status, veteran status, or any other characteristic protected by law. ConsenSys is aware of fraudulent recruitment practices and we encourage all applicants to review our best practices to protect themselves which can be found at https://consensys.net/careers/best-practices-to-avoid-recruitment-fraud/.


#LI-Remote #LI-EC1