Medium’s mission is to help people deepen their understanding of the world and discover ideas that matter. We are building a place where ideas are judged on the value they provide to readers, not the fleeting attention they can attract for advertisers. We are creating the best place for reading and writing on the internet—a place where today’s smartest writers, thinkers, experts, and storytellers can share big, interesting ideas.
We are looking for a Data Analytics Engineer who will help build, maintain, and scale our business critical data warehouse and BI platform. In this role, you will lead development of both transactional and data warehouse designs, mentoring our team of cross functional engineers and data scientists. You’ll gain a deep understanding of how we use data in our business, and help make self-serve data a reality at Medium.
At Medium, we are proud of our product, our team, and our culture. Medium’s website and mobile apps are accessed by millions of users every day. Our mission is to move thinking forward by providing a place where individuals and publishers can share their stories and perspectives. Behind this beautifully-crafted platform is our engineering team who works seamlessly together. From frontend to API, from data collection to product science, Medium engineers work multi-functionally with open communication and feedback.
What you will doWork on high impact projects that improve data availability and quality, and provide reliable access to data for the rest of the business.You’ll be the go-to Looker expert at the company, and will help bridge the gap between understanding business needs and knowing how to design efficient, usable data models.Work with engineers, product managers, and data scientists to understand data needs and implement data exploration tools and dashboardsBuild data expertise and own data quality for allocated areas of ownership.You’ll help define the self-serve data strategy at Medium, advocate for best practices, lead trainings, and investigate new technologies.Design, architect, and support new and existing ETL pipelines and Looker data models, and recommend improvements and modifications.Analyze, debug and maintain critical data pipelines. Tune SQL queries and Snowflake data warehouse configurations to improve performance while keeping costs in mind.Identify and help triage infra issues with our ETL infrastructure.
Who you areYou have 2+ years of software engineering and/or data analytics experience.You have experience with schema design and dimensional data modeling.You have experience writing and optimizing large, complex SQL and ETL processes, particularly on column-oriented databases and event-based data structures. You have designed and built data models in Looker, and you know how to balance trade-offs between performance and usability. You have a BS in Computer Science / Software Engineering or equivalent experience.
Nice to haveKnowledge and experience using SparkProficiency with PythonExperience with Snowflake
Medium taps into the brains of the world’s most insightful writers, thinkers, and storytellers to bring you the smartest takes on topics that matter. So whatever your interest, you can always find fre...
View all jobs