Truelogic Software

Principal Engineer: Data & Analytics - Art Marketplace

at Truelogic Software
Technology & Programming Full-Time Latin America
522 days ago

Description

Truelogic is a leading provider of nearshore staff augmentation services, located in New York. Our team of 500 tech talents is driving digital disruption from Latin America to the top projects in U.S. companies. Truelogic has been helping companies of all sizes to achieve their digital transformation goals.

Would you like to make innovation happen? Have you ever dreamed of building Products that impact millions of users? Nice! Then we have a seat for you on our team!

 

What are you going to do?

Artists shape culture. They spark conversation, create connection, and bring beauty into the world. Minted is where they come together to reach further. Our client's marketplace empowers a thriving community of independent artists to sell and scale their work. They nurture self-expression, cultivate community, and bring the best in visual art to a global audience.

Their marketplace brings the best in independent design to consumers everywhere. They recognize the challenges independent artists face, and leverage their resources to level the playing field and create a platform that gives artists the freedom to develop their craft and grow their business.

This role is accountable for all aspects of the technical strategy, data design, and data quality that lead to Minted having a reliable and powerful analytics platform that serves all of the company’s analytics needs, including advanced customer modeling, proprietary voting algorithms, marketing effectiveness models, and business reporting. Users of the system include regular business users, executives, statisticians, and analysts. The core of our analytics environment is a multi-terabyte Snowflake database that is connected by a wide variety of inbound as well outbound data pipelines. We work with raw log and transactional data from Minted’s upstream environments as well as with a wide variety and ever expanding list of 3rd party data sources including (e.g., website telemetry / error logs, MarTech vendors and publishers, CRM systems, email service providers, data enrichment platforms). On the outbound side, the analytics environment systematically provisions data to our web and email personalized marketing engines, to our BI tools and advanced analytics tools, and through an API layer that can be used to call for specific metrics. We are also pioneering the use of advanced data visualizations to help the general public understand the power and the reach of our community crowdsourcing model.

  • Delivering on a roadmap for the analytics ecosystem that matches the needs of the company

  • Establish scalable data architectures, infrastructure, data ingestion pipelines, tools, and controls to help the analytics team grow

  • Design and implement scalable streaming data pipelines processing hundreds of millions of transactions a day

  • Ensure that the data model scales and enables high performance

  • Lead implementation of data warehouse solutions, providing near real-time data to a variety of client systems

  • Develop & collaborate with other engineers within and outside of Analytics group

  • Guide other team members on best practices in Data Warehousing and software engineering

  • Work in a dynamic, agile startup environment managing multiple high-impact and high-visibility priorities simultaneously

  • Set up and manage a monitoring and support process that ensures that analytics system is ready and available for business stakeholders

  • Ensure the security of the analytics system

  • Set up and manage a data quality monitoring process that ensures that the data being used is correct

 

Why do we need your skills?

  • 6+ years of Data Warehouse development (Snowflake, Redshift or Presto is preferred)

  • 8+ years of SQL Scripting experience using RDBMS

  • 2+ years of experience with distributed systems, distributed data stores, data pipelines, and other tools in the AWS ecosystem

  • 2+ years implementing ETL using open-source solutions (Airflow is preferred)

  • 4+ years of Python ETL development (scripting is ok)

  • Experience with streaming pipelines at scale (Kinesis, Kafka, Spark, or Beam is preferred)

  • Hands-on experience with Unix scripting using Bash

  • Experience with design, implementation, and enhancement of BI tool is a plus (e.g., Looker, Tableau, PowerBI)


关注公众号,不定期副业成功案例分享
Follow WeChat

Success story sharing

Want to stay one step ahead of the latest teleworks?

Subscribe Now