GCP – Google Cloud Data Heroes Series: Meet Antonio, a Data Engineer from Lima, Peru
Google Cloud Data Heroes is a series where we share stories of the everyday heroes who use our data analytics tools to do incredible things. Like any good superhero tale, we explore our Google Cloud Data Heroes’ origin stories, how they moved from data chaos to a data-driven environment, what projects and challenges they are overcoming now, and how they give back to the community.
In this month’s edition, we’re pleased to introduce Antonio! Antonio is from Lima, Peru and works as a full time Lead Data Engineer at Intercorp Retail and a Co-founder of Datapath. He’s also a part time data teacher, data writer, and all around data enthusiast. Outside of his allegiance to data, Antonio is a big fan of the Marvel world and will take any chance to read original comic books and collect Marvel souvenirs. He’s also an avid traveler and enjoys the experience of reliving family memories through travel. Antonio is proudly pictured here atop a mountain in Cayna, Peru, where all of his grandparents lived.
When were you introduced to Google Cloud and how did it impact your career?
In 2016, I applied for a Big Data diploma at the Universidad Complutense del Madrid, where I had my first experience with cloud. That diploma opened my eyes to a new world of technology and allowed me to get my first job as a Data Engineer at Banco de Crédito del Perú (BCP), the largest bank and the largest supplier of integrated financial services in Perú and the first company in Peru using Big Data technologies. At BCP, I developed pipelines using Apache Hadoop, Apache Spark and Apache Hive in an on-premise platform.
In 2018, while I was teaching Big Data classes at the Universidad Nacional de Ingeniería, I realized that topics like deploying a cluster in a traditional PC were difficult for my students to learn without their own hands-on experience. At the time, only Google Cloud offered free credits, which was fantastic for my students because they could start learning and using cloud tools without worrying about costs.
In 2019, I wanted a change in my career and left on-prem technologies to specialize in cloud technologies. After many hours of study and practice, I got the Associate Cloud Engineer certification at almost the same time I applied for a Data Engineer position at Intercorp, where I would need to use GCP data products. This new job pushed me to build my knowledge and skills on GCP and matched what I was looking for. Months later, I obtained the Professional Data Engineer certification. That certification, combined with good performance at work, allowed me to get a promotion to Data Architect in 2021. In 2022, I have started in the role of Lead Data Engineer.
How have you given back to your community with your Google Cloud learnings?
To give back to the community, once a year, I organize a day-long conference called Data Day at Universidad Nacional Mayor de San Marcos where I talk about data trends, give advice to college students, and call for more people to find careers in cloud. I encourage anyone willing to learn and I have received positive comments from people from India and Latin America. Another way I give back is by writing articles sharing my work experiences and publishing them on sites like Towards Data Science, Airflow Community and the Google Cloud Community Blog.
Can you highlight one of your favorite projects you’ve done with GCP’s data products?
At Intercorp Retail, the digital marketing team wanted to increase online sales by giving recommendations to users. This required the Data & Analytics team to build a solution to publish product recommendations related to an item a customer is viewing on a web page. To achieve this, we built an architecture that looks like the following diagram.
We had several challenges. First, finding a backend that supports millions of requests per month. So after some research, we decided to go with Cloud Run because of the ease of development and deployment.
The second decision was to define a database for the backend. Since we needed a database that responds in milliseconds, we chose Firestore.
Finally, we needed to record all the requests made to our API to identify any errors or bad responses. In this scenario, Pub/Sub and Dataflow allowed us to do it in a simple way without worrying about scaling.
After two months, we were ready to see it on a real website (see below).
For future technical improvements we’re considering using Apigee as our API proxy to gather all the requests and take them to the correct endpoint. Cloud Build will be our alternative to our deployment process.
What’s next for you and what do you hope people will take away from your data hero story?
Thanks to the savings that I’ve collected while working in the past five years, I recently bought a house in Alabama. For me, this was a big challenge because I have only lived and worked outside of the United States. In the future, I hope to combine my data knowledge with the real estate world and build a startup to facilitate the home buying process for Latin American people.
I’ll also focus on gaining more hands-on experience in data products, and giving back to my community through articles and soon, videos. I dream one day to present a successful case of my work in a big conference like the Google Cloud Next.
If you are reading this and you are interested in the world of data and cloud, you just need an internet connection and some invested effort to kickstart your career. Even if you are starting from scratch and are from a developing country like me, believe that it is possible to be successful. Enjoy the journey and you’ll meet fantastic people along the way. Keep learning just like you have to exercise to keep yourself in shape. Finally, if there is anything that I could help you with just send me a message and I would be happy to give you any advice.
Begin your own Data Hero journey
Ready to embark on your Google Cloud data adventure? Begin your own hero’s journey with GCP’s recommended learning path where you can achieve badges and certifications along the way. Join the Cloud Innovators program today to stay up to date on more data practitioner tips, tricks, and events.
If you think you have a good Data Hero story worth sharing, please let us know! We’d love to feature you in our series as well.
Read More for the details.