Azure – Generally available: Improved Try Azure Cosmos DB for free experience
Get started using Azure Cosmos DB for free.
Read More for the details.
Get started using Azure Cosmos DB for free.
Read More for the details.
Query your Azure Cosmos DB containers even more efficiently with new query engine optimizations.
Read More for the details.
Azure Cosmos DB accounts can now take advantage of continuous backup with seven-day data retention and point-in-time restore capabilities.
Read More for the details.
Use new Go SDK features in your Azure Cosmos DB SQL API account including authentication with Azure Active Directory, the ability to execute single partition queries, and transactional batch support.
Read More for the details.
Use the new Azure Data Studio MongoDB extension for Azure Cosmos DB to manage all your MongoDB resources.
Read More for the details.
Get personalized recommendations based on your Azure Database for MySQL Flexible Server usage.
Read More for the details.
SQL Server 2016 customers can now use Azure without migrating through offloading analytics or read-only workloads to Azure SQL Managed Instance (MI).
Read More for the details.
New features include MLflow enhancements and train and deploy models in Azure hybrid and multi-cloud.
Read More for the details.
Amazon Relational Database Service (Amazon RDS) Custom for Oracle now supports Oracle Database versions 12.2 and 18c. Amazon RDS Custom is a managed database service for applications that require customization of the underlying operating system and database environment. With support now added for 12.2 and 18c, you can now run your legacy, packaged and customized applications that are dependent on these database versions on Amazon RDS Custom for Oracle.
Read More for the details.
Is it September yet? Hardly! School is barely out for the summer. But according to Google and Quantum Metric research, the back-to-school and off-to-college shopping season – which in the U.S. is second only to the holidays in terms of purchasing volume1 – has already begun. For retailers, that means planning for this peak season has kicked off as well.
We’d like to share four key trends that emerged from Google research and Quantum Metric’s Back-to-School Retail Benchmarks study of U.S. retail data, explore the reasons behind them, and outline the key takeaways.
1. Out-of-stock and inflation concerns are changing the way consumers shop. Back-to-school shoppers are starting earlier every year, with 41% beginning even before school is out – even more so when buying for college1. Why? The behavior is driven in large part by consumers’ concerns that they won’t be able to get what they need if they wait too long. 29% of shoppers start looking a full month before they need something1.
Back-to-school purchasing volume is quite high, with the majority spending up to $500 and 21% spending more than $1,0001. In fact, looking at year-over-year data, we see that average cart values have not only doubled since November 2021, but increased since the holidays1. And keep in mind that back-to-school spending is a key indicator leading into the holiday season.
That said, as people are reacting to inflation, they are comparing prices, hunting for bargains, and generally taking more time to plan. This is borne out by the fact that 76% of online shoppers are adding items to their carts and waiting to see if they go on sale before making the purchase1. And, to help stay on budget and reduce shipping costs, 74% plan to make multiple purchases in one checkout1. That carries over to in-store shopping, when consumers are buying more in one visit to reduce trips and save on gas.
2. The omnichannel theme continues. Consumers continue to use multiple channels in their shopping experience. As the pandemic has abated, some 82% expect that their back-to-school buying will be in-store, and 60% plan to purchase online. But in any case, 45% of consumers report that they will use both channels; more than 50% research online first before ever setting foot in a store2. Some use as many as five channels, including video and social media, and these 54% of consumers spend 1.5 times more compared to those who use only two channels4.
And mobile is a big part of the journey. Shoppers are using their phones to make purchases, especially for deadline-driven, last-minute needs, and often check prices on other retailers’ websites while shopping in-store. Anecdotally, mobile is a big part of how we ourselves shop with our children, who like to swipe on the phone through different options for colors and styles. We use our desktops when shopping on our own, especially for items that require research and represent a larger investment – and our study shows that’s quite common.
3. Consumers are making frequent use of wish lists. One trend we have observed is a higher abandonment rate, especially for apparel and general home and school supplies, compared to bigger-ticket items that require more research. But that can be attributed in part to the increasing use of wish lists. Online shoppers are picking a few things that look appealing or items on sale, saving them in wish lists, and then choosing just a few to purchase. Our research shows that 39% of consumers build one or two wish lists per month, while 28% said they build one or two each week, often using their lists to help with budgeting1.
4. Frustration rates have dropped significantly. Abandonment rates aside, shopper annoyance rates are down by 41%, year over year1. This is despite out-of-stock concerns and higher prices. But one key finding showed that both cart abandonment and “rage clicks” are more frequent on desktops, possibly because people investing time on search also have more time to complain to customer service.
And frustration does still exist. Some $300 billion is lost each year in the U.S. from bad search experiences5. Data collected internationally shows that 80% of consumers view a brand differently after experiencing search difficulties, and 97% favor websites where they can quickly find what they are looking for5.
What are the key takeaways for retailers? In general, consider the sources of customer pain points and find ways to erase friction. Improve search and personalization. And focus on improving the customer experience and building loyalty. Specifically:
80% of shoppers want personalization6. Think about how you can drive personalized promotions or experiences that will drive higher engagement with your brand.
46% of consumers want more time to research1. Drive toward providing more robust research and product information points, like comparison charts, images, and specific product details.
43% of consumers want a discount1, but given current economic trends, retailers may not be offering discounts. In order to appease budget-conscious shoppers, retailers can consider other retention strategies such as driving loyalty using points, rewards, or faster-shipping perks.
Be sure to keep returns as simple as possible so consumers feel confident when making a purchase, and reduce possible friction points if a consumer decides to make a return. 43% of shoppers return at least a quarter of the products they buy and do not want to pay for shipping or jump through hoops1.
Google-sponsored research shows that price, deals, and promotions are important to 68% of back-to-school shoppers.7 In addition, shoppers want certainty that they will get what they want. Google Cloud can make it easier for retailers to enable customers to find the right products with discovery solutions. These solutions provide Google-quality search and recommendations on a retailer’s own digital properties, helping to increase conversions and reduce search abandonment. In addition, Quantum Metric solutions, available on the Google Cloud Marketplace, are built with BigQuery, which helps retailers consolidate and unlock the power of their raw data to identify areas of friction and deliver improved digital shopping experiences.
We invite you to watch the Total Retail webinar “4 ways retailers can get ready for back-to-school, off-to college” on demand and to view the full Back-to-School Retail Benchmarks reportfrom Quantum Metric.
Sources:
1. Back-to-School Retail Benchmarks reportfrom Quantum Metric
2. Google/Ipsos,Moments 2021, Jun 2021, Online survey, US, n=335 Back to School shoppers
3. Google/Ipsos, Moments 2021, Jun 2021, Online survey, US, n=2,006 American general population 18+
4. Google/Ipsos, Holiday Shopping Study, Oct 2021 – Jan 2022, Online survey, US, n=7,253, Americans 18+ who conducted holiday shopping activities in past two days
5. Google Cloud Blog, Nov 2021, “Research: Search abandonment has a lasting impact on brand loyalty”
6. McKinsey & Company, “Personalizing the customer experience: Driving differentiation in retail”
7. Think with Google, July 2021, “What to expect from shoppers this back-to-school season”
Read More for the details.
Google Cloud Data Heroes is a series where we share stories of the everyday heroes who use our data analytics tools to do incredible things. Like any good superhero tale, we explore our Google Cloud Data Heroes’ origin stories, how they moved from data chaos to a data-driven environment, what projects and challenges they are overcoming now, and how they give back to the community.
In this month’s edition, we’re pleased to introduce Francisco! He is based out of Austin, Texas, but you’ll often find him in Miami, Mexico City, or Bogotá, Colombia. Francisco is the founder of Direcly, a Google Marketing Platform and Google Cloud Consulting/Sales Partner with presence in the US and Latin America.
Francisco was born in Quito, Ecuador, and at age 13, came to the US to live with his father in Miami, Florida. He studied Marketing at Saint Thomas University, and his skills in math landed him a job as Teaching Assistant for Statistics & Calculus. After graduation, his professional career began at some nation’s leading ad agencies before he eventually transitioned into the ad tech space. In 2016, he ventured into the entrepreneurial world and founded Direcly, a Google Marketing Platform, Google Cloud, and Looker Sales/Consulting partner obsessed with using innovative technological solutions to solve business challenges. Against many odds and with no external funding since its inception, Direcly became a part of a selected group of Google Cloud and Google Marketing Platform partners. Francisco’s story was even featured in a Forbes Ecuador article!
Outside of the office, Francisco is an avid comic book reader/collector, a golfer, and fantasy adventure book reader. His favorite comic book is The Amazing Spider-Man #252, and his favorite book is The Hobbit. He says he isn’t the best golfer, but can ride the cart like a pro.
When were you introduced to the cloud, tech, or data field? What made you pursue this in your career?
I began my career in marketing/advertising, and I was quickly drawn to the tech/data space, seeing the critical role it played. I’ve always been fascinated by technology and how fast it evolves. My skills in math and tech ended up being a good combination.
I began learning some open source solutions like Hadoop, Spark, and MySQL for fun and started to apply them in roles I had throughout my career. After my time in the ad agency world, I transitioned into the ad tech industry, where I was introduced to how cloud solutions were powering ad tech solutions like demand side, data management, and supply side platforms.
I’m the type of person that can get easily bored doing the same thing day in and day out, so I pursued a career in data/tech because it’s always evolving. As a result, it forces you to evolve with it. I love the feeling of starting something from scratch and slowly mastering a skill.
What courses, studies, degrees, or certifications were instrumental to your progression and success in the field? In your opinion, what data skills or competencies should data practitioners be focusing on acquiring to be successful in 2022 and why?
My foundation in math, calculus, and statistics was instrumental for me. Learning at my own pace and getting to know the open source solutions was a plus. What I love about Google is that it provides you with an abundance of resources and information to get started, become proficient, and master skills. Coursera is a great place to get familiar with Google Cloud and prepare for certifications. Quests in Qwiklabs are probably one of my favorite ways of learning because you actually have to put in the work and experience first hand what it’s like to use Google Cloud solutions. Lastly, I would also say that just going to the Google Cloud internal documentation and spending some time reading and getting familiar with all the use cases can make a huge difference.
For those who want to acquire the right skills I would suggest starting with the fundamentals. Before jumping into Google Cloud, make sure you have a good understanding of Python, SQL, data, and some popular open sources. From there, start mastering Google Cloud by firstly learning the fundamentals and then putting things into practice with Labs. Obtain a professional certification — it can be quite challenging but it is rewarding once you’ve earned it. If possible, add more dimension to your data expertise by studying real life applications with an industry that you are passionate about.
I am fortunate to be a Google Cloud Certified Professional Data Engineer and hold certifications in Looker, Google Analytics, Tag Manager, Display and Video 360, Campaign Manager 360, Search Ads 360, and Google Ads. I am also currently working to obtain my Google Cloud Machine Learning Engineer Certification. Combining data applications with analytics and marketing has proven instrumental throughout my career. The ultimate skill is not knowledge or competency in a specific topic, but the ability to have a varied range of abilities and views in order to solve complicated challenges.
You’re no doubt a thought leader in the field. What drew you to Google Cloud? How have you given back to your community with your Google Cloud learnings?
Google Cloud solutions are highly distributed, allowing companies to use the same resources an organization like Google uses internally, but for their own business needs. With Google being a clear leader in the analytics/marketing space, the possibilities and applications are endless. As a Google Marketing Platform Partner and having worked with the various ad tech stacks Google has to offer, merging Google Cloud and GMP for disruptive outcomes and solutions is really exciting.
I consider myself to be a very fortunate person, who came from a developing country, and was given amazing opportunities from both an educational and career standpoint. I have always wanted to give back in the form of teaching and creating opportunities, especially for Latinos / US Hispanics. Since 2018, I’ve partnered with Florida International University Honors College and Google to create industry relevant courses. I’ve had the privilege to co-create the curriculum and teach on quite a variety of topics. We introduced a class called Marketing for the 21st Century, which had a heavy emphasis on the Google Marketing Platform. Given its success, in 2020, we introduced Analytics for the 21st Century, where we incorporated key components of Google Cloud into the curriculum. Students were even fortunate enough to learn from Googlers like Rob Milks (Data Analytics Specialist) and Carlos Augusto (Customer Engineer).
What are 1-2 of your favorite projects you’ve done with Google Cloud’s data products?
My favorite project to date is the work we have done with Royal Caribbean International (RCI) and Roar Media. Back in 2018, we were able to transition RCI efforts from a fragmented ad tech stack into a consolidated one within the Google Marketing Platform. Moreover, we were able to centralize attribution across all the paid marketing channels. With the vast amount of data we were capturing (17+ markets), it was only logical to leverage Google Cloud solutions in the next step of our journey. We centralized all data sources in the warehouse and deployed business intelligence across business units.
The biggest challenge from the start was designing an architecture that would meet both business and technical requirements. We had to consider the best way to ingest data from several different sources, unify them, have the ability to transform data as needed, visualize it for decision makers, and set the foundations to apply machine learning. Having a deep expertise in marketing/analytics platforms combined with an understanding of data engineering helped me tremendously in leading the process, designing/implementing the ideal architecture, and being able to present end users with information that makes a difference in their daily jobs.
We utilized BigQuery as a centralized data warehouse to integrate all marketing sources (paid, organic, and research) though custom built pipelines. From there we created data driven dashboards within Looker, de-centralizing data and giving end users the ability to explore and answer key questions and make real time data driven business decisions. An evolution of this initiative has been able to go beyond marketing data and apply machine learning. We have created dashboards that look into covid trends, competitive pricing, SEO optimizations, and data feeds for dynamic ads. From the ML aspect, we have created predictive models on the revenue side, mixed marketing modeling, and applied machine learning to translate English language ads to over 17 languages leveraging historical data.
What are your favoriteGoogle Cloud data productswithin the data analytics, databases, and/or AI/ML categories? What use case(s) do you most focus on in your work? What stands out aboutGoogle Cloud’s offerings?
I am a big fan of BigQuery (BQ) and Looker. Traditional data warehouses are no match for the cloud – they’re not built to accommodate the exponential growth of today’s data and the sophisticated analytics required. BQ offers a fast, highly scalable, cost-effective and fully controlled cloud data warehouse for integrated machine learning analytics and the implementation of AI.
Looker on the other hand, is truly next generation BI. We all love Structured Query Language (SQL), but I think many of us have been in position of writing dense queries and forgetting how some aspects of the code work, experiencing the limited collaboration options, knowing that people write queries in different ways, and how difficult it can be to track changes in a query if you changed your mind on a measure. I love how Look ML solves all those challenges, and how it helps one reuse, control and separate SQL into building blocks. Not to mention, how easy it is to give end users with limited technical knowledge the ability to look at data on their terms.
What’s next for you?
I am really excited about everything we are doing at Direcly. We have come a long way, and I’m optimistic that we can go even further. Next for me is just to keep on working with a group of incredibly bright people who are obsessed with using innovative technological solutions to solve business challenges faced by other incredibly bright people.
From this story I would like to tell those that are pursuing a dream, that are looking to provide a better life for themselves and their loved ones, to do it, take risks, never stop learning, and put in the work. Things may or may not go your way, but keep persevering — you’ll be surprised with how it becomes more about the journey than the destination. And whether things don’t go as planned, or you have a lot of success, you will remember everything you’ve been through and how far you’ve come from where you started.
Want to join the Data Engineer Community?
Register for the Data Engineer Spotlight, where attendees have the chance to learn from four technical how-to sessions and hear from Google Cloud Experts on the latest product innovations that can help you manage your growing data.
Read More for the details.
The following solution brief discusses a GCVE + Traffic Director implementation aimed at providing customers an easy way to scale out web services, while enabling application migrations to Google Cloud. The solution is built on top of a flexibleandopen architecture that exemplifies the unique capabilities of Google Cloud Platform. Let’s elaborate:
Easy: The full configuration takes minutes to implement and can be scripted or defined with Infrastructure-as-Code (IaC) for rapid consumption and minimal errors.
Flexible and open: The solution relies on Envoy, an open source platform that enjoys tremendous popularity with the network and application communities.
The availability of Google Cloud VMware Engine (GCVE) has given GCP customers the ability to deploy Cloud applications on a certified VMware stack that is managed, supported and maintained by Google. Many of these customers also demand seamless integration between their applications running on GCVE, and the various infrastructure services that are provided natively by our platform such as Google Kubernetes Engine (GKE), or serverless frameworks like Cloud Functions, App Engine or Cloud Run. Networking services are at the top of that list.
In this blog, we discuss how Traffic Director, a fully managed control plane for Service Mesh, can be combined with our portfolio of load balancers and withhybrid network endpoint groups (hybrid NEG) to provide a high-performance front-end for web services hosted in VMware Engine.
Traffic Director also serves as the glue that links the native GCP load balancers and the GCVE backends, with the objective of enabling these technical benefits:
Certificate Authority integration, for full lifecycle management of SSL certificates.
DDoS protection with Cloud Armor, helps protect your applications and websites against denial of service and web attacks.
Cloud CDN, for cached content delivery.
Intelligent anycast with a Single IP and Global Reach, for improved failover, resiliency and availability.
Bring Your Own IP (BYOIP), to provision and use your own public IP addresses for Google Cloud resources.
Diverse backend types integration in addition to GCVE, such as GCE, GKE, Cloud Storage and serverless.
The following diagram provides a summary of the GCP components involved in this architecture:
This scenario shows an external HTTP(S) load balancer used to forward traffic to the Traffic Director dataplane component, implemented as a fleet of Envoy proxies. Users can create routable NSX segments and centralize the definition of all traffic policies in Traffic Director. The GCVE VM IP and port pairs are specified directly in the hybrid NEG, meaning all network operations are fully managed by a Google Cloud control plane.
Alternatively, GCVE VMs can be deployed to a non-routable NSX segment behind an NSX L4 load balancer configured at the Tier-1 level, and the Load Balancer VIP can be exported to the customer VPC via the import and export of routes in the VPC Peering connection. It is important to note that in GCVE, it is highly recommended that NSX-T load balancers be associated with Tier-1 gateways, and not the Tier-0 gateway.
The steps to configure load balancers in NSX-T, including server pools, health checks, virtual servers and distribution algorithms are documented by VMware and not covered in this document.
Fronting the web applications with an NSX load balancer would allow for the following:
Only VIP routes are announced, allowing the use of private IP addresses in the web tier, as well as overlapping IP addresses in case of multi-tenant deployments.
Internal clients (applications inside of GCP or GCVE) can point to the VIP of the NSX Load Balancer, while external clients can point to the public VIP in front of a native, GCP external load balancer.
A L7 NSX load balancer can also be used (not discussed in this example), for advanced application-layer services, such as cookie session persistence, URL mapping, and more.
To recap, the implementation discussed in this scenario shows an external HTTP(S) load balancer, but please note that an external TCP/UDP network load balancer or TCP Proxy could also be used for supporting protocols other than HTTP(S). There are certain restrictions when using Traffic Director in L4 mode, such as a single backend service per target proxy, which need to be accounted for when implementing your architecture.
In this scenario, the only change is the load balancing platform used to route requests to Traffic Director-managed Envoy proxies. This use case may be appropriate in certain situations, for instance, whenever the users want to take advantage of advanced traffic management capabilities not supported without Traffic Director, as documented here.
The Envoy-managed proxies controlled by Traffic Director can send traffic directly to GCVE workloads:
Alternately, and similar to what was discussed in Scenario #1, an NSX LB VIP can be used instead of the explicit GCVE VM IPs, which introduces an extra load balancing layer:
To recap, this scenario shows a possible configuration with L7 Internal Load Balancer, but an L4 Internal Load Balancer can also be used for supporting protocols other than HTTP(S). Please note there are certain considerations when leveraging L4 vs. L7 load balancers in combination with Traffic Director, which are all documented here.
With the combination of multiple GCP products, customers can take advantage of the various distributed network services offered by Google, such as global load balancing, while hosting their applications on a Google Cloud VMware Engine environment that provides continuity for their operations, without sacrificing availability, reliability or performance.
Go ahead and review the GCVE networking whitepaper today. For additional information about VMware Engine, please visit the VMware Engine landing page, and explore our interactive tutorials. And be on the lookout for future articles, where we will discuss how VMware Engine integrates with other core GCP infrastructure and data services.
Read More for the details.
With access to ever more data, compute capacity, and storage capabilities, the cloud is revolutionizing scientific research across the globe. By migrating their data and processing to the cloud, researchers can accelerate breakthroughs and solve real-world problems faster. Now in its second year, the Google Cloud Research Innovators Program promotes collaboration among a global cohort of scientists and provides them with professional opportunities and technical expertise to make the best use of the cloud in their research. We are proud to announce the new cohort, which doubles the number of Research Innovators from 31 to 62. The incoming group represents 7 disciplines, 10 countries, and 45 institutions.
This year’s cohort has been expanded and reorganized into five tracks: health sciences, life sciences, physical sciences, social sciences, and computer sciences. The life sciences track includes nine climate scientists working to address urgent concerns about our planet’s resources.
For example, Richard Fernandes, Research Scientist at Natural Resources Canada (NRCan), is developing the LEAF toolbox to map and assess vegetation with satellite data from Google Earth Engine. Tian Guo, Assistant Professor of Computer Science, works with a team at Worcester Polytechnic Institute to make distributed deep learning more efficient and responsive with Google’s preemptible Virtual Machine Instances (VMs). Amiyaal Ilany, Senior Lecturer in Life Sciences at Bar-Ilan University in Israel, has been introducing new methods and technologies to the study of animal behavior in the wild. Vijay Ramdin Singh, Postdoctoral Fellow in the Chemical Engineering department at the University of Illinois at Chicago, studies high-throughput materials design using machine learning. Thea Sommerschield, Marie Skłodowska-Curie Postdoctoral Fellow in the Humanities at Ca’ Foscari University of Venice, used the Natural Language Processing capabilities of DeepMind and Google Cloud to design Ithaca, an open source deep neural network that can restore and attribute ancient Greek inscriptions. These are just a few examples of the innovative projects now underway by the new cohort of Google Research Innovators.
Leveraging the cloud for breakthroughs is not just about speed and power, and not just for research in science and technology. Sommerschield explains how machine learning (ML) and artificial intelligence (AI) can also revolutionize research methods across disciplines: “under Google’s aegis, we can organize the ancient world’s information using ML not simply as an assistive tool, but as a technique to unlock hidden patterns and insights in the textual data: the qualitative leap consists in no longer seeking to do faster and better what we were doing before, but rather in developing and using new techniques to do what we could not do before.”
Last year’s cohort of Research Innovators is already advancing these tools and techniques. Tapio Schneider, Theodore Y. Wu Professor of Environmental Science and Engineering at Caltech and Senior Research Scientist at JPL, developed a Climate Machine to improve simulations of future environmental conditions. Another Research Innovator, Teodora Szasz, Senior Computational Scientist at the University of Chicago’s MiiE (Messages, Identity, and Inclusion in Education) Lab, used Machine Learning to visually categorize illustrations to better measure representation in children’s literature.
If you’re a researcher interested in exploring the benefits of the cloud for your projects, apply here for access to the Google Cloud research credits program in 60 eligible countries.
Read More for the details.
Amazon Aurora PostgreSQL-Compatible Edition now supports PostgreSQL major version 14 (14.3). PostgreSQL 14 includes performance improvements for parallel queries, heavily-concurrent workloads, partitioned tables, logical replication, and vacuuming. PostgreSQL 14 also improves functionality with new capabilities. For example, you can cancel long-running queries if a client disconnects and you can close idle sessions if they time out. Range types now support multiranges, allowing representation of non-contiguous data ranges, and stored procedures can now return data via OUT parameters. This release includes new features for Babelfish for Aurora PostgreSQL version 2.1. Please refer to Amazon Aurora PostgreSQL updates for more information.
Read More for the details.
Starting today, AWS Site-to-Site VPN supports the ability to deploy IPSec VPN connections over Direct Connect using private IP addresses. With this change, customers can encrypt DX traffic between their on-premises network and AWS without the need for public IP addresses, thus enabling enhanced security and network privacy at the same time.
Read More for the details.
Durable Functions are now supported when building Java applications in Azure Functions.
Read More for the details.
When you think about Cloud Security there are many areas of responsibility – securing infrastructure, network, data, applications, and managing identities and access. There are also ongoing processes for security operations and governance, risk & compliance management. But the best part of building your application on cloud is that you share the security responsibility with the cloud provider.
Cloud security requires collaboration and is usually operated on a shared responsibility model where the cloud provider is responsible for the security of the underlying cloud infrastructure and you are responsible for securing the applications you deploy on the cloud. This gives you the flexibility and control you need to implement the required security controls for your application and business. Depending on your use case you can restrict access to the sensitive data and projects or selectively deploy public applications.
At Google Cloud, we strive to go beyond the basics of this shared responsibility model to an operating model based on Shared Fate . Shared fate is about preparing a secure landing zone for a customer, guiding them while there, being clear and transparent about the security controls they can configure, offering guardrails, and helping them with cyber-insurance. This is to help ensure that customers have a robust security posture with secure-by-design services, secure defaults, and a rich set of best practices, templates, blueprints, documentation, and professional services that we make available.
Now, let’s see how Google Cloud provides capabilities across the different layers of security:
Cloud providers are responsible for providing infrastructure security, which includes security through the entire information processing life cycle including hardware infrastructure, service deployment, storage services, user identity, internet communications and operational and device security.
Google’s stack builds security through progressive layers that deliver true defense in depth at scale. Google’s hardware infrastructure is custom-designed “from chip to chiller” to precisely meet specific requirements. Its software and OS are stripped-down, hardened versions of Linux. Titan purpose-built chips help establish a hardware root of trust. This end-to-end provenance and attestation helps Google greatly reduce the “vendor in the middle problem”.
Network security is partly the cloud providers responsible and partly yours. Providers work to make sure the traffic is secure and encrypted and the communication with others services on the public internet are secure. They also offer strong baseline protection against network attacks.
You are responsible for defining and enforcing your application perimeter, segmentation of your projects between teams and organizations, managing remote access for your employees and implementing additional DoS defense.
Google Cloud Virtual Private Cloud (VPC) offers private connectivity between multiple regions without communicating across the public internet. You can use a single VPC for an entire organization, isolated within projects. VPC flow logs capture information about IP traffic to and from network interfaces and help with network monitoring, forensics, real-time security analysis, and expense optimization. Shared VPC helps configure a VPC network to be shared across several projects in your organization. Connectivity routes and firewalls associated are managed centrally. You can also segment your networks with a global distributed firewall to restrict access to instances. Firewall Rules policies lets you set rules for access and logging lets you audit, verify, and analyze the effects of your firewall rules. VPC Service Controls extend the perimeter security to manage Google Cloud services by preventing access from unauthorized networks. Application security Cloud IDS: Cloud Intrusion Detection System provides managed, cloud-native network threat detection from malware, spyware, command-and-control attacks
When building an application or API on cloud you are responsible for the application security including the scanning and testing. Adopt practices such as:
Allow and deny traffic based on authentication and authorization of the user.
Use or implement services to block off bot & fraudulent users from your website.
You can protect your internet facing application against attacks by using Web App and API protection (WAAP) solutions. This solution is a combination of:
Cloud Load Balancing: Provides automatic defense against Layer 3 and Layer 4 DDoS attacks
Cloud Armor: Filter incoming web requests by geography or a host of L7 parameters like request headers, cookies, or query strings
reCAPTCHA Enterprise: Provides protection against bots and fraudulent users.
Apigee API Gateway: Protects API backend by throttling API traffic against DDoS attack and control access to APIs with OAuth, API key validation, and other threat protection capabilities.
Securing your software requires establishing, verifying, and maintaining a chain of trust, to establish the provenance or origin trail of your code, via attestations, generated and checked throughout your software development and deployment process. Open source SLSA Supply chain Levels for Software Artifacts is an end-to-end framework for Supply Chain Integrity that you can adopt incrementally to increase your security posture.
In Google, Cloud Binary Authorization service establishes, verifies, and maintains a chain of trust via attestations and policy checks across different steps of the SDLC process.
Code: Using Open Source Insightsidentify dependencies, security advisory, license across open source code.
Build: Cloud Build captures another set of attestations (tests run, build tools used etc) that add to your chain of trust.
Test & scan: Complete build when stored in Artifact Registryis automatically scanned for vulnerabilities.
Deploy & run: Binary Authorization verified for authenticity & deploys when attestations meet organization policy. It even continuously validates conformance to the policy after deployment.
Data security is a shared responsibility between you and the cloud provider. The cloud provider offers some capabilities built into the infrastructure such as data encryption at rest and in transit while you are responsible for your applications’ data security. This includes features such as key and secret management, finding sensitive data, enforcing controls, preventing exfiltration and data loss.
Google Cloud offers data encryption at rest andin transit with the option to encrypt data in use using Confidential Computing. If you need the data to be encrypted via your own keys you can bring your own key (CSEK), use Google’s managed Key Management Service (KMS), use a hardware security module (HSM) or an external key manager (EKM). Data Loss Prevention (Cloud DLP) helps discover, classify, and protect sensitive data.
This requires securely managing the user lifecycle and application access, including authentication of the user and authorization of those users to appropriate services.
In Google Cloud, Cloud Identity is the IdP which provides the authentication options. It stores and manages digital identities for cloud uses, also provides 2 step verification and SSO integration with third party identity provider such as Okata, Ping, ADFS or Azure AD.
Once authenticated, Cloud IAM provides the authorization – “who can do what and where on Google Cloud” by providing fine-grained access control and visibility for centrally managing cloud resources. IAM policies manage access control for Google Cloud resources and IAM Roles help set fine-grained permissions.
BeyondCorp Enterpriseenacts a zero-trust model for access to your applications and resources. No one can access your resources unless they meet all the rules and conditions codified in per-resource access policies:
Endpoint security is critical for protecting users and access. You need to make sure you apply patches, prevent compromises and manage user devices including the policies that define which device has access to which resources in your application or projects.
Safe Browsing or Web Risk API: Lets client applications check URLs against Google’s constantly updated lists of unsafe web resources. With Safe Browsing you can:
Check pages against our Safe Browsing lists based on platform and threat types.
Warn users before they click links in your site that may lead to infected pages.
Prevent users from posting links to known infected pages from your site.
From a Security Operations (SecOps) perspective, you need to detect, respond to, and remediate threats in the Cloud.
In Google Cloud this can be achieved through:
Security Command Center: Continuously monitors your Google Cloud environment for misconfigurations, detects threats, malicious activity and helps maintain compliance. More on Security Command Centerhere.
Audit Logs: Cloud Logging offers Audit logs that record administrative activities and accesses within your Google Cloud resources. Audit logs help you answer “who did what, where, and when?”
Access Transparency: Logs record the actions that Google personnel take when accessing customer content
Siemplify Security Orchestration, Automation and Response (SOAR): enables modern, fast and effective response to cyber threats by combining playbook automation, case management and integrated threat intelligence in one cloud-native, intuitive experience.
Includes understanding security risk, defining and enforcing policy, demonstrating compliance by achieving certifications and maintaining a sound security posture.
Google Cloud is compliant with major security certifications such as PCI DSS, FedRAMP, HIPAA and more. Google Cloud products regularly undergo independent verification of their security, privacy, and compliance controls, achieving certifications, attestations, and audit reports to demonstrate compliance. To learn more about it check out this page
That was a bird’s eye view of the Google Cloud Security Services. For a more in-depth look into the security check out the whitepaper here.
For more #GCPSketchnote, follow the GitHub repo. For similar cloud content follow me on Twitter @pvergadia and keep an eye out on thecloudgirl.dev
Read More for the details.
Amazon ECS now fully supports multiline logging powered by AWS for Fluent Bit for both AWS Fargate and Amazon EC2. AWS Fluent Bit is an AWS distribution of the open-source project Fluent Bit, a fast and a lightweight log forwarder. Amazon ECS users can use this feature to re-combine partial log messages produced by your containerized applications running on AWS Fargate or Amazon EC2 into a single message for easier troubleshooting and analytics.
Read More for the details.
Amazon Textract is a machine learning service that automatically extracts text, handwriting, and data from any document or image. We continuously improve the underlying machine learning models based on customer feedback to provide even better accuracy. Today, we are pleased to announce a quality enhancement to our Forms extraction feature.
Read More for the details.
Announcing the general availability of GitOps with Flux v2 in Azure Kubernetes Service (AKS) and Azure Arc-enabled Kubernetes (Arc K8s).
Read More for the details.