Unmasking Deception: Harnessing the Power of MongoDB Atlas and Amazon SageMaker Canvas for Fraud Detection
Financial services organizations face growing risks from cybercriminals. High-profile hacks and fraudulent transactions undermine faith in the industry. As technology evolves, so do the techniques employed by these perpetrators, making the battle against fraud a perpetual challenge. Existing fraud detection systems often grapple with a critical limitation: relying on stale data. In a fast-paced and ever-evolving landscape, relying solely on historical information is akin to driving by looking into the rearview mirror. Cybercriminals continuously adapt their tactics, forcing financial institutions to stay one step ahead. The newest tactics often can be seen in the data. That's where the power of operational data comes into play. By harnessing real-time data, fraud detection models can be trained on the most accurate and relevant clues available. MongoDB Atlas, a highly scalable and flexible developer data platform, coupled with Amazon SageMaker Canvas, an advanced machine learning tool, presents a groundbreaking opportunity to revolutionize fraud detection. By leveraging operational data, this synergy holds the key to proactively identifying and combating fraudulent activities, enabling financial institutions to safeguard their systems and protect their customers in an increasingly treacherous digital landscape. MongoDB Atlas MongoDB Atlas , the developer data platform is an integrated suite of data services centered around a cloud database designed to accelerate and simplify how developers build with data. MongoDB Atlas's document-oriented architecture is a game-changer for financial services organizations. Its ability to handle massive amounts of data in a flexible schema empowers financial institutions to effortlessly capture, store, and process high-volume transactional data in real-time. This means that every transaction, every interaction, and every piece of operational data can be seamlessly integrated into the fraud detection pipeline, ensuring that the models are continuously trained on the most current and relevant information available. With MongoDB Atlas, financial institutions gain an unrivaled advantage in their fight against fraud, unleashing the full potential of operational data to create a robust and proactive defense system. Amazon SageMaker Canvas Amazon SageMaker Canvas revolutionizes the way business analysts leverage AI/ML solutions by offering a powerful no-code platform. Traditionally, implementing AI/ML models required specialized technical expertise, making it inaccessible for many business analysts. However, SageMaker Canvas eliminates this barrier by providing a visual point-and-click interface to generate accurate ML predictions for classification, regression, forecasting, natural language processing (NLP), and computer vision (CV). SageMaker Canvas empowers business analysts to unlock valuable insights, make data-driven decisions, and harness the power of AI without being hindered by technical complexities. It boosts collaboration between business analysts and data scientists by sharing, reviewing, and updating ML models across tools. It brings the realm of AI/ML within reach, allowing analysts to explore new frontiers and drive innovation within their organizations. Reference Architecture The above reference architecture includes an end-to-end solution for detecting different types of fraud in the banking sector, including card fraud detection, identity theft detection, account takeover detection, money laundering detection, consumer fraud detection, insider fraud detection and mobile banking fraud detection to name a few. The architecture diagram shown here illustrates model training and near real-time inference. The operational data stored in MongoDB Atlas is written to the Amazon S3 bucket using the Triggers feature in Atlas Application Services. Thus stored, data is used to create and train the model in Amazon SageMaker Canvas. The SageMaker Canvas stores the metadata for the model in the S3 bucket and exposes the model endpoint for inference. For step-by-step instructions on how to build the fraud detections solution mentioned above with MongoDB Atlas and Amazon SageMaker Canvas, read our tutorial .
MongoDB Achieves AWS Financial Services Competency
We are delighted to announce that MongoDB Atlas has successfully achieved the AWS Financial Services competency , highlighting our dedication to delivering exceptional data management solutions tailored specifically for the financial industry. This significant milestone demonstrates our ability to provide secure, scalable, and reliable database services that meet the stringent regulatory requirements and unique challenges faced by financial institutions. By attaining the AWS Financial Services competency, MongoDB Atlas reinforces its position as a trusted partner for financial organizations in their quest for modern and efficient solutions for secure data management. Our developer data platform empowers financial institutions to leverage the inherent flexibility of MongoDB's document model, coupled with advanced security features and automated operations. This enables organizations to expedite innovation, enhance customer experiences, and fuel business growth while maintaining compliance with regulatory frameworks. Achieving this competency further signifies our commitment to elevating the standards of data management within the financial services sector. By leveraging MongoDB Atlas, financial institutions can optimize their operations, increase agility, and leverage data-driven insights to make informed decisions. We provide a comprehensive suite of features and functionalities , including data encryption at rest and in transit, identity, and access management controls, and built-in compliance certifications such as PCI-DSS, HIPAA, and ISO/IEC 27001, which are essential for meeting the strict security and privacy requirements of the financial industry. By harnessing real-time, up-to-the-minute information, fraud detection models can be trained on the most accurate and relevant data available. MongoDB Atlas, a highly scalable and flexible database, coupled with Amazon SageMaker Canvas, an advanced machine learning tool, presents a groundbreaking opportunity to revolutionize fraud detection. By leveraging operational data, the synergy of the MongoDB Atlas and SageMaker Canvas holds the key to proactively identifying and combating fraudulent activities, enabling financial institutions to safeguard their systems and protect their customers in an increasingly treacherous digital landscape. At MongoDB, we recognize that the regulated industry landscape is constantly evolving. As a result, we remain dedicated to continually expanding our capabilities, refining our offerings, and collaborating closely with our customers to address their evolving needs. Last year, MongoDB achieved the AWS Public Sector competency. Our attainment of the AWS Financial Services competency serves as a testament to our ongoing commitment to excellence and innovation in data management, ensuring that our customers in regulated industries can rely on MongoDB Atlas as a robust and secure platform to power their digital transformation initiatives. Learn more about MongoDB Atlas on AWS , and get started for free on AWS Marketplace .
MongoDB and BigID Delivering Scalable Data Privacy Compliance for Financial Services
Ensuring data privacy compliance has become a critical priority for banks and financial services. Safeguarding customer data is not only crucial for maintaining trust and reputation but also a legal and ethical obligation. In this blog, we will dive into why and how the financial services industry can adopt an approach to data privacy compliance effectively using BigID and MongoDB. Embracing a privacy-first mindset To establish a robust data privacy compliance framework, banks, and financial services must prioritize privacy from the onset. This entails adopting a privacy-first mindset throughout all aspects of their operations. Embedding privacy principles into the organizational culture helps create a foundation for compliance, ensuring that data protection is a core value rather than an afterthought. Understand the regulatory landscape Compliance with data privacy regulations is an ongoing process that requires a deep understanding of the applicable legal landscape. Banks and financial services should invest in a comprehensive knowledge of regulations such as the General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA), Digital Personal Data Protection (DPDP), and other relevant global and local regulations. This understanding helps organizations identify their obligations, assess risks, and implement necessary controls to ensure compliance. Ensuring compliance with regulatory requirements Data privacy compliance requirements vary based on specific regulations applicable to state, region or country. Organizations must adhere to these regulator requirements as its crucial to meeting legal obligations, maintaining trust and mitigating risks. Regularly Update Policies and Procedures: The data privacy landscape is constantly evolving, with new regulations and best practices emerging regularly. Banks and financial services should stay ahead of these developments to review and update their privacy policies and procedures accordingly. Regular audits and risk assessments should be conducted to identify gaps and ensure that the organization remains compliant with evolving requirements. Implement Data Discovery & Governance Frameworks: Effective data governance is a fundamental aspect of data privacy compliance. Banks and financial services should establish data governance frameworks with clear policies, procedures, and accountability mechanisms. This includes defining data ownership, identifying data across systems, implementing data classification, setting retention periods, and establishing secure data storage and disposal protocols. Regular audits and internal controls help ensure adherence to these policies and procedures. Streamline Consent Management: Transparency and consent are vital components of data privacy compliance. Banks and financial services should provide clear and easily understandable privacy notices to customers, outlining the types of data collected, the purposes of the processing, and any third-party sharing. Additionally, organizations should develop user-friendly consent mechanisms that enable individuals to make informed choices about their data. Fulfill User Rights and Data Subject Access Requests: All privacy regulations grant individuals various rights over their data, including the right to access, correct, delete, and restrict the sale of data. The fulfillment of data rights requires mechanisms such as customer self-service portals and automated workflows for data subject access requests. Conduct Privacy Impact Assessments (PIAs): Privacy Impact Assessments (PIAs) are essential tools for evaluating and mitigating privacy risks associated with data processing activities. Banks and financial services should regularly conduct PIAs to identify potential privacy concerns, assess the impact of data processing, and implement appropriate safeguards. PIAs enable organizations to proactively address privacy risks, demonstrate compliance, and enhance transparency in data processing practices. Prioritize Data Minimization and Purpose Limitation: Collecting and processing only the necessary personal data is a key principle of data privacy compliance. Banks and financial services should adopt data minimization strategies , limiting data collection to what is essential for legitimate business purposes. Furthermore, data should be processed only for specific, clearly defined purposes and not repurposed without obtaining appropriate consent or legal basis. By embracing data minimization and purpose limitation, organizations can reduce privacy risks and respect individuals' privacy preferences. Navigate Data Localization & Transfers: Data localization involves keeping data within the jurisdiction where it was collected. While this approach can help ensure data protection, it can also create challenges for businesses that operate in multiple countries. Implementing data localization practices ensures that customer data remains within the country's boundaries as well as adhering to cross-border data transfer requirements. Strengthen Security Measures: Protecting customer data from unauthorized access, breaches, and cyber threats is crucial. Banks and financial services should implement robust security measures, including encryption , access controls, intrusion detection systems, and regular security assessments. Ongoing staff training on cybersecurity awareness and best practices is essential to mitigate the risk of human error or negligence. Achieving privacy compliance with BigID and MongoDB Financial institutions need the ability to find, classify, inventory, and manage all of their sensitive data, regardless of whether it’s on-prem, hybrid-cloud, or cloud-based. Organizations must know where their data is located, replicated, and stored — as well as how it is collected and processed, it’s a momentous task — and requires addressing common challenges like siloed data, lack of visibility and accurate insight, and balancing legacy systems with cloud data. All while meeting a litany of compliance requirements. With a major shift towards geographically dispersed data, organizations must make sure they are aware of – and fully understand – the local and regional rules and requirements that apply to storing and managing data. Organizations without a strong handle on where their data is stored potentially risk millions of dollars in regulatory fines for mishandling data, loss of brand credibility, and distrust from customers. A modern approach relying on modern technologies like BigID & MongoDB helps to solve data privacy, data protection, and data governance challenges. BigID, the industry leader for data security, privacy, compliance, and governance, is trusted by some of the world's largest financial institutions to deliver fast and accurate data discovery, classification, and correlation across large and complex data sets. BigID utilizes MongoDB as the internal data store for the platform to help generate data insights at scale, automate advanced discovery & classification, and accommodate complex enterprise requirements. As technology partners, MongoDB’s document model and distributed architecture enable BigID to deliver a scalable and flexible data management platform for data privacy and protection. How BigID powered by MongoDB addresses privacy compliance challenges By taking a privacy-first approach to data and risk, organizations can address the challenges of continuous compliance, minimize security risks, proactively address data privacy programs, and strengthen data management initiatives. BigID, powered by MongoDB, helps organizations identify, manage, and monitor all personal and sensitive data activity to achieve compliance with several data privacy requirements. Organizations get: Deep Data Discovery: BigID helps organizations discover and inventory their critical data, including financial information. This enables organizations to understand what data they have and where it is located, which is an important first step in achieving compliance. Accurate Classification: With exact value matching, BigID graph based technology can identify and classify personal and sensitive data in any environment such as email, shared drives, databases, data lakes, and many more. Efficient Data Mapping: Automatically map PII and PI to identities, entities, and residencies to connect the dots in your data environments. Streamlined Data Lifecycle Management: Accurately find, classify, catalog, and tag your data and easily enforce governance & control – from retention to deletion. Fulfillment of Consent & Data Rights Request: Automate consent and data rights management with a privacy portal that includes a seamless U/X that manages data subject rights requests (DSAR). Centralize DSAR’s with automated access and deletion workflows to fulfill end-to-end data rights requests. Effective Privacy Impact Assessments (PIA/DPIA): Easily build seamless workflows and frameworks for privacy impact assessments (PIA) to estimate the risk associated with all data inventory. ML-based Data Access Management: For full compliance with specific requirements, BigID helps mitigate risk with significant open-access requirements to remediate file access violations on critical data across all data environments. Validated Data Transfers: Monitor cross-border data transfers and create policies to enforce data residency and localization requirements. Effective Remediation: BigID helps to define the remediation action related to critical data to provide audit records with integration to ticketing systems like Jira for seamless workflows. By adopting a privacy-first approach to data and risk, financial services organizations can tackle the challenges of continuous compliance, mitigate security risks, and enhance data management initiatives. BigID, powered by MongoDB, offers comprehensive solutions to help organizations identify, manage, and monitor personal and sensitive data activities, enabling them to achieve compliance with various data privacy requirements. Looking to learn more about how you can reduce risk, accelerate time to insight, and get data visibility and control across all your data - everywhere? Take a look at the below resources: Control your data for data security, compliance, privacy, and governance with BigID Data-driven privacy compliance and automation for new and emerging data privacy and protection regulation Protect your data with strong security defaults on the MongoDB developer data platform Manage and store data where you want with MongoDB MongoDB for Financial Services
How to Build Advanced GraphQL-based APIs With MongoDB Atlas and AWS AppSync Merged APIs
When businesses develop their own IT systems, sooner or later, the complexity of managing APIs becomes a challenge. Breaking down monolithic architectures into multiple microservices often results in a proliferation of APIs associated with each microservice. Each API, in turn, has versioning, leading to further fragmentation of the APIs and driving up maintenance costs. If your microservices diagram looks like a hairball you know you are living in API Hell. Conway’s Law , which states that systems mirror the communication structure of the organization, also applies to API development. Different teams build separate and, sometimes, overlapping APIs, further contributing to the fragmentation. While this is especially true for REST-based APIs , it's also a challenge for GraphQL -based APIs. GraphQL has emerged as a powerful tool for building flexible and efficient APIs that empower developers and elevate user experiences. AWS AppSync is the go-to service for customers looking to accelerate application development with serverless GraphQL and Pub/Sub APIs. AWS AppSync offers a managed GraphQL service with additional features and capabilities. It simplifies the development of scalable, real-time applications by seamlessly integrating with various data sources, providing offline support, enabling fine-grained authorization and security, and automating infrastructure management. By embracing AppSync, you can harness the full potential of GraphQL while leveraging the benefits of a comprehensive portfolio of services and products provided by AWS. MongoDB Atlas on Amazon Web Services (AWS) and AWS AppSync combined help developers build scalable, secure, and serverless applications. By seamlessly integrating MongoDB as a data source within AppSync, you're able to leverage MongoDB's flexible document model and AppSync's GraphQL-based querying to efficiently retrieve and manipulate data. And you can leverage AppSync's automatic scaling, ensuring optimal performance. This combined solution enables you to build high-performing serverless applications while simplifying application development. AWS AppSync recently added a feature called Merged APIs that allows you to compose multiple GraphQL source APIs into a single GraphQL API. Merged APIs give developers the ability to compose distinct APIs developed by different teams into a single, combined GraphQL schema. The merged API resolver function contains the logic to consolidate the source details. The resulting single GraphQL API can be cached for better performance. You can then present the unified API to the clients as a single API endpoint. AppSync Merged APIs combine MongoDB Atlas-backed APIs with other APIs, allowing you to enrich operational data residing in MongoDB Atlas with data coming from additional sources. You can serve data with a unified GraphQL schema across multiple data sources, including MongoDB. If you're interested in learning more about this powerful integration, check out our new tutorial that demonstrates two ways to combine MongoDB Atlas with AWS AppSync : leveraging the Drivers and the Atlas Data API. Both approaches work with the AWS AppSync Merged API as well. Checkout our tutorial on GitHub . Try out MongoDB Atlas on AWS ( Atlas GraphQL ) and AWS AppSync today. Sign up for MongoDB Atlas on AWS Marketplace Today
Fueling Pricing Strategies with MongoDB and Databricks
Deploying real-time analytics solutions with the right tech stack can have transformative benefits. Retailers want to grow their brand name or improve customer experience with value based pricing, whilst remaining competitive and cost effective. Despite their ambition to become “data driven” operations, companies often fail in their endeavors, at the core of this is the struggle to do real-time analytics. We will explore the architecture in Figure 1 and discuss the advantages of integrating MongoDB Atlas and Databricks as a perfect pairing for retail pricing strategies using real time AI. The solution we’ll describe integrates concepts from Event Driven architecture in the data generation and ingestion side, real time analytics orchestration, machine learning and microservices. Let’s get started! Figure 1: Overview of a dynamic pricing solution architecture Reduce friction with flexibility The pricing data complexity for a retailer with a diverse product line increases due to factors like seasonal sales, global expansion, and new product introductions. Tracking and analyzing historical prices and product offerings become more challenging as they change throughout the year. Analytics solutions built around event driven architectures try to explain what is happening in a specific system or solution based on any significant occurrence such as user actions, data updates, system alerts, or sensor readings. Deciding which occurrences are crucial to understand your customers and instrument your business model around that is when things start to become more intricate. Specially when trying to instrument your data models using traditional relational database management systems and its disadvantage when it comes to pairing that data structure with object oriented applications. The inability of a retailer to adapt it’s data model to the customer behavior quickly translates into friction and a weaker presence in the market, for example poor pricing strategies compared to competitors because they lack information of historic price points and how they vary between products. Figure 2: An inflexible data model is a road block for innovation That friction is contagious throughout the whole value chain of an organization, affecting the semantic layer of business (a bridge between the technical data structures and business user understanding), generating data inconsistencies, and reducing time to insight. The capacity of your conceptual data model to adapt to an ever changing customer behavior helps reduce that friction significantly as its flexibility allows for a more intuitive data modeling of real world events. For your real time pricing strategies challenges, MongoDB Atlas document model with its embedding and extended reference capabilities becomes the perfect tool for the job, as it allows for faster feature development and stronger test driven growth and talent retention as a consequence. In combination with it’s high performance queries and horizontal scalability the solution becomes robust as it will handle the high throughput of clickstreams on your ecommerce applications and yet be able to respond real time data driven decision making features. Its ease of integration with other platforms thanks to strong API capabilities and drivers make it the perfect solution on top of which to build your business operational and intelligence layers as you’ll avoid vendor lock-in and data scientists can easily leverage AI frameworks to work with fresh data. It’s distributed by default principle, plus following best practices principles guarantee your operational data layer will handle the workload needed. As the AI algorithms analyze vast amounts of historical and real-time data to make pricing decisions, having a distributed platform like MongoDB enables efficient storage, processing, and retrieval of data across multiple nodes. From what? To why? The intelligence layer To unlock meaningful market growth and achieve it at scale your analytics need to evolve from just understanding what is happening by querying and analyzing historical data, to understand why the events measured with your operational data layer are happening and even further try to forecast them. For a pricing solution, retailers would need to gather historical pricing points data for their product lines and shape through ETL (Earn, Transform, Load) pipelines to feed machine learning algorithms. This process is often complicated and brittle using the traditional data warehousing approach, often incurring in data duplication making it difficult and costly to manage. Figure 3: Reduced friction thanks to seamless integration of the different data layers The advantage of using MongoDB Atlas as your operational data layer solution, is that through its Aggregation Pipelines you can shape your data in any way you need and then through MongoDB App Services , you can instrument Triggers and Functions to simplify this process and then consume that data in Databricks by leveraging the MongoDB Atlas via Spark connector . Databricks provides you with a streamlined way of working with your models, by writing python code on hosted clusters notebooks. You can leverage its MLFlow integration to be able to register experiments which then can be turned into deployed models over an endpoint. So transforming your data and integrating your operational layer, through connectors and API calls as triggers and functions, with your intelligence layer for machine learning and AI, you can easily build a pricing solution that will be able to generate market growth for your organization from its core through a semantic layer acting as a bridge between the technical aspects of data storage and the business requirements of data analysis. Uncover new growth opportunities Designing a real time analytics solution with MongoDB Atlas and Databricks is not only the fastest way to unlock your team's capabilities to devise pricing strategies, it also sets the cornerstone to build automated rules for more complex solutions. Other ways of automating your retail application with AI driven insight could include: optimizing your marketing mix budget by each product price elasticity, adding another analytical layer of customer segmentation data to achieve targeted dynamic pricing, or optimizing your supply chain with sales forecasting in real time. By taking advantage of MongoDB Charts or the MongoDB BI Connector , you can fuel your business dashboards, making that semantic layer of the business model the central point for your teams alignment. Foundations for growth Modern ecommerce sites unleash the power of real time analytics and automation to create better experiences for customers and a more profound approach to customer analytics by unblocking the power of machine learning to discover trends in behavioral data, effectively turning companies into automated growth machines. If you want to discover how to build a simple dynamic pricing solution integrating MongoDB Atlas and Databricks make sure to read this guide.
Modernize Your Factory Operations: Build a Virtual Factory with MongoDB Atlas in 5 Simple Steps
Thank you to Karolina Ruiz Rojelj for her contributions to this post. Virtual factories are revolutionizing the manufacturing landscape. Coined as the "Revolution in factory planning" by BMW Group at NVIDIA, this cutting-edge approach is transforming the way companies like BMW and Hyundai operate, thanks to groundbreaking partnerships with technology companies such as NVIDIA and Unity. At the heart of this revolution lies the concept of virtual factories , computer-based replicas of real-world manufacturing facilities. These virtual factories accurately mimic the characteristics and intricacies of physical factories, making them a powerful tool for manufacturers to optimize their operations. By leveraging AI, they unlock a whole new world of possibilities, revolutionizing the manufacturing landscape, paving the way for improved productivity, cost savings, and innovation. In this blog we will explore the benefits of virtual factories and guide you through the process of building your own virtual factory using MongoDB Atlas. Let’s dive in! Unlocking digital transformation The digitalization of the manufacturing industry has given rise to the development of smart factories. These advanced factories incorporate IoT sensors into their machinery and equipment, allowing workers to gather data-driven insights on their manufacturing processes. However, the evolution does not stop at smart factories automating and optimizing physical production. The emergence of virtual factories introduces simulation capabilities and remote monitoring, leading to the creation of factory digital twins, as depicted in Figure 1. By bridging the concepts of smart and virtual factories, manufacturers can unlock greater levels of efficiency, productivity, flexibility, and innovation. Figure 1: From smart factory to virtual factory Leveraging virtual factories in manufacturing organizations provides many benefits including: Optimization of production processes and identification of inefficiencies. This can lead to increased efficiency, reduced waste, and improved quality. Aiding quality control by contextualizing sensor data with the manufacturing process. This allows analysis of quality issues and implementation of necessary control measures while dealing with complex production processes. Simulating manufacturing processes and testing new products or ideas without the need for physical prototypes or real-world production facilities. This significantly reduces costs associated with research and development and minimizes the risk of product failure. However, setting up a virtual factory for complex manufacturing is difficult. Challenges include managing system overload, handling vast amounts of data from physical factories, and creating accurate visualizations. The virtual factory must also adapt to changes in the physical factory over time. Given these challenges, having a data platform that can contextualize all the data coming in from the physical factory and then feed that to the virtual factory and vice versa is crucial. And that is where MongoDB Atlas , our developer data platform, comes in, providing synchronization capabilities between physical and virtual worlds, enabling flexible data modeling and providing access to the data via a unified query interface as seen in Figure 2. Figure 2: MongoDB Atlas as the Data Platform between physical and virtual Factories Now that we’ve discussed the benefits and the challenges of building virtual factories, let’s unpack how simple it is to build a virtual factory with MongoDB Atlas. How to build a virtual factory MongoDB Atlas 1. Define the business requirements The first step of the process is to define the business requirements for the virtual factory. Our team at MongoDB uses a smart factory model from Fischertechnik to demonstrate how easily MongoDB can be integrated to solve the digital transformation challenges of IIoT in manufacturing. This testbed serves as our foundational physical factory and the starting point of this project. Figure 3: The smart factory testbed We defined our set of business requirements as the following: Implement a virtual run of the physical factory to identify layout and process optimizations. Provide real-time visibility of the physical factory conditions such as inventory for process improvements. This last requirement is critical; while standalone simulation models of factories can be useful, they typically do not take into account the real-time data from the physical factory. By connecting the physical and virtual factories, a digital twin can be created that takes into account the actual performance of the physical factory in real-time. This enables more accurate predictions of the factory's performance, which improves decision-making, process optimization, and enables remote monitoring and control, reducing downtime and improving response times. 2. Create a 3D model Based on the previous business requirements, we created a 3D-model of the factory in a widely used game engine, Unity . This virtual model can be visualized using a computer, tablet or any virtual reality headset. Figure 4: 3D-model of the smart factory in Unity Additionally, we also added four different buttons (red, white, blue, and “stop”) which enables users to submit production orders to the physical factory or stop the process altogether. 3. Connect the physical and virtual factories Once we created the 3D model, we connected the physical and virtual factories via MongoDB Atlas. Let’s start with our virtual factory software application. Regardless of where you deploy it, be it a headset or a tablet, you can use Realm by MongoDB to present data locally inside Unity and then synchronize it with MongoDB Atlas as the central data layer. Allowing us to have embedded databases where there's resource constrainment and MongoDB Atlas as a powerful and scalable cloud backend technology. And lastly, to ensure data synchronization and communication between these two components, we leveraged MongoDB Atlas Device Sync , providing bi-directional synchronization mechanism and network handling. Now that we have our virtual factory set-up, let’s have a look at our physical one. In a real manufacturing environment, many of the shopfloor connectivity systems can connect to MongoDB Atlas and for those who don't natively, it is very straightforward to build a connector. At the shopfloor layer you can have MongoDB set up so that you can analyze and visualize your data locally and set up materialized views. On the cloud layer, you can push data directly to MongoDB Atlas or use our Cluster-to-Cluster Sync functionality. A single IoT device, by itself, does not generate much data. But as the amount of devices grows, so does the volume of machine-generated data and therefore the complexity of the data storage architecture required to support it. The data storage layer is often one of the primary causes of performance problems as an application scales. A well-designed data storage architecture is a crucial component in any IoT platform. In our project, we have integrated AWS IoT Core to subscribe to MQTT messages from the physical factory. Once these messages are received and filtered, they are transmitted to MongoDB Atlas via an HTTP endpoint. The HTTP endpoint then triggers a function which stores the messages in the corresponding collection based on their source (e.g., messages from the camera are stored in the camera collection). With MongoDB Atlas, as your data grows you can archive it using our Atlas Online Archive functionality. Figure 5: Virtual and physical factories data flow In figure 5, we can see everything we’ve put together so far, on the left hand side we have our virtual factory where users can place an order. The order information is stored inside Realm, synced with MongoDB Atlas using Atlas Device Sync and sent to the physical factory using Atlas Triggers . On the other hand, the physical factory sends out sensor data and event information about the physical movement of items within the factory. MongoDB Atlas provides the full data platform experience for connecting both physical and virtual worlds! 4. Data modeling Now that the connectivity has been established, let's look at modeling this data that is coming in. As you may know, any piece of data that can be represented in JSON can be natively stored in and easily retrieved from MongoDB. The MongoDB drivers take care of converting the data to BSON (binary JSON) and back when querying the database. Furthermore, you can use documents to model data in any way you need, whether it is key value pairs, time series data or event data. On the topic of time series data, MongoDB Time Series allows you to automatically store time series data in a highly optimized and compressed format, reducing customer storage footprint, as well as achieving greater query performance at scale. Figure 5: Virtual and physical factories sample data It really is as simple as it looks, and the best part is that we are doing all of this inside MongoDB Atlas making a direct impact on developer productivity. 5. Enable computer vision for real-time inventory Once we have the data modeled and connectivity established, our last step is to run event-driven analytics on top of our developer data platform. We used computer vision and AI to analyze the inventory status in the physical factory and then pushed notifications to the virtual one. If the user tries to order a piece in the virtual factory that is not in stock, they will immediately get a notification from the physical factory. All this is made possible using MongoDB Atlas and its connectors to various AI platforms If you want to learn more, stay tuned for part 2 of this blog series where we’ll dive deep into the technical considerations of this last step. Conclusion By investing in a virtual factory, companies can optimize production processes, strengthen quality control, and perform cost-effective testing, ultimately improving efficiency and innovation in manufacturing operations. MongoDB, with its comprehensive features and functionality that cover the entire lifecycle of manufacturing data, is well-positioned to implement virtual factory capabilities for the manufacturing industry. These capabilities allow MongoDB to be in a unique position to fast-track the digital transformation journey of manufacturers. Learn more: MongoDB & IIoT: A 4-Step Data Integration Manufacturing at Scale: MongoDB & IIoT Manufacturing with MongoDB
Ulta Beauty Solves Seasonal Shopping
The holiday season can feel like a whirlwind to retailers. Between keeping up with sudden shifts in shopper preferences, supply chain nuances, and a massive increase in demand, the hardest part can be ensuring great customer experiences, both in store and online. This year, retailers have a unique challenge ahead, as those on Google Cloud have experienced more online traffic in the first half of 2022 than in all of 2019 . Retailers will need to get an early start to the 2023 shopping season since 50% of consumers plan to get a jump on their holiday shopping activity before the traditional start on Black Friday. Luckily, modern advancements in automation and infrastructure can help retailers survive seasonal spikes and stay innovative year-round. A platform for innovation Ulta Beauty's system overhaul proved to be a success when they implemented Google Kubernetes Engine (GKE) and enabled the development of cloud-native applications. Ulta Beauty was able to take advantage of the new strategic change to swiftly fix bugs, test new offerings, and provide customers with far better experiences. Thanks to this transformation and using GKE, their developers are now able to launch new products and services faster, and create great customer interactions faster. Now Ulta Beauty's guests have dynamic and personal connections with beauty and wellness, with their preferences taken into account. Despite this, the company had to overcome some difficulties. As Sethu Madhav Vure, IT Architect, Ulta Beauty, explains, "Microservices are not a silver bullet. For us, the struggle was breaking up a monolithic environment into multiple applications while keeping existing services functional and preparing for the future." Ulta Beauty sought to simplify and scale through a domain-driven design approach. Identifying and grouping similarly structured operations enabled them to create a modern architecture. To match their needs for dynamic scaling, MongoDB Atlas was selected and leveraged with Google Cloud integrations, resulting in a quick proof of concept. Thanks to comprehensive resource allocation, Vure remarked, "The free tier of MongoDB Atlas allowed us to prove the value of the technology before we invested in it." The innovative partnership between MongoDB Atlas and Google Cloud has enabled Ulta Beauty to maximize efficiency and take a rapid, iterative approach to their newest projects. It allows them to better manage their expansive data, and to deploy and scale offerings quickly and successfully, as seen in their recent unplanned traffic surge that took them less than an hour to manage. Equipped with the dynamic scalability of GKE and the on-demand functionality of MongoDB Atlas, Ulta Beauty can face any challenge with confidence. Seasonal prep Ulta Beauty has drastically improved its technical infrastructure to meet the demands of the holiday season. With just 20 GKE pods, the company is now able to scale up to 2,400 transactions per second! Partnering with Google Cloud, Ulta Beauty leveraged event-based integrations and Cloud Pub/Sub middleware on top of MongoDB Atlas integrations, resulting in an efficient process that maximizes the power of their platform. By optimizing their technology partner stack, Ulta Beauty was able to make a dramatic shift in their IT culture, allowing them to trial new solutions faster and with the full support of leadership. The result? They were able to handle increased traffic during the holiday shopping season with ease, delivering the customer service they promised, free from the worry of outages. As Vure puts it, "We are now better prepared for a stress-free holiday season, enabling us to focus on creating even more great service for our customers." Check out MongoDB on Google Cloud Marketplace to learn more about what these partners can do for your business.
Improved Developer Experience with the Atlas Admin API
With MongoDB Atlas, we meet our developers where they are and offer multiple ways to get started and work with Atlas. One of the ways to get started programmatically with Atlas is through the Atlas Administration API. It provides programmatic access to Atlas resources such as clusters, database users, or backups to name a few, enabling developers to perform operational tasks like creating, modifying, and deleting resources. We are excited to announce two key capabilities that will improve the developer experience when working with the Atlas Administration API. Versioned Atlas Administration API If you use the Atlas Administration API today, you are working with the unversioned Administration API (/v1). We have heard your feedback on the challenges around the API changes not having a clear policy as well as communication gaps about the new features/ deprecations. To address this, we are excited to now introduce resource-level versioning with our new versioned Atlas Administration API ( /v2). Here is what you can expect: More predictability and consistency in handling API changes: With this new versioning, any breaking changes that can impact your code will only be introduced in a new resource version. You can rest assured that no breaking changes will affect your production code running the current, stable version. Also, deprecation will occur with the introduction of a new stable API resource version. This will give you at least 1 year to upgrade before the removal of the deprecated resource version. It adds more predictability to what’s coming to the API. Minimum impact with resource-based versioning: With the resource level versioning, whenever we talk about API versions we’re referring to the actual API resource versions represented by date. So once you migrate from the current unversioned Administration API (/v1) to the new versioned Administration API (/v2), this will point to version 2023-02-01. To make the initial migration process smooth and easy this first resource version applies to all API resources (e.g. /serverless, /backup, /clusters, etc.). However, moving forward, each resource can introduce a new version (e.g. /serverless can move to 2023-06-01, /backup can stay on 2023-02-01) independently from each other at various points in time. The advantage is that if you have not implemented the e.g. /serverless resource and say a new version is introduced, you will not need to take any action. You will only need to take action if and when the resources you are utilizing are deprecated. More time to plan your migration: Once a particular resource version is deprecated, there will be enough time ( 12 months) before it is removed, so this will give you ample time to plan and transition to the new version. Improved context and visibility: Our updated documentation has all the details to guide you through the versioning process. Whether it’s a release of a new endpoint, deprecation of an existing resource version, or a non-breaking change to a #stable resource - all of them are now tracked on a dedicated and automatically updated Changelog . Other than that we provide more visibility and context on any API changes through the API specification which presents information for all stable and deprecated resource versions enabling you to access documentation that’s relevant for your particular case. We have made it very simple for you to get started with the new versioned Administration API (/v2). To start using it you need to migrate from the current Unversioned Administration API (/v1) to the new Versioned Administration API (/v2). This migration can be done smoothly by making two changes in your code: Update the path of each endpoint from /v1 to /v2 . Add a version header to each endpoint: application/vnd.atlas.2023-02-01+json. The 2023-02-01 version will have a long support timeframe of 2 years from its deprecation, giving you ample time to transition. Read more about the versioned Admin API. To learn more about the transition process, read our migration guide . Go SDK for the Atlas Administration API One of the mechanisms that simplifies interaction with APIs is the availability of SDKs. To make it easy to get started and work with the Versioned Administration API, we are excited to introduce the new Go SDK for the Administration API. If you are a Go developer and have interacted with the Atlas Administration API, you must be familiar with our current Go client. We are now introducing the new Go SDK for the Atlas Admin API. This provides a significantly improved developer experience for Go developers as it supports full endpoint coverage, improves the speed of getting started with the versioned Admin API, provides you a consistent experience when working with the Admin API, and gives you the choice of version for better control of changes and the impact on the scripts. Let’s look at some of the benefits you can expect: Full Atlas Administration API endpoint coverage: The new Go SDK allows you to access all the features and capabilities that the Atlas Administration API offers today with full endpoint coverage. This ensures you can programmatically leverage the full breadth of the developer data platform. Flexibility in choosing the API resource version: When interacting with the new versioned Atlas Administration API through the Go SDK client, you can choose a particular version of the Admin API, giving you control over when you are impacted by breaking changes or always work with the latest version. Ease of use: Your getting started experience for Admin API through the Go SDK client is now much more simplified with fewer lines of code since it includes pre-built functions, structs, and methods that encapsulate the complexity of HTTP requests, authentication, error handling, versioning, and other low-level details. Immediate access to updates: When using the new Go SDK, you can immediately access any newly released Atlas Admin API capabilities. And every time a new version of Atlas is released, the SDK will be quickly updated and continuously maintained, ensuring compatibility with any changes in the API. Get started today with the GoSDK client. Also, refer to our migration guide to learn more.
Unleashing Innovation in the Start-Up Nation: Inside MongoDB Israel
Israel, often referred to as the “start-up nation”, has a thriving tech scene and drive for innovation. In the bustling city of Tel Aviv, our MongoDB office offers a space for employees from across the organization to collaborate, build community, and empower MongoDB customers to unleash the power of software and data. Now with over 50 employees in Tel Aviv, MongoDB Israel is paving the way for data platform transformation in this start-up nation. Read on to learn more about life at MongoDB in Israel. MongoDB has long had a presence in Tel Aviv, but we opened a new office space in December 2022. Located in WeWork Midtown, our Tel Aviv office offers amazing views of the city and coastline, and employees have full access to all of WeWork’s amenities and activities. Not to mention, we offer some great benefits to our employees in Israel, too! 25 days annual leave and 13 Israeli holidays Private medical and life insurance at no cost to employees 90% reimbursement for fertility, surrogacy, and adoption expenses through Carrot, up to a lifetime maximum of 54,500 shekel, plus new parent support through Cleo Twenty weeks of fully paid parental leave (regardless of gender) for employees who have passed their one-year work anniversary Keren Hishtalmut continuing education fund, travel allowance, meal vouchers, and an employee stock purchase program Global company initiatives to support mental well-being, including mental health resources, a free subscription to Headspace, and an employee assistance program What really makes MongoDB unique is its people. According to one team member, “The MongoDB Israel team is truly a standout in the tech industry, characterized by its vibrant energy, diverse talent, and collaborative spirit. We're a collection of individuals from various backgrounds, each contributing unique skills and experiences. This diversity fuels our innovative approach and fosters a dynamic work environment where everyone has an opportunity to grow, learn, and excel.” Hear what some of our employees have to say about working for MongoDB in Israel. Nitzan Aloni, Enterprise Account Executive Israel is a start-up nation; full of unicorns and full of innovation. This is why MongoDB is so perfect for our customers, and we feel it as a sales team. Customers love the technology and are happy to work and engage with us. It has been really wonderful to see the development of MongoDB in Israel, which is now the core developer data platform for many successful companies. Personally, one of my biggest achievements so far has been working with companies within the finance and healthcare industries. These sectors can sometimes be more legacy and less technologically advanced, but my team and I have worked with these customers to increase their confidence and knowledge. Today, almost all the big finance companies in Israel are using Atlas, our cloud managed database service. MongoDB is a major part of their modernization journey at its core. Our team in Israel is full of great minds, a lot of experience and ambition, and amazing energy! If you want to learn and develop yourself, this is the right place for you. Not to mention, the market is full of opportunities for MongoDB. Every company is looking to modernize itself, because today, more than ever, organizations need to be on the cutting edge in order to survive. MongoDB is a driving force for modernization in every sector. We are far from slowing down! Learn more about MongoDB’s opportunity and regional leadership from Vice President Gabriella Cohen. Itay Tevel, Sr. Solutions Architect My journey at MongoDB has been challenging and rewarding. I firmly believe that developers will build the future, and MongoDB empowers them by offering a developer data platform with an optimal user experience that extends far beyond the core database. My role as a Solutions Architect is multi-faceted and keeps me continuously learning while also giving me the opportunity to educate others. One of the most rewarding aspects of my role is the deep, meaningful discussions I have with developers about complex data problems. Even when engaging with teams that have been using MongoDB extensively for many years, we're often able to offer value and insights, thanks to MongoDB Atlas' ability to meet a broad range of customer demands. What sets MongoDB apart is our relentless commitment to the customer's experience. We're not just about developing outstanding technology; we're about ensuring that technology serves our customers in the best way possible. We actively work to remove roadblocks and streamline their journey with us. This customer-centric ethos permeates every level of our organization and informs every decision we make. The MongoDB Israel team is particularly known for its commitment to holistic professional development. Our Solutions Architects don't just focus on the technical aspects of their roles; they also invest considerable time and effort into enhancing their soft skills. This includes honing abilities like negotiation, time management, teamwork, and presentation skills, amongst others. Solutions Architects have the opportunity to evolve into advisory or principal roles, tackling more complex challenges, and expanding their spheres of influence. The interplay between the technical and business aspects of our customer engagements presents a fascinating challenge, one that keeps our Solutions Architects engaged, evolving, and motivated. MongoDB presents an exciting blend of technological innovation, customer engagement, and personal development opportunities. This makes it an ideal place for Solutions Architects looking to make a difference and advance their careers. Daphne Levy, Technical Services Engineer As a Technical Service Engineer (TSE) at MongoDB, every day I have the opportunity to learn and grow in my role. Apart from diving into diverse topics, such as core server performance, queries and indexes, Atlas search, replication, changestreams, and more, I have also had the opportunity to step out of my comfort zone. TSEs at MongoDB have the opportunity to work on challenging, impactful, and complex projects with the help of colleagues. MongoDB supports, respects, and challenges me to exceed my own expectations. Even when I feel uncertain or lacking in knowledge, I always know that someone has my back and people are always available to help me solve a new challenge. That being said, autonomy and ownership are also highly valued. Our managers trust in our abilities. This empowerment enables individuals to be independent in their work and be comfortable taking risks in order to influence the overall success of the team. As a TSE, I feel fulfilled knowing that my contributions play a significant role in the company’s success by helping customers resolve intricate technical issues. The sense of unity and cohesiveness within the team is strong; in Hebrew we call it ‘gibush’ (גיבוש). What I particularly appreciate about the Israeli team is its commitment to diversity and inclusion. We come from various cultures, sometimes different countries, and speak several languages. This diversity of backgrounds leads to a rich array of perspectives and ideas, sparking innovation and creativity. Collaboration among team members is a defining aspect of our work environment. It would be hard to find another tech company that invests so much into its employees. We constantly have the opportunity to develop and participate in training programs, conferences, workshops, and mentorships, along with participating in fun team bonding events that bring us together and help us build a more unified team. Lastly, MongoDB places great importance on maintaining a healthy work-life balance with a supportive environment that prioritizes employee well-being and offers flexibility, which has made it particularly advantageous for me as a working mum. Irina Sidorova, Customer Success Manager Joining MongoDB in Israel was an easy decision for me. I was blown away by how excited and passionate the developer community is about MongoDB. Their energy and genuine love for the technology made me curious and excited to dive deeper into learning more about it. During the interview process, I was impressed by the level of professionalism and humanity displayed by everyone I interacted with. In addition to having great and open conversations with each one of the team members, they provided me with clear expectations and learning materials, which gave me confidence and left me with a feeling that the company sets its people up for success. In my opinion, what really sets MongoDB apart is the culture of constructive feedback. After each interview, I received honest and valuable feedback that helped me to improve for the next round. I believe that this is crucial for personal and professional growth, and it's something that I really appreciate about MongoDB. One of my biggest highlights so far has been leading the Israeli MongoDB Certification Program. The program allowed participants to enrich their knowledge of MongoDB technology with the help of MongoDB University courses and Online Live Webinars with our Israeli-based team of Solution Architects. It was a great reminder of how much developers love the technology and how eager they are to learn more about it. At MongoDB, you'll have the opportunity to work on projects that challenge the technology market and be part of a company that is constantly growing, improving, and evolving. On top of it all, MongoDB is committed to providing employees with the support and resources they need to succeed. The culture is collaborative, supportive, and inclusive, with everyone in the company working together towards a common goal. If you're looking for a dynamic, challenging, and supportive work environment, join us at MongoDB. And in case you’re still debating, the Tel Aviv office is simply stunning – boasting the best view of the city and providing a perfect environment to work in! Join this dynamic team in Tel Aviv - view open roles on our careers site .
MongoDB University Expands Education Outreach With New Partnerships
MongoDB is paving the way for the future of development and looking to bridge the software development skills gap through partnerships with leading online learning platforms. New partnerships with LinkedIn Learning and Coursera will make MongoDB University content widely available to new learners. And, to enable traditionally underrepresented groups to advance their skills with MongoDB, we're providing free certifications to developers through Women Who Code, MyTechDev, and Lesbians Who Tech & Allies. The new partnerships are part of the MongoDB for Academia program, which gives educators access to free MongoDB Atlas credits and certifications. We're also excited to announce new online learning courses to teach database administrators and SQL professionals the power of non-relational database technologies. Supply, demand, and closing the skills gap Today's tech industry faces immense demand for software engineers to create new applications, and choosing the right database technology is essential for success. It establishes the backbone and influences the speed of building, deploying, and updating an application. These newly announced initiatives will make it easier than ever for developers to quickly learn how to leverage MongoDB and use it to build and deploy successful applications within high-demand industries across the globe — rapidly closing the gap on the current software engineer shortage. MongoDB for Academia will provide educators and students with the opportunity to hone their database development skills for free. The initiative offers more than $400,000 of MongoDB Atlas credits, free certification, access to free curriculum resources, and the GitHub Student Developer Pack, giving students around the world the opportunity to enter the workforce with relevant certifications and skills. With the addition of MongoDB University developer courses on LinkedIn Learning , millions of people across the globe have access to an expansive range of software development skills. Offering courses in Introduction to MongoDB as well as Java, Python, C#, and Node.js, these courses have been bundled into Learning Paths, creating the perfect platform for students aiming to pass MongoDB's Associate Developer certification. This partnership between MongoDB University and LinkedIn provides a chance to upskill the knowledge base of existing and aspiring software developers. Last year, 446 million skills were added to LinkedIn profiles. By joining forces with Coursera , MongoDB University is taking its Introduction to MongoDB course to a new level. The course will now be available to 124 million global learners through the Coursera platform. Upon completion, learners will receive an electronic and physical certificate to add to their education portfolio. With this new partnership, learners will have access to official educational materials actively developed and maintained by MongoDB experts. For experienced SQL professionals looking to expand their skills, MongoDB University is rolling out a new online learning path: MongoDB for SQL Professionals. With this course, students will gain the knowledge and expertise they need to tackle complex non-relational databases and develop powerful, innovative applications. Since refreshing MongoDB University last November, more than 50,000 developers have taken advantage of the free, ungated courses per month with more than 600 individuals becoming certified professionals. Normalizing diversity MongoDB is proud to also be partnering with leading diversity and inclusion advocacy groups to make software development a more welcoming profession for underrepresented groups. Through the new partnership with Women Who Code , MongoDB seeks to certify 100 members by the end of 2023. Women Who Code is an international non-profit organization that provides programming assistance to women pursuing technology careers. MongoDB University is also partnering with Lesbians Who Tech & Allies to certify 100 members beginning in October 2023. Lesbians Who Tech & Allies is a community of LGBTQ women, non-binary, and transgender individuals in and around tech. Finally, by partnering with MyTechDev , a non-profit organization focused on empowering African students, MongoDB hopes to provide practical coding skills and specialization pathways in enterprise technologies. The initiative seeks to certify 500 people in Nigeria, South Africa, Kenya, and Egypt over the next two years. These exciting new partnerships are all part of our mission to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. To start learning with MongoDB University, visit learn.mongodb.com .
New Online Archive with Performance Improvements and Enhanced Metrics
We are excited to announce several new features coming to Atlas Online Archive. With these improvements, customers will observe higher performance, be able to choose which region hosts their archival data, and have greater insight into the data stored in the archive through improved metrics. New storage engine We have optimized Online Archive through the introduction of a new storage engine that captures metadata during archival to power faster query performance. The new underlying storage service minimizes the overall data scan when querying the Online Archive. Additionally, the new storage engine delivers additional benefits in the form of storage optimizations like sorting as well as ongoing rebalancing to deliver consistent performance over time. With this feature, you will experience faster Online Archive querying performance while your overall costs are decreased when querying against the archives. Ability to choose a storage region With the new Online Archives, the users will be able to select the storage region at the time of creating an Online Archive. There will be a feature to choose one of the supported Data Federation regions from the dropdown during the creation process. In addition, we display the closest region so the users can make an informed decision about their storage region. Enhanced metrics With these new enhanced and improved metrics, we are helping customers better understand what data is in their archive such that they can connect to the appropriate endpoint for their use case and have insight into the performance of archival jobs. With these enhanced metrics, users can see a dashboard including the total # of documents in the archive and overall Archive data size (with a tooltip to explain metrics), what rate are archival jobs archiving, Min/Max Date Fields in the archive, and other statistics that are helpful and important for the users to increase the visibility of the archives. Newly created archives Note that only the newly created Online Archives on or after 06/07/2023 will see the performance improvements and enhancements initially. All existing archives created before 06/07/2023 will continue to function as-is (without the new improvements and enhanced metrics). Over the coming year, we will migrate all existing archives created before 06/07/2023 to the new backend storage service. See the documentation for additional information about creating new Online Archives and Enhanced Metrics .
Dissecting Open Banking with MongoDB: Technical Challenges and Solutions
Thank you to Ainhoa Múgica for her contributions to this post. Unleashing a disruptive wave in the banking industry, open banking (or open finance), as the term indicates, has compelled financial institutions (banks, insurers, fintechs, corporates, and even government bodies) to embrace a new era of transparency, collaboration, and innovation. This paradigm shift requires banks to openly share customer data with third-party providers (TPPs), driving enhanced customer experiences and fostering the development of innovative fintech solutions by combining ‘best-of-breed’ products and services. As of 2020, 24.7 million individuals worldwide used open banking services, a number that is forecast to reach 132.2 million by 2024. This rising trend fuels competition, spurs innovation, and fosters partnerships between traditional banks and agile fintech companies. In this transformative landscape, MongoDB, a leading developer data platform, plays a vital role in supporting open banking by providing a secure, scalable, and flexible infrastructure for managing and protecting shared customer data. By harnessing the power of MongoDB's technology, financial institutions can lower costs, improve customer experiences, and mitigate the potential risks associated with the widespread sharing of customer data through strict regulatory compliance. Figure 1: An Example Open Banking Architecture The essence of open banking/finance is about leveraging common data exchange protocols to share financial data and services with 3rd parties. In this blog, we will dive into the technical challenges and solutions of open banking from a data and data services perspective and explore how MongoDB empowers financial institutions to overcome these obstacles and unlock the full potential of this open ecosystem. Dynamic environments and standards As open banking standards continue to evolve, financial institutions must remain adaptable to meet changing regulations and industry demands. Traditional relational databases often struggle to keep pace with the dynamic requirements of open banking due to their rigid schemas that are difficult to change and manage over time. In countries without standardized open banking frameworks, banks and third-party providers face the challenge of developing multiple versions of APIs to integrate with different institutions, creating complexity and hindering interoperability. Fortunately, open banking standards or guidelines (eg. Europe, Singapore, Indonesia, Hong Kong, Australia, etc) have generally required or recommended that the open APIs be RESTful and support JSON data format, which creates a basis for common data exchange. MongoDB addresses these challenges by offering a flexible developer data platform that natively supports JSON data format, simplifies data modeling, and enables flexible schema changes for developers. With features like the MongoDB Data API and GraphQL API , developers can reduce development and maintenance efforts by easily exposing data in a low-code manner. The Stable API feature ensures compatibility during database upgrades, preventing code breaks and providing a seamless transition. Additionally, MongoDB provides productivity-boosting features like full-text search , data visualization , data federation , mobile database synchronization , and other app services enabling developers to accelerate time-to-market. With MongoDB's capabilities, financial institutions and third-party providers can navigate the changing open banking landscape more effectively, foster collaboration, and deliver innovative solutions to customers. An example of a client who leverages MongoDB’s native JSON data management and flexibility is Natwest. Natwest is a major retail and commercial bank in the United Kingdom based in London, England. The bank has moved from zero to 900 million API calls per month within years, as open banking uptake grows and is expected to grow 10 times in coming years. At a MongoDB event on 15 Nov 2022, Jonathan Haggarty, Natwest’s Head of “Bank of APIs” Technology – an API ecosystem that brings the retail bank’s services to partners – shared in his presentation titled Driving Customer Value using API Data that Natwest’s growing API ecosystem lets it “push a bunch of JSON data into MongoDB [which makes it] “easy to go from simple to quite complex information" and also makes it easier to obfuscate user details through data masking for customer privacy. Natwest is enabled to surface customer data insights for partners via its API ecosystem, for example “where customers are on the e-commerce spectrum”, the “best time [for retailers] to push discounts” as well insights on “most valuable customers” – with data being used for problem-solving; analytics and insight; and reporting. Performance In the dynamic landscape of open banking, meeting the unpredictable demands for performance, scalability, and availability is crucial. The efficiency of applications and the overall customer experience heavily rely on the responsiveness of APIs. However, building an open banking platform becomes intricate when accommodating third-party providers with undisclosed business and technical requirements. Without careful management, this can lead to unforeseen performance issues and increased costs. Open banking demands high performance of the APIs under all kinds of workload volumes. OBIE recommends an average TTLB (time to last byte) of 750 ms per endpoint response for all payment invitations (except file payments) and account information APIs. Compliance with regulatory service level agreements (SLAs) in certain jurisdictions further adds to the complexity. Legacy architectures and databases often struggle to meet these demanding criteria, necessitating extensive changes to ensure scalability and optimal performance. That's where MongoDB comes into play. MongoDB is purpose-built to deliver exceptional performance with its WiredTiger storage engine and its compression capabilities. Additionally, MongoDB Atlas improves the performance following its intelligent index and schema suggestions, automatic data tiering, and workload isolation for analytics. One prime illustration of its capabilities is demonstrated by Temenos, a renowned financial services application provider, achieving remarkable transaction volume processing performance and efficiency by leveraging MongoDB Atlas. They recently ran a benchmark with MongoDB Atlas and Microsoft Azure and successfully processed an astounding 200 million embedded finance loans and 100 million retail accounts at a record-breaking 150,000 transactions per second . This showcases the power and scalability of MongoDB with unparalleled performance to empower financial institutions to effectively tackle the challenges posed by open banking. MongoDB ensures outstanding performance, scalability, and availability to meet the ever-evolving demands of the industry. Scalability Building a platform to serve TPPs, who may not disclose their business usages and technical/performance requirements, can introduce unpredictable performance and cost issues if not managed carefully. For instance, a bank in Singapore faced an issue where their Open APIs experienced peak loads and crashes every Wednesday. After investigation, they discovered that one of the TPPs ran a promotional campaign every Wednesday, resulting in a surge of API calls that overwhelmed the bank's infrastructure. A scalable solution that can perform under unpredictable workloads is critical, besides meeting the performance requirements of a certain known volume of transactions. MongoDB's flexible architecture and scalability features address these concerns effectively. With its distributed document-based data model, MongoDB allows for seamless scaling both vertically and horizontally. By leveraging sharding , data can be distributed across multiple nodes, ensuring efficient resource utilization and enabling the system to handle high transaction volumes without compromising performance. MongoDB's auto-sharding capability enables dynamic scaling as the workload grows, providing financial institutions with the flexibility to adapt to changing demands and ensuring a smooth and scalable open banking infrastructure. Availability In the realm of open banking, availability becomes a critical challenge. With increased reliance on banking services by third-party providers (TPPs), ensuring consistent availability becomes more complex. Previously, banks could bring down certain services during off-peak hours for maintenance. However, with TPPs offering 24x7 experiences, any downtime is unacceptable. This places greater pressure on banks to maintain constant availability for Open API services, even during planned maintenance windows or unforeseen events. MongoDB Atlas, the fully managed global cloud database service, addresses these availability challenges effectively. With its multi-node cluster and multi-cloud DBaaS capabilities, MongoDB Atlas ensures high availability and fault tolerance. It offers the flexibility to run on multiple leading cloud providers, allowing banks to minimize concentration risk and achieve higher availability through a distributed cluster across different cloud platforms. The robust replication and failover mechanisms provided by MongoDB Atlas guarantee uninterrupted service and enable financial institutions to provide reliable and always-available open banking APIs to their customers and TPPs. Security and privacy Data security and consent management are paramount concerns for banks participating in open banking. The exposure of authentication and authorization mechanisms to third-party providers raises security concerns and introduces technical complexities regarding data protection. Banks require fine-grained access control and encryption mechanisms to safeguard shared data, including managing data-sharing consent at a granular level. Furthermore, banks must navigate the landscape of data privacy laws like the General Data Protection Regulation (GDPR), which impose strict requirements distinct from traditional banking regulations. MongoDB offers a range of solutions to address these security and privacy challenges effectively. Queryable Encryption provides a mechanism for managing encrypted data within MongoDB, ensuring sensitive information remains secure even when shared with third-party providers. MongoDB's comprehensive encryption features cover data-at-rest and data-in-transit, protecting data throughout its lifecycle. MongoDB's flexible schema allows financial institutions to capture diverse data requirements for managing data sharing consent and unify user consent from different countries into a single data store, simplifying compliance with complex data privacy laws. Additionally, MongoDB's geo-sharding capabilities enable compliance with data residency laws by ensuring relevant data and consent information remain in the closest cloud data center while providing optimal response times for accessing data. To enhance data privacy further, MongoDB offers field-level encryption techniques, enabling symmetric encryption at the field level to protect sensitive data (e.g., personally identifiable information) even when shared with TPPs. The random encryption of fields adds an additional layer of security and enables query operations on the encrypted data. MongoDB's Queryable Encryption technique further strengthens security and defends against cryptanalysis, ensuring that customer data remains protected and confidential within the open banking ecosystem. Activity monitoring With numerous APIs offered by banks in the open banking ecosystem, activity monitoring and troubleshooting become critical aspects of maintaining a robust and secure infrastructure. MongoDB simplifies activity monitoring through its monitoring tools and auditing capabilities. Administrators and users can track system activity at a granular level, monitoring database system and application events. MongoDB Atlas has Administration APIs , which one can use to programmatically manage the Atlas service. For example, one can use the Atlas Administration API to create database deployments, add users to those deployments, monitor those deployments, and more. These APIs can help with the automation of CI/CD pipelines as well as monitoring the activities on the data platform enabling developers and administrators to be freed of this mundane effort and focus on generating more business value. Performance monitoring tools, including the performance advisor, help gauge and optimize system performance, ensuring that APIs deliver exceptional user experiences. Figure 2: Activity Monitoring on MongoDB Atlas MongoDB Atlas Charts , an integrated feature of MongoDB Atlas, offers analytics and visualization capabilities. Financial institutions can create business intelligence dashboards using MongoDB Atlas Charts. This eliminates the need for expensive licensing associated with traditional business intelligence tools, making it cost-effective as more TPPs utilize the APIs. With MongoDB Atlas Charts, financial institutions can offer comprehensive business telemetry data to TPPs, such as the number of insurance quotations, policy transactions, API call volumes, and performance metrics. These insights empower financial institutions to make data-driven decisions, improve operational efficiency, and optimize the customer experience in the open banking ecosystem. Figure 3: Atlas Charts Sample Dashboard Real-Timeliness Open banking introduces new challenges for financial institutions as they strive to serve and scale amidst unpredictable workloads from TPPs. While static content poses fewer difficulties, APIs requiring real-time updates or continuous streaming, such as dynamic account balances or ESG-adjusted credit scores, demand capabilities for near-real-time data delivery. To enable applications to immediately react to real-time changes or changes as they occur, organizations can leverage MongoDB Change Streams that are based on its aggregation framework to react to data changes in a single collection, a database, or even an entire deployment. This capability further enhances MongoDB’s real-time data and event processing and analytics capabilities. MongoDB offers multiple mechanisms to support data streaming, including a Kafka connector for event-driven architecture and a Spark connector for streaming with Spark. These solutions empower financial institutions to meet the real-time data needs of their open banking partners effectively, enabling seamless integration and real-time data delivery for enhanced customer experiences. Conclusion MongoDB's technical capabilities position it as a key enabler for financial institutions embarking on their open banking journey. From managing dynamic environments and accommodating unpredictable workloads to ensuring scalability, availability, security, and privacy, MongoDB provides a comprehensive set of tools and features to address the challenges of open banking effectively. With MongoDB as the underlying infrastructure, financial institutions can navigate the ever-evolving open banking landscape with confidence, delivering innovative solutions, and driving the future of banking. Embracing MongoDB empowers financial institutions to unlock the full potential of open banking and provide exceptional customer experiences in this era of collaboration and digital transformation. If you would like to learn more about how you can leverage MongoDB for your open banking infrastructure, take a look at the below resources: Open banking panel discussion: future-proof your bank in a world of changing data and API standards with MongoDB, Celent, Icon Solutions, and AWS How a data mesh facilitates open banking Financial services hub