Discover the consequences of neglecting meticulous record-keeping and understand why it’s crucial for sustained success in the financial industry.
Shelby and Abrar, could you kindly share your professional journey and involvement in developing Arteria AI, emphasizing the significance of data and documentation in the banking industry?
Shelby: I was first inspired by the documentation opportunity while I was a partner (many years ago) at one of the “Seven Sister” Canadian law firms. I specialized in M&A and litigation and saw thousands of documents be printed and put into folders for due diligence and document reviews. Large teams of lawyers would then sift through everything, pencils in hand, without the use of technology. This was so crazy to me – so I left the partnership and solo founded my first startup, with Abrar as one of my first employees, that combined early principles of data and technology with innovative organizational design to run large-scale due diligence processes.
The growth was meteoric and we were acquired by Deloitte. I always joke that if I wasn’t pregnant with my first child then I may have done something more with it, but the acquisition kicked off an amazing ride at Deloitte. With Abrar’s help, I ran several portfolios for Deloitte including data, analytics, growth, innovation and AI, and it was during that time where we met our other co-founder, Jonathan Wong.
At some stage we realized we weren’t quite finished with documentation, and there was a burning need on the industry side for an enterprise-grade solution. There were a few players in the market that were largely focused on making corporate legal and procurement functions more efficient, but nothing that supported the documentation pain points of the business. This was particularly relevant in financial services.
So, we started building something inside of Deloitte, incubated it for a couple of years, and then spun it out into a now wholly independent startup (Arteria). Financial services has always been our core area of focus, and we’re super proud to work with some of the largest institutions in the world.
Considering the pivotal role of documentation in the banking industry, could you elaborate on why it’s so imperative for banks to prioritize data and documentation? What consequences might arise from neglecting this vital aspect?
Nearly every critical process in a financial institution is underpinned by a series of documents (i.e., booking a trade, originating a loan, onboarding a new client, etc.). It’s estimated that approximately 90% of data inside an enterprise is unstructured, and less than 1% of data is used in decision-making. Typically, the most relevant medium for unstructured data is in the document.
Information becomes static as soon as it moves into the document, which forces the overall process to become very manually intensive (think: email-driven workflows, CTRL+F for previous language, manual data entry, etc.). As a result, core business processes are hamstrung – very limited data is used in decision-making and end-to-end automation is not possible.
Could you provide insights into how challenges with unstructured data impact financial institutions? Why is it crucial for them to address these challenges, especially in the current regulatory environment?
It’s probably helpful to first understand how documentation infrastructure addresses the unstructured data problem. Arteria’s north star is to be data-first, meaning we digitize the document (i.e., structure its data) at the onset. A data model is applied automatically if we are generating a document, and if the document already exists, we use to append one – in either case, the data model continues to update as the document evolves, meaning we understand what’s inside the document at all times. As a result, we can serve up insights to augment decision-making, drive intelligent automation, and enable real-time transparency into the documentation portfolio.
In terms of regulatory, the combination of new requirements and the ever-growing complexity of products and business processes creates a significant pain point in reporting. Almost always, the document is the golden source of information, and its data must be structured in order for it to be useful to core systems and processes (such as reporting). This is currently done by manual data entry processes which are expensive, inefficient, and limited in scope.
With documentation infrastructure, data seamlessly flows out of the document and reporting is available in real time, ensuring the institution can respond quickly to regulatory requests and monitor obligations.
Moreover, if there is a regulatory event that triggers repapering, the process is completely streamlined as the data already exists in the documentation portfolio.
Of course, the regulatory angle extends beyond reporting and repapering alone, and specific compliance initiatives can often be augmented by documentation. For example, the industry’s move to T+1 requires end-to-end efficiency in the post-trade lifecycle. This is a hot use case for us, as automating the documentation in post-trade adds significant gains to process-level efficiency.
In your opinion, how can banks effectively leverage AI to address the prevalent issue of unstructured data? What specific benefits might an AI-driven approach bring to modernizing documentation infrastructure in the banking sector?
Historically, there have been different vendors for digitizing documentation (i.e., using AI models to structure document data) and managing the generation and workflow around documentation. To the end user, the distinction is minimal – whether or not a document already exists, the capability set that drives automation and intelligence should be the same. The capability to ingest an existing document, stack a data model, and drive digitization in the documentation layer is where our clients are getting the most value with AI. In particular, models must be able to work with longer form documents that have complexity in order to be widely applicable across an institution.
There is also the element of configuring the platform. Arteria has a no-code module that enables functional business users with the tools to effectively self-implement without requiring technical expertise. We are using AI to further drive down time to value by automating Arteria’s configuration.
How does the implementation of AI in documentation processes contribute to revenue enhancement, improved efficiency, cost-effectiveness, enhanced risk management, and informed decision-making for financial institutions?
AI-enabled documentation infrastructure is effectively a layer to connect documentation, a core part of nearly every process, to the automation lifecycle at an institution.
This drives automation and intelligence to all involved stakeholders to increase efficiency at a process level.
The business value varies by use case, but typically falls into speed, capital efficiency, risk controls and client experience. In trading onboarding, for example, documentation infrastructure automates document generation, accelerated negotiation from intelligent workflow tools, and the seamless flow of data between documents and core systems. The result is reduced time to trade (i.e., faster time to revenue), decreased operational cost, standardized risk controls and full visibility into the document portfolio. It’s also one of the first and most important touchpoints between the institution and the client, so speed adds significant value from a client service perspective.
In commercial lending, the origination process requires several documents to be created at different points of the workflow, each building on the previous document. The ability to automatically generate the next document using the data from the last significantly increases efficiency, which is particularly important as this is a front-office activity.
On a personal level, could you kindly share some strategies you employ to stay ahead in the rapidly evolving field of AI and data management?
Staying on top of the latest technology and techniques is critical, especially because the rate of change in AI is so fast. We are fortunate to have an incredible Head of Data Science, Dr. Amir Hajian, who is a PhD from Princeton and ran science at Thomson Reuters. He and his team monitor the latest developments in AI on a daily basis, and he makes sure we are approaching AI with the latest techniques and thinking. We also invest in research that is relevant to financial services documentation. As an example, we are thrilled that a research paper that we wrote at Arteria has recently been accepted into a leading AI conference. Our clients trust us to be strong thought leadership partners in the space. We have a real responsibility not just to be up to speed on the latest trends, but also to work on solutions to problems the scientific community hasn’t quite figured out yet.
For professionals entering AI or data management, what personal advice would you offer to help them navigate challenges and contribute meaningfully to the industry?
The AI ecosystem looks much different now than it did when we entered the space. One strategic approach that has stood the test of time is the importance of a tangible value proposition. We strongly believe that AI-focused businesses need to be anchored in business value. It’s easy to get sucked into research for research’s sake – AI must be deployed to solve specific problems that drive tangible business value for clients. All businesses will eventually fail if there isn’t a market for the solution.
As pioneers in AI-driven documentation, what future trends do you foresee in how banks manage their data and documentation, and how can they prepare for these evolving dynamics?
Documentation infrastructure will follow a similar cycle as previous significant enterprise technologies in that it will soon become table stakes. The problem set is ubiquitous to every part of the institution. In terms of preparation, it’s critical that institutions evaluate vendors for applicability across the enterprise. Having unique widgets cover specific documentation use cases may work in the short-term, but in time, tech stacks will consolidate and the players who chose enterprise-grade software from the onset will be ahead of the pack. It’s also critical to have the right people at the table when evaluating potential solutions. In almost every case, documentation infrastructure has a front-office value proposition, so business users should have a strong say when it comes to evaluation.
In conclusion, could you both share any final thoughts or key takeaways regarding the transformative impact of AI on documentation infrastructure in the banking sector and its broader implications for the industry?
The only thing I’ll add on top of what we’ve already covered is the size of the opportunity. Financial institutions have gone through several waves of digital transformations, and nearly every one has focused on systems and processes with structured data. New capabilities that extend the transformation to unstructured parts of the institution have created a vast opportunity of untapped value. McKinsey estimates that there is a $20 trillion tech-enabled value creation opportunity in banking alone, and we strongly believe that unstructured data will be a critical factor in the next iteration of change.
Stay Ahead of the Financial Curve with Our Latest Fintech News Updates!