Back to Video Hub
Workforce intelligence
AI in Finance

[CFO Roundtable] Scaling Finance with AI: What’s working, and what’s next

Key Highlights

Complex, non-deterministic questions are the next AI frontier
AI has gotten good at answering specific, structured questions — but finance teams still struggle to get reliable answers to open-ended questions like "why did churn increase last month?" Marten Abrahamsen describes a framework Vercel uses to categorize workflows by complexity and predictability, and explains why cracking non-deterministic, complex analysis would be a genuine game-changer for FP&A teams.
Jump to
20:58
The CFO-CPO relationship is more important than ever
As AI creates pressure to reduce headcount and restructure workflows, the partnership between Finance and People leaders has become critical. Ben Gammell describes how he and Brex's Chief People Officer hold weekly one-on-ones specifically to work through organizational changes together, making sure efficiency goals don't come at the expense of a humane employee experience.
Jump to
34:56
How to run an AI pilot that actually leads to a decision
Pilots fail when success isn't defined upfront. Ben Gammell walks through how Brex validated their AI customer service platform by running a structured back-test first, then gradually exposing a small percentage of live chats to the tool before scaling — giving them a clear confidence interval on efficacy before committing to a full deployment.
Jump to
43:29

AI is reshaping how finance teams operate, but the biggest change is not the tools. It is how CFOs and Finance leaders rethink workflows, accountability, and scale. In this panel discussion, finance executives from SeatGeek, Brex, and Vercel share how they are adopting AI inside the Office of Finance without sacrificing trust, controls, or compliance.

Whether you are modernizing FP&A, improving close processes, or evaluating new AI tools, this conversation offers real-world guidance from operators scaling high-growth companies.

In this session, you will learn how finance teams are using AI to:

  • Move from reactive month-end workflows to proactive, predictive operations
  • Automate variance analysis, flux reporting, and repetitive manual tasks
  • Build internal AI tools without dedicated engineering resources
  • Decide what to build versus buy and measure ROI
  • Create secure sandbox environments for experimentation
  • Balance automation with auditability, compliance, and human review

About the speakers

Teddy Collins, EVP Finance, SeatGeek

Teddy Collins is EVP of Finance at SeatGeek, a live entertainment ticketing platform. He has been an early adopter of AI within the Office of Finance and moderated this panel discussion. At SeatGeek, his team has gone from zero internally developed AI tools to roughly ten agents, with new ones being added on a weekly basis. He is also a longtime TeamOhana customer.

Ben Gammell, President & CFO, Brex

Ben Gammell is President and CFO of Brex, a corporate spend management platform serving tens of thousands of businesses. Before taking on the finance and go-to-market leadership at Brex, he led operations at SoFi and held investment roles at Goldman Sachs. 

His background spans both the operational and financial sides of high-growth companies, and at Brex he sits at the center of one of the most active AI transformations in fintech — both in how the company builds products for finance teams and how its own Office of Finance operates.

Marten Abrahasen, CFO, Vercel

Marten Abrahamsen is CFO of Vercel, a developer tools company known for its AI-native infrastructure platform. He previously served as CFO of Fundbox, a working capital solutions provider for small businesses, and began his career in private equity. 

At Vercel, Abrahamsen has built a finance culture centered on experimentation and internal tool-building, where team members are expected to use AI in their daily workflows and are evaluated on that ability starting from the interview process.

Takeaway #1: AI shifts Finance from reactive to proactive

For years, finance teams have lived in a reactive loop: close the books, chase receipts, explain last month's numbers. AI is starting to break that pattern. As Ben Gammell puts it, the goal is to embed intelligence directly into financial workflows so that issues get flagged — or resolved — before they become problems.

“Historically, finance has been quite a reactive function. AI is taking us a step further — how do you actually make it more predictive?”

— Ben Gammell, President & CFO, Brex

At Brex, that looks like an audit agent that reviews all company expenses, identifies anything outside policy, and proactively surfaces repayment recommendations — without a finance professional in the loop for the routine work. 

At Vercel, it looks like an AI-powered budget versus actuals tool connected to NetSuite and Stripe that lets anyone on the finance team ask plain-language questions about spend changes without waiting on the data team to run a SQL query.

The common thread is speed and access. AI doesn't replace the finance team's judgment — it removes the friction between a question and an answer, so the team can spend more time on decisions instead of data retrieval.

Takeaway #2: Finance teams can build their own tools now

One of the more surprising themes in this conversation is how much finance teams are building themselves. Marten Abrahamsen walked through how Vercel's finance and accounting teams built their own internal tools — including a revenue recognition app for ASC 606 compliance — using natural language prompts, with no engineering help required. 

The accounting team even built their own version with a slightly different output format because they had different preferences. This matters because, in the past, getting engineering resources to build internal finance tools was nearly impossible at a high-growth tech company, where engineers prioritize product work above all else.

To make self-serve tool-building work at scale, both Vercel and Brex have put some structure around it. A few things they've found helpful:

  • Separating "small software" — lightweight, individual-use tools — from more centrally managed applications that connect to sensitive data sources like Snowflake or NetSuite, which require tighter security controls
  • Creating a sandboxed internal agent platform where employees can experiment with prompts and agentic workflows without risking company or customer data
  • Recognizing that junior team members are often the most active builders, because they're closest to the repetitive, automatable work

The more senior you get, the more your work involves judgment calls that are harder to hand off to an agent, but the tools junior team members build tend to benefit everyone. As Abrahamsen's CTO likes to say: ask your employees what they hate most about their job, and the answer to that question is what you should build.

Takeaway #3: Build vs. buy requires deliberate thinking

Every finance team faces the same question: should we build this ourselves or buy a vendor solution? The answer, according to the panelists, depends on two things — how unique your use case is, and how much ongoing maintenance you're willing to take on.

Ben Gammell described Brex's decision to build its own automated customer onboarding flow, because it required combining proprietary KYC data with internal credit underwriting models in a way no outside vendor could replicate. 

On the other hand, for customer service AI, Brex bought from Sierra AI — because building a best-in-class chat agent wasn't where Brex had a unique advantage, and a dedicated vendor would build a better product by virtue of serving many customers across many industries. The hidden cost of building is maintenance, and that ongoing work adds up.

"Whilst a lot of these tools and agents are very easy to build, you also have to maintain them and make sure they're kept up to date. There is an inherent tax to that."

— Ben Gammell, President & CFO, Brex

Brex learned this firsthand during the zero-interest-rate era, when they built almost everything internally. The lesson they took from that period was to be more intentional: build only where you have a genuine edge, and buy where a vendor is better positioned. 

Marten Abrahamsen echoed the same approach at Vercel, pointing to their use of Ironclad AI for contract review as an example of a tool worth buying. During a fundraise, what previously required hiring a law firm to manually review hundreds of customer contracts was reduced to a single prompt — instantly producing a list of contracts with M&A termination clauses.

Takeaway #4: AI adoption is a culture challenge, not a technology one

Getting a finance team to actually use AI is more about culture than capability. The tools are accessible. The barrier is inertia — especially for finance professionals who aren't used to starting their day by building something. Ben Gammell admitted that even he was slow to adopt AI at work, because between back-to-back meetings and pressing deliverables, it was hard to carve out time to learn a new tool.

Both Brex and Vercel have tackled this by being deliberate about how they create space for experimentation. Some of the approaches that have worked:

  • Hackathons and demo days give team members unstructured time to play with AI tools without the pressure of a deadline or deliverable
  • Vercel builds AI tool-building into their hiring process — candidates are asked to use an AI development platform as part of their case study interview, setting the expectation before someone joins the team
  • Monthly team showcases where employees share what they've built reinforce a culture where experimentation is recognized and encouraged

Marten Abrahamsen summarized Vercel's internal philosophy as wanting people to get "speeding tickets, not parking tickets." The goal is to move fast and experiment, and the real failure is not trying. Leadership modeling matters too: when a CFO shows up to a team meeting and says they spent an hour on the weekend building something rough, it gives everyone else permission to do the same.

Transcript

Teddy Collins: [00:00:00] Hey everyone. Thanks. Thanks Tushar. Uh, my name's Teddy Collins. I'm the EVP of, of finance and happy longtime, uh, team Mohana customer. And so, got, uh, two cut. Yeah. Great CFOs here with kind of long storied histories across operations and, and other verticals.

First, we've got, uh, Ben Gamble, who's the CFO and president of, of Brex, and he kind of came in, kind of headed operations, then took over finance, then the go to market motion for tens of thousands of customers. So great to see there. Uh, and then before that, um, led operations at at SoFi and then. Um, various kind of investment, uh, positions, uh, starting with, with Goldman Sachs and similar theme, uh, Martin, um, also kind of got his start in the investment area and, and private equity, uh, and then the CFO, um, at, at funbox, which provided working capital solutions for, for small businesses.

And, um, you know, most recently, and, and [00:01:00] currently CFO at, at Versal provides, uh, developer tools for, uh, for folks to build and deploy and scale. You know, high performance AI apps. So thank you. Thank you for joining. 

Ben Gammell: Thank you for having us. Thank you. Yeah. 

Teddy Collins: And I just kind of wanna get started. Both your companies are, you know, AI native, so kind of on the cutting edge of, of AI and kind of given your, you know, background in operations and finance, how does that kinda shape your vision on what the purpose, what AI can do for your company and specifically within the, the Office of Finance?

Ben, I love for you to kick that off for us. 

Ben Gammell: Sure. Happy to. Well, firstly, um, unlike Martin, I, I don't have the luxury of being an AI native company. But we're going through that transformational shift as a company to, to become one. So we were started in 2017 by context, so existed far beyond, uh, far before I should say this, uh, recent AI wave.

As we think about finance, I think that, you know, historically finance has been quite a reactive function and [00:02:00] role where we're. Closing the books after the month closes, or we're chasing receipts of payments that have already gone out. And I think with what we applied to at Brex from like 2018 onwards was trying to create a sort of proactive motion within finance.

So how do we embed sort of intelligence into workflows that finance professionals engage with, whether it's. Expenses, cars, payment flows in general so that you can be a bit more on the front foot. I think AI for us, is taking us a step further than that, which is like, how do you actually make it more predictive?

So how can AI use the context or the agents, use the context in which it operates around your expense policy, how you think about workflows to actually be able to proactively and predictively sort of resolve issues before they even become issues. So an example of this is we have an audit agent whereby that agent can go and summarize all the expenses for our company.

Can actually proactively suggest these are the repayments that should exist because they fall outside of your expense policy and even, you know, take a step further of actually automating and actually taking action on that workflow. So you don't actually need a finance person [00:03:00] in the loop for the sort of more mundane and execution oriented aspects of the work.

So that's how we think about it. Rex? 

Marten Abrahamsen: Yeah, so we don't sell into CFOs who sell most into developers. So AI and the finance org at versa has been mostly around improving internal workflows. Um, so just a couple of examples. Um, we have like an a s 6 0 6 revenue recognition, uh, tool that we just built ourselves.

You can take any contract or any bulk upload of contracts, uh, plug it in there and you sort of get a highlight of which are the contracts that have, like some your special terms that you should make sure you, um, address, uh, individually. And here are all the contracts that are like super boil plates, uh, straightforward and can just be recognized.

Um, prorata. Um, so that's like one tool that we have. Uh, we use it for BVA, so we connect into, um, NetSuite and, and Stripe data. It's kind of like a B minus quality candidly. Um, but it's, it's, it's pretty fast, right? So you can ask a quick question like, Hey, like what happened in this was the change in [00:04:00] spend from this vendor to this vendor month over month or quarter of a quarter, or whatever.

And you can get an answer really, really fast. So it helps sort of speed up the process and make data more accessible, sort of at the fingertips. Um, we, in the past were pretty dependent on help from like our data team to run like advanced SQL queries. Uh, so the data team actually built this, um. It's called internally D zero.

We have a tool called V zero. So the D is for data, um, where you can now just write pretty complex, uh, SQL queries using an LLM and have that be, um, run straight into like our Snowflake database. So you're much less dependent on a, um, so of help from the data team to run more complex. Uh, or, you know, requests and workflows.

So for us, it's just been making every individual on the team more efficient, uh, ability to self-serve more. Uh, and as a result, I think we need slightly fewer people. But we're pretty far from, I think, the, the stage for like, you no longer need anyone in finance and you can just have AI like [00:05:00] close to books for you and do everything.

I think that will take some time. Um, or of course biased, but, uh, huge sort of efficiency improvements. Uh, you might be able to do it with a slightly leaner team than, than in the past. 

Teddy Collins: Yeah. I know, Martin, you said you don't kind of sell these applications outside of Elle, but I, from everything you've described so far, I, I think you kind of maybe owe it to the world to, you know, release those.

Marten Abrahamsen: I think they're actually some of them as case studies on our website. Um, uh, so you can go and download them there. They're free open source. Um, but yeah, some of these other ones, I mean, you can just make them yourself. Like the ASC 6 0 6 1 was like, I, I actually built the first version of it in prompting plain English language.

It's gonna sound like a huge self over ell, but, uh, I had like, literally went to V zero, which is like a ELL product. I typed in what I wanted, I got a draft. Uh, the out the, it looked how I thought it would look, but then when you uploaded the contract, that, uh, output was not how I wanted it. So then I just asked it to, and I needed a couple of attempts because maybe I wasn't [00:06:00] too crisp on exactly what I wanted.

Um, but then I did that and now it spits out sort of the results the way I wanted. Um, but the. Revenue team on my accounting side, they actually built their own because they have a slightly different preference for how they wanna have that information. Uh, and they just do it themselves. Uh, and I think that's like the key unlock In the past, even if you wanted to build some internal tools, you had to get engineering resources and kind of difficult, especially working at a tech company where like you want all your engineers working on stuff that's gonna move the business forward.

Um. Asking them to build some, you know, revenue recognition tool for you is probably not gonna be a super high priority. Um, whereas now you can just do it yourself and get pretty decent outputs. 

Teddy Collins: Yeah, and I want to kind of stay on that topic for for one second. You've already kinda described kind of going from zero to one and kind of, a lot of people have kind of built the apps, experimented, you know, got these different unlocks.

But how do you take those apps and kind of. Bring them to scale on, hey, that might work [00:07:00] from, for you, but like the 50 or a hundred different applications that are kind of all sprawling, sprawling about what kind of happens internally to, to validate and to use and to poke and to improve, uh, those, those applications.

Marten Abrahamsen: Yeah. As I think we definitely have a, uh, a surge in what we internally called small software, which is basically almost like individual use. Uh, software. Um, and that's sort of meant to improve the efficiency of any one individual. Then we do have some tools that are a little bit more centrally, um, run.

And there we have, you know, real sort of, you know, Okta security and that kind of stuff, making sure we're not sort of leaking information. And the, the difference between those two is like, do they have access to sort of underlying, uh, information? So like if there's a connection to say like a Snowflake database or.

Uh, NetSuite or, you know, greenhouse or t mohana data or something that is more sensitive, then they tend to lock it down. Uh, if it is you put in an Excel sheet and you, you get it, [00:08:00] some, perhaps there's some outputs, uh, from it, but you're not feeding sort of live information, but have much lower sort of security requirements, um, for, for those, um, types of instances.

But boy, do they have a, uh, a surge in, in tools that people use. I even like a personal examples. I, I haven't built this yet, but all the people on my team have, I've never found a good to-do list that I like. I still, I've tried everything, but I still sort of come back to like pen and pencil. But I have all the people on my team that have sort of built their own ideal version of a to-do list that fits their workflow as opposed to having to try to find, uh, some third party tool that someone else decided on the workflow and you're now adapting to that.

Uh, and that's obviously super easy to do. It's like a database of sorts. Uh, and uh, yeah, you can have as many of those as you want in an organization. 

Teddy Collins: Yeah, I mean, I'd love to to geek out about productivity tools, which probably literally my favorite, favorite topic, but wanna get back over to, to Ben and, you know, you serve [00:09:00] kind of tens of thousands of kind of offices of, of finance, but I also know you're kind of a, a big proponent of kind of investing in bringing.

Kind of early stage companies into your finance stack. Can you tell me a bit more about kind of your approach to your. To your CFO finance stack? 

Ben Gammell: Yeah. I think that a, a big part of this is really just like trying to meet customers where they are. I think that there's a, a tension at play with any company that is providing an AI tool or platform externally to their customers about recognizing the audience you're speaking to.

I think when we think about our, like startup customers, smaller customers, like they're more technologically enabled. They're more technologically fluent. They will probably, especially if they've started within the last year or so, be sort of AI native and so they're. Propensity to adopt and, and really run with more sort of AI forward features and functionality is like just much, much higher.

I think you have this sort of, you know, there's a concept, concept of crossing the chasm. There's sort of this longer tail of customers that are maybe a bit more established, have already pre-existing solutions or [00:10:00] workflows that you're now having to sort of, you know, help them evolve through that change.

And, and for what we do is, is we recognize that like. There is, we think of AI as it needs to be the icing on the cake. And, and what I mean by that is, especially when you think about finance, like. When you think about movements of money and the fact that you still have to close your books and you have all these sort of processes in place that still need to pass muster, you need to make sure you just do those table state things really, really well in a really reliable way that instills trust and confidence in your product.

And I think once you've gotten that trust from the customer, you can then start to prompt them and so show them like, Hey, there's a better way of doing this, or There's this other approach you could be taking, or Here's how you can. Um, sort of move your company forward in terms of enabling 'em to move faster and focus on the things that matter, maybe less so on expenses and things on the backend.

And that is a bit of an education journey though for us. Like, honestly for last, uh, week, I was having a conversation with A CFO who they are still in the mindset of they really like say personal liability cards [00:11:00] because like that for them just feels like, okay, these employees will care more about how they're spending if they feel like they are personally liable and they may not get reimbursed by their company.

Brex we're obviously biased, but we think that's like a relatively antiquated way of thinking and that you actually should be able to, with the right controls in place, be able to move forward with more of a corporate liability and, and actually have your employees on a card. Just makes it easier for everyone on the back end.

But like obviously if I'm talking to that a that that CFO and I'm like, Hey, here's all this AI things you could do automatically and you would decentralize decision making autonomy to these agents that you would program and you'd be able to verify like that is so far away from where they currently are in their journey, that that would fall on deaf ears.

And so I think for us, the way we try to do it is like we want to be. As on the leading edge as our customers that are very AI native. For example, like Anthropic is a customer of Brex. We're sort of what we're beta testing with them is far more down the line of what agents could do from a finance professional perspective than what, you know, if you're a Fortune 500 company on Brex and [00:12:00] maybe you're a more established business and you're maybe it's sort of a little slower to adopt.

So really just tailoring it for the audience. 

Teddy Collins: That's good. I want to give a, a little plug to, to Brex where, so we've been a. A customer at SeatGeek for a few years, and I think with the, I'd say the prior, you know, pre AI versions of, of the product, I think we were able, you know, we've got a case study on the site.

We've achieved like 99.7% GL account coding accuracy, just with the, um. I think we built through like largely deterministic workflows, you know, elegant structuring of kind of budgets and sub budgets and, you know, intake flows, like routing things to the correct place. And I think one thing that AI has done since, you know, soon after we launch the product is kind of bring that knowledge and know-how to, the organization just automatically configure and it's just there and it just works and gets to that 99% instead of becoming just, or instead of having to be just an expert that knows everything about.

Every, um, every end of the product, 

Ben Gammell: hopefully. Yeah. 

Teddy Collins: Yeah, definitely. Yeah. [00:13:00] Um, and then I wanna go back, back to you, mark, and, and just talk a bit more about the kind of culture around kind of learning and development and kind of. Let's say you've got a new hire coming in who may be coming from a larger company where they don't have this kind of robust, uh, environment for, for learning and, and launching apps.

What do you kind of do structurally and organizationally to encourage this as a, like you mentioned the demo days as a Yeah. Um, these, you know, time off hackathons. Like what, what, what's for us? It's special sauce. 

Marten Abrahamsen: Yeah, for us it even starts the interviewing process. So, uh, we found interviews. This is not gonna be particularly revolutionary to be relatively low signal.

Um, some people are good at interviewing and terrible employees and vice versa. Uh, so we always have like a case study as part of an interview. And I think in the past they used to be like a modeling exercise in Excel or like you have to create some slides and we're increasingly asking people to experiment using something in B Zero.

Uh, to actually build an application of sorts as part [00:14:00] of their interviewing cycle. And most people have this, their first time using V zero, so you can get a sense for like, are they able, willing to sort of experiment. Uh, and you don't need to know any coding, it's just, you know, you kind of gotta do some trial and error, um, and figure out, uh, how to do some good prompting.

Uh, but it, it really starts with that. So even to get hired, you must sort of be able to demonstrate that you can use, uh, some of these tools. Uh, and then after you start, I think we're, you know, even have like channels, so called like a kudos channel. You sort of give credit to people and uh, you know, in, you have a monthly team meeting where we try to have people showcase some stuff that they've built.

Uh, we're very big on automations, um, very big on experimentation. Um. Like the, we even talk, talk about this internal, like we want people to get sort of speeding tickets, not parking tickets, um, move real really fast, push boundaries, uh, experiment and, and try new stuff. And you're not gonna [00:15:00] get in trouble if something like doesn't quite work.

Um, you will get in trouble if you don't experiment or in. Um, we don't, we don't wanna have that, uh, type of a culture. Um, so like, what does that mean from a culture perspective? Like, we're not afraid. Uh, I think it's like a very big one. There's no, um, oh, I like tried something and it didn't work. Uh, it's almost like a failed founder that's like a status symbol almost, uh, that you've sort of been out there and try to build something and you're gonna try again to build something else.

Guest 2: Yeah. 

Ben Gammell: I think to jump onto what Martin's saying, I do think that creating an environment where experimentation is encouraged and is able to be done at scale, I think is super important. So we created like a, a platform, like an agent platform that any employee can go to and in a risk, you know, relatively controlled way, they can, you know, build agents.

There's prompt engineering, multi multimodal testing, evaluation frameworks, and they can really just like. Start to play around with the data that Brex has or like what we're doing and know it's in like a [00:16:00] contained way that they're not worried about sort of, oh, you've signed up to some AI platform or tool externally.

Um, and then, you know, Brex is either Brex or our customer data is just like out into the wild. And so I think having a sort of almost a sandbox environment, which is this essentially. To enable employees just to like geek out and spend time just like iterating on problems they're facing and they can do it in like a relatively unstructured way, I think has been really helpful.

And definitely on the hiring side as well. I think, um, we're almost going so far as to structuring our interviews that you, there are certain elements you can't. Do if you don't use ai, and it's not that you have to be like ai, you know, fluent and, and be sort of really progressed in your journey on, um, sort of, and familiarity with these different tools and platforms, but more like there's a certain degree of intellectual curiosity you need to be able to demonstrate.

And in those sort of, um, hiring frameworks like show that you are actually capable of thinking about, okay, how can I do from a blank sort of sheet of paper essentially, how can I get to an end outcome? 

Teddy Collins: Yeah, and just tell me a bit about, more about [00:17:00] the adoption, who you, you know, described this, you know, platform, the agents and, and building, but you know, like what per, let's just say with within finance, like what percentage of your org is like actually.

It's just like a 80 20 rule where, you know, a small number of people are like doing most of the development and people are kind of getting the, the benefits of that. Um, like how do you, how do you teach people in a more kind of structured way to, to take advantage of this? 

Ben Gammell: Yeah, I think finance, actually, ironically, even though we're sort of a company sort of aimed at bringing AI to finance professionals, we were actually relatively slow to adopt because I think for a lot of people.

It was just like getting started was like the inertia we had to overcome. I think if you are in a more technical field, so engineering and others design, you maybe have more sort of fluency in terms of just like starting with a new tool. I think in finance it's maybe just like less part of the job day to day.

And so a big part for us was just sort of saying to people like, Hey, just go spend time, [00:18:00] like use a hackathon, like use a demo day to actually really just. Play around with something without them necessarily having to be a deliverable that you are on some deadline or time pressure to meet. Um, so that people could just start to get used to engaging with these platforms and, and these tools.

And so I think for us it was just around getting people comfortable with starting to like use this other mechanism or tool to get their work done. And then, you know, I would say creativity abounds from there because as they get more familiar about where does, you know, different. Sort of ag agentic workflows help or, or hinder, they can start to know when to deploy it.

So the a 6 0 6 kind of example is, is, you know, we actually have a lot of our, most of our sort of ag agentic workflow from the finance perspective is actually built by, I would say like our more entry level, like junior level, uh, folk, because they're the ones that probably get most exposed to. Tasks or workflows that can be automatable can be improved upon with ai and you know, the senior you, the more senior you go, maybe it becomes a little bit more [00:19:00] challenging because suddenly you are thinking much more about, okay, well how do I think about your overall and annual plan and how do I think about like the three year, like, direction of your business and what's the resource allocation like?

Sure, you can maybe like jam with chat gpt and other sort of tools for that, but it's, it may be harder you, for you to build, uh, a sort of an agent that does that on your behalf. 

Marten Abrahamsen: Mm-hmm. It's obviously our business, right? But I think everyone's expected to, to use these tools. Um, I have someone from like my HR team here with me today and 'cause they were curious and we also were team on customers.

Um, they're using V zero, like we have a quarterly recruiting review later today and like the dashboards and sort of what we're gonna go through is a V zero. So you can like click around and it's effectively like a. Relative simplistic. I'm not sure how many prompts it actually took, but, uh, you can sort of a mini application specifically for this meeting.

So as I was reviewing it last night, in preparation, like you can navigate it and get the information that you wanted. Uh, so it's sort of been the way that we now used to communicate internally is using these, uh, different [00:20:00] applications. Uh, so the, I think it was almost the expectation that you always use.

Some kind of AI tool to help your job, you know, be easier. And our CTO has this like famous saying, which she sent on stage for a few different presentations, is like, ask your employees what do you hate most about your job? And like, the answer to that is what you should build. And yeah, we said at our level, I mean, maybe slightly less manual, uh, work.

So, you know, we might not be the best people to come up with ideas. Um, but if you ask the people that actually day in and day out perhaps do you know, sort of very repetitive, mundane work. We can have a pretty good list for you of some stuff that can probably automate it, uh, using an agent or, you know, at least improved by building some kind of special interface for you to interact with.

Teddy Collins: Yeah. So definitely hearing a lot of efficiency gains, automation, you know, make things operationally easier, but what kind of gets you excited about what could be coming up over and be enabled over the next, you know, two or three years? Where, where are we headed? 

Marten Abrahamsen: Yeah, so I think for me, we have this framework internally, which is like [00:21:00] deterministic and non-deterministic workflows and simple and complex.

And we've basically done a pretty good job in three out of the four quadrants. The one we haven't quite, and this is not, I think this is like the world, uh, haven't quite captured yet, is non-deterministic and complex. So like why did you know churn increase last month? Or, you know, why did signups improve?

Uh, well, there's not. You're not asking a specific answer to, like, you know, was this above or below targets? Or, you know, is this in policy or can you draft me like a standard template NDA, right? You have to dig into different systems. You don't even know what, where to even look for the answer. Uh, and it's not even a specific answer.

You have to look for multiple things. And how do you even know? How does even the AI know when to stop searching that they've sort of reached the end of potential, um, solutions. So as I think about what could be. You know, improvements coming and, you know, chat. GPT has really only been around like two and a half-ish years since three five came out and we've seen [00:22:00] tremendous improvements in those two and a half years.

Um, imagin what would happen in other two and a half years. I think we're just gonna see exponential improvements in these models and capabilities. So I think really good non-deterministic complex workflows I think would, may be a sort of a game changer for finance teams. To provide insights to the rest of the organization.

Teddy Collins: Yeah, I really like that kind of saying of like, the human brain literally cannot comprehend an exponential function, and I can just see that from kind of two, three years ago, even kind of six, seven months ago from, you know, within SeatGeek having. Um, kind of no, internally developed AI for the office of finance.

Now we have kind of nine, 10 agents kind of building a new one every, every week or so. And then I think our first experience, I think with, with AI was actually with, with Brex and with the AI assistant and kind of seeing kind of that operational leverage that you can get when the agent can go out and kind of.

Look at that anomaly and kind of reach out to the [00:23:00] person. So, um, we've been able to, I think at least keep headcount, you know, closer to constant the last few years, um, from a g and a perspective as kind of the business keeps, um, keeps getting bigger. 

Ben Gammell: I think to echo your point, Martin, I think one of the things that I'm really interested to see is how can AI get to a place with the right structured data sets and such that it can answer those more amorphous, prompt questions, which I think like what hopefully fin finance and people can still do better than, than AI in, in the fullness of time is like knowing what questions to ask.

Like what is the thing that you need to know about your business to make decisions and having that kind of innate sense, so like you may know. Hey, like why is Churn doing this and like, what should we do to drive this in a better direction? When I ask that of my team today, it's like we'll have to go circle back.

Like we're gonna have to go look at the data. We're gonna have to go spend time in cycles to understand like the full root cause of this issue, to then come back with a proposal. And if AI is able to sort of speed up [00:24:00] those cycle times, I think that would be a huge unlock for sure. 

Marten Abrahamsen: And it's like, it was like asking the questions to ask, but also to quality Check the answer.

Yeah. Right. Like. My doctor told me that she's allowed to use chat GPT to ask for medical advice, but I'm not because she can look at the answer and say like, that is just dead wrong. Yeah, he's hallucinating. You're not gonna die in the next 45 minutes. Um, whereas if, uh, someone that does not sort of understand the, the business might ask why is churn up?

And it would, uh, it would just say, oh, there's more days in October than September. And he'd be like, well, that's not, definitely not the answer. Right? But that could be something that the LLM would say. 

Ben Gammell: Yeah. 

Marten Abrahamsen: Uh, so to be able to sort of know what questions to ask and also sort of sense check. Does this answer actually make sense?

Ben Gammell: Yeah. 

Teddy Collins: Yeah. So, Ben, back to you. So thinking about your, assuming out to your finance stacks, you've got kind of a best of breed and kind of a lot of apps, you're also developing a lot internally. When you're out there looking to solve like a certain problem and you're looking at vendors, you're thinking about, you know, building it yourself internally, what's [00:25:00] that?

What are kind of that pros and cons list? Like what are you out and looking, how do you find out? What's gonna be that killer app that's gonna help you? 

Ben Gammell: Yeah, so I think for us, we, we have very much a, a build and buy mindset in when it comes to all these tools and, and applications because I think the way we think about, it's like, where do we believe we can drive the greatest amount of ROI as it relates to like what we should be building internally.

So when we think about that, for example, we decided to onboard, uh, we decided to, sorry, automate our onboarding flow for customers. And that was something that we were like, okay, if we sort of combine our know your customer workflow and model with our underwriting credit model, we can have a better customer experience because we can more seamlessly est both the structured data of the application flow, but also behavioral and risk signals to be able to automate a lot of the onboarding that our customers go through.

So that was like something we decided to build internally because. It's so unique to us and it leverages so much of our own data to be able to sort of create a better customer experience that we're like, we have to build this ourselves. It would be very hard for us to outsource that or, or [00:26:00] buy a product that otherwise has that capability.

On the flip side, when we think about, um, say US customer service agents, like we use Sierra ai, um, we've done a really deep integration with them and they are now sort of through Sierra AI's product and the integration with bx. They now handle most of our inbound, um, sort of customer service, uh, cases.

And so from that perspective, like that was something that we're like, look, we can obviously surface up to them the detail and the context that they need to have the agent respond on our behalf. But that isn't some sort of, the actual customer service agent in and of itself doesn't necessarily have to be something that like Brex uniquely builds.

Uh, because we actually think that there are other people who are like, that's their entire business model and they're probably better set up to do so based on not just Brex instance, but they'll obviously have customers and other. Uh, industries and other customers in general that they will be able to feed into a better product experience for that agent.

And so I think, you know, on the finance side specifically, there is, um, some platforms that we use, for example, these numeric, uh, that helps us with our BVA and like cash flux analysis that we do [00:27:00] every month. We feel like that that's a pretty good tool for us to have bought by virtue of its integration with NetSuite.

And just like it's something that we didn't feel we had a unique ability to sort of. Build it in a better way. So we're like, let's use that other tool. Because whilst the tools, and to your point about like small software, while a lot of these tools and and agents are very easy to build, you also have to, to a certain extent maintain them.

And you have to make sure that they're kept up to date and that they continue to advance. And so I think when you think about which sort of. Systems to build yourself. There is also an inherent tax to that because you are supporting it, which is again, obviously if you buy a tool, there's an inherent cost of paying a vendor on an ongoing basis.

But one of the things that I would say we've made historically Brex kind of, we grew up in the sort of zero income, sorry, zero interest rate period, not, we also were zero income as well. Like, you know, we just decided to build everything. We're like, oh, we can build everything. Cash is. Free. Like let's just go out and go crazy.

And then what we sort of have learned over the years is like we actually need to be much more intentional about where do we believe we are uniquely positioned to do a better [00:28:00] job building this versus going to an outside party because of all those other costs that are incurred. 

Marten Abrahamsen: Yeah, as I said, we like to build stuff.

We also recognize when there are preexisting tools that are really good. So like for legal, we use like GCI. Mm-hmm. Uh, we didn't feel the need to sort of reinvent the wheel, uh, for something that is sort of specifically trained on legal stuff. We do also, like, you know, it's really good integrations like with existing system of records.

So for like ironclad, for example, we use ironclad ai and like we just did a, I've done two fundraisers at Versal and like the first time around we got like some questions from the investors of like. How many of your contracts are, you know, in an event of an m and a, like they don't roll over. And we had to literally hire a law firm to like go through every single customer contract and give us that, uh, number, probably cost us $50,000 or something like that in with ironclad ai.

It was just a prompt and it created a list of the customers so they have the right to, to end the contract in the event of m and a. So. That went from like a huge pain and was like over a [00:29:00] weekend poor associates at, you know, this law firm to like a quick prompt of all of the data. Now there's some work put into that, right?

We gotta make sure all the contracts are in ironclad and stuff. But if the AI works like really well with the underlying data, uh, then that's also, also worth it. 

Teddy Collins: That just reminded me, just flashbacks of the same exercise that I've done and, and, and other various, um, financings and the, that tagging and the.

You know, hundreds and hundreds of, of, 

Marten Abrahamsen: they actually asked a follow up question after we had done it, which was like, and how many of them have like a non-standard payment terms? And we just said we're not gonna answer that. Which would've been another easy prompt on iron. 

Ben Gammell: We're 

Marten Abrahamsen: oversubscribed. Sorry. 

Ben Gammell: It's like you can, I'll say investor due diligence is a huge, it's so much easier now with AI because you've, as soon as, especially if you've gone like deep, really deep with one investor and they've asked you like.

More or less the universe of questions. You'll get almost like the same flavor of questions, but maybe framed slightly differently or phrased slightly differently from other [00:30:00] investors. And your ability to just like create a central repository of all your due diligence materials and then actually just have AI write the first responses is kind of remarkable.

Marten Abrahamsen: We do, so we're pretty acquisitive, so we don't, uh, do a bunch of corp dev and it's not like they're like particularly sophisticated data rooms that are scalable require companies. But it's usually like an investor presentation, some kind of Excel sheet with some stuff, some employee data. And we built little bots that would basically just like review data room and give us the output on one page that we then look at, uh, when we discuss the deal, uh, in, uh, in our weekly m and a meetings.

So it's just stuff like that. But you can, uh, as a consequence, have time to look at more deals simultaneously and make it really easy for people that sit on the m and a committee and be like, oh, we should look at this. I read the one pager, not interesting. Let's move on. Uh, so kind of an another use case of what, how to, how to be more efficient.

Sure. 

Teddy Collins: Very cool. I do want to kind of give an early plug in a couple minutes. We're gonna open up to audience questions, so kind of think and we'll, we'll, we'll definitely circle back and [00:31:00] back to you Martin. You know, we're all kind of private companies, you know, hopefully for now up, up here can be built, it's kind of ASC 6 0 6 app.

Like other, as you think about. Kinda like SOX compliance and going to your audit committee meeting. Yep. And like, you did this, you did what with your contract, and then how, how do you prove and kind of represent like how you, how are you building comfort? 

Marten Abrahamsen: Yeah. 

Teddy Collins: Around controls and, and audit. 

Marten Abrahamsen: It's definitely a, a tough topic.

We're actually working directly with Deloitte, uh, to make sure that they are okay with. You know, we still, while we make life easier for ourselves, we still do manual view of these contracts, right? Uh, but we're getting feedback from auditors. So like, what is an appropriate use, uh, of AI in this process and, and what is not?

Um, so hopefully that the, uh, the field of auditing will also evolve in the, in the AI age. Uh, but candidly, we're not quite there yet, right? So there's still, um, manual reviews involved. We just use the tools to make life easier for ourselves not to completely automate the, [00:32:00] the workflow. 

Ben Gammell: Yeah. Not that I'm trying to make a plug for Deloitte necessarily, but I think like.

Having an upfront conversation with your auditors to understand their appetite and their sort of comfort level around your use of the tools, I think was actually one of the reasons we went with Deloitte as our auditor, because we felt like they were. At least to how they articulated to us, and we've experienced this going through an audit with them, like just a little bit more on the front foot about thinking about, okay, we are gonna be using AI to help improve workflows and, and, and our finance work.

Like making sure that that wasn't just like a non-starter for them. That our audit work would almost be the thing that would hamstring our ability to move forward from a a workflow perspective. 

Marten Abrahamsen: They want to say yes, 

Ben Gammell: they do. 

Marten Abrahamsen: Yeah. 

Ben Gammell: Yeah. I think that like, it's just, it's, it's hard because, I mean, look, one of the things that's actually good from an AI perspective is, is like, if you think about it, finance, like there's a ton of rich data that underpin financial like information, and it's also relatively verifiable.

Like you know if it's correct or not. And so in theory, AI [00:33:00] should you do really well in finance because you can tell it you were right, you were wrong, and it learns. So like there is a huge potential for AI from a finance perspective in my mind. But it's just getting people comfortable that there's.

Maybe a little less visibility or a little less manual sort of, you know, you, you can go and say to person X, Hey, you messed up, or Person Y you messed up. 'cause it's, you know, you can't fire an agent, so to speak. 

Marten Abrahamsen: It is interesting because quality assurance is obviously an important part of a normal finance team, right?

Some junior analyst did some work and you know, like senior manager will look at it and say like, that was wrong, you should redo it. But somehow, if it's AI that does it, we trust it less than a kid straight out of, uh, college with no work experience. So. Uh, I do think that the world will evolve, uh, here as well, and it would be, the burden will be now on people actually reviewing the information.

Right. Uh, you know, what, what tests have you done to ensure that what came out of the, um, the underlying foundation models is actually accurate and represented and. It might end up being similar to something, what the auditors do right when they do [00:34:00] sampling. So like you might be expected to do your own sampling based on like what the, uh, what the audit, uh, what the LMS spit out 

Teddy Collins: right now.

So yeah, basically probably more augmentation, not replacing kind of humans still in the loop. So any kind of brave souls out there. Any questions, please, uh, stand up in the front. 

Guest 3: Hi, I'm Anju. We are one of Rex's customer. Love it. Ben didn't pay me to say that, but Okay. Uh, I would love to see what does this partnership of a CFO role and.

Chief people officer role look like where the push and pulls. And then with the mix of AI in there, how do you see this relationship evolving? And I'm on the other side, I'm the chief people officer, so I'm curious to know from your perspective, how do you see this partnership at Workplace? 

Ben Gammell: So I, I will be the first to say that I think the C-F-O-C-P-O relationship is one of the most important relationships that you can have as an organization because I think that you, you need to make sure that you are both on the same page and you kind of have each other's backs as it relates to like the [00:35:00] organizational change you're trying to drive.

I think that there is, there's always gonna be a sort of natural, almost tension that exists between those organizations because there's a bit of an element of, you know, if you just be overly redundant, like. Finance people care about money and how, like how much we're spending on certain things. And people, officers can also care about like what's the employee experience, how are we setting up our employees for success?

That obviously involves investments in learning development and tools and access and systems that to be set up for success. I think what I would say with Arch two people, officer who's phenomenal, um, a woman by the name of Heather Dunn. Is just really making sure that we have continued conversations about like, what are our objectives?

What are we trying to optimize for? We have weekly one-on-ones. We're like going through our tear down of like, here are all the big problems that we're trying to solve and, and work on them together. Because one of the things that I will say is, is that it is challenging for an employee base, especially, so Brex, were like a 1200 person company.

To go through this sort of, you know, quasi transformational because you know, you have a lot of people you've hired that maybe weren't [00:36:00] as AI fluent and now they're going through this organizational shift, and how do you make sure you can set 'em up for success? There's also like the, the lingering question in the back of a lot of people's minds about, does this mean role eliminations?

Does this mean reductions in headcount? I think the way we've articulated it and, and Brex is fortunate that we're growing so fast so that we are able to have this luxury is as we're saying, like, Hey. We don't anticipate this will lead to role reductions. It just means we'll be able to go further with the people we have on our staff today.

Um, and that's been a really helpful message that we've been able to articulate because again, of the luxury and the position we're in, that may not necessarily be the case for all organizations. And I really respect that. And I think. Trying to have that sort of, uh, close partnership is really important so that you can be humane in all of this.

Like everyone is going through this change. And I think making sure that a finance person isn't coming in like myself and saying, oh, we wanna see, you know, 10% reduction in g and a arbitrarily because of AI operational improvements. Like actually working through with the people person to determine like how feasible is, is that, or what time horizon is that even like, you know, is that even where our GA costs are?

Like, I mean. We are [00:37:00] now a sort of an office first company. I can't make an office space agentic. I mean, I maybe theoretically could in terms of access controls, but like we still occupy certain space. So like making sure you have a really good grasp and relationship around what are the true drivers of these costs and how to, to move them forward.

Marten Abrahamsen: Yeah. We also work very closely our, uh, uh, head of people and myself, and we have way more problems than we have people to, you know, support set problems. Um, so we look at everything like what is like the most urgent needs. Uh, how do we staff towards them and what are areas where we can still make progress on, you know, things we'd like to work on and things we'd like to fix.

And are there areas where we can, rather than hire, you know, three more people into this area, we can use AI to support them, have them be more efficient and, and push the envelope sort of a little bit more, unlike around efficient teams. Um, but. Yeah, it's not like we can say, oh, because of ai we're not gonna make like drastic cuts, uh, in the business.

Um, but I do think you can make efficiency improvements. Um, so we spend a lot of time and energy on, [00:38:00] on that and making sure that we sort of resource and staff the, the business correctly. 

Teddy Collins: Great.

Anyone else? I see so much too shy. 

Guest 1: Yeah. I think all three of you are fairly well funded companies. So when you think about ai, and I think Brex, you've gone through any transformation, so would love to hear your thoughts on this as well. But how do you think about efficiency calculations or the ROI on each of these, whether internal or external tools that you want to implement and, uh, have your teams use?

Is it a dollar number? Is it time savings? 

Ben Gammell: Yeah, I mean, I think. That is definitely a hard dynamic. And I think even before AI, when you would try to assess productivity, it was always a hard question for A CFO when someone was saying, Hey, I'll get this new software and it'll make us more productive. And you're like, okay, well how are you quantifying that?

Like will it actually, and I think there's always been a little bit of a wariness on the [00:39:00] finance side about productivity gain assumptions when making the decision to leverage a, a, a new software or a new tool. I do think there is something different about AI in general as it relates to. It is, you know, today, to Martin's Point, really helping replace a lot of the more deterministic and more rote work that you can, you know, naturally say, okay.

Like for example, I can easily say that our Sierra Chatbot instance. Is saving us like 15,000 hours worth of like customer experience time over the course of the year by virtue of like the chat containment rate that we're seeing, the, uh, speed to resolution for our customers and then doing the, the sim relatively simple math to understand like, okay, this saves us x number of customer agents that we otherwise don't need to hire into as the company scales.

I think that becomes a little bit harder to, to determine when it comes to. Uh, like more on the engineering side when you're looking at productivity tools like Cursor and others that a lot of our engineers are using. Because, you know, I would say it is like, what's the measurement you do? Is it, it is not [00:40:00] necessarily lines of code written because, you know, that could also create somewhat perverse incentives.

So you have really long code, uh. Code base versus like a more efficient one. So what we're looking at is like the rate of pull requests. So how like frequently are we seeing our engineers ship code, um, as like a proxy metric to understanding like overall efficiency for our engineering team. But I think that the, the proof, proof will be in the pudding of like, do we actually find that we're increasing the product velocity overall?

And that's something that we'll probably have to have a more of a sense for over a six or 12 month horizon. You're probably not gonna get as much signal in the first month or so. Um, and so just being, I would say. Upfront with folk when they're looking at using tools or, or, uh, especially when they're looking at making a meaningful investment in a different.

AI AI platform or tool is okay. What do you think success looks like? Like when you actually even just ask the question of like, how would you define success of using this tool? What is the outcome you wanna see? At least gets the buyer in the mindset of like, okay, there is an outcome that we're [00:41:00] expecting and then we can see if, you know, if it's a proxy metric or more directional sense in terms of Sierra.

Like, do we actually achieve that metric? And it's not necessarily to say like, if we miss it. Like, you know, that person is gonna get like wrapped on the wrist, but it's more, okay, what was wrong in our assumptions going into that and how can we learn about that for future sort of deployments of different tooling?

Marten Abrahamsen: I think it should be pretty clear what is driving value or not. I was thinking of sort of a couple of examples on our end. When I first joined, saw we had like one lawyer and we had a massive external legal counsel bill, and then we started using GCAI. And which is not free, but it's not super expensive either.

And suddenly our external council bill like dropped because we no longer needed to send. A bunch of extra work to, uh, uh, external counsel because the easy stuff GCI can do. So internal counsel can spend more of their time on value added work. Uh, and the other one is we built, uh, an agent to review all our inbound leads.

If you like, hit contact sales and you fill out the form, we used to have a 10 person team that would sort of like review that list. [00:42:00] If we delete people that we just, you know, scream obscenities at us or sometimes for people using it to like compact support. Instead, I have a question on this thing. My XY, Z isn't working.

Uh, and then once this sort of drill that list down into like actual leads, then they would spend time kinda what we call hydrating the lead. Or like, is this like a real business? What does their website infrastructure look like? Like what are their needs? What geo are they in? And now all that stuff is automated.

So we went from having 10 people for us, inbound leads to one person, and the only thing this person does is to make sure that the deal bot is working. So this is like pretty clear, like obviously this is working, um, and we have a bunch of other things that are like probably not working quite as well.

But when AI is working like, you know, it, you don't need to do some like R-O-I-R-O-I analysis to, to assess is this working or not. 

Guest 1: When you started thinking about Sierra because there is a or gc.ai, did you, did you ask them to give you a [00:43:00] guesstimate or some kind of a ROI calculator of sorts? 

Ben Gammell: So we went through like an initial back test with them.

So we basically said like, we essentially had a upfront, almost sort of test of, okay, if we were to run all these chats through Sierra, how. What was its efficacy in terms of getting to a resolution? And that was one of the reasons we went with Sierra over some of the other platforms that we were doing similar tests with, was just to understand what was our confidence interval that they would actually be able to resolve the case that was being chatted in.

We then actually did a pilot period for about three months that we were essentially wanted to see, okay, like how do we sort of expose, let's call it 5% of chats to Sierra and then 10%, and then as we were both getting confidence about the efficacy of it and that it wasn't gonna cause. A worse customer experience.

And also that when it helped us better quantify, okay, this is the savings it could otherwise drive, and therefore the knowledge of the ROI upfront was actually pretty easy for us to determine even before sort of signing up to a more [00:44:00] sort of, you know, established agreement that is now over a hundred percent of our, you know, chat cases, for example.

Guest 1: So you believe that, is that the model for bringing AI into the enterprise? That you have to go through this try before all and. 

Ben Gammell: Yeah, that's a good question. I don't think you'd necessarily have to go through that. I think it's easier for sure if you have the ability to. What we're seeing with a lot of our enterprise customers is, is that pilots.

I would say in general, and this is like one man's opinion, so I take it with a grain of salt, I think has like almost a 50 50 success rate. And what I mean by that is. A pilot can either be really good if you have like a very defined, this is what the pilot's for, this is how we're gonna be evaluated, this is what success looks like, and you align with your counterparty, whether it's your customer or your vendor upfront about what those look like.

And then you work across the pilot towards that and you get to the end with real signal to make a decision either to buy or not. I think where pilots go horribly wrong is when you're very vague [00:45:00] upfront about what does success look like? And you kind of just run a three month experimentation with a vendor or with a customer, and then you get to the end and you're like, well, wait.

How do we even evaluate if we're happy with what the prior three months has achieved? And so from that perspective, I, I think that pilots can be a useful tool, uh, whether it's to gain customers through saying, Hey, we'll do a pilot with you or as a buyer to say like, yes, we'll sign up for a pilot. I think it's just really important in either respect that you have a clear definition of what success looks like and make it quite targeted, um, so that at the end of that period you actually can make a go no go decision.

Marten Abrahamsen: So actually just thinking through this, so we all of our sort of big customers for VI Zero, which is our air product today, I think all of them started as like a 10 seat account and then it became like a 20 seat account and then 50 and then a hundred and then 800 like. And the few times you've done like, oh, they want to do like a thousand trial seats for three months, and they come back after three months and be like, well, it turns out that only like 60 people [00:46:00] ended up using it.

So like, we don't think this is for us. And like it didn't work out. We've had much better success with these like. Gradual. But you get in there with like a group of people that actually want to use it, even if it's smaller. And then they hear, Hey, do you hear what you know the team over marketing are doing?

They did this prototyping thing, like we should check that out. And then you get a few people from the design team and then someone else hears about it. Like you kinda get that like organic growth as opposed to this like from the top pushed down. Hey, you now have all a seed access to this thing. And then they sort of feel like the tools are being forced upon them, um, which has not really worked out at all for us versus the bottoms up, uh, takes a little longer.

NDR looks really good though, so like the finance person in, you should, should like it 

Teddy Collins: too Sharp. Maxwell, you have your go to market strategy. Calibrate it now. We, we ready? We 

Guest 1: got, we got all three of you there. 

Teddy Collins: Land and expand. There you go. The 

Guest 1: buyers. 

Teddy Collins: And just kind of a time check. I know we're coming out.

Do we want to end a. We've got, we've 

Guest 1: got 10 minutes. 

Teddy Collins: Okay, great. [00:47:00] 

Guest 4: Um, thinking about Sarah, by the way, um, and I'm a finance leader, thinking about driving efficiency within FP and A. Some of the things that take us the longest time and add the most value is driving in identifying insights across the business.

Also, as strategies change throughout the year, how do we pivot the business more, the downstream impacts. Have you used anything in your companies to kind of. Identify those insights and or understand the downstream impacts to a strategic change across the company. 

Marten Abrahamsen: That's probably the, I talked earlier about the complex non-deterministic workflows.

That is like exactly it. The closest we've done is the ability to make it easier for people to sort of prompt our Snowflake database. Yeah. So that you can at least, um, get access to information. You can experiment a little bit more with what if I cut the data this way? What about this thing? Um, but it's still.

And I think of this as like the best part of like [00:48:00] fp a Strat Fitness is when they think sort of strategically about the business and insights. So that's also probably one of the last things to be able to be truly replaced or even enhanced meaningfully. Yeah, enhanced by AI would 

Guest 5: be ideal. 

Marten Abrahamsen: Yeah. Yeah. We, we can make it a little bit faster.

Yeah. Today, um, and actually just saw a demo of something called Concourse If you, um, which. Yeah, it's basically the, it is middleware that connects all the different data sources and they have an LLM on top of it that allows you to, to search, say like snowflake, stripe, and a few other things, NetSuite. Um, but it's still pretty early days for all these different things.

Um, but yeah, there's, there's hope for the future. 

Teddy Collins: I would say. I really like to kind of geek out on kind of like the Gartner and Forrester type research and at the Gartner CFO conference earlier this year, they did studies about. That you can't really solve kind of the fp a burnout issue through just like being more efficient and kind of getting automation.

You need to kind of create these [00:49:00] decision support tools like for folks so that they can use those tools. And then, um, you don't like the fp a business partner doesn't have to kind of be present for them when that decision's being made. So it's like, how can you kind of leverage the technology and kind of.

Multiply yourself to make yourself available to the entire organization. And I think the, like for us internally at SeatGeek, we're live entertainment, uh, ticketing company. And when we're, we've had a lot of success with kind of rhythm of the business type use cases within AI where, hey, that variance report, like from Looker, from Tableau, that would say like, here is like our market share data.

Here is what the year over years look like. But kind of go out into the news and say, oh. Player into your reports, like, like all these other factors to help, um, kind of free up our resources so that we can build those decision support tools for folks, I think has been, been the unlock for us. 

Guest 4: Yeah. Awesome.

Thank you. [00:50:00] 

Teddy Collins: I gonna stay up in the front. 

Guest 2: Yeah. I I social you, Ben, you said use numeric for some EVA analysis. So do you have some examples of how a tool has been able to like use AI to. Do better analysis, maybe better than like Martin, you were saying like churn that gives you days in the month, like really looking.

Maybe hiring happened sooner, faster than later, or it was just timing from a event or trade show. So any examples how it really works well for you? 

Ben Gammell: Yeah, happy to. So I think, um, it sounds like you've also built this internally. Yeah. But essentially what it will be is like when we do our close and we're doing our flex analysis, numeric has a, a integration with icl, with NetSuite, and so it's able to use both of those contexts.

To understand like when we see variance to plan or variance month or month, you could like double click into a specific GL and understand what are the drivers of that GL change. Like is it that that like a certain vendor is now being billed that wasn't previously, is it that your usage of a given vendor has now meaningfully shifted?

Um, and so that what I think helps us just really tighten in on. When you see BVA, it's [00:51:00] like, okay, how much of this is like true favorability or unfavorability, or how much of this is timing? Like you can just get to that answer much faster and so you can understand, okay, like this is what I actually think is part of, you know, whether it's a flash or something.

The explanation to the finance team and or you know me around like, okay, this is actually what we think is going on underneath the hood. I think, you know, when it comes to your point about sort of more rote work of finance, I think like our flash reporting is becoming like very much automated because it's like something that follows a pretty similar format month or month, I mean.

I don't know if you would radically change your flash every month, that would be kind of crazy. So like AI's really good at that. It's like you have a pre deterministic kind of format with information that's coming to it and it actually is able to do a pretty good first pass at like, here's the flash and then you know, a human in the loop can then go and like correct wording and correct understandings of, you know, what's driving it.

Marten Abrahamsen: Yeah, I think a high, high level flux, it can do super detailed real insights. Not really. Yeah. Hmm. 

Teddy Collins: Yeah. And then just kind of one, one other story. I know [00:52:00] just kind of talking with peers in the industry, kind of one approach I've seen is kind of get your budget data, get your GL data, get the procurement tool data into like one snowflake kind of database, and then kind of unleash AI there.

But then I've also seen like good use cases where you can go into all these tools that aren't necessarily, you know, triangulate. So basically like the budget information. Trial balance information and like some tools, they can kind of just go deep flex analysis just on NetSuite data. But if, oh, you have a big increase in marketing spend, if that was already a campaign in the budget, that's not as useful of information.

So I think kind of that orchestration layer and kind of that single pane of glass across everything, no matter how, depending on how you structure your data to get there, is where I've seen some good, good results. 

Marten Abrahamsen: You gotta really love paying snowflake invoices though, to put all your data from all your different sources in there.

Teddy Collins: You're paying, you're paying it somewhere. Yeah. So well [00:53:00] in pain or, or 

Marten Abrahamsen: don't, you don't pay for like a netsy query. Right. Whereas you do pay for a 

Teddy Collins: snowflake query. So,

Guest 5: um, hi Johnny from Blue Light. Um, question. Have either any three of you, um, incorporated. AI as a decision tree when evaluating headcount requests, and as part of your intake or that fill process 

Teddy Collins: should answer? No, not yet. Any, any ideas 

Ben Gammell: such? No. I, I think very much, I would say we haven't sort of relied on AI to, uh.

What we have done as part of that. 'cause we've also done the hiring dynamic of like, okay, think through the interview loops, how do you make sure you're hiring people who are, you know, more AI oriented? One thing we've used AI as part of like the hiring process, less about whether or not from a head count forecast or not perspective, is actually on the interview loop side, whereby you obviously have, you know, let's call it half a dozen people plus who will interview a [00:54:00] candidate.

They all have their scorecards. They all talk about like the pros and cons of this candidate in their specific areas of domain. What we are doing, which I think is a little, um, what's the word that I want to use? It's a little archaic, but our CEO wants to do it and our CPO is accommodating to it. And I'm also, so, which is like my, our CEO and myself will actually sign off on every hire we make.

We're also not hiring crazy amounts of people. So like it is possible, but I wouldn't necessarily say recommended it necessarily. But what is helpful is we're using AI to basically go through all of the scorecard data of an interview candidate and then surface up to myself or our CEO, like, here are the areas of strength of this candidate.

Here are the areas of risks that we're identified as part of the interview loop. These are some sort of considerations to have, and then at least that enables us to have a faster like. Look into a given candidate to understand like, okay, what are the areas that we maybe want to actually go in and read the actual notes on a specific area of strength or risk, uh, versus like, you know, the pages of, pages of, of [00:55:00] interview notes.

Which also I will say like, we're obviously using, um, AI tools as part of the interview loop as it relates to like just taking notes as your interviewing candidates. And so that's really helpful 'cause you're just getting. More robust, more rich note taking for, uh, the reviewer of those notes, but also for the candidate experience is better because you're not having someone that's half paying attention 'cause they're taking notes.

They're actually able to engage more fluently with the candidate. 

Marten Abrahamsen: Basically all the same thing. Sorry. CEO approves every single hire and we do hire a lot of people. So that's sort of somewhat of a burden. And we just bright hire in all the interviews to take notes. It's actually not to plug bread higher, but like it also gives you feedback on yourself.

Like how, how long is your longest monologue? What was the candidates to your speaking ratio? How many words per minute? Um, 

Ben Gammell: how do you score? 

Marten Abrahamsen: I'm like in the high eighties, but now that I sort of know the criteria, I was like, you get bonus points if you set an agenda at the beginning of the interview is something I've never done before, but now to make sure I get a good score, I'm like, today we're gonna talk about this.

Which apparent the candidates also like, so, um, [00:56:00] we're learning and short monologues. Um, but yeah, so you spread hire for that. And then we built our own little tool to take similar to all the scorecards and sort of come up with a summary for our CEO. Um. And then he says, no. And then we gotta fight a little bit.

And, uh, and then we, yeah, then we get the hire approved. I did see this actually on my report from the HR team written in V zero later today. 100% offer acceptance. So, 

Ben Gammell: wow. Congrats. 

Marten Abrahamsen: But that was not, without a fight. There's always a little, 

Ben Gammell: and at risk of saying this in front of a, a group of maybe people, people, uh, maybe that means your offers are too high.

Teddy Collins: Sure. And confirming you don't ask them if they're gonna accept before you offer it, right? No. Okay. Good stuff. Um, so just a couple, couple minutes left. Love to each of you kind of give kind of one, one wisdom. Looking back all the success you've had on kind of reaching kind of AI at scale within, within your finance stack and, and operations, [00:57:00] um, kind room of finance and HR professionals.

Kinda one, one piece of advice. 

Ben Gammell: I would say just start, like, just, just use it, uh, and, and try to find the time in your busy schedules to use it and get more familiar with it. And not, to your point, Martin, earlier, like don't be fearful of it. Like, don't fear failure, don't fear of like using it. I think that I actually was relatively slow to adopt ai, um, at work because I was just like, I've got a million things to do.

I've got back to back meetings. I've got like some deliverable I need to ship next week and I just like, where am I gonna find the time to like go and jam for like three, four hours on this tool? And so I think I was very slow to adopt in a way that I probably should have been faster. And I would say, had I done it all again, I'm much quicker off the bat.

Marten Abrahamsen: Yeah. So I think it's a cultural thing. Um, so lead by example, you can even say, Hey, I had an hour on the weekend, I like built this thing. It's probably terrible. Uh, I bet some of you that deal with this problem every single day and can commit a little bit more time to, it can probably figure out to build a much [00:58:00] better and more useful application than I did.

But I'm experimenting with it. I'm trying it out. Um, I don't think it's like too crazy to say that like AI is the future. Um, so yeah, just have fun, honestly. Like, uh, it's not, it's easy to like be afraid of it because there's lots of new words, uh, and complexity and, um. But it just comes down to like, it's almost easier than Microsoft Excel.

'cause you don't even need to do any formulas. You can just write plain English and then you can say, you did it wrong, do this instead. And then it just does it. Um, so you need some patience maybe. Um, but yeah, experiment and have some fun and learn and, um, then it doesn't work and have to ask to do the exact same thing again.

Because somehow the machine started doing something completely different. You asked it to. You know, change the location of a box and he'd rewrote the entire application instead. Sometimes weird stuff happens. Um, so be patient and have some fun. 

Teddy Collins: Great. Thanks Martin. Ben, thank you. Let's thank you 

Marten Abrahamsen: all.[00:59:00] 

Thank.

Frequently asked questions

No. Finance teams at companies like Vercel are building internal tools—including revenue recognition apps and budget versus actuals dashboards—using plain English prompts with no coding required. The barrier to getting started is much lower than most finance leaders expect.
No. Finance teams at companies like Start with a structured back-test or a small pilot with clearly defined success metrics before scaling. Brex validated their AI customer service platform by exposing a small percentage of live cases to the tool first, which gave them a reliable cost savings number and confidence in the product's efficacy before expanding to full deployment.are building internal tools—including revenue recognition apps and budget versus actuals dashboards—using plain English prompts with no coding required. The barrier to getting started is much lower than most finance leaders expect.
The most effective approach is creating low-pressure space for experimentation—hackathons, demo days, or monthly showcases where there's no deliverable attached. Leaders who experiment publicly and share what they're building, even when the results are rough, give their teams permission to do the same.
Build when your use case depends on proprietary data or workflows that an outside vendor can't replicate. Buy when a dedicated vendor is better positioned to maintain and improve the product over time—and factor in the ongoing maintenance cost of anything you build yourself.
Most teams are using AI to accelerate workflows while keeping humans in the review loop, not to fully automate them. Working directly with your auditors early—as both Brex and Vercel have done with Deloitte—helps establish what's acceptable and ensures AI-assisted processes don't create problems when it's time for your audit.