Service as a Software over Software as a Service
AI taking us back to sell technology as a solution.
I’ve been talking a lot over the last couple of weeks about how we can leverage AI, or even AI can leverage itself to make better forex trading decisions. We explore how this was possible and how social media-sourced LLMs have driven impressive results through their community-owned models. One of the main conclusions we can get from the previous readings, and in general from this AI wave is the importance of data accessibility to train and improve the model. Is not only the amount of data we use to train our machine, but the quality of it. This is what is going to enable us to drive the expected result over a given field.
I have been writing about trading because it is a great example of how a data flow, that we can analyze and weigh its quality can be added into a workflow to make more informed and risk-balanced decisions, at whatever pace is needed in trading to perform the task with the help of our agent.
Another popular use case for AI is Customer support. I haven’t written about this use case because I don´t find it particularly exciting. But let’s take a minute to understand that in a customer success use case, you also have specific workflows to determine results or endpoints. Similar to trading. You know what to do step by step from the beginning and there are not that many variations over what is a good or bad result. It works like an exact science. Maybe some of you have heard about Sierra or Cresta, Generative AI-based companies aiming to build your own autonomous contact center.
Besides being the early movers on the application layers for contact center solutions with an innovative Gen AI approach, Sierra and Cresta bring another important innovation to the AI and future landscape.
The Service as a Software is a shift coming by hands of Generative AI, and the ability to charge for jobs to be done or customer resolutions like is the case of Sierra and Cresta. This means the shift in paying for the problem or number of problems the service is taking care of for you instead of paying a monthly fee for each seat.
This is important in many ways, maybe the more obvious is the question raised by VC tycoons such as Az16 and Sequoia, both of which have been open advocates of the end of the SaaS era. We do not doubt that AI is going to change the way we do things, and the way we approach solving problems, but also will change the way we measure success when a service is provided. Every business will have customization needs, their own data sets, and their proper jobs to be done. This is when the one-size-fits-all SaaS approach becomes obsolete.
While companies are doing this customization they also have ways to measure the success of this customization through the AI tools itself. This feels like going back to the tailor-made custom legacy solutions. During college I founded my APP developer studio, personalizing workflows and turning them into software solutions was the way I paid for my booze, dates, and fun. When you tailor software or in this case AI, you are in an iterative process where you have a clear vision of the solution, and you should have your workflows in place to measure how this path to the solution should look like. But in the end, the treatment of this AI service should be as a Solution not as a service.
This is the exact same thing that solutions like Cresta, Sierra, Cerebras, Scale AI, and big techs like Nvidia, and AWS Rekognition are doing. They build the solution with you and charge by the progress driven by this solution.
Sequoia, Generative AI’s act o1
Two key points to understand why this shift is happening:
Increased Focus on Outcomes Over Products: As AI advances, clients are looking to maximize the specific results AI can drive, like optimized processes, improved safety, or enhanced customer experiences. A pay-for-results model (akin to Sierra.ai’s “per resolution” pricing) aligns more closely with this demand, emphasizing utility over just access.
Advances in AI Enable Specific, Granular Billing: With modern AI’s ability to track usage and effectiveness, companies can offer highly targeted billing structures. Customers are increasingly willing to pay for concrete outcomes and avoid long-term, static subscriptions.
This means, you have a clear path to deliver a result, and you will have the tools to charge or get metrics of every single step.
Another important point driving this change is the cost of training LLMs. Now we understand that part of our future will look like a handful of winners super Large Language Models, acting more as a generalist framework and customized solutions with more controlled cloud computing costs are going to be the product to be sold by adventurous entrepreneurs like us.
But remember that for this substack the more important part of the AI conversation is decentralization. Here is where I ask some questions and some reflections.
I think we might be arriving at a wave of hyper-personalization, as individual consumers but also small and medium businesses, and enterprises. But personalization is expensive, it requires tailored workflows, tailored data flows, and tailored endpoints to tackle the expected results. I am sure that if we build a way to share resources and information about the challenges we need to tackle as users and businesses we will find out that a lot of our challenges are shared, and that there are a lot of workflows and datasets we can use to train our agents to tackle them. Guess what!! In blockchain through ZK proof, Oracles, and data liquidity pools, this is possible.
So maybe I am coming back to build tailored solutions in the form of customized AGI.