To receive industry-leading AI updates and exclusive content, sign up for our daily and weekly newsletters. Learn more
Over the past decade, the divide between technical and sales teams has faded to almost non-existent. And I agree. Not all technical teams work at high-tech companies. Blurring the lines between sales and technology allows you to confidently build and ship products with the confidence that they will be well-received, widely adopted (not always), and contribute significantly to the bottom line. If there’s a better way to motivate a high-performing technical team, please let me know!
This is a change that was accelerated, if not driven, by data technology. We’ve been working through the hype cycles of big data, business intelligence, and AI for decades. Each brought new skills, problems, and collaborators for CTOs and their teams to figure out, distancing us a little more from the rest of the organization. No one else can do what we do, but everyone needs it.
Technical teams are not commercial by nature. As these roles expand to include building and delivering tools that support various teams across the organization, this gap becomes increasingly apparent. In particular, we’ve all seen the statistics about the number of data science projects that never reach production. The reason is clear: tools built by people who don’t fully understand the needs, goals, and processes of commercial teams will always be of limited use.
In the early days of AI, throwing money at technology was highly justified because investors wanted to invest in technology, not outcomes. But the technology has matured and the market has changed. Now we need to show a real return on our technology investments, and that means delivering innovations that have a measurable impact on the bottom line.
Transitioning from Support to Core Functionality
The growing pains of the data technology hype cycle have brought two great benefits to modern CTOs and their teams (in addition to the adoption of tools like machine learning (ML) and AI). The first is a mature, centralized data architecture, which eliminates historical data silos across the business and gives, for the first time, a clear picture of what’s going on at a commercial level and how one team’s actions impact others. The second is the shift from a support function to a core function.
This second point is important: As a core function, technical employees now sit in meetings side-by-side with their sales colleagues, and these relationships give them a better understanding of processes outside of the technical team, including what their colleagues need to accomplish and how that impacts the business.
The result was a new way of working: for the first time, technical people were no longer isolated, dealing with unrelated requests from across the company to pull this statistic or analyze this data. Instead, they could finally see the impact they were having on the business in financial terms. This was a challenging perspective, and it gave rise to a new way of working: an approach aimed at maximizing this contribution and creating the most value as quickly as possible.
Introducing Lean Values
While we hesitate to add a project management methodology to the dictionary, Lean Value is worth considering, especially in an environment where the return on technology investments is under intense scrutiny. Its guiding principle is “relentless prioritization to maximize value.” For my team, that means prioritizing the research that is most likely to deliver value or advance organizational goals. It also means deprioritizing non-critical tasks.
We focus on achieving a minimum viable product (MVP), applying lean principles across engineering and architecture, and (here’s the hard part) actively avoiding building perfect on the first pass. Every week, we review non-functional requirements and re-prioritize them based on their objectives. This approach reduces unnecessary code and ensures that the team doesn’t get sidetracked or lose sight of the bigger picture. And because we have a very clear framework, we’ve found it to be an inclusive way of working for neurodiverse individuals in our team.
As a result, product deployment has accelerated. We have a distributed, international team and operate a modular microservices architecture that fits with a lean value approach. Weekly reviews keep us focused and prevent unnecessary development, which also saves time. They also allow us to make changes incrementally, avoiding major redesigns.
Leverage LLM to improve quality and speed delivery
We set a level of quality that we must achieve, but choosing efficiency over perfection means making pragmatic use of tools like AI-generated code. GPT 4o can save time and money by generating architecture and functional recommendations. Senior staff then spend time critically evaluating and refining those recommendations, rather than writing code from scratch.
Many will frown upon this particular approach or find it shortsighted, but we are careful to mitigate risk. Each build increment must be production-ready, refined, and approved before moving to the next stage. There is never a stage that doesn’t involve humans. All code, especially generated code, is monitored and approved by experienced team members following our ethical and technical code of conduct.
Data Lakehouse: A Lean Value Data Architecture
Inevitably, the Lean Value framework has spilled over into other areas of the process, giving rise to Data Lakehousing, a combination of Data Lake and Data Warehouse, with the adoption of Large Language Models (LLMs) as a time-saving tool.
Standardizing data and structuring unstructured data to achieve an Enterprise Data Warehouse (EDW) is a multi-year process that has drawbacks: EDWs are inflexible, expensive, and have limited usefulness for unstructured data and different data formats.
While a data lakehouse can store both structured and unstructured data, processing it with an LLM reduces the time required to standardize and structure the data, automatically converting it into valuable insights. The lakehouse provides a single platform for data management that can support both analytics and ML workflows, and requires fewer team resources to set up and manage. Combining an LLM with a data lakehouse accelerates time to value, reduces costs, and maximizes ROI.
Similar to a lean value approach to product development, this lean value approach to data architecture requires some guardrails. Teams must implement robust and well-thought-out data governance to maintain quality, security, and compliance. Balancing performance to query large data sets while remaining cost-effective is also an ongoing challenge that requires ongoing performance optimization.
A seat at the table
The Lean Value approach is a framework that has the potential to transform how technology teams integrate AI insights into strategic planning, helping to deliver meaningful outcomes for the organization and motivating high-performing teams to become accustomed to maximum efficiency. It is important for CTOs that the return on technology investments is clear and measurable. This creates a culture where technology drives commercial goals and contributes to the bottom line as much as departments like sales and marketing.
Raghu Punnamraju is CTO Velocity Clinical Research.
Data Decision Maker
Welcome to the VentureBeat community!
DataDecisionMakers is a place where experts, including technologists working with data, can share data-related insights and innovations.
If you want to hear about cutting edge ideas, updates, best practices, and the future of data and data technology, join DataDecisionMakers.
You might also consider contributing your own article.
Learn more about DataDecisionMakers