The following partner insight was authored by Jeff Briner, Principal with PricewaterhouseCoopers (PwC), in its SAP practice; he is the Digital Supply Chain Capability leader.
In 2025, investment has poured into asset development, from data centers to wind, solar, and nuclear projects, creating vast flows of engineering and procurement data that should move smoothly from design to operation. In practice, the process is fragmented. Owners, engineers, and contractors exchange thousands of documents without a reliable way to confirm that what is designed is what gets built and operated. The result is inefficiency, higher costs, and, at worst, operational risk.
Governance should be injected into the process from the start, not added later. Data to Operations (DTO) does this by transforming unstructured artifacts—engineering designs, procurement records, and piping and instrumentation diagrams (P&IDs)—into structured information that can form a dependable operational foundation.
Using AI, DTO captures as-built conditions with accuracy and consistency, giving utilities a reliable record of what was constructed, at what capacity, and with what implications for safe and efficient operation.
Aligning design, construction, and operational data into a single “golden record” allows enterprises to plan maintenance, control costs, and meet compliance obligations. DTO’s holistic configuration management closes a longstanding gap in asset-intensive industries: providing assurance that the asset conceived at the start is the one operators can rely on for years to come.
The financial impact of incomplete asset data shows up quickly. Budgets often separate capital costs, which are planned and financed at the start, from operating costs, which recur year after year. When records are missing, work shifts into the operating category, where it becomes far more expensive.
Information that could have been captured during construction for a few dollars later requires senior engineers or field crews to recover—to the tune of hundreds of dollars an hour. These unanticipated overruns can send an entire project over budget.
The safety risks are even greater. Running a facility safely depends on knowing exactly what equipment is in place, what it can handle, and how it should be maintained. Without clear configuration control, crews may arrive with the wrong replacement part, push equipment beyond its limits, or act on faulty assumptions.
In some cases, these gaps have led directly to serious accidents that could have been avoided if accurate information had been captured from the start.
To address these risks, organizations should have a way to carry accurate data through the project lifecycle. DTO does this by taking in unstructured material and converting it into structured data that can be trusted as the basis for asset management. The system also coordinates inputs from contractors, vendors, and owners through a shared portal, creating a common record and reducing the need for the slow, error-prone field-mapping exercises of the past.
This produces integration at the information level, where users see consistent context across systems rather than only at the database layer. Users access information without digging through mismatched systems, and leaders gain a clear view of what has been built. When owners can specify the structures and content they need, the output improves even further, but DTO still delivers value even when inputs vary. The outcome is a single, intelligible record that can cut through silos and makes the overall scope of asset data usable.
This ability to unify data is especially critical as utilities modernize their core ERP systems, since each migration path—brownfield, greenfield, bluefield—brings its own challenges. In every case, AI is no longer separate but part of the fabric of modernization. DTO can fit into all three strategies by embedding accurate asset data and configuration control into the process, reinforcing the clean-core foundation that cloud ERP depends on.
Utilities have been cautious in how they adopt AI, and that caution is well placed. Storing important asset data in the cloud heightens cybersecurity concerns, and rushing adoption can lock organizations into mistakes that are hard to undo. Similar fears met the arrival of ERP systems, though the outcome was an expanded industry and a workforce with new capabilities.
Deliberate adoption lets utilities test, learn, and manage risks while positioning themselves to scale as the technology matures. By narrowing the initial use of AI and putting strong guardrails in place, utilities can gain the advantages of automated data ingestion and orchestration while limiting their exposure.
Still, AI tools evolve more rapidly than most projects. Models improve every few months, and each new release brings stronger capabilities. Planning for AI adoption means planning for constant iteration. DTO is designed with that pace in mind, giving organizations a foundation that can absorb new versions and continue to improve outcomes.
The risks of building and operating complex assets have always been financial and human, and only grow when data is incomplete.
DTO addresses this by embedding governance into the lifecycle and producing a reliable record of what was actually constructed. That foundation helps utilities control costs and safeguard operations today. As AI becomes inseparable from how infrastructure is designed and run, it can also enable future progress.
Jeff Briner is Principal with PricewaterhouseCoopers (PwC), in its SAP practice; he is the Digital Supply Chain Capability leader.