From optional to essential: the changing landscape of quality control in geo data collection

Declan Vanderhor, Director & Founder, TabLogs

Nothing is more certain than change. As technological advancements continue to shape the way geo data collection occurs, the need for quality control in managing it is as important as ever. Because without it, the impact of unreliable data can be vastly damaging to project efficiency and outcome.

Having founded TabLogs more than nine years ago with a mission to help improve the way consultancies log boreholes and manage their data, I have since had a front-row seat watching the shift in quality control practices within geo data collection.

What was once considered an optional step in the data management process, often overlooked by project managers as a routine task, has undeniably evolved into an indispensable aspect of geotechnical engineering. The recognition of the critical role played by meticulous data collection and management has transformed it from a sideline consideration to a core pillar of successful project execution.

In earlier days, the importance of this process may have been underestimated, but as the industry faces increasingly complex challenges it has become clear that the quality of geotechnical data directly influences project outcomes and overall success. This shift in perspective has led organisations, project managers, and stakeholders alike to acknowledge the intrinsic value of strong quality control practices in geo data collection, reporting and management. 

“I believe this change is not an isolated event, but a dynamic shift driven by a variety of factors. In this article I want to dive deep into what these factors are and reinforce them by highlighting a few case studies where I have witnessed this happening across the industry."

The growing complexity of data sharing standards

One of the most significant trends I have observed is the growing complexity of data sharing standards and data schemas within the geotechnical engineering community. In the past, data interoperability was often considered optional, leading to inefficiencies and communication bottlenecks. However, today, industry players are embracing standardised formats to ensure seamless communication and collaboration across diverse projects.

Many organisations have recognised the importance of adopting and adhering to these data standards. AGS or DIGGS, for example, has become a linchpin in facilitating the exchange of borehole data. Its adoption has proven transformative, enabling geotechnical engineers to share vital information effortlessly across projects or organisations.

This shift towards standardisation not only enhances collaboration but also opens new possibilities for innovation and the development of comprehensive, integrated solutions.

The adoption of data sharing standards has not only streamlined communication but has also fostered a more transparent and collaborative environment.

The increasing reliance on internal databases

Another pivotal shift revolves around the increasing reliance on internal databases by geotechnical consultancies and project owners’ teams. The advent of technical tools and software programs, like TabLogs, has empowered companies to build and maintain usable and hygienic databases on national and international scales. This provides organisations with the ability to vastly improve desktop investigations or pre-feasibility assessment and hints at the paradigm shift we are beginning to see across the industry. This has drastically reduced the margin of error associated with manual data entry and enhanced the efficiency of subsurface investigations.

The rising prominence of advanced modelling tools

Additionally, the integration of advanced modelling techniques, Finite Element Method (FEM), and digital twins into geotechnical engineering practices has become much more prevalent. However, the success of these modelling tools is dependent on the availability of detailed and accurate geodata. Organisations are recognising the need for high-quality, standardised data to feed into these systems for meaningful outputs.

Conclusion

In essence, the geotechnical landscape is experiencing a shift where the once optional aspects of quality control in geo data collection are now indispensable.

This transformative journey is not without its challenges. While industry players are recognising the importance of standardisation, there is still work to be done in establishing universal protocols that cater to the diverse needs of different projects. However, initiatives are underway, driven by collaborations between software providers, industry associations, and leading geotechnical experts, to address these challenges and create a more harmonised and interoperable ecosystem.

The future of subsurface investigations is not optional—it's essential. By embracing these changes, we pave the way for a more connected, efficient, and resilient geotechnical engineering community, ready to tackle the challenges of tomorrow head-on.

Previous
Previous

Garbage in, garbage out. The importance of data acquisition and data management

Next
Next

The rise of ESG and how it relates to geotechnical engineering