- To streamline and improve the quality of the existing drill core logging, sampling and dispatch process
- To empower the logging geologists to work autonomously, ensuring ownership and accountability for logging-related data management tasks
- To reduce downstream data management workload
- To increase digital data capture at the source, improving timeliness of data delivery and enhanced data integrity
- To improve the data management skills of all site team members.
A mid-tier gold producer in Australia with multiple operations identified issues in their existing data capture workflows leading to inefficient use by geologists. They were using Excel-based data capture and ad hoc manipulation which led to inconsistencies in the quality and accuracy of data across sites.
Proposed Sampling Process
- Import data into LogChief from DataShed as per logging requirements (including any existing field data entry such as Core Recovery, MagSus etc.)
- Core logging (Lithology, Alteration, and Veins etc.) completed within LogChief
- Sampling intervals then populated using Core Recovery intervals or manually created via the LogChief Multi-row creation function
- Sample logging completed, including VG, HG, VS, ME, PF options
- QAQC elements added via the LogChief extensions database
- Next sequential Sample ID procured and confirmed into LogChief, at which stage logged samples and created QAQC applied with updated Sample ID’s
- Data verified within LogChief then synchronised to DataShed.
The client has been effectively using this solution since 2018 and continues to refine their data capture to include ROM management, material movement (truck spotters etc). Modifications to suit each site are made easily with full audit and QC. pXRF and other data are imported through LogChief into their DataShed system. Significant improvements in the quality of data logged and also the efficiency of geologists in the field have been the recognised results.