4.1. Feature Alignment during SfM
In some cases, manual processing and changes to Metashape default settings were sporadically needed during SfM processing. The observed alignment failures can be subdivided into two groups. The first contains cases without sufficient image overlap, which can be corrected by re-visiting the photographic acquisition step for additional data. Aided by imaging the samples in a second side-view orientation, this provides additional angles of overlap and solved most cases.
The second group failed to align due to various technical reasons, either related to processing settings and/or to changes in acquisition parameters. To enhance processing rates, for instance, Metashape features several algorithms that pre-select image pairs for matching. While ensuring decreased processing times under normal use, these algorithms may lead to the failure of alignment upon processing of small and texture-similar samples. By disabling the generic and reference pre-selection modules, all images are compared against the entire data set at full resolutions (versus downscaling of images and order-wise comparison ), resulting in improved alignment for this specific use-case [
40].
In other cases, image-selections had to be re-aligned manually in a two-step alignment process. Here, the selective alignment of GCP-containing images proofed useful as an alignment-anchor/tie point for the remaining images, allowing for the alignment of images with different sets of acquisition parameters (e.g., different focal lengths). The use of calibrations aids during the photography step of one of the orientations further improved the alignment of samples. The extra key points, which are directly linked to a pose, prevented misalignments that arise from the texture-similar (i.e., mostly mud-dominated) samples. The alignment of smaller samples benefited the most from such corrections, which may be related to the lower number of key points associated with them.
4.2. Volume Assessment and Error-Contributing Factors
Differences between the immersion in fluid and photogrammetric methods may arise due to several factors. However, volumetric errors do not appear related to the use of a particular software suite, with both open-source (e.g., Blender) and commercial (e.g., Metashape) packages resulting in near-identical outcomes (i.e., up to at least the third digit). Neither are differences observed to arise from the implementation of different GCP workflows, which assign marker coordinates prior to the processing stage. While not having an impact on accuracy and precision, the use of ArUco markers in the Python-based workflow does significantly cut the amount of time by making manual marker calibration and assignment redundant, as opposed to the Metashape 12-bit marker workflow where manual intervention was a necessity.
Disabling of several GCPs during pre-processing led to more significant differences in volumes and spatial accuracy. A low number of GCPs put on a single line on one side of the sample, for example, resulted in anomalous DCM volumes deviating either positively or negatively (
Table 2). As soon as a minimum threshold of GCPs in object-centric placement is passed, the net accuracy and precision gain of introducing more GCPs diminishes, as seen by the RMSE and volume differences between DH4-568, DH4-568a, b, and d. Similar findings have been observed during supra-metre scale studies (e.g., [
46,
47]), which indicate that horizontal and vertical accuracy increases as the GCP-count increases and the distribution is optimised.
More crucial than the RMSE error of the GCPs is the (hypothesised) equivalence between geotechnical and photogrammetric volume estimates. With mean relative volume differences between both methods at per mille levels, the use of photogrammetry appears suitable as a cost-effective alternative to traditional, sample-altering methods like the immersion in fluid procedure. The near-outlier DH4-591 does indeed exceed the 1% mark with almost 3% deviation or slightly less than 2 cm
3 in terms of absolute volume (versus a sample volume of approximately 70 cm
3). As both procedures determine volume indirectly, the difference may be the result of errors in either method, as well as due to the samples physically changing over time. The extended time between cold-storage, photogrammetric acquisition, and the immersion in fluid tests may have led to swelling and/or shrinkage, caused by drying and/or wetting resulting from changes in environmental conditions. This phenomenon has previously been reported during a permeability assessment on samples from the same borehole [
32]. Furthermore, samples may have been damaged during for example storage, transport, and immersion, with each step increasing the likelihood thereof. Although masses measured prior to photogrammetry and the immersion in fluid tests indicate no significant differences, different volumetric differences are observed during post-waxing determination of the volumes. The latter narrows down the volumetric difference to 1.48% (i.e., half the original difference), with re-assessment of the post-waxing volumes (DH4-591w) through the photogrammetric method and immersion in fluid method yielding 73.2 cm
3 (+1.9 cm
3) and 74.3 cm
3 (+4.48 cm
3), respectively.
Upon retrieval, re-examination of the wax-coated sample indicated insufficient waxing for parts of the sample, directly exposing the drill core to fluids during immersion. In the presence of a (micro-)fracture network, this may be sufficient for fluids to enter the drill core sample and impact the outcome of the measurement. Though likely only a minor factor, migration of wax into the network may have further contributed to the observed difference, and is indirectly supported by the pre/post-waxing volume difference changes attributed to the wax-coating itself.
The overall equivalence derived from all eight samples remains statistically significant, with the mean deviation between both techniques coming down to 0.059%. Furthermore, the method appears indifferent to the symmetricity and size of samples, making small-scale, close-up photogrammetry suitable for both volume and bulk density calculations. Finally, the immersion in fluid method requires substantial more physical handling and processing of the sample than the photogrammetric method, increasing the likelihood of (accidental) damage and human error. For higher numbers of samples, this physical aspect further limits the cost- and time-benefits made possible through the (semi-)automation offered by the photogrammetric method.
Thus, this work shows that the same principles of bulk density analysis and quantification of e.g., soil aggregates (e.g., [
48]) can indeed be used to accurately determine bulk densities of drill core and other geological samples. The benefit of not being destructive or sample-altering is highly beneficial, and shows that the photogrammetric method is a suitable option for the calibration of the Longyearbyen CO
2 Lab density wireline logs. Having been stored in a semi-continuously frozen setting since being drilled, the properties of the well-cemented drill core samples obtained from the Longyearbyen CO
2 Lab are unlikely to have changed substantially over time, and are likely closest to the
in situ conditions. As such, the use of photogrammetry-derived bulk densities is likely to be extended to selected intervals covered by the available wireline logging, providing a cost- and time-effective alternative to traditional methods and an efficient means to obtaining quantitative bulk density data in-house using off-the-shelf components.
4.3. 3D Image Analysis and Characterisation
Like virtual outcrop models (VOMs), we have shown that DCMs can be used to generate pseudo-cross-sectional profiles that are amenable to qualitative and quantitative, image-based analyses. Digital characterisation techniques originally developed for VOMs are therefore applicable to DCMs as well, allowing for the easy extraction, elucidation and transfer of geological information (e.g., [
3,
49]).
DCM profiles are readily generated through the “unwrapping” of core textures projected on a cylindrical mesh, resulting in 2D orthographic images of the core sides (i.e., top, side, base). Though this procedure is relatively straight forward, artefacts in the pseudo-cross-section arise when the projection-mesh and DCM are too far apart (
Figure 7). This divergence in position mostly stems from differences in shape, and can be minimalised by better tuning the shape of the projection-mesh to the DCM. However, the latter leads to altering perspectives of the unwrapped texture, and the masking of non-parallel sections may therefore yield better results, at least when the circumference profile is concerned.
In
Figure 7, for instance, the aforementioned mis-projections are visible at the top and bottom edges, where jagging, repetitions, and other projection-anomalies occur. If unnoticed, such anomalies are likely to interfere with the correct interpretation. Especially when used in conjunction with automated structural and material analyses (e.g., automated image processing), this (current) limitation should be kept in mind. Similar anomalies arise from the stitching of multiple fragments, as indicated in
Figure 7. Here they are a likely result of the ill-lit stitching space between individual core fragments, which is a side effect of the reconstruction of composite cores from fragments.
The ability to reconstruct cores from individual core fragments allows for the reconstruction of highly fractured intervals into composite DCMs (
Figure 8), which can theoretically be extended (digitally) to encompass all available core material for a given borehole. The two procedures summarised here strongly differ in efficiency and automation potential. Currently, closely aligning fragments during the photographic acquisition step is found to be more effective, especially considering that manual reconstruction and alignment of segments using 3D software is far more time-consuming. A significant increase in efficiency is expected when implementing novel alignment methods and techniques within the latter. The potential of object-reassembly algorithms (e.g., [
50,
51]) underlines this expectation, and it is likely to result in a fully automated add-on to the current workflow. Until then, object-reassembly effectively takes place during the photographic acquisition step. By imaging interfaces of connected segments, enough keypoints and overlap between fragments are afforded to allow for SfM processing to reconstruct composite models within an accurate coordinate system. Through the latter procedure, the 487.45–487.53 m MD interval of DH4 was reconstructed. This interval was selected due to a clear sinusoid structural pattern visible in acoustic televiewer data (
Figure 9), potentially allowing for the determination of the
in situ core orientation and depth if also observed in the corresponding pseudo-cross-sections of the DCM.
Unwrapped core textures can be treated as directly correlatable to optical televiewer data from the same depth. As a result, related features may be directly compared to borehole-wall imagery, and used to both correct orientation and depth of the core. During such a re-alignment, it is important to note that discrepancies between logged core intervals and televiewer data may exist, resulting from cable stretch, irregular tool movement (which may further adversely impact the data), misalignment and loss of drill core samples [
52]. In addition, a limited feature and signal count in televiewer data is expected when dealing with flay-lying, homogeneous units.
Such a setting is partly encountered in our case study targeting the shale-rich, Late-Jurassic Agardhfjellet Formation, which makes up most of the interval for which televiewer data is available. The initial comparison between (boxed) drill cores and televiewer data indicated the damage sampling, poor handling and sample degradation can lead to. Most anomalies found in televiewer data pertain to previously, heavily sampled core intervals, or were fully pulverised and smashed up as a result of wear and tear through operational activities. Furthermore, discrepancies were observed between (boxed) core samples and the televiewer data, with mismatching structures (e.g., presence/absence of fractures) documented at similar depth. In general, these findings are quite likely the result of less-than-optimal televiewer data, and the impact of such features as breakouts and faulty tool orientation. Additionally, units featuring nicely dipping marker beds (e.g., coal seams in the Helvetiafjellet Formation), are beyond the televiewer-logged interval, preventing comparison. The applicability of the proposed protocol is therefore likely to increase for more favourable settings that include steeper dipping depositional angles and more geological heterogeneity. In such cases, orientations may be deduced based on structural and sedimentary facies alone.
Nonetheless, the feasibility of the technique (even under far from optimal conditions) has been successfully demonstrated through the analysis of sample DH4-487.5. Besides confirming orientation and depth, the alignment also shows which fractures are present in situ, and which are likely drilling-induced. This provides a valuable aid during fracture characterisation, in addition to being an efficient way of visualising stratigraphic-sedimentary details.
The combination of digital reconstruction and positioning allows for the
in situ orientation to easily be extrapolated to a larger section of the core, which in turn may be used to determine the geometrical relationships among structures, tectonic stress and kinematic history [
53]. As neither proprietary hardware nor software is needed, the photogrammetry-based procedure lowers the requirements with which this can be done compared to other standard and widely used core-orienting techniques [
53,
54]. As a follow-up (currently) beyond the capabilities of the Longyearbyen CO
2 Lab, it would be interesting to compare optical televiewer data with the reconstructed, directly correlatable televiewer analogue derived from unwrapped cores.
4.4. Future Applications
The applicability of the proposed workflow allows for a straightforward application of SfM photogrammetry to geological hand specimens. Beyond drill core samples, potential uses are, in particular, envisaged for the characterisation and (digital) documentation of delicate and dangerous (e.g., radioactive) materials prior to destructive testing, in-field bulk density assessments, and the establishment of readily accessible DCM repositories (including geoscientific metadata) that guarantee enhanced scientific repeatability. As such, we foresee the potential of this method for the digitisation of ice and permafrost cores in addition to soil, mineral, and rock samples.
The digitisation and (digital) storage of drill core samples is complementary to existing physical and photographic sample archives, providing capabilities to remotely access and analyse 3D model. However, like with physical storage, several recommendations are needed to prevent the loss of data, to ensure the access to the archive, and to validate results. Due to similarity in object types and scales, we base these recommendations on the EAC guidelines for best practices in European archaeological archiving [
55]:
All scientific digitisation projects must result in a secure, stable, ordered, and accessible archive, featuring digital backup strategies and updated, (semi-)permanent storage solutions.
Standards and procedures for the creation, selection, management, compilation and transfer of the archive must be agreed upon in the design stage, and each procedure must be fully documented.
The entire archive must be compiled in such a way to ensure the preservation of relationships between elements and to facilitate access to all parts in the future. This also includes the linked storage of such related and derived data as interpretations and subsequent processing.
Where possible, physical and digital sample storage should be in the same place, or at least stored through association, to prevent red tape from being in the way of access.
Finally, a digitisation project is only completed after the archive has been transferred to a recognised repository, and is fully accessible for consultation.
Beyond the archival value, digital sample repositories allow for the use of big data tools (e.g., machine learning), which may extend the use of digital samples to methods that are currently limited to 2D images (e.g., optical mineralogical characterisation [
56,
57]), turning such repositories into actual digital laboratories.
To lay the foundation for an accessible digital drill core repository on Svalbard, all generated models have been integrated into the Svalbox database, which is a showcase geoscientific database for high Arctic teaching and research [
58] conforming the recommendations outlined above. Other examples of such geoscientific databases include SafariDB [
59] and eRocK [
60], which may be expanded to include digital rock samples as well.