Scanned Image Data from 3d-printed Specimens Using Fused Deposition Modeling

This dataset provides high-resolution 2D scans of 3D printed test objects (dog-bone), derived from EN ISO 527-2:2012. The specimens are scanned in resolutions from 600 dpi to 4800 dpi utilising a Konica-Minolta bizHub 42 and Canon LiDE 210 scanner. The specimens are created to research the influence of the infill-pattern orientation; The print orientation on the geometrical fidelity and the structural strength. The specimens are printed on a MakerBot Replicator 2X 3D-printer using yellow (ABS 1.75 mm Yellow, REC, Moscow, Russia) and purple ABS plastic (ABS 1.75 mm Pink Lion&Fox, Hamburg, Germany). The dataset consists of at least one scan per specimen with the measured dimensional characteristics. For this, software is created and described within this work. Specimens from this dataset are either scanned on blank white paper or on white paper with blue millimetre marking. The printing experiment contains a number of failed prints. Specimens that did not fulfil the expected geometry are scanned separately and are of lower quality due to the inability to scan objects with a non-flat surface. For a number of specimens printed sensor data is acquired during the printing process. This dataset consists of 193 specimen scans in PNG format of 127 objects with unadjusted raw graphical data and a corresponding, annotated post-processed image. Annotated data includes the detected object, its geometrical characteristics and file information. Computer extracted geometrical information is supplied for the images where automated geometrical feature extraction is possible.


Introduction
Additive Manufacturing (AM) or 3D printing [1] is the method of creating physical objects from digital models by usually layer-wise fabrication.This term comprises various technologies used to create the physical objects [2,3] ranging from Laminated Object Manufacturing (LOM), Selective Laser Sintering (SLS) or Selective Laser Melting (SLM), Stereolithography (SLA) to Fused Deposition Modeling (FDM) (or Fused Filament Fabrication (FFF) [4]).Objects can be created using a range of materials like plastics (Thermoplastics or Photopolymers), Ceramics, Waxes, Metals and Alloys dependent upon the underlying technology.Due to the nature of the fabrication process the digital Computer Aided Design (CAD) model must be transformed to a machine code file [5].For this step the digital model is transformed into an exchange format like StereoLithography (file format) (STL) or Additive Manufacturing File Format (AMF) [6] which is then processed by a software that is called a slicer.The slicing software creates layers or cross-sections through the object under the influence of user-selectable parameters that is then traced by the machine-code.The quality of the resulting object is dependent upon the quality of the slicer [7].In FDM [8] a thermoplastic like acrylonitrile butadiene styrene (ABS) polylactic acid (PLA) is heated to above the glass-transition temperature within the extruder to a semi-molten state and then pushed onto the build plate or previous layers of the printed object, where the extrudate solidifies due to the reduced temperature [9].This research is conducted to test the hypotheses that the build orientation and the infill pattern of a part influence the mechanical and geometrical (see Section 1.1) properties of said part.For this research we have created a set of infill patterns ranging from 0 to 90 degrees (0, 5, 10, 30, 45, 60 and 90 degrees, see Figure 1), where the angle indicates the orientation of the strands within the specimen against the X-axis which is the front of the build plate, see Figure 1.The experiment is designed to print the varying infill patterns with and without orientation of the object in alignment of the infill pattern.The specimens are printed with a layer height of 0.3 mm and a two layer design.The second layer is oriented either identical to the first layer or mirrored to the first layer.For this experiment the following four groups are created for each of the infill patterns: In the Figures 2 and 3 the placement of the specimen on the printing bed is displayed.In Figure 2 a specimen with a 45 degree infill pattern is displayed -the infill pattern is indicated by the red stripes within the specimen.This specimen is rotated at 45 degrees against the X-axis of the printer.The infill pattern is oriented along the Y-axis of the 3D-printer.In Figure 3 the specimen with the same 45 degree infill pattern is depicted.In this configuration, the specimen is aligned with its longest side to the X-axis of the 3D-printer.In this case the infill pattern is not aligned with either the X-axis or the Y-axis.Further experiments are conducted on the sensor data acquisition during the printing process for state detection [10], for which the 3D printer (Makerbot Replicator 2X) is equipped with sensor nodes registering ambient (e.g., temperature, air pressure and magnetic fields) and inherent data (e.g., vibration).

X-axis Y-axis
For the seven infill patterns and a minimum of 3 prints per group this yields a expected sample size of: 7 × 4 × 3 = 84.The experiment found that some models resulted in misprints and flawed objects that are partially unscannable (especially infill pattern 5 and 10 degrees).For a full coverage the following objects are missing: • 1 × 45 degrees norm orient, omitted due to machine error • 3 × 10 degrees flip orient, printed but of unusable quality due to printing errors

Accuracy
The accuracy and geometrical fidelity of 3D-printed objects is researched in many works and for over 20 years [11,12] due to the necessity to produce objects that match their digital models closely for the use as prototypes (Rapid Prototyping, (RP) [13,14]), consumer products (Rapid Manufacturing, (RM) [15]) or tools (Rapid Tooling, (RT) [16]).
Dimitrov et al. [17] conducted a study on the accuracy of the Powder bed and inkjet head 3D printing (3DP) process with a benchmark model.Among the three influencing factors for the accuracy is the selected axis and the material involved.
Turner and Gold [18] provide a review on Fused Deposition Modeling (FDM) with a discussion on the available process parameters and the resulting accuracy and resolution.
Boschetto and Bottini [19] develop a geometrical model for the prediction of the accuracy in the Fused Deposition Modeling (FDM) process.They predict the accuracy based on process parameters for a case study for 92% of their specimens within 0.1 mm.Armillotta [20] discusses the surface quality of Fused Deposition Modeling (FDM) printed objects.The author utilises a non-contacting scanner with a resolution of 0.03 mm for the assessment of the surface quality.Furthermore, the work delivers a set of guidelines for the FDM process in respect to the achievable surface quality.
Equabal et al. [21] present a Fuzzy classifier and neural-net implementations for the prediction of the accuracy within the Fused Deposition Modeling (FDM) process under varying process parameters.They achieve a mean absolute relative error of 5.5% for the predictor based on Fuzzy logic.
Sahu et al. [22] also predict the precision of FDM manufactured parts using a Fuzzy prediction, but with different input parameters (Signal to noise ratio of the width, length and height).
Katatny et al. [23] present a study on the dimensional accuracy of Fused Deposition Modeling (FDM) manufactured objects for the use as medical models.The authors captured the geometrical data with a 3D Laser scanner at a resolution of 0.2 mm in the vertical direction.In this work a standard deviation of 0.177 mm is calculated for a model of a mandible acquired from Computer Tomography (CT) data.
To counter expected deviations of the object to the model, Tong et al. [24] propose the adaption of slice files.For this adaption the authors present a mathematical error model for the Fused Deposition Modeling (FDM) process and compare the adaption of slice files to the adaption of Stereolithography (file format) (STL) files.Due to machine restrictions the corrections in either the slice file and the Stereolithography (file format) (STL) file are comparable, i.e., control accuracy of the 3D-printer is not sufficient to distinguish between the two correction methods.
Boschetto and Bottini [25] discuss the implications of Additive Manufacturing (AM) methods on the process of design.For this discussion they utilise digitally acquired images to compare to model files.
Garg et al. [26] present a study on the comparison of surface roughness of chemically treated and untreated specimens manufactured using FDM.They conclude that for minimal dimensional deviation from the model the objects should be manufactured either parallel or perpendicular to the main axis of the part and the 3D-printer axis.
From the literature the following taxonomy (Table 1 can be constructed, based on the utilized techniques for accuracy measurement and applicability restrictions or generalisations.From the literature it is evident, that either manual measurements, optical analysis, 3D laser-scanning or coordinate measuring machines are applied for the geometrical analysis of 3D-printed objects.Methods to assess the surface roughness of 3D-printed objects are specific to the applied technology, as the traces of the manufacturing are expressed significantly different for each technology.For example, with Fused Deposition Modeling (FDM) manufacturing, the object is created by extruding filament bead-wise along the machine-path thus leaving bead-like artefacts on the surface.With Selective Laser Melting (SLM), an object manufactured does not express such bead-like structures, as the material gets molten by the laser in a different pattern, with partial remelting of previous material.

Materials and Methods
The image data acquisition is performed using a Konica Minolta BizHub 42 and a Canon LiDE 210 scanner.The BizHub is capable of producing lossless images up to a resolution of 600 dpi (also pixels (px) per Inch) as Tagged Image File Format (TIFF) [https://partners.adobe.com/public/developer/en/tiff/TIFF6.pdf] files.The LiDE 210 device is capable of producing images up to a resolution of 4800 dpi in a format depending upon the acquisition software (Tagged Image File Format (TIFF) is used in this experiment).With these resolutions available, the image size, average file size (for the Portable Network Graphics (File Format) (PNG) format see [34]) and the theoretical maximum resolution is listed in Table 2.The theoretical maximum resolution is calculated by: The specimens are scanned on either blank white paper or paper with blue millimetre marking affixed with scotch tape to prevent misalignment during the scanning procedure.The image data, see Figure 4, is then cropped for the individual specimens using GNU Image Manipulation Program (GIMP) as an image manipulation tool (Version 2.8.16).The individual specimen image data is then stored as Tagged Image File Format (TIFF) file format with Lempel-Ziv-Welch (Algorithm) (LZW) compression [35] for smaller file sizes, see Figure 5.In the following step the Tagged Image File Format (TIFF) image is converted into the lossless Portable Network Graphics (File Format) (PNG) format using imagemagick (Version 6.9.3-0) and optimised using optipng (Version 0.7.5, parameters used "-fix -o 5") for further reduction in filesize.
The software to extract the geometrical information from the scanned data is written in Python (Version 2.7.11) using the OpenCV framework (Version 2.4.12.2) for image processing.The algorithm to extract the geometrical information is described as follows: 1.
Crop the original form to contain each individual printed object 2.
Foreach cropped area of interest around the object do For the corner detection two approaches are used as the specimens and are not equipped with accurate corners, but rounded corners due to the nature of the manufacturing technique.The first corner detection utilizes extensions of the vertical and horizontal borders and defines the corner as the intersection of these, see Corner A in Figure 6.The second approach is to detect the nearest point on the outline of the specimen to the respective corner of the image frame, e.g., the top-left corner of the specimen is the point on the outline of the specimen that is closest to the top-left corner of the image (Position X = 0 and Y = 0), see Corner B in Figure 6.The significant points and measurement identifiers are depicted in Figure 7.An overlay enriched image is created by the software that includes information on the filename, the measured distances (length of object measured from the top corners and along the longest axis of the object; width of object at the left and right side; width of the object in the middle), its orientation, the angle of the enclosing ellipse and the deduced infill pattern in degrees.Furthermore, this overlay image highlights the detected object and places a bounding box as well as a box through the corners of the object as an overlay.Further information is extracted and stored in a text file where each line is associated with a datum.These data are described in Section 3. See Figure 8 for an example of the result of the software processing with the overlayed information on the original image data.
The experiment uses a model derived from EN ISO 527-2:2012 [36] for structural testing of plastic based specimen with the deviation of object thickness that is reduced to 2 layers of 0.3 mm each.See Figure 9 for a reference of the object geometry.The unprocessed image data is provided as an example in Figure 10 for specimen 50.Although the experiment is conducted on flat, single and dual layer specimens -which are not representative of real-world objects -it displays erroneous and expected behaviour in the deposition of thermoplastic material.The material deposition structure is governed by the choice of material, the selection of parameters for the execution and the quality of the 3D-printer in use.The dataset is generated from an experiment on (to published separately) the structural stability of various infill patterns and build orientation which evaluates flexural stress for FDM printed specimens using acrylonitrile butadiene styrene (ABS) plastics.In combination with this analysis the dataset can help to research the relationship between visually apparent structures and stress quality of objects.The data can also be used to visually analyse patterns and structures indicating flawed execution to ascertain the quality of the executing 3D-printer and develop more accurate models of deposition strategies.From the visual data analysis systematic shrinkage can be researched under the influence of the varying infill patterns.For the analysis on the impact of the infill and build orientation on the geometrical fidelity, we refer to the publication by Baumann et al. [37].Furthermore, the scanned image based and software supported geometrical analysis of the specimen is applicable for the rapid measurement of specimens for testing according to the EN ISO 527-2:2012 [36] standard.

Error Estimation
From the theoretical px lengths for each of the resolutions provided in the Table 3 below the following error estimation for the proposed and applied method can be derived.11 an example for this error measurement is depicted.In this figure, the reference lines are analysed with the GIMP software.The dotted lines are from the software.The pixels from the reference lines are not sharp and lead to measurement errors.Such measurement errors do also occur on the corners and borders of the specimen.As the pixels (pxs) in the digital image tend to bleed and the contours of the features are unsharp, an uncertainty for the measurements is inherent.To calculate the uncertainty of the method, measurements are taken to estimate known distances of 1 cm and 5 mm.The measurements are taken at two positions with the first position being placed above the actual feature so that this reflects the maximum distance.The second position is taken below the feature so that this measurement reflects the minimum distance.The measurements are then compared to the theoretical values for these distances as listed in Table 3, third column.In this column the pixels (pxs) for a distance of 1 cm are listed for the respective resolution.
In Table 4 the equivalencies for the digital units, i.e., px, to the real world units, i.e., mm and cm, respectively, are listed.The second column indicates the equivalent of 1 px in mm and the third column indicates the equivalent of 1 cm in pixels (pxs).In Table 3 the following abbreviations for the columns are in use: • max and min for the maximum and minimum measured distances for the 1 cm reference in pixels (pxs) • pos.diff and neg.diff for the positive and negative difference to the theoretical value for the reference distance as indicated in Table 4. • pos.diff % and neg.diff % for the percentage difference of the differences to the theoretical values • pos.diff real and neg.diff real for the real-world differences in mm to the theoretical value.
In Table 5 the average percentage and real errors for the averaged measurements of the reference length are listed per resolution.

On the Data Acquisition Device and Data Acquisition
The image data is acquired using a Canon LiDE 210 optical flatbed scanner for which the specification is available at https://www.usa.canon.com/internet/portal/us/home/support/details/scanners/photo-scanner/canoscan-lide-210.This scanner has an optical resolution of 4800 × 4800 dpi and offers and interpolation mode of up to 19,200 × 19,200 dpi.Only optically available resolutions are used for the data acquisition.The scanning unit moves from the front of the device to the backside of the device.The front of the device is identified by the location of the interface-buttons.In the experiment this translates of a movement from the top of the scanned page to the bottom.The scanning unit has an integrated light source below the contact image sensor (CIS) leading to a narrow shadow line above the scanned objects, see Figure 12 for a schematic view of the scanning device and the specimens placement.Automatic image enhancement techniques and filters are disabled for the scanning procedure.

Dataset Description
The dataset is split into four parts:

•
Part A, contains the original scanned A4 papers with the specimens affixed.

•
Part B, contains the cropped and extracted, unaltered scanned data for each individual specimen.

•
Part C, contains the augmented image data for each individual specimen as provided by the analysis software.

•
Part D, contains the data files for each individual specimen as provided by the analysis software.
The files are identified following the schema: where PAGE_NUM indicates the page identifier this specimen is placed on, SPECIMEN_NUM indicates the individual specimens number, and FILE_TYPE indicates whether this is an image (indicated by PNG) or a data file (indicated by LOG).RES indicates the respective resolution in DPI and can be either 600, 1200, 2400 or 4800.Data from part C is following a different naming schema to distinguish the augmented and raw image data.Image data in part C is named: p<PAGE_NUM>-<SPECIMEN_NUM>-opt-res.PNG The geometrical data extracted from the image data files and stored in the respective data files is described as follows.Each line of also contains en example output from the analysis software for specimen 67.
Height -Height of the scanned image in pixel (px) (1648)  (2.08160) 25. calc_dist_left_side -Calculated distance between the the points defined by avg_right_side_A_x, avg_right_side_A_y and avg_right_side_B_x, avg_right_side_B_y in pixel (px) (981.34574) 26.calc_dist_left_side -Calculated distance between the the points defined by avg_right_side_A_x, avg_right_side_A_y and avg_right_side_B_x, avg_right_side_B_y in cm (2.07718) 27.avg_left_side_A_x -Analogue to avg_right_side_A_x but for the left side of the specimen (690.45745)28.avg_left_side_A_y -Analogue to avg_right_side_A_y but for the left side of the specimen (1399.79787)29.avg_left_side_B_x -Analogue to avg_right_side_B_x but for the left side of the specimen (606.27723)30.avg_left_side_B_y -Analogue to avg_right_side_B_y but for the left side of the specimen (422.06931) 31.line_1_left_side -Analogue to line_1_right_side but for the left side of the specimen (−0.04133 + 1428.33131)32.line_2_left_side -Analogue to line_1_right_side but for the left side of the specimen (−0.04113 + 447.00702) 33.distA_left_side -Analogue to distA_right_side but for the left side of the specimen (980.37059)34.distA_left_side_cm -Analogue to distA_right_side_cm but for the left side of the specimen (2.07512) 35.distB_left_side -Analogue to distB_right_side but for the left side of the specimen (980.36215)36.distB_left_side_cm -Analogue to distB_right_side_cm but for the left side of the specimen (2.07510) 37. distAB_left_side_avg -Analogue to distAB_right_side_avg but for the left side of the specimen (980.36637)38.distAB_left_side_avg_cm -Analogue to distAB_left_side_avg_cm but for the left side of the specimen (2.07511) 39. calc_dist_length -Analogue to calc_dist_right_side but for the length of the specimen (7029.23297)40.calc_dist_length_cm -Analogue to calc_dist_right_side_cm but for the length of the specimen (14.87854) 41. avg_length_A_x -Analogue to avg_right_side_A_x but for the length of the specimen (170.87179)42.avg_length_A_y -Analogue to avg_right_side_A_y but for the length of the specimen (955.38462)43.avg_length_B_x -Analogue to avg_right_side_B_x but for the length of the specimen (7192.50000)44.avg_length_B_y -Analogue to avg_right_side_B_y but for the length of the specimen (628.50000)45. line_1_length -Analogue to line_1_right_side but for the length of the specimen (−0.87051 + 1104.13027)46.line_2_length -Analogue to line_2_right_side but for the length of the specimen (25.57616 + −183328.02318)47. distA_length -Analogue to distA_right_side but for the length of the specimen (6963.98400)48.distA_length_cm -Analogue to distA_right_side_cm but for the length of the specimen (14.74043) 49.distB_length -Analogue to distB_right_side but for the length of the specimen (11217.47002)50.distB_length_cm -Analogue to distB_right_side_cm but for the length of the specimen (23.74364) 51.distAB_length_avg -Analogue to distAB_right_side_avg but for the length of the specimen (7029.23297)52.distAB_length_avg_cm -Analogue to distAB_right_side_avg_cm but for the length of the specimen (14.87854) 53.calc_dist_center -Analogue to calc_dist_right_side but for the centre width of the specimen (541.39512)54.calc_dist_center_cm -Analogue calc_dist_right_side_cm but for the centre width of the specimen (1.14595) 55. avg_center_A_x -Analogue to avg_right_side_A_x but for the centre width of the specimen (3675.02713)56.avg_center_A_y -Analogue to avg_right_side_A_y but for the centre width of the specimen (1034.01938)57.avg_center_B_x -Analogue to avg_right_side_B_x but for the centre width of the specimen (3783.48889)58.avg_center_B_y -Analogue to avg_right_side_B_y but for the centre width of the specimen (503.60000) The following table (Table 6) lists all available objects contained in part B:   In part B there are 33 images with a resolution of 600 dpi, 116 at 1200 dpi, 35 at 2400 dpi and 6 images at 4800 dpi for a total of 193 images.
The average filesize and image properties are listed in the table below (Table 7):

Summary
The dataset is compiled during research on the influence of object orientation and infillorientation in FDM 3D-printing on the structural and geometrical quality of objects.The research is focused on the measurement and analysis of structural implications of varying infill orientations for which the flexural testing is performed, which is the focus of a separate publication.For the quality assessment the geometrical fidelity of the 3D-printer is analysed for which this dataset is used.The dataset is compiled over a period of three weeks in a office-environment to reflect the use-case of home-office usage.The dataset is of value to perform further geometrical analysis on the specimens, as well as to analyse error patterns and error modes with their physical reflection in FDM 3D-printing.The dataset are of use as an educational resource due to their high quality and allow for students to see the influence of the movement and structural parts of a 3D-printer on the surface quality of the 3D-printing objects.

Usage Notes
The dataset layout is described in Sections 3 and 7.The provided data can be used as examples to study the effects of the layer deposition and its inherent flaws such as smeared beads.The relevant geometrical information is available in the respective text files for the scanned specimens as described in the scheme.The data is released under the CC-BY license and can be used according its terms.

Concluding Remarks
The experiment conducted on the mechanical properties of varying infill pattern is still in progress and the experiment on the geometrical properties will be published in an article by the title "Geometrical Fidelity of Consumer Grade 3D Printers" in Computer Aided Design & Applications, 2017 [37].The authors are of the opinion that the underlying data set described in this work is beneficial to other researches and warrants publication of the dataset itself.The data set can be used to study movement and material deposition of FDM 3D-printers and their common faults and errors.From the dataset the repeatability of identical 3D-printed models can be studied.Furthermore, we think that the image data is valuable for teaching purposes on the FDM 3D-printering process.

Dataset Availability
The dataset is not previously published in any other location but with this article.In the dataset the following items (part A) are present as indicated in Table 8 with the page number indicated in the first column, the lowest item number and the highest item number in the following columns.Missing pages and pages indicated with letters are due to enumeration mistakes on the paper form, missing parts are due to 3D-printering errors rendering them unsuitable for scanning.

Figure 2 .
Figure 2. Specimen with 45 degree infill pattern in rotated position on the printing bed.

Figure 3 .
Figure 3. Specimen with 45 degree infill pattern in oriented position on the printing bed.

Figure 4 .
Figure 4. Uncropped and unaltered raw image data acquired with Canon LiDE 210 at 1200 dpi containing specimens 8, 9, 10, 11, 12, 13 and 14 from page 28, filesize is 431.1 MiB in Tagged Image File Format (TIFF) format.Image dimensions are 10,224 pixel (px) in width and 14,055 pixel (px) in height.Image is converted to Portable Network Graphics (File Format) (PNG) for display within this document.Image is scaled to 1024 pixel (px) width for display.

Figure 5 .
Figure 5. Cropped specimen (8) stored in Tagged Image File Format (TIFF) file format with Lempel-Ziv-Welch (Algorithm) (LZW) compression at 1200 dpi, filesize is 26.3 MiB.Image size is 7360 pixel (px) in width and 1794 pixel (px) in height.Image is converted to Portable Network Graphics (File Format) (PNG) for display within this document.Image is scaled to 1024 pixel (px) width for display.

( a )
Transform the Red-Green-Blue (Color Coding) (RGB) image data to Hue-Saturation-Value (Color Coding) (HSV) for more resistant colour based object detection (b) Identify image background and measurement mesh (static) and subtract from image (c) Binarize image by thresholding with most common colour in image (d) Utilise OpenCV blob detection algorithm on result and select largest blob as candidate for object detection (e) Detect corners of object detection candidate and transform to array of line segments (f) Close holes within the maximum border segment (g) Create bounding-box around candidate object and compare to expected result i. if object candidate is verified then: ii.Scan left side for corner top-left (Point A) iii.Scan left side for corner bottom-left (Point B) iv.Scan right side for corner top-right (Point C) v. Scan right side for corner bottom-right (Point D) vi.Calculate distance between Point A and Point C (Distance Top) and angle against horizontal for AC vii.Calculate distance between Point B and D (Distance Bottom) and angle against horizontal for BD viii.Calculate distance between Point A and B (Distance Left) and angle against horizontal for AB ix.Calculate distance between Point C and D (Distance Right) and angle against horizontal for CD x.Determine average X position of upper border near object centre xi.Determine average X position of lower border near object center xii.Calculate Average distance between upper and lower border near object centre (Middle Width) xiii.Calculate area surrounded by detected border divided by area of bounding box (h) Create overlay information for original image (intended for human usage) (i) Store data in database for later retrieval

Figure 6 .
Figure 6.Schematic view of corner detection.

Figure 7 .
Figure 7. Description of significant point within the scanned image data for reference -Specimen 4 depicted.Image is scaled to 1024 pixel (px) width for display.

Figure 8 .Figure 9 .
Figure 8. Overlay image data for specimen 155 as a result from the software processing.Image in Portable Network Graphics (File Format) (PNG) format with a filesize of 11.5 MiB and image dimensions of 7535 px width and 1716 px height.Image is scaled to 1024 px width for display.

Figure 10 .
Figure 10.Cropped scanned image data for specimen 50 on paper with blue millimetre marking.Image is scaled to 1024 pixel (px) width for display.

Figure 11 .
Figure 11.Measurement uncertainty apparent in GIMP for line thickness analysis.

Figure 12 .
Figure 12.Schematic view of the scanning device and specimens placement.
08423) 13. avg_right_side_A_x -Average X position for point A for the distance calculation (6718.12857)14. avg_right_side_A_Y -Average Y position for point A for the distance calculation (1122.64286)15. avg_right_side_B_X -Average Y position for point B for the distance calculation (6722.41860)16. avg_right_side_B_Y -Average Y position for point B for the distance calculation (137.97674) 17. line_1_right_side -Definition of a line through the positions of elements detected on the border of the right top side in the form of f (x) = K × x + l.Gradient and y-intercept (−0.04906 + 1452.20479)18. line_2_right_side -Definition of a line through the positions of elements detected on the border of the right bottom side in the form of f (x) = K × x + l.Gradient and y-intercept (−0.04322 + 428.53713) 19.distA_right_side -Calculated distance of a line perpendicular to the line_1_right_side and its intersection of line_2_right_side in pixel (px) (983.28993)20.distA_right_side_cm -Calculated distance of a line perpendicular to the line_1_right_side and its intersection of line_2_right_side in cm (2.08130) 21. distB_right_side -Calculated distance of a line perpendicular to the line_2_right_side and its intersection of line_1_right_side in pixel (px) (983.57903)22. distB_right_side_cm -Calculated distance of a line perpendicular to the line_2_right_side and its intersection of line_1_right_side in cm (2.08191) 23.distAB_right_side_avg -Average of distA_right_side and distB_right_side in pixel (px) (983.43448)24.distAB_right_side_avg_cm -Average of distA_right_side_cm and distB_right_side_cm in cm

Table 1 .
Taxonomy of Accuracy Measurement in Literature.

Table 2 .
Average Image Properties for the Varying Resolutions.

Table 3 .
Measured Errors for references in various resolutions.

Table 4 .
Equivalencies of digital and real world units for different resolutions.

Table 5 .
Average Errors for references in various resolutions.
Calculated distance at the right side (width) in pixel (px) (984.67546)12.calc_dist_right_side_cm -Calculated distance at the right side (width) in cm(2.

Table 6 .
Overview of contained specimen scans.

Table 7 .
Average filesize in MiB, width and height in px of images in part B.

Table 8 .
Available pages and the respective specimens contained within.