Whitepapers

A Guide to Getting the Best Performance with Large Datasets

Issue link: https://geospatial.trimble.com/en/resources/i/1415711

Contents of this Issue

Navigation

Page 6 of 12

Trimble Software: Working With Large Datasets 7 Sharing Deliverables ● Many people use Dropbox/AWS to share small-to-medium size projects. ● Physically shipping drives containing projects to co-workers and clients is common today as a method of transfer. ● To move processed point cloud data between software packages, *.E57 and *.LAS are the most commonly used standardized formats. They are recommended because they are binary files which generally have smaller file sizes. ● Modern day web-based viewers allow Geospatial professionals to share their collected and processed datasets with clients. Trimble Clarity allows for visualization of georeferenced scan data without the need for an expensive powerful computer. See Clarity examples below: Archiving ● Archive enough project data to ensure you have what you need; do not archive redundant edited versions and completed processing that can be recreated quickly. ● While a minimum data backup approach for Trimble RealWorks could be to store *.tzf files and the associated *.RMX registration parameter files for each *.tzf, you will lose any segmentation or registration grouping that was performed in the original project. ● Evaluate how long you should keep archived data. o Will there be follow-on work that requires the raw data? o Are there contractual obligations for making the data available for a period of time? ● Use local storage for ongoing work and archive finished projects for the long term (Petabytes may be needed). ● Archive drives don't need to be as fast as the local drives used for processing. Use your fastest drives for temporary storage while working on the project. ● Archiving to a NAS/local server or to a remote server provides better security against data loss. Physically separating the different archived copies of the project guards against loss from fires, floods and other damaging events. ● Try to archive the data in a way that will allow for efficient searching and retrieval later. It doesn't help to have copies of your data that you cannot find.

Articles in this issue

view archives of Whitepapers - A Guide to Getting the Best Performance with Large Datasets