Version 5 2025-09-26, 07:15Version 5 2025-09-26, 07:15
Version 4 2025-04-14, 08:08Version 4 2025-04-14, 08:08
Version 3 2025-04-14, 08:08Version 3 2025-04-14, 08:08
Version 2 2025-02-13, 09:11Version 2 2025-02-13, 09:11
Version 1 2025-02-13, 07:30Version 1 2025-02-13, 07:30
journal contribution
posted on 2025-09-26, 07:15authored byJin YanJin Yan, Jun Zhang, TianSheng Xu, Qing Yang, Jing Gao, Guanghong Gong
<ul><li><b>gen_coords.jl</b>: Pre-generates planar coordinate points for a series of projections and stores the results in the <code>coords</code>/ directory.</li><li><b>gen_opt_coords.jl</b>: Uses Particle Swarm Optimization to perform non-proportional scaling on a series of projections, obtaining scaling factors that result in lower distortion projections. Using these scaling factors, planar coordinate points are pre-generated and stored in the <code>coords-opt</code>/ directory.</li><li><b>opt-results.txt</b> contains the optimized scaling factors.<br><b>Example:</b><table><tr><th><p dir="ltr">Projection Name</p></th><th><p dir="ltr">Equal-Area Projection</p></th><th><p dir="ltr">Distortion Before Optimization</p></th><th><p dir="ltr">Distortion After Optimization</p></th><th><p dir="ltr">X Scaling Factor</p></th><th><p dir="ltr">Y Scaling Factor</p></th></tr><tr><td><p dir="ltr">moll</p></td><td><p dir="ltr">true</p></td><td><p>0.3856590420303463</p></td><td><p>0.3716426296273295</p></td><td><p>0.9059392884196235</p></td><td><p>1.1038267274449063</p></td></tr></table></li></ul><p dir="ltr">Each of the <code>coords</code>/ and <code>coords-opt</code>/ directories provides three sample projections (Mollweide, Mayr, and Equal Earth).</p><p><br></p><ul><li><b>fusion.jl</b>: Performs pairwise fusion of map projections from the <code>coords</code>/ and <code>coords-opt</code>/ directories, using six different fusion methods and a wide range of fusion coefficients. The script also supports early termination when Type II anomalies are detected.</li></ul><p dir="ltr">For the 220 projections in both <code>coords</code>/ and <code>coords-opt</code>/ (due to storage limitations, we only provide the three map projections mentioned above for each directory as examples), over 2 million new projections can be generated.</p><ul><li><b>fusion-Arithmetic-output.txt</b> contains the arithmetic fusion results. (Results from the other five fusion methods can also be obtained using <b>fusion.jl</b>).<br><b>Example:</b></li></ul><table><tr><th><table><tr><th><p dir="ltr">Source Map Projection 1</p></th><th><p dir="ltr">Source Map Projection 2</p></th><th><p dir="ltr">Fusion method</p></th><th><p dir="ltr">Weighting</p></th><th><p dir="ltr">Area distortion</p></th><th><p dir="ltr">Angle distortion</p></th><th><p dir="ltr">Overall distortion</p></th></tr><tr><td><p dir="ltr">moll</p></td><td><p dir="ltr">eqearth</p></td><td><p dir="ltr">Arithmetic</p></td><td><p>-0.3</p></td><td><p dir="ltr">Type II anomaly detected</p></td></tr><tr><td><p dir="ltr">moll</p></td><td><p dir="ltr">eqearth</p></td><td><p dir="ltr">Arithmetic</p></td><td><p>-0.2</p></td><td><p dir="ltr">Type II anomaly detected</p></td></tr><tr><td><p dir="ltr">moll</p></td><td><p dir="ltr">eqearth</p></td><td><p dir="ltr">Arithmetic</p></td><td><p>-0.1</p></td><td><p dir="ltr">Type II anomaly detected</p></td></tr><tr><td><p dir="ltr">moll</p></td><td><p dir="ltr">eqearth</p></td><td><p dir="ltr">Arithmetic</p></td><td><p>0.1</p></td><td><p>0.007493778579334968</p></td><td><p>0.35797298302218555</p></td><td><p>0.35805141151962805</p></td></tr><tr><td><p dir="ltr">moll</p></td><td><p dir="ltr">eqearth</p></td><td><p dir="ltr">Arithmetic</p></td><td><p>0.2</p></td><td><p>0.012212275287131306</p></td><td><p>0.3545642029519205</p></td><td><p>0.3547744546646775</p></td></tr><tr><td><p dir="ltr">moll</p></td><td><p dir="ltr">eqearth</p></td><td><p dir="ltr">Arithmetic</p></td><td><p>0.3</p></td><td><p>0.015239993432593657</p></td><td><p>0.3530991232662611</p></td><td><p>0.35342785437940194</p></td></tr></table></th></tr></table><p><br></p><ul><li><b>train_model.py</b>: Trains an EfficientNetV2-S classifier on the imagery dataset. The script splits the training data into training and validation subsets, monitors validation accuracy, and saves both the final model weights and the best-performing checkpoint.</li><li><b>best_model.pt</b>: Stores the weights of the best-performing model across multiple training epochs, determined by validation accuracy. This checkpoint is recommended for evaluation and downstream use.</li><li><b>evaluate_model.py</b>: Loads a trained model (typically <b>best_model.pt</b>), runs inference on the test set, generates a confusion matrix and classification report, and copies misclassified samples into organized folders for further inspection.</li><li><b>train-set.zip</b>: Archive containing ~3,000 labeled images, organized into class subfolders. Please extract the contents into a <code>train-set</code>/ directory before use. This dataset is intended for model training and validation.</li><li><b>test-set.zip</b>: Archive containing ~1,000 labeled images, organized into class subfolders. Please extract the contents into a <code>test-set</code>/ directory before use. This dataset is reserved exclusively for final model evaluation.</li><li><b>misclassified/</b>: Automatically generated during evaluation; stores copies of misclassified test samples in subfolders named trueClass-to-predClass.<br><br>Due to storage limitations, the full dataset (127.1 GB of imagery) is not provided here. Instead, only the <code>train-set</code> and <code>test-set</code> directories are made available to support training and evaluation.</li></ul><p><br></p>