**Interactive World War 2 Map** !!! WARNING The author of this page does not condone the denial of any Nazi crimes nor the glorification of any aspect of the third Reich.

[There is also a full screen version](map.html)

!!! Tip The data and information shown here is not accurate and only useful for visualization purposes. # General Information The map provides three types of data: - Front line locations from 1939 to 1945 for the western, eastern and north africa fronts. - Historical maps with searchable location names. - Biographical data from a few memoirs and other sources. The following sections will detail the sources used to acquire this data as well as the processing necessary to display it. Please note that the data has not been verified and is not scientifically validated. # Front Lines While there are many maps with front line information available, I was not able to find data in the form of geographical coordinates. ## Sources Since there are only few points in time for which front lines were available, the map will show the last one before the displayed date. In the following for each theatre of operations the source of the data is listed and how it is processed to generate geographical coordinates. ### Eastern Front For the late war the The Chief of Staff of the United States Army has created [bi-monthly maps](https://en.wikisource.org/wiki/Atlas_of_the_World_Battle_Fronts_in_Semimonthly_Phases_to_August_15_1945). Finding maps for the early part of the German-Soviet War was more difficult but there are some in the same style, e.g. for the [early](https://commons.wikimedia.org/wiki/File:Eastern_Front_1941-06_to_1941-12.png) part of the war. Processing these maps into polygons over latitude-longitude coordinates was more involved, please check the chapter below for the details. ### Africa A collection of useful maps can be found [here](https://medium.com/@Inflab/southern-front-maps-of-world-war-ii-ffbd40467bc3). I decided to simplify things by assuming that the front is always running along a constant latitude. This is certainly not true but suffices as an approximation. I then used coastline and country geojeson data to explicitly create the frontline polygons. ### Early War - Western Front and War in Norway The biggest challenge when creating maps for the part of the war from 1939 through early 1941 is finding country border information for this time. Luckily the late war maps for the eastern front had these. I then used this to create frontline data for a few select timestamps. ## Technical Details The methods described below are implemented in python using OpenCV (the code can be found in the Git repo). ### Processing the Eastern Front Maps #### Early War For the early war maps I manually created contours for the timestamps available. Given the discussions on Wikipedia the accuracy of these maps has to be questioned. There is also the peculiarity that the last map for early 1943 does not show the recapture of Charkow after the _Third Battle of Kharkov_ (which is why I have left out this timestamp). Assigning latitude-longitude information to each pixel in the image (called unprojecting from now on) is difficult here because the projection method isn't specified. For all cities shown on the map I looked up the coordinates to create a list of known $[pixel, coordinate]$ pairs. I used _scipys_ _griddata_ method to interpolate this sparse data onto a grid. I then used the _interpn_ method to inter/extra-polate coordinates for all pixel values (possibly outside of the bounds of the sparse known data). As a fallback I also employed a _Lasso_ regressor from _scikit-learn_ trained on the known locations. The results are acceptable for purely visualization purposes but significantly lack quality in the extreme regions like the Caucasus or Caspian Sea (with errors of up to $1°$ in latitude/longitude). #### Late War Since the number of maps available here is much larger ($> 40$) I decided to use an automated process. 1. Use thresholding and morphological transformations to select the area the Allied powers occupied. This already creates a small error. 2. The projection method here is known so the only task is to determine the projection parameters. This can be done when for each map at least the pixel location of Berlin and another city are known. With this information unprojecting is already possible. The results obtained with this method were far from acceptable with shifts of more than 40km in most regions. One reason for this is that the photos are skewed. I decided to use an optimization scheme to improve the unprojected coordinates. This is done by again selecting the pixel locations of a few (15) cities for which coordinates are also known. I then used a model of the form $$ \bar{U}(p) = T_{c_2}(U(T_{c_1}(p))) $$ which is an affine projection each before and after the normal unprojection. Here $T_{c_i}(x) = c_i x$ where $c_i \in \mathcal{R}^{2x3}$ and $x \in \mathcal{R}^2$. This gives a total of 12 `degrees` of freedom which are optimized with _scipy_ using the _SLSQP_ optimization algorithm. #### Accuracy Please note that the front lines shown on the map are far from accurate. There are many reasons for that: - Resolution of the map images. For the set of late war maps, a typical pixel in the image covers an area of size 4km by 2.5km. This is compounded by the scanning process and later by the technique used to identify the front line (which creates small shifts in the image). - Potentially inaccuracies in the maps themselves, the Wikipedia pages for the early war maps has some info on that. ### Format The data format for the maps is a simple json file which has the following structure: ~~~~ { "date": "1940-12-10", "charts":[ { "fronts": ["Africa", "West", "East", "North", "South"], "outer": [[lat, lng], ...], "inner": [ [[lat, lng], ...], ... ] }, ... ] } ~~~~ The fronts field can technically have multiple values, but this isn't used yet. Each front is made up of charts, for example the western front could have one chart for France as well as one for the UK. Each such chart has an outer polygon and multiple inner ones which are used to specify pockets such as Wjasma or Stalingrad or simply large lakes. # Historical Maps The German "Übersichtskarte Von Nord-, Mittel- & Osteuropa 1:300.000 (1893 - 1945)" (found with a nice web view [here](https://www.landkartenarchiv.de/deutschland_uebersichtskartevonmitteleuropa.php) and for download [here](http://igrek.amzp.pl/mapindex.php?cat=ME300)) are useful because many location names used in books can only be found on them and not in any modern online map. ## Tile Layer The map library used can not only display OSM tiles (the images of the map) but also custom ones. So I decided to combine the historical maps to create a combined view of them. The process involved: 1. Finding the 4 pixel coordinates of the corners. For these the geographical coordinates are known. 2. This will not be a perfect rectangle so instead a mapping from an arbitrary quadrilateral to the geographical rectangle is needed. _scikit-learn_ provides a nice class for this. 3. _Mercantile_ is a useful library to interact with the tiles of a Web Mercator projection. Using this iterate over all tiles of the map and crop out the part of the tile. 4. This is done on the highest zoom level (I used 14 but this is arbitrary and depends on the space constraints), followed by combining them into smaller tiles. Since I don't have a web server to host these tiles they are not visible by default. If you are interested in them, contact me :) Results are in general good but not spectacular. One way to check this is to compare city locations on the historic maps vs OSM maps. Another way is to compare the _Deutsche Heeres Gitter_ lines between two adjacent maps. I also added an option to display the DHG everywhere analytically to check the accuracy quickly (some of the older maps do not use this grid but a different one). ## Searchable Location Names. The next step was to make the location names from these maps searchable. This proved more complicate than anticipated because OCR software has trouble with these maps. So my process is to: 1. Use the Tensorflow implementation of EAST found [here](https://github.com/argman/EAST) to determine text locations in the map images. 2. For each such location I use Tesseract to perform OCR on the bounding box. The results are poor in general with some positive outliers. For example the name Stalingrad is not recognized, instead _LINGRAD_ is found. More work is needed here. For now the search function uses the [Edit Distance](https://en.wikipedia.org/wiki/Edit_distance) to make it useable. ### Format The format of the json data is an array of _objects_ which are just a bounding box followed by a name. ~~~~ [ [ [ [top, left], [bottom, right], ], ["name"] ] ] ~~~~ # Biographical Data The biographical data is mostly taken from books, written by participants of the war. These books differ wildly in the accuracy of which they give dates and locations. I always used the location names given in the source and not the modern one (which can be easily determined). Most of the time I was able to match the locations with high certainty. But in general there is a degree of unknown here, for example: Hans Jürgen Hartmann gives "Kurenye" as the location of his unit near the Don in August 1942. The closest match I could find on Openstreet Map is [Куренное](https://ru.wikipedia.org/wiki/%D0%9A%D1%83%D1%80%D0%B5%D0%BD%D0%BD%D0%BE%D0%B5) which is a reasonable assumption given the [context](http://www.294id.de/infanterieregiment-514.html) but could very well be incorrect. All authors often, and for good reasons, skipped many stops on their long distance transport routes. This created difficult to read visualizations because of crossing and overlapping lines. Due to this reason I invented fake intermittent stops on their train journeys. These are always reasonable in that they are towns on the railway lines between the start and end of their journeys. But I of course do not know which exact route they took in the many possible alternatives. To make this clear I have slightly faded out these segments on the map. !!! At the end of each chapter the book or other sources will be listed. ### Format The json data is split into two files. The _places.json_ file specifies geographical coordinates for places, format: ~~~~ { "Chemnitz": { "coordinates": [50.833333, 12.916667] }, ... } ~~~~ The names are mostly cities but sometimes also arbitrary locations. The second file contains the actual route the person took. ~~~~ [ { "date": "18.07.1942", "place": "Chemnitz", "annotations": "..." }, ... ] ~~~~ Where the annotation is a combination of: k=kämpfend, f=fliehend, v=verwundet, ?=erfunden (implizit oz), u=Urlaub, o=Ort geraten, z=Zeit geraten, t=Transport oder Training, g=gefangen ## Hans Jürgen Hartmann Extracting the dates and locations was extremely easy since the book is written as a diary. The only obstacle was the time between September 1942 and November 1943 when the author was not in combat but in either military hospitals or in training. During this time most dates were invented by me. !!! Hartmann Hans-Jürgen, et al. Zwischen Nichts Und Niemandsland Tagebuch Eines Deutschen Soldaten Aus Dem Zweiten Weltkrieg. Books on Demand, 2017. ## Günther Koschorrek For most parts dates and locations could easily be extracted. After the retreat from Rytschow in December 1942 over the Don it can no longer be reconstructed where the author was wounded because the area today is part of the Tsimlyansk Reservoir. The first part of the retreat in early 1944 from Nikopol to Wosnessensk is lacking any places or dates in the book. !!! Koschorrek Günter K. Vergiß Die Zeit Der Dornen Nicht: Ein Soldat Der 24. Panzerdivision Erlebt Die Sowjetische Front Und Den Kampf Um Stalingrad. Weltbild, 2008. ## Guy Sajer Extracting the information is much more challenging here because the author rarely uses dates and often makes it clear that he does not remember the accurate name of villages. ### Chapter 2 & 3 In chapter 2 and 3 the author is in the region of Charkow in early 1943. He mentions the following useful information: - One day before he reaches Charkow he is informed about the capitulation of the German 6th Army in Stalingrad. - He reached Charkow before it was re-captured by the Soviet forces on 15th of February 1943. - In the first paragraph of the second chapter he mentions that it is approximately late February or early March. - The author says that he saw dug in German troops on the bank of the Don. - At the beginning of the third chapter *Vom Don nach Charkow. Erster Rückzug. Die Schlacht am Donez* the author mentions that the snow is melting and the temperature is +5 °C. Historically these are the facts: - The German 6th Army capitulated in Stalingrad on the 2nd of February 1943. - Soviet forces captured Charkow on the 15th of February 1943. - German forces re-captured Charkow on the 15th of March. - The *Ostrogozhsk–Rossosh Offensive* started on the 13th of January and removed German forces from most parts of the Don south of Woronesch. - In the *Voronezh–Kastornoye Operation* Woronesch was re-captured by Soviet forces on the 25th of January after it was [encircled](https://de.wikipedia.org/wiki/Woronesch-Charkiwer_Operation#/media/File:Ww2_map25_Dec42_Feb43.jpg). Due to the mismatch of historical facts and statements by the author I have decided to completely ignore these two chapters and place the author in Charkow the entire time (which is certainly impossible). The user _B Hellqvist_ has created an excellent analysis of the problems [here](https://forum.axishistory.com/viewtopic.php?p=1234357&sid=8aa33fa243abaa1638ac865cbf28d042#p1234357). To archive this great information I have copied his text [here](data/persons/sajer/B_Hellqvist_Analysis.txt) and the images [here](data/persons/sajer/April1944.jpg) and [here](data/persons/sajer/scan0020.jpg). !!! Sajer, Guy. Der Vergessene Soldat. Helios, 2016. ## Hans von Luck In general the author mentions enough dates to reconstruct an accurate timeline. Exceptions include the advance towards Moscow in 1941, the drive to Berlin in early 1942 and the retreat from the Siwa Oasis in 1942 to Tunis. !!! Luck, Hans Von. Mit Rommel an Der Front. Weltbild, 2010. ## E Company, 506th Infantry Regiment This biographical data is not for a single person and was also not extracted from a book but relies on information from the Internet. !!! [This timeline](https://www.tripline.net/trip/In_the_Footsteps_of_Easy_Company-17344703415310029BDAF1ADC2609F2E) was helpful.