On Mass Deletions in OGF

This morning, I posted this in response to a query on the User Diaries at OpenGeofiction:

General advice for all users:

When deleting large numbers of objects, please be careful. This is not a use-case that the OSM software is designed to handle (think about it – mass deletions are NOT common on OSM). Divide up your deletions to cover small numbers of objects (<1000) and small areas (so if something goes wrong you don’t mess up large areas).

I called attention to the post in the discord channel (“OGFC”), and so I decided to rant onward and provide some additional clarification and thoughts. I’ve distilled those, without much editing, below.

To the above, I would add that in my own practice, I have another step I take with respect to deletions. I almost never delete relations. You’ll notice in JOSM, it actually tries to warn you that “this is rarely necessary” when you try to delete a relation, and I think that’s a solid point. What you can do with relations you don’t need any more is repurpose them. That means you delete the members of the relation, and all of its tags, and then use it the next time you need a new relation for something.

user: What would be the potential harm of deleting relations?

Well I suspect that sometimes, a deleted relation can end up as a “ghost” object, or a “stranded” object, in the render database. Meaning it’s truly been deleted from the API database (the main rails port) but that deletion fails to transfer correctly to the render database (which stores things in a different format, and doesn’t actually have “relations” at all, but rather transforms a relation them into a special type of “closed way” I think)

By repurposing a relation instead of deleting it, you guarantee the render database will have to reclassify the linked object(s).

If you just delete it, the render database might not handle the disappeared relation correctly, and just leave the no-longer-linked objects lying around.

This is just speculation, because I don’t really understand it well. But just based on observed behavior

In general, I think that the OSM software is not designed to handle mass deletions. That’s the key point.

Because in OSM, who goes around deleting large number of objects? The real world doesn’t work that way.

You re-tag things, yes. You might move things slightly, adjusting position to improve accuracy. But things in OSM don’t “disappear”

Or if they do, they do so in very, very small sets. A building gets knocked down. A road gets torn out.

One object at a time.

So it seems very plausible to me that the OSM software actually hasn’t been designed to handle mass deletions, and hasn’t been tested for integrity in dealing with mass deletions.

For years now, I’ve been telling people, when you decide to rearrange your territory (and what geofictician doesn’t sometimes decide to rearrange their territory?!) … it’s better to move things than to delete/re-create things.

This prevents “ghosts” in the render.

In the long run, I suspect that we’ll have to just “reboot” the render now and then. But I’m not going to do it very often. (I mean “reboot” as in delete and re-create from zero, not “reboot” as in turn the machine off and on again – that happens every night).

I’d welcome comments on this rambling, vaguely rant-like discourse, for those who are knowledgeable about how the OSM apidb-to-renderdb pipeline works.

Music to make deletions to: 10cm, “오늘밤에.”

New server, same old city, but less old than before

Having worked furiously, all summer, in taking over the hosting for Opengeofiction, I have now finally reached a point where I feel like investing some time in the creative side – and actually work on the map again for a while.

I have added quite a bit of new area to the city of Ohunkagan. I’ve been saying that, within my “historical approach,” that I’ve brought it up to 1920, but I think there’s more I want to do before I call it officially caught up to 1920. So let’s call it 1917 or so. Also, some of the surrounding communities are still back before 1900 – e.g. Prairie Forge, Iyotanhaha, Riverton. They need to catch up, too. So here’s the new “work-in-progress” gif. picture

Here is the current snapshot.

picture [Technical note: screenshot taken at this URL (for future screenshots to match).] Here’s the wider area snapshot. picture

[Technical note: screenshot taken at this URL (for future screenshots to match).]

Music to map on a new server by: Lianne La Havas, “Forget.”

OGF Live

The OGFMirror became opengeofiction.net, over the weekend.

This is huge news.

So far the site is working okay. Not great, just okay. There are some issues.

  • The ActionController::InvalidAuthenticityToken is frequent and painful, for users.
  • Possibly related to the above – users keep getting forcibly logged out, over and and over.
  • The render is still playing catchup on the whole world map, and seems to be lagging around 2 hours for high zooms.
  • Incoming email is completely broken – possibly due to errors in the DNS tables at the opengeofiction.net registrar (and we’re waiting on the old host to fix this)
  • Outgoing email is problematic for some substantial portion of users due to over-aggressive anti-spam efforts by several major email providers, including Apple (icloud) and Microsoft (hotmail, outlook, live). I’m not even sure how to begin fixing this. I’ve implemented DKIM, but this also relies on fixing DNS errors which are not currently being fixed, and that might help. I’ve looked into a blacklisting of my email server by spamhaus.org and discovered it is due to my email server sharing the same IP range with a Nigerian Prince or somesuch in the server farm where it lives.

Work continues. Meanwhile, to all the users of OGF: “Happy mapping!”

Music to map happily to: Cake, “I Will Survive.”

OGFMirror

I’m super bad about posting to this blog. That’s partly because I feel a strong desire to report some actual, positive progress, which I haven’t felt enabled to do.

I have been very busy with HRATE technicalities. I am building – very, very slowly – a “mirror” for opengeofiction.net (OGF). I think if this is successful, then the owner of that site, who has expressed interest in “letting go” of having to continue to maintain it, will allow the mirror to take over for the site and a transition to a new hosting environment will be complete.

Someday, I intend to write up, in elaborate, technical detail, this process of setting up a mirror. But in broad outlines, here is what it involves (has involved, will involve).

  • Build a new Ubuntu 20.04 LTS server. This leads to lots of incompatibilities farther down the line, because the existing OGF server is an older version. Install the basics – apache, postgresql, etc.
  • Install an OSM rails port on the server.
  • Migrate the OGF data to this server. This was very, very hard – because the OGF data (in either .osm.pbf format, or in pg_dump format) proved to contain inconsistencies (data corruption). Some missing current nodes and ways had to be restored manually (text-editing .osm = .xml files). This ended up a 2-weeks-long process.
  • Set up incoming replication from the source apidb (OGF) to the new mirror (currently being called ogfdev).
  • Set up outgoing replication for the new ogfdev instance (to drive render, overpass, etc)
  • Set up a new primary render. This had some sub-parts.
    • coastlines. This proved very difficult, because as far as I can figure out, the osmcoastline tool used to create the coastline shapefiles is broken on Ubuntu 20.04. An older version must be used. My current workaround: I’m actually running coastlines on an older server. I import a coastline-containing pbf file to the older server, run the osmcoastline tool, and post the shapefiles for consumption on the render server.
    • I made a decision to run the renders on a different server than the apidb. I think this might involve a bit more expense, short term, but it makes the whole set of processes more scalable, long term. My experience with Arhet is that the render requires scaling sooner / more frequently than the apidb, as the user base grows. Installing the render software (mod_tile and “renderd”) proved difficult. It turns out that there are some lacunae and downright incorrect steps in the documented installation sequences on github.
    • Set up incoming replication from the ogfdev database to the render database.
    • There are substantial differences in recent versions of the openstreetmap-carto style – specifically, the special shapefiles are no longer stored in as datafiles in data folder in the render directory. Instead, the shapefiles are loaded to the database. Because non-standard shapefiles are used, this means rewriting the load procedures (python scripts) – the standard approach is to just grab the files for “Earth” (because who would run osm for some other planet?!). So that file-grabbing is hard-coded in the procedure.
  • Set up a new topo render. The topo render was shut down on OGF, so this will be the only working version. Unfortunately, I ran into a similar problem with some of the topo pre-processing as I ran into with osmcoastline, above. I suspect for the same reason – something in one of the dependencies they both have. So the topo pre-processing (turning the .hgt files into a contour database) is also being run on a separate, Ubuntu 18.04 server (just like the coastlines).
  • Set up appropriate changes and customizations for the front-facing rails port (osm website). This involves importing user data (done) but also user diaries (not done). These require ad hoc SQL coding that give me flashbacks to my job as DBA in the 2000’s. Another unfinished piece – internationalization. The current ogfmirror website looks okay, but only in English. Switch to another language, and it all reverts to OSM boilerplate. Why is internationalization done so badly on production software of this kind? I see no easy solution except manually editing each language’s .yml file in turn (OSM has a 100+ languages). Or building my own damn application to achieve that result.
  • Set up overpass and overpass-turbo. Overpass installs relatively painlessly, but I’m having trouble getting incoming replication to work correctly. overpass-turbo was quite difficult – the current version on github is flat-out broken, and so an older version (commit) must be compiled and installed. Further, the compilation and configuration process overwrites some of the parameters files, so the parameters files have to be modified after running the first steps of configuration, but before the last part. This is the step I am on right now.
  • Set up nominatim? – nice to have, but not urgent. Anyway nominatim doesn’t work on the existing OGF website
  • Implement some of the custom tools that are available on the OGF website: the “scale helper,” the “coastline helper,”…
  • What else? This is a work in progress…

So I’ve been busy. Here is a link to the site. Bear in mind, if you are reading this in the future, the link may not show you what I’m currently writing about, but rather some future iteration of it.

https://ogfmirror.com

I’m still working on some of those last steps. Open to hearing what else needs to be done.

Music to hack to: K-os, “Hallelujah.”

Font Fail

As usual, I have neglected this blog. So what’s been going on?

I have taken some steps to migrate one of my major geofictions – The Ardisphere – from OGF to my self-hosted OGFish clone, Arhet. The reason for this is that OGF seems increasingly rudderless and destined to eventually crash and burn, and I am emulating the proverbial rat on the sinking ship. I still hugely value the community there. But the backups have become unreliable, the topo layer (of which I was one of the main and most expert users) has been indefinitely disabled, and conceptual space for innovation remains unavailable.

One small problem that I’ve run up against in migrating The Ardisphere to Arhet is that I discovered that Korean characters were not being supported correctly by the main Arhet map render, called arhet-carto. This is a problem because the Ardisphere is a multilingual polity, and Korean (dubbed Gohangukian) is one of the major languages in use, second only to the country’s lingua-franca, Spanish (dubbed Castellanese). I spent nearly two days trying to repair this Korean font problem. I think I have been successful. I had to manually re-install the Google noto set of fonts – noto is notorious (get it?) for being the most exhaustive font collection freely available. I don’t get why the original install failed to get everything – I suspect it’s an Ubuntu (linux) package maintenance problem, rather than anything directly related to the render engine (called renderd, and discussed in other, long-ago entries on this sparsely-edited blog).

Here (below) are before-and-after screenshot details of a specific city name that showed the problem: Villa Constitución (헌법시) is the capital and largest city in The Ardisphere. Ignore the weird border-artifacts behind the name on these map fragments – the city is in limbo, right now, as I was re-creating it and it got stuck in an unfinished state.

Before – you can see the Korean (hangul) is “scattered”:

picture

After – now the hangul is properly-composited:

picture

You can see The Ardisphere on Arhet here – and note that within the Arhet webpage you can switch layers to OGF and see it there too. Same country, different planets!

Music to fiddle fonts by: Attack Attack! “Brachyura Bombshell”

The Terrible mysql Crash of 2021

I still don’t know how it happened. I somewhat suspect I got hacked, somehow … I found strange and unexpected Chinese IP addresses in my mysql error log. But I don’t understand mysql back end or admin well enough to know for sure what was going on.

I was able to restore a full-server backup to a new server instance, and have re-enabled the mysql-driven websites (my 2 blogs, my wiki, etc.) on the new instance. Meanwhile, I somewhat stupidly reactivated the non-mysql website (the geofictician OSM-style mapping site, the so-called “rails port”) on the old server instance. The consequence of that is that I am now stuck with a two-server configuration where I had a single server configuration before. I think in the long run I’ll want to isolate ALL my mysql-based sites to a single server, and ALL my non-mysql-based sites to another single server. That’s going to take a lot of shuffling things around, which is not trivial.

For now this blog (and my other blog) seems healthy and up-and-running, again.

There may be more downtime ahead as I try to reconfigure things more logically, however.

Music to do sysadmin drudgery by: Talking Heads, “Found A Job.”

Blocks and more blocks

I keep on with my many small steps approach to Ohunkagan. I feel it becoming mythical in my mind and memory. I imagine the corners where scenes take place in unwritten novels, the streetcars characters ride to well-mapped destinations.
I’ve updated the “work-in-progress” gif.

picture

 

Here is the current snapshot, let’s call it 1907.

picture

[Technical note: screenshot taken at this URL (for future screenshots to match).]

Here’s the wider area snapshot.

picture

[Technical note: screenshot taken at this URL (for future screenshots to match).]

Music to map by: Arvo Pärt, “Tabula rasa.”

More Ohunk Than Evar

I keep doing small edits for Ohunkagan Metropolitan Area. I’m up to the year 1904 or so.

I made this cool gif of the progress so far:

picture

Here’s the transit network, on the same frame:

picture

Here is a wider area view – I’m going to start a time series of screenshots for these, too, to show the growth of the metropolitan area.

 

picture

I have been placing lots of industry and factories and such. I’m most proud of the rail-car factory, here.

Music to map by: HAUJOBB, “Dead Market.”

Round and round

I’m sorry I neglected this blog for the last two months. It wasn’t because I stopped geofiction activities – I just kind of forgot to update anything here. In fact, I’ve been staying busy with various geofiction projects.

I ran across a small, free website that someone made that transforms a flat map of an imaginary planet into a globe that you can rotate with the mouse or that can be used to generate a “spinning world” gif. It’s called maptoglobe.com.

I decided I wanted to make one for my planet, Arhet – just out of curiosity. This did have a few minor technical challenges. First, I had to “knit” together the tile images for Arhet. I found a nice utility that does this, an application called tile-stitch by Eric Fischer. It can be found on github. Except for one small problem, I just followed the documentation provided on the github README. That one problem: to get it to work in my machine, I needed to modify the code in the stitch.c file to include the full path to the geotiff utilities. So…

Original code:...

#include <geotiffio.h>
#include <xtiffio.h>
...

My version:...

#include </usr/include/geotiff/geotiffio.h>
#include </usr/include/geotiff/xtiffio.h>
...

Once that was set up, I simply extracted the tiles at zoom level 5 from the Arhet2-carto render using the tile-stitch utility, with this command
./stitch -o arhet5.png -- -85.05 -179.99 85.05 179.99 5 https://tiles01.rent-a-planet.com/arhet2-carto/{z}/{x}/{y}.png

That got the whole planet into a square .png file, which I called arhet5.png.

The next problem is that the maptoglobe website requires the map image to be in a equirectangular projection. But the tiles for Arhet are in the modified mercator projection used by almost all online “slippy maps,” classified as EPSG:3857.

So the arhet5.png file was in the wrong projection. I found out I could use another utility that I already had, the gdal library, to do this job. I ran the following commands.
/usr/bin/gdal_translate -of Gtiff -co "tfw=yes" -a_ullr -20037508.3427892 20036051.9193368 20037508.3427892 -20036051.9193368 -a_srs "EPSG:3857" "arhet5.png" "arhet5_tfw.tiff"

/usr/bin/gdalwarp -s_srs EPSG:3857 -t_srs EPSG:4326 -ts 6400 3200 "arhet5_tfw.tiff" "arhet5.tif"

These produced a .tif file in the right projection, 6400 x 3200 pixels. I then opened this file and resaved as .png again (because this is a more compact format that is therefore uploadable to maptoglobe.com – which has a maximum file size limit).

I then uploaded that .png file to the maptoglobe site, and it allowed me to save the resulting “globe” – it’s accessible here. Further, I was able to make this nice little spinning planet gif:

That’s the planet Arhet, as it currently stands – note that most of the mapping there is not my own, but the work of the various other Arhet members who have joined me in my experiment.

That worked out so well that I did the same thing for my own private planet, Rahet (note that the names Arhet and Rahet are obviously related; Rahet came first, and when I decided to change the project and invite other participants, I renamed the old Rahet as Arhet, and then resurrected the old Rahet later and as a separate project again).

Here is a the link for Rahet on the maptoglobe site, and here is the spinning planet gif:

So those are pretty cool. Remember that the original “slippy maps” (HRATEs) of these two projects are on the map portion of this website, here and here.

Music to make HRATEs to: 하선호 (Sandy), “Love Me More”