OGF Live

The OGFMirror became opengeofiction.net, over the weekend.

This is huge news.

So far the site is working okay. Not great, just okay. There are some issues.

  • The ActionController::InvalidAuthenticityToken is frequent and painful, for users.
  • Possibly related to the above – users keep getting forcibly logged out, over and and over.
  • The render is still playing catchup on the whole world map, and seems to be lagging around 2 hours for high zooms.
  • Incoming email is completely broken – possibly due to errors in the DNS tables at the opengeofiction.net registrar (and we’re waiting on the old host to fix this)
  • Outgoing email is problematic for some substantial portion of users due to over-aggressive anti-spam efforts by several major email providers, including Apple (icloud) and Microsoft (hotmail, outlook, live). I’m not even sure how to begin fixing this. I’ve implemented DKIM, but this also relies on fixing DNS errors which are not currently being fixed, and that might help. I’ve looked into a blacklisting of my email server by spamhaus.org and discovered it is due to my email server sharing the same IP range with a Nigerian Prince or somesuch in the server farm where it lives.

Work continues. Meanwhile, to all the users of OGF: “Happy mapping!”

Music to map happily to: Cake, “I Will Survive.”

OGFMirror

I’m super bad about posting to this blog. That’s partly because I feel a strong desire to report some actual, positive progress, which I haven’t felt enabled to do.

I have been very busy with HRATE technicalities. I am building – very, very slowly – a “mirror” for opengeofiction.net (OGF). I think if this is successful, then the owner of that site, who has expressed interest in “letting go” of having to continue to maintain it, will allow the mirror to take over for the site and a transition to a new hosting environment will be complete.

Someday, I intend to write up, in elaborate, technical detail, this process of setting up a mirror. But in broad outlines, here is what it involves (has involved, will involve).

  • Build a new Ubuntu 20.04 LTS server. This leads to lots of incompatibilities farther down the line, because the existing OGF server is an older version. Install the basics – apache, postgresql, etc.
  • Install an OSM rails port on the server.
  • Migrate the OGF data to this server. This was very, very hard – because the OGF data (in either .osm.pbf format, or in pg_dump format) proved to contain inconsistencies (data corruption). Some missing current nodes and ways had to be restored manually (text-editing .osm = .xml files). This ended up a 2-weeks-long process.
  • Set up incoming replication from the source apidb (OGF) to the new mirror (currently being called ogfdev).
  • Set up outgoing replication for the new ogfdev instance (to drive render, overpass, etc)
  • Set up a new primary render. This had some sub-parts.
    • coastlines. This proved very difficult, because as far as I can figure out, the osmcoastline tool used to create the coastline shapefiles is broken on Ubuntu 20.04. An older version must be used. My current workaround: I’m actually running coastlines on an older server. I import a coastline-containing pbf file to the older server, run the osmcoastline tool, and post the shapefiles for consumption on the render server.
    • I made a decision to run the renders on a different server than the apidb. I think this might involve a bit more expense, short term, but it makes the whole set of processes more scalable, long term. My experience with Arhet is that the render requires scaling sooner / more frequently than the apidb, as the user base grows. Installing the render software (mod_tile and “renderd”) proved difficult. It turns out that there are some lacunae and downright incorrect steps in the documented installation sequences on github.
    • Set up incoming replication from the ogfdev database to the render database.
    • There are substantial differences in recent versions of the openstreetmap-carto style – specifically, the special shapefiles are no longer stored in as datafiles in data folder in the render directory. Instead, the shapefiles are loaded to the database. Because non-standard shapefiles are used, this means rewriting the load procedures (python scripts) – the standard approach is to just grab the files for “Earth” (because who would run osm for some other planet?!). So that file-grabbing is hard-coded in the procedure.
  • Set up a new topo render. The topo render was shut down on OGF, so this will be the only working version. Unfortunately, I ran into a similar problem with some of the topo pre-processing as I ran into with osmcoastline, above. I suspect for the same reason – something in one of the dependencies they both have. So the topo pre-processing (turning the .hgt files into a contour database) is also being run on a separate, Ubuntu 18.04 server (just like the coastlines).
  • Set up appropriate changes and customizations for the front-facing rails port (osm website). This involves importing user data (done) but also user diaries (not done). These require ad hoc SQL coding that give me flashbacks to my job as DBA in the 2000’s. Another unfinished piece – internationalization. The current ogfmirror website looks okay, but only in English. Switch to another language, and it all reverts to OSM boilerplate. Why is internationalization done so badly on production software of this kind? I see no easy solution except manually editing each language’s .yml file in turn (OSM has a 100+ languages). Or building my own damn application to achieve that result.
  • Set up overpass and overpass-turbo. Overpass installs relatively painlessly, but I’m having trouble getting incoming replication to work correctly. overpass-turbo was quite difficult – the current version on github is flat-out broken, and so an older version (commit) must be compiled and installed. Further, the compilation and configuration process overwrites some of the parameters files, so the parameters files have to be modified after running the first steps of configuration, but before the last part. This is the step I am on right now.
  • Set up nominatim? – nice to have, but not urgent. Anyway nominatim doesn’t work on the existing OGF website
  • Implement some of the custom tools that are available on the OGF website: the “scale helper,” the “coastline helper,”…
  • What else? This is a work in progress…

So I’ve been busy. Here is a link to the site. Bear in mind, if you are reading this in the future, the link may not show you what I’m currently writing about, but rather some future iteration of it.

https://ogfmirror.com

I’m still working on some of those last steps. Open to hearing what else needs to be done.

Music to hack to: K-os, “Hallelujah.”

Font Fail

As usual, I have neglected this blog. So what’s been going on?

I have taken some steps to migrate one of my major geofictions – The Ardisphere – from OGF to my self-hosted OGFish clone, Arhet. The reason for this is that OGF seems increasingly rudderless and destined to eventually crash and burn, and I am emulating the proverbial rat on the sinking ship. I still hugely value the community there. But the backups have become unreliable, the topo layer (of which I was one of the main and most expert users) has been indefinitely disabled, and conceptual space for innovation remains unavailable.

One small problem that I’ve run up against in migrating The Ardisphere to Arhet is that I discovered that Korean characters were not being supported correctly by the main Arhet map render, called arhet-carto. This is a problem because the Ardisphere is a multilingual polity, and Korean (dubbed Gohangukian) is one of the major languages in use, second only to the country’s lingua-franca, Spanish (dubbed Castellanese). I spent nearly two days trying to repair this Korean font problem. I think I have been successful. I had to manually re-install the Google noto set of fonts – noto is notorious (get it?) for being the most exhaustive font collection freely available. I don’t get why the original install failed to get everything – I suspect it’s an Ubuntu (linux) package maintenance problem, rather than anything directly related to the render engine (called renderd, and discussed in other, long-ago entries on this sparsely-edited blog).

Here (below) are before-and-after screenshot details of a specific city name that showed the problem: Villa Constitución (헌법시) is the capital and largest city in The Ardisphere. Ignore the weird border-artifacts behind the name on these map fragments – the city is in limbo, right now, as I was re-creating it and it got stuck in an unfinished state.

Before – you can see the Korean (hangul) is “scattered”:

picture

After – now the hangul is properly-composited:

picture

You can see The Ardisphere on Arhet here – and note that within the Arhet webpage you can switch layers to OGF and see it there too. Same country, different planets!

Music to fiddle fonts by: Attack Attack! “Brachyura Bombshell”

The Terrible mysql Crash of 2021

I still don’t know how it happened. I somewhat suspect I got hacked, somehow … I found strange and unexpected Chinese IP addresses in my mysql error log. But I don’t understand mysql back end or admin well enough to know for sure what was going on.

I was able to restore a full-server backup to a new server instance, and have re-enabled the mysql-driven websites (my 2 blogs, my wiki, etc.) on the new instance. Meanwhile, I somewhat stupidly reactivated the non-mysql website (the geofictician OSM-style mapping site, the so-called “rails port”) on the old server instance. The consequence of that is that I am now stuck with a two-server configuration where I had a single server configuration before. I think in the long run I’ll want to isolate ALL my mysql-based sites to a single server, and ALL my non-mysql-based sites to another single server. That’s going to take a lot of shuffling things around, which is not trivial.

For now this blog (and my other blog) seems healthy and up-and-running, again.

There may be more downtime ahead as I try to reconfigure things more logically, however.

Music to do sysadmin drudgery by: Talking Heads, “Found A Job.”

Round and round

I’m sorry I neglected this blog for the last two months. It wasn’t because I stopped geofiction activities – I just kind of forgot to update anything here. In fact, I’ve been staying busy with various geofiction projects.

I ran across a small, free website that someone made that transforms a flat map of an imaginary planet into a globe that you can rotate with the mouse or that can be used to generate a “spinning world” gif. It’s called maptoglobe.com.

I decided I wanted to make one for my planet, Arhet – just out of curiosity. This did have a few minor technical challenges. First, I had to “knit” together the tile images for Arhet. I found a nice utility that does this, an application called tile-stitch by Eric Fischer. It can be found on github. Except for one small problem, I just followed the documentation provided on the github README. That one problem: to get it to work in my machine, I needed to modify the code in the stitch.c file to include the full path to the geotiff utilities. So…

Original code:...

#include <geotiffio.h>
#include <xtiffio.h>
...

My version:...

#include </usr/include/geotiff/geotiffio.h>
#include </usr/include/geotiff/xtiffio.h>
...

Once that was set up, I simply extracted the tiles at zoom level 5 from the Arhet2-carto render using the tile-stitch utility, with this command
./stitch -o arhet5.png -- -85.05 -179.99 85.05 179.99 5 https://tiles01.rent-a-planet.com/arhet2-carto/{z}/{x}/{y}.png

That got the whole planet into a square .png file, which I called arhet5.png.

The next problem is that the maptoglobe website requires the map image to be in a equirectangular projection. But the tiles for Arhet are in the modified mercator projection used by almost all online “slippy maps,” classified as EPSG:3857.

So the arhet5.png file was in the wrong projection. I found out I could use another utility that I already had, the gdal library, to do this job. I ran the following commands.
/usr/bin/gdal_translate -of Gtiff -co "tfw=yes" -a_ullr -20037508.3427892 20036051.9193368 20037508.3427892 -20036051.9193368 -a_srs "EPSG:3857" "arhet5.png" "arhet5_tfw.tiff"

/usr/bin/gdalwarp -s_srs EPSG:3857 -t_srs EPSG:4326 -ts 6400 3200 "arhet5_tfw.tiff" "arhet5.tif"

These produced a .tif file in the right projection, 6400 x 3200 pixels. I then opened this file and resaved as .png again (because this is a more compact format that is therefore uploadable to maptoglobe.com – which has a maximum file size limit).

I then uploaded that .png file to the maptoglobe site, and it allowed me to save the resulting “globe” – it’s accessible here. Further, I was able to make this nice little spinning planet gif:

That’s the planet Arhet, as it currently stands – note that most of the mapping there is not my own, but the work of the various other Arhet members who have joined me in my experiment.

That worked out so well that I did the same thing for my own private planet, Rahet (note that the names Arhet and Rahet are obviously related; Rahet came first, and when I decided to change the project and invite other participants, I renamed the old Rahet as Arhet, and then resurrected the old Rahet later and as a separate project again).

Here is a the link for Rahet on the maptoglobe site, and here is the spinning planet gif:

So those are pretty cool. Remember that the original “slippy maps” (HRATEs) of these two projects are on the map portion of this website, here and here.

Music to make HRATEs to: 하선호 (Sandy), “Love Me More”

Messing with Maperitive

For a long time I avoided Maperitive – because when I tried it several years ago it repeatedly crashed my computer. So I thought of it as bad software.

Recently, I decided to give it another try. This is in relation to wanting more tools to be able to develop and understand custom renders, related to my efforts to expand functionality on my own map server.

So just this morning I downloaded Maperitive, got a successful install, and played with it a little bit. I made a google maps style view of my city Ohunkagan.

I tried to make a detailed view of the OGF country called Egani (because my efforts with Maperitive just happened to match up with that mapper’s request for some technical help).

That’s a pretty detailed map – you will have to download and zoom around to see the detail.

Maperitive is powerful, but it’s got a pretty steep learning curve too. I’ll mess with it some more, as it appears to have a stable version that runs on Linux, now.

Here is a map of my village of Goodgrove, on Arhet (my own map server).

Music to view maps by: Muse, “Map of the problematique.”

HRATE

In response to several queries from other geoficticians about how to set up a map server (e.g. Arhet or Ogieff), I decided to consolidate some documentation about how I got Arhet to work. It is here:

http://wiki.geofictician.net/wiki/index.php/HRATE

It is very much a work-in-progress. As I think of information to add, or decide to add more detail, I’ll fill it out. At some point, I’ll be trying to do a “re-build” of the Arhet stack to a new server (for reasons related to the fact that I have other things running on its current server that have a higher need for reliability), and when I do that, it will give me a chance to review and better understand the process.

Music to write documentation by: Erik Satie, “Gnossienne No. 1, 2, 3.”

Git topo

I finally got tired of dealing with Windows 10 drama, and decided to rebuild my preferred Ubuntu Linux desktop, as I’d been using in Korea before moving away last July.

I’ve made good progress on that, and have JOSM up and working again, and all that. But I became aware, as I was migrating my data and files, that I have a lot of files I would rather not lose, especially related to my geofiction. I need some systematic means of keeping stuff backed up.

I handled the issue of backup and redundancy for my creative writing years ago, when I started storing all my drafts and notes in google docs. It’s convenient, too, because I can get to my writing no matter where I am.

But I have no such system for all my .osm files for the geofiction. Especially important are the .osm files I use for drawing the topo layer, since those are never uploaded anywhere except temporarily at the time of an update.

I suppose I could just copy the files. But I decided I needed to store them in some kind of version-controlled space. About two years ago, I’d had them in a git repository but it was just copied out to an extra harddrive. I used git for some other stuff I used to do, so it wasn’t that hard to figure out.

I decided this time to try something different – I made a repository on github and decided to put my topo .osm files there. If I get in the habit of regularly updating the git repository, I’ll always have those topo files, no matter what happens to my computer or where I am. Further, if ever I go in the direction of wanting to collaborate on drawing topo files, this will make it really easy (assuming the other person is up to dealing with checking things out of a git repository).

If ever there will be a truly collaborative geofiction “planet” with a master topo layer, this might be a way to maintain that information, since practically speaking it can’t and shouldn’t be uploaded to the map server. Just an experiment, I guess, and meanwhile I’ll have a reliable backup of my work.

Music to map by: 선미, “가시나.

Some weeks…

And then, some weeks, I don’t get much done.

I started working on trying to customize my Rails Port (the main “copy” of the OpenStreetMap slippy map), and got very bogged down in the fact that the OpenStreetMap Rails Port is highly complex software written in a language and using an architecture unfamiliar to me: the infamous “Ruby on Rails.”

I dislike the way that the actual name “OpenStreetMap” is hard-coded throughout all the little modules. It seems like a poor application design practice, especially for an opensource project. One area where the name proliferates is in all the internationalization files. So I started wondering how hard it might be to get all these internationalization files to be more “generic.” The answer: pretty hard, at least for me.

I’ve wandered off down a digressive passage where I’m learning about software internationalization under the Ruby on Rails paradigm, but I’m undecided how I want to handle this. Do I want to try to solve it the “right way”? Or just kludge it (most likely by deleting all the internationalization files except perhaps English, Spanish, and Korean)?

Meanwhile I have also got pulled away by some non-computer, non-geofiction projects.

So… not much to report, this week – nothing mapped, nothing coded, nothing configured.

Music to map by: Sergei Rachmaninoff, “Piano Concerto No. 2.”