2 ways to your offline wikipedia

… for whatever reason you might need that ;)

For both ways you will need the dumps, you get them here:
http://dumps.wikimedia.org/dewiki/latest/
The main dump is the pages-articles.xml.bz2 if you want the categories as well, you need the
category.sql.gz and categorylinks.sql.gz too.
The pages file is quite huge and will take you probably about 4 hours to download, depending on your connection.

You will also need a fresh and running installation of the mediawiki software.
Install it and here we go:
Continue reading “2 ways to your offline wikipedia”

Migrate to Mongolab

Recently i ran into RAM troubles on my vserver for some reasons, i encountered the evil:

Cannot allocate memory at ...

So first i suspected mongodb to use up loads of memory as top showed.

But after some recherche work i learned mongodb only -seems- to use a lot of memory.
see here and here and here
The actual usage was around 20mb RAM, so mongodb was innocent.

The true RAM monsters were some apache and php-fpm zombies, but thats another story.

While suspecting mongodb i thought about outsourcing the mongodb and i found a free and sufficient offer in mongolab.
My interests were on and i gave it a try.
The free version has a limit for up to 240MB storage and since my app is just a small counter it should last for some time.
Continue reading “Migrate to Mongolab”