apache cgi fork bomb

Like Russell Coker I had to deal with an apache OOM crashing my main Xen instance yesterday.

In my case, it is due to the Czech IP 88.102.6.114, which decided it would be a good idea to download the entire output of viewvc, recursively, and apparently also in parallel. After 2 hours and some 40 thousand hits, this eventually started enough instances of viewvc and svn that it used up 512 mb of ram and OOMed the system.

I don't have a satesfactory fix yet. Viewvc shouldn't let itself be run dozens of times simulantaneuosly, surely. Apache doesn't seem to provide a way to limit the total number of CGIs it can launch at once, except by limiting the total number of connections. Apache's rlimitnproc option sadly only applies the limits after starting a CGI, rather that applying them to apache itself. Something that could control the number of CGIs started on a per-directory and per-vhost basis would be ideal.

For now, I used ulimit -u when starting apache, which prevents many viewvc processes from running at once, but allows anyone who DOSes viewvc to also DOS all other CGIs on the system.

discussion

Posted
alioth ikiwiki users

If you use ikiwiki on alioth and didn't just get a mail about it, please contact me.

Posted
abram's

This evening I was going to help Megan and Eric work on their new barn, but they weren't around so I redirected over to Abram's falls and hiked down to the big falls.

(Picture from here.)

Standing under Abram's is the best, most theraputic shower anywhere.

discussion

Posted
watching paper cranes on mbone

I was just thinking about this. 1995, in the mac lab. The mbone window using a tons of bandwidth to show video of someone's paper crane mobile. Mostly hidden in the background, but ocasionally watched as an antidote to the stess-making car-crash sounds from 30 other macs as we all learned how to make them crash programming them in assembly.

Anyone else remember those paper cranes on the mbone?

Posted
video games: competition, classicism, emulation

Biella blogged about King of Kong, a movie about classic video games. I also enjoyed it, although I felt the rivalry and competition cheapened it (while also certianly giving it more of a story to tell).

I wish I could enjoy competition the way Bubulle so eloquently describes. For me though, getting to the end of a game is more like raching the end of a long hike, whether you beat someone to the end or not doesn't matter, it's the personal experience of getting there that counts. Anyway..

The reason I'm really responding to Biella's post is because of the question she asks:

But what I also found amazing was how the movie conveyed the persistence of the (older) game. They live on in the lives of individuals and collectives, despite the rise of a whole, new class of games that are much more popular today. I am not sure how much longer they will live on, or if the movie was also inadvertently portraying the rise and slow decline of an era that will, in another 50 years, become part of the archive of dead history.

My guess is that the classic games will persist past the lifetime of their hardware, and that they'll keep on appealing to at least a small percentage of new gamers. Actually, I think they're more popular now than they were fifteen years ago, partly because the commercial gaming industry has moved further away from what the old games are really good at, and partly because emulation is so good that they're available to everyone.

And maybe even partly because to really appreciate Kenta Cho's games or many of the other new non-commercial games that Miriam keeps finding and pushing into Debian, you need to know the classics. Even if you do suck at galaga, and even if the only version you've really played is xgalaga, which lacks ship capture. :-)

I never played games in machines that ate quarters as a kid, because I didn't have many quarters, and it didn't seem to be a good value. I put one of the few quarters I've ever spent into a video game machine earlier this summer, in the corner of the Musée Mécanique that's dedicated to sorta-old video machines (as opposed to ancient, lovingly restored arcade games). It was a metal slug machine. I've beat metal slug under emulation, but only at the expense of many emulated quarters. Another quarter went into vintage game of pong. I enjoyed playing them both on real hardware, but emulation was good enough for both of these and for many other games in between.

Posted
yay, rising applachia

I finally got to hear Rising Applalachia just now (at Java J's). They're about to play a song they wrote last night called "the Shouting Sprout", about this nice little organic food store with the same name hidden in the middle of the strip mall fast food wasteland of Bristol's exit 7.

I can happily reocmmend both the band, and the store, which I discovered myself on Friday. :-)

Posted
foo feed

I've added a foo feed to my blog. Stuff posted there won't show up in my main blog feed, so it probably won't go across your eyeballs. I've not been blogging a lot lately, because it often seems not worth wasting bandwidth for random, half-thought-out, or unimportant stuff, and often not worth the effort to try to synthesise something interesting out of it all. I'll probably use the foo feed for such things, as well as for replies to blog posts that don't add a lot of substance. Just grist for the google mill.

Maybe that'll mean it becomes the interesting feed that everyone reads..

(PS, am I the only DD on Planet Debian who occasionally gets the heebie-jeebies about my every post scrolling across uncountable computer screens as part of that screensaver? Hi, you. Yes, you, the one reading this in a large-font crawl across your screen right now, who has no idea who I am, or even that there's a person typing this.)

Posted
everyone should program

Funny, I use vimoutliner just like Matther Palmer, and like him I also think that everyone should learn to program. If you don't know how to program, you can't truely use a computer. You can only fiddle with constructs someone else has made for you on a computer.

Somehow I doubt either of these ideas will catch on, but it's nice that I'm not alone in my heresies.

Posted
CHON

I've been reading The Omnivore's Dilemma, by Michael Pollan. The stuff about corn is interesting.

I was going to talk about how this relates to CHON-food and Pohl's Gateway. After all, both books feature industalising the suface area of whole states to produce food. Though one is SF and one isn't. But I'm too tired.

Posted
seth on diablo

Driving up Diablo, we're always in awe of the bikers going up it. When we're not thinking they're insane. :-)

Seth biked up. Awesome, and happy b-day!

Posted
on debian directories in upstream tarballs

From time to time the question comes up on Debian mailing lists of whether debian directories in upstream tarballs are harmful and what do do about them. I tend hide from such threads, since all upstream tarballs I create contain debian directories.

But I realised tonight that the common wisdom from those discussions is wrong, or at least incomplete. Here's why.

First, it's important the notice that this is not a problem in the rpm world. Putting an rpm spec file in an upstream tarball is considered good practice. There's no downside; if a packager needs to change how the package is put together, they can replace the spec file. And users can run rpmbuild on the tarball and get an rpm spit out. Handy.

So, then, why is the equivilant practice a problem in the debian world? Three intertwined reasons:

  1. The tools assume all files in debian/ are good. There's no way to make debhelper not install a debian/init script, or any similar file.

    Note that this problem only affects helper tools like debhelper; the core deb building tools only care about a few files that are in every debian directory. Effectively, debhelper, and (possibly) cdbs are the only tools with the problem currently.

  2. The source package format doesn't support removing files from the upstream tarball. This is a pretty silly limitation that any source format better than the current one should override. If wig and pen was ever implemented, it should support some means of deleting files.

  3. Package maintainers fear that they can't get upstream to make changes, or not fast enough, so worry about getting into a mess with files in an upstream debian/ that can't easily be fixed.

I feel that if any of these three problems is solved, an upstream debian directory is ok. #3 can be solved to some extent for some packages, though these solutions will always be the least satisfying.

It's tempting to try to solve #1 in debhelper. I could add an --ignore switch, to make debhelper commands not act on a given file in the debian directory. (Update: done in version 5.0.57!)

The best place to solve it, though, would be in the source package format. I'd be happy to see wig and pen be implemented, but would be happier to see us moving toward using a distributed revision control system as a source package format. Consider a source format that replaces .diff.gz with .git.tar.gz.

discussion

Posted
git archive as distro package format

The tricky part of using a git (or other rcs) archive as distribution source package format is handling pristine upstream tarballs.

One approach would be to try to create a git archive that didn't include objects present in the upstream tarball. Then, to unpack the source package, you'd unpack the upstream tarball, convert the files in it into git objects and add them into the .git directory. This way a pristine upstream tarball is available for verification, and it also has the same properties of not needing to re-upload the source for every package release as the current Debian source format.

This seems like it might be possible to implement, but you'd need to know quite a lot about git internals to remove the redundant objects from the git repo and regenerate them from the tarball.


Another approach would be to keep the pristine upstream tarball in the git archive, and then the source package would consist entirely of the git archive. This doesn't have the same nice minimal bandwidth upload behavior -- unless you can git push your changes to do the upload.

Storing a lot of upstream tarballs in git wouldn't be efficient, but I've written pristine-tar to take care of that. :-)

Posted
pristine tarball generator

Keeping pristine upstream tarballs around is a pain. Wouldn't it be nice to be able to keep them in revision control? Except, it would use far too much disk...

Here's a solution. It generates an binary delta between the pristine upstream tarball and a tarball created using files checked out of the repository. The delta should be quite small, and any checkout of the repository that has the same file contents as that used to create the delta can be used to regenerate the pristine tarball.

Example (which would presumably be more fun and faster if I used git):

joey@kodama:~package/uqm-voice>svn switch svn+ssh://uqm.debian.net/srv/svn/uqm/tags/uqm-voice/upstream_version_0.3
D    debian
A    comm/blackur/black041.ogg
U    comm/shofixt/shofi040.ogg
U    comm/starbas/starb182.ogg
D    comm/slyland/slyla030.ogg
U    comm/chmmr/chmmr035.ogg
U    comm/supox/supox031.ogg
Updated to revision 223.
joey@kodama:~package/uqm-voice> pristine-tar extract ~/uqm-voice.delta ../uqm-voice_0.3.orig.tar.gz

The generated tarball is bit-identical to the 19 MB upstream tarball, though you have to gunzip them both to check this, since the gzip compression differs. The delta file is all of 41k large.

The uqm-voice.delta file was created earlier as follows:

joey@kodama:~> pristine-tar stash lib/debian/unstable/uqm-voice_0.3.orig.tar.gz uqm-voice.delta

I've uploaded pristine-tar to incoming.

Posted