Poisonous Potatoes

This weekend the Independent has run an article about an environmental pressure-group GM Watch publishing a report that claims that Genetically Modified (GM) potatoes are unsafe for human consumption. Upon further inspection this report was actually published by GM Free Cymru which I think puts it fairly in my patch. Why do I care about this?; I’m living in Wales and I used to be a molecular biologist.

The headline of the Independent article “Suppressed report shows cancer link to GM potatoes” would lead one to think that this report shows GM potatoes cause cancer in Rats: It doesn’t and they don’t.

So what does the report show? The first thing to mention is that this isn’t a full scientific paper. There is a great deal of detail missing from the description of the experiment but even without those details it is possible to draw some conclusions, although those conclusions differ markedly from those drawn by the Independent. To make us even more confident in the results the report leads off with

“The analysis of the relevant part of the Institute of Nutrition Report showed that the studies were not carried out according to the accepted protocols for the biomedical assessment of GM food and feed (1). Many of the conclusions drawn by the authors do not correspond to the obtained data and therefore they are incorrect.”

Three groups of 10 Wistar rats were fed on different diets for 6 months. These groups were:
Control – normal feed.
RB – normal feed plus boiled Russett Burbank potato.
RB-GM – normal feed plus boiled genetically modified Russett Burbank potato.

After six months the animals were killed so they could be autopsied. Both the RB group and the RB-GM group had lost weight relative to the Control with the RB-GM group having lost the most weight. Both the RB group and the RB-GM group showed pathological changes in their livers. The RB-GM group is described as having the worst changes but it is not described how that determination was made.

Taking this at face value it clearly shows that Russet Burbank potatoes and genetically modified Russet Burbank potatoes are poisonous to Rats (and by extension to humans). There is one small problem with this: the Russett Burbank has been in commercial cultivation since 1871 and is one of the most widely grown varieties of potato on the planet. If it’s as poisonous as the report makes out then people should be dropping dead of potato poisoning left,right and centre.
So why are these rats showing pathology? My suspicion (there aren’t enough details given of the protocol to know for sure) is that the various diets were either not calorie controlled (i.e normal diet plus loads of potato) or they were calorie conrolled without compensating for nutrient changed in the diet (i.e reduce the amount of cereal/grain to keep the calories constant but now the rats are missing out on lots of vital vitamins and minerals which are not present in the potato).

Which brings us on to the bit about cancer. Nowhere in the report is cancer mentioned. It doesn’t even mention the word tumor. So I can only conclude that the Independent made that bit up out of whole-cloth.

In short, both the Independent article and the report it cites are trash from start to finish.

For a more reasonable take on these particular GM potatoes one can read the report from the Canadian Food Inspection Agency which licensed these genetically modified potatoes for sale in 1996. You’ll note that this report mentions toxicology studies in mice which showed no adverse effects. I’ve emailed the agency to see if it’s possible to get the details of these experiments.

Solaris 10 and Telnet

Yes it’s still turned on by default at least in Solaris 10 update 2. That’ll teach me not to portscan new *nix installs where I’m not completely familiar with the OS defaults.
Seriously though, is there a good reason for telnet to be on by default in this day and age?

TLS and UML = PITA

So it’s sunday and I decide to run apt-get on my Debian box which hosts this here blog and the peapod project page. Mysql 5 gets an update. apt-get stops it: apt-get starts it: and it dies on it’s arse.

Feb 11 15:54:02 hlynes mysqld[707]: 070211 15:54:02 [Note] /usr/sbin/mysqld: Shutdown complete
Feb 11 15:54:02 hlynes mysqld[707]:
Feb 11 15:54:03 hlynes mysqld_safe[6754]: ended
Feb 11 15:55:46 hlynes init: Trying to re-exec init
Feb 11 16:01:45 hlynes mysqld_safe[10043]: mysqld got signal 11;
Feb 11 16:01:45 hlynes mysqld_safe[10043]: This could be because you hit a bug.
It is also possible that this binary
Feb 11 16:01:45 hlynes mysqld_safe[10043]: or one of the libraries it was linked
against is corrupt, improperly built,
Feb 11 16:01:45 hlynes mysqld_safe[10043]: or misconfigured. This error can also
be caused by malfunctioning hardware.
Feb 11 16:01:45 hlynes mysqld_safe[10043]: We will try our best to scrape up som
e info that will hopefully help diagnose
Feb 11 16:01:45 hlynes mysqld_safe[10043]: the problem, but since we have alread
y crashed, something is definitely wrong
Feb 11 16:01:45 hlynes mysqld_safe[10043]: and this may fail.

After much googling it turns out to be to do with the fact that this is a Bytemark virtual machine running under UML. Apparently the tls library which is part of NPTL does some very strange things with memory that work just fine on a normal kernel but not one running under UML.

Anyway the workaround is to move /lib/tls out of the way. Apparently this can fox apache also.

Lisp Breaks My Brain-Meats

I started reading Structure and Interpretation of Computer Programs over the weekend. I’ve only got as far as the first set of examples and already my head hurts.

Write a procedure that takes three arguments and returns the sum of the squares of the larger two numbers. Without iteration because they haven’t shown you how to do loops in Scheme at this point in the text.

I’ts both wonderful and depressing that by the end of page 2 I’ve already hit a gaping chasm in my programming skills. I will prevail.

Incidentally, this is not a request for people to post the solution. Anyone doing so will suffer the wrath of my patented ‘slap-in-the-face-over-XML-RPC’ system.

Wikis, Wikis Everywhere

One of the first things I did when I arrived at my current job was to whack a copy of mediawiki on one of my servers so that I somewhere to write ad-hoc documentation. As time has passed we have built up quite a few docs and had to lock down the wiki as some of the documents are externally viewable.

It has become apparant that we could do with moving to a wiki that is more suited to our current usage.  So my list of wiki requirements is:

  • Be able to re-use apache authentication since we already have working mod_auth_pam setups talking to central LDAP.
  • Different parts of the wiki editable/viewable by different users.
  • Pages can be public or only viewable by logged in users.
  • Able to use multiple different auth mechanisms concurrently: e.g Apache,PAM,wiki-login
  • Themes/stylesheets for different sections of the wiki
  • File upload/attachments. I don’t want people to have to ftp documents and link to them manually.

So far the contenders are Midgard, MoinMoin and TWiki. I’m leaning towards Moin because it’s written in Python. What I’d really like to hear is some comments from people you have similare requirements. What wiki did you go with and what were your experiences?

Thanks

Backups Part 3: Rotating and Culling Backups

I’ve now got backup scripts happily creating copies of all my subversion repositories and MySQL databases every 24 hours. This is great but it means you end up with an awful lot of backups. I realy don’t need a backup from every day going back forever. But it is nice to have snapshots going back into history in case something subtle has gone wrong.

What I’d really like is to copy the oldest backups into a different directory every seven days, and delete all the backups in the main directory that are older than seven days. Of course I’ll then end up with piles of backups building up in the weekly directory. So I’d like to go through the weekly directory every month, copy the oldest backups into another directory and delete all the weekly backups that are more than one month old.

To do this I give you the snappily named rotate-backups

rotate-backup            rotates backups in a given directory weekly,or monthly

-b                       directory to rotate backups in
-f                       file containing list of backup dirs
-t                       time period (weekly,monthly)

The config file is just a new-line separated list of directories. To make it work I put a script in /etc/cron.weekly like:

rotate-backups -t weekly -f /etc/rotate-backups

and one in cron.monthly:

rotate-backups -t monthly -f /etc/rotate-backups

The backup script makes an assumption that backups created on the same day are from the same backup run. It copies the ‘oldest’ backups by copying files from the same day as the oldest file in the directory. This way it doesn’t have to know anything about what you are backing up or what your naming conventions are.

Also it culls old backups relative to the date of the latest file in the directory. This means that if you stop taking backups the script won’t keep deleting files until you have none left.

Backups – Part 2: Subversion

We run a subversion service for a number of different research groups. Generally we create a separate subversion repository for each group. Obviously looking after this data is important. It’s not a good idea to lose months of someone’s work.

Fortunately backing up a subversion repository is pretty simple. Subversion ships with a utility called svnadmin. One of the functions of which is to dump a repository to a file.

As the repository owner do:

svnadmin dump /path/to/repository > subversion_dump_file

I have a directory full of subversion repositories so what I really want is a script that will find all the subversion repositories in the directory and dump them with sensible filenames. With my usual lack of imagination I’ve called this script svn_backup. It runs svnadmin verify against each file in the directory it’s given. If any of them turn out to be subversion repositories it dumps them using svnadmin dump.

$ ./svn_backup
svn_backup  -s subversion_dir [-b backup_dir] [ -o logfile]
        script to backup subversion repositories

        -s      directory containging subversion repositories
        -b      directory to dump backups to. defaults to /tmp
        -o      file to output names of backups to

So I now have an entry in cron.daily like:

svn_backup -s /var/www/subversion -o /tmp/svn.log -b /var/backups/svn

The reason I write the backups to a log file is that it allows me to run a script once a week that copies the latest backups to Amazon’s S3 storage system.

The scripts:
svn_backup
s3svnbackup.py

Backups – Part 1: MySQL

Like most people I’ve got a number of MySQL servers with different databases running things like Wikis and other web-apps. MySQL ships with a handy little tool called mysqldump which can be used to dump ( as the name suggests ) a mysql DB to a text file containing SQL commands necessary to re-create it.

The first thing I like to do is to create a backup user that only has enough privileges to do the backup.

GRANT LOCK TABLES,SELECT,RELOAD,SHOW DATABASES ON *.* TO 'backup'@'localhost' IDENTIFIED BY 'mypasswd';
FLUSH PRIVILEGES;

with this done you should be able to do something like

mysqldump -hlocalhost -ubackup -pmypasswd --databases mysql > test_backup.mysql

With that in place it was an easy task to write a script that can read a config file for the backup login info and options to pass to mysqldump. This script has a number of benefits over sticking a mysqldump command straight into cron:

  1. It stores it’s settings in an external config file so you can use the same script in several settings
  2. It backs up each database on the server into a separate dump file.
  3. The backup options are in the config file so you can back up different servers with different tweaks. e.g locking the DB for consistancy.

the line in the config file looks like

localhost:backup:mypasswd:--opt --add-drop-table

I add a quick script to /etc/cron.daily like

mysql_backup -b /var/backups/mysql -f /etc/mysql_backup

I can now sleep a bit easier. This isn’t the only way to back up mysql, proper DBAs will know all about replication and various other tricks.

Next time: subversion repositories.

Backup Scripts

Over the last couple of weeks I’ve been chipping away at the problem of our department having no backups whatsoever. Being a small department with few machines and a fairly small amount of data I’ve decided that systems like Bacula and Amanda are over-kill for our situation.

I’ve written a set of small scripts to handle our most pressing backup needs. Over the next few posts I’ll describe how I’ve backed things up and the scripts and tools I’ve used to do it. None of this is rocket-science but if it saves even one person, one hour of work it’ll have been worth writing down.

All the scripts in the next few posts can be found in the WeSC subversion repository.

Part 1: Mysql

Part 2: Subversion

Part 3: Rotating and Culling Backups