From The Infosphere, the Futurama Wiki
Jump to: navigation, search

This is SvipBot, it is the bot of Svip, hence its unoriginal name.

Currently, the bot runs a very simple routine system. It runs every third hour during the day, picks up the latest changes to the Recent Changes, if no changes have been made, it takes 5 random pages. So to inflict it to do stuff, edit pages as of right now.

In the future, User:SvipBot/tasks should be editable for the sake of being able to apply specific tasks for it to do on its next run, but that is only in speculation.

Routine run

Currently this is it what happens (without any parameters) when the bot is run. In the future, this will be run if no tasks are available.

Step 1
The bot will first check for a botlock file. This file indicates whether the bot is supposedly running. Should the bot have failed for any reason during the previous run, the file would not have been deleted and thus the bot dares not run again (if the problem persists).
Then the bot will gather the time stamp from a previous run (to only get changes since the last run).
Step 2
Obtain pages to edit
If there are changes since its previous run, all the changes (and thus each page changed) will be run.
If there are no changes, 5 random pages will be chosen instead.
In the future, it should be possible to apply a category or more on the tasks page for the bot to run in addition to these tasks.
Step 3
Run each page.
Then the script simply runs through all the pages it has now be handed.
  1. First it rearranges and fixes the page's categories.
  2. Then it fixes up its quotes (to fit the standard of using {{q}}).
    • Then it fixes 'appearance/reference' links, e.g. {{elink}} and such. Basically it attempts to use {{e}} and {{f}} instead. The 'noicon' versions will instead be turned into raw wikicode, e.g. '{{elink/noicon|1ACV01|Space Pilot 3000}}' would become '"Space Pilot 3000"'.
    • In addition to that, it also takes care of episode, comic and film links, if they appear raw in a page, but without the appropriate italic for films or quotation marks for episodes and comics.
  3. Then comes the general clean up, this is a set of minor tasks.
    1. Tidying up headlines (adding spaces around the titles to the pad them from the '==').
    2. Creates a line (if none is there already) before a headline to create some room when editing.
    3. Removes triple or more line breaks, which usually creates a big gap in articles, and replace them with a double line break to create the smallest gap.
    4. Then it fixes dates appearing in articles (this is however not entirely bullet proof yet, but works neatly so far), this is done by converting them to our agreed standard of 'DD Month, YYYY', e.g. 1 January, 2000.
    5. Then it changes 'Image:' to 'File:' in accordance with the new MediaWiki style.
      • In the future, it will also remove underscores from wiki-links as they are not needed and just ugly, but this could be its own small run, as this is not unique to file-links.
    6. And then it handles the appearance list by applying {{appear-begin}} around the list if it is longer than 15 elements and doesn't already have such a capsule.
Step 4
Close off
Then the bot writes the new time stamp for the next run to be used, writes its log and removes the botlock and it is done.

Message-box warning.png
The description below is outdated
All being said about the bot below on this page is outdated. The bot works very differently from when this was written. I will get around to update it, eventually.

Technical information

The bot is written in the Python programming language, it uses the MediaWiki API for obtaining information, and data, but has to rely on /index.php?action=raw (notice: clicking this link in a browser will prompt you to download a file) to obtain raw versions of pages.

Currently, the bot is manually run, performs its tasks, and closes.

Before the bot gets to work, it obtains data from previous runs, in order to not disclose too much of the same information. It then logs in to obtain a session token and other data in order to post. The bot performs the redirect task, which is done by getting a list of pages without redirects included in the (main) namespace from the API. For each of the creations, the bot has to make two requests, one to obtain the edittoken, and another to make the creation. If the first request suggests there already is an article called what it is intending to redirect from, the bot continues onto the next article, if not it creates the article.

The e/c fixing task, is done by obtaining a list of articles including the templates, for each, it has to obtain its content using index.php?action=raw, which it then modifies using regular expressions and its lists of episodes/films/comics. When modified, the bot edits the respective pages, by again first performing two requests, one to obtain an edittoken, and the next to make the actual edit.

For reasons currently related to some problems I have to Python, the bot ignores pages with special characters in its title, such as Bender Bending Rodríguez.