Author Topic: Random wisdom collected from various places  (Read 228946 times)

Offline zombiehoffa

  • Junior Indexer
  • **
  • Posts: 17
  • Helpful: +5/-0
Random wisdom collected from various places
« on: 2013-05-21, 06:42:50 pm »
In no particular order here is various collected wisdom.
  • The faq is your friend, you can find it here: https://github.com/nZEDb/nZEDb/wiki/FAQ's
  • You need python-mysqldb to run the threaded scripts. This is new for you nn converts
  • mysqldump is also your friend. It's a good idea to backup your database before you tinker with it. You can do that by modifying this command:
    mysqldump -u USER -pPASSWORD nzedb | gzip > /mnt/md0/dbbackup/nzedb_data.sql.gz
  • You can automate the above with cron. To do this in ubuntu (and likely the other distros, but I only tested it on ubuntu) type crontab -e and then something like the following in there:
    14 02 * * * /usr/bin/mysqldump -u root -pPASSWORD nzedb | gzip > /mnt/md0/dbbackup/nzedb_`date +"\%Y-\%m-\%d"`.sql.gz
    This will put gzipped db backups into /mnt/md0/dbbackup/ every night at 2:14 am. You will want to periodically clean that dir out or risk filling your drive up.
  • The following sql commands will make nzedb post process the respective categories again. I believe this is no longer needed because of  reset_postrprocessing.php in ~/misc/testing/DB_scripts/ and is included only for information.
    to make nzedb postprocess again run these for movies, books, music respectively:
    update releases set imdbID = NULL;
    update releases set bookinfoID = NULL;
    update releases set musicinfoID = NULL;
  • to change all active for backfill groups backfill days at once you can run the following sql command (the example sets it to 1700 days back):
    update groups set backfill_target = 1700 where backfill = 1;
  • If you want to put your tmpunrar dir into ram to save your disks millions of writes you can use the following in /etc/fstab (adjust the path appropriately for your installation and adjust the size appropriately for your ram. This example uses 256 MB. I believe it could be much less, maybe as low as 90M):
    none /var/www/nZEDb/nzbfiles/tmpunrar   tmpfs  nodev,nodiratime,noexec,nosuid,size=256M 0 0
  • running patchmysql.php is your friend. If you did a git pull at any point and the scripts stopped working, or are working weird, try running patchmysql.php (and following the directions it gives you).
  • If you want to activate all your groups without having to click "activate" on all of them you can do it from sql with: update groups set active =1;
  • If you want to allow all your groups to be backfilled without having to click backfill on all of them in groups you can do it from sql with: update groups set backfill = 1;
  • For all you aspiring contributors out there, this is probably a good read: https://help.github.com/articles/using-pull-requests (and https://github.com/nZEDb/nZEDb/wiki/Contributing-Code-For-Beginners-to-Git(hub) [added by archer]
  • Don't let your in process stuff grow too much. The pp script has a sort in it and that kills your db if you have lots of unprocessed stuff (it's sorting it all every time it queries for work). If you're one of the unlucky souls (like me) that didn't set the max postprocess variable and ended up with 500-600k releases sitting and waiting to process, you can deal with this by bumping max add pp per run to something large (say 1000) bumping the time between queries to something large (say 600) and turning off backfill and update binaries until you've crunched your in process pp down to something reasonable.
  • precompiled oldschool ffmpeg here: http://dl.dropboxusercontent.com/u/24633983/ffmpeg/index.html This is useful for you debian based distro users (i.e ubuntu, mint, etc) That one uses the libav ffmpeg (long story, google it if you're interested).
  • 'for ((n=0;n<50;n++)); do php postprocess.php nfo true; done' Change the "50" to whatever you want. In the current form it will run postprocess.php for nfo's 50 times. For those of you, like myself, that have a lot of nfo's to process but don't necessarily want to change their settings, or can't login to the webpage because it's on your lan and you haven't setup a vpn tunnel yet, but do have a port forwarded for ssh. take note that this will delete the stuff in your tmpdir a lot, so you may not want to run tmux/screen (and the corresponding post processing) while powering through your nfos.
« Last Edit: 2014-12-03, 09:19:36 pm by archer »

Offline jonnyboy

  • Epic Indexer
  • *****
  • Posts: 1046
  • Helpful: +93/-1
  • Lazzy Trucker
Re: Random wisdom collected from various places
« Reply #1 on: 2013-05-21, 07:21:41 pm »
or php misc/testing/DB_scripts/reset_postprocessing.php
to reset all or each category

Offline Killerc

  • Junior Indexer
  • **
  • Posts: 45
  • Helpful: +2/-0
  • deathadder on IRC
Re: Random wisdom collected from various places
« Reply #2 on: 2013-05-23, 12:04:30 pm »
Had not thought at all about point number 7. Out of curiosity did it speed it up any that you noticed? But simply reducing the writes is a brilliant one.

The rest i've slowly worked out alone, but this makes a great read for a new user, nicely consolidated in one place. Assuming they read this thread ;)

Good work.

Offline zombiehoffa

  • Junior Indexer
  • **
  • Posts: 17
  • Helpful: +5/-0
Re: Random wisdom collected from various places
« Reply #3 on: 2013-05-23, 03:28:00 pm »
Honestly, jonnyboy dropped that nugget of wisdom on my head back in the tmux for nn+ days and it seemed like such a good idea that I've just done it ever since. I don't know if it speeds up nZEDb because I've never run nZEDb without it. I can say it sped up the old program a lot.

Offline ThePeePs

  • Overlord
  • ******
  • Posts: 44
  • Helpful: +7/-0
  • Hardware mod'er and p/t coder
    • nZEDb by ThePeePs
Re: Random wisdom collected from various places
« Reply #4 on: 2013-08-09, 08:55:27 am »
I just did point number 7 (also didn't think of it) for the tmpunrar dir, and I have seen a speed up on everything.

Thanks for the tip!  8)

Offline jonnyboy

  • Epic Indexer
  • *****
  • Posts: 1046
  • Helpful: +93/-1
  • Lazzy Trucker
Re: Random wisdom collected from various places
« Reply #5 on: 2013-08-09, 11:47:17 am »
You can change #7 to allow any user to mount it. The example is on the tmux-edit page.

ruhllatio

  • Guest
Re: Random wisdom collected from various places
« Reply #6 on: 2013-12-18, 04:47:54 am »
Wanted to add that if you're like me you only activated groups for updating binaries and not backfill to get up and running.  Now you want to add backfill for each group you activated and no others without clicking each GD one in the Admin site.  Just run:

Code: [Select]
UPDATE groups set backfill = 1 WHERE active = 1;
Then use the others OP listed to change your target days.

Offline prometheus247

  • Newbie
  • *
  • Posts: 3
  • Helpful: +0/-0
Re: Random wisdom collected from various places
« Reply #7 on: 2014-08-04, 01:13:47 am »
Hello, the FAQ link in #1 isnt working.
The recent correct link: https://github.com/nZEDb/nZEDb/wiki/FAQ's

Offline R19Heit

  • Newbie
  • *
  • Posts: 1
  • Helpful: +0/-0
Random wisdom collected from various places
« Reply #8 on: 2014-12-11, 12:51:56 pm »
The front page shows the feed with three options: Recent, Most Commented and Random.

How can I change the default behaviour to show Random or Most Commented articles ?

Offline mordor

  • Prolific Indexer
  • ****
  • Posts: 153
  • Helpful: +7/-0
    • autonzb