Page MenuHomePhabricator

Setup a Swift cluster on beta-cluster to match production
Closed, ResolvedPublic

Description

The thumbnails / files are served by an unpuppetized instance deployment-upload which run a nginx proxy and uses scripts in /data/project/scripts.

We should create a Swift cluster in beta, import the files from /data/project/upoad7 and points MediaWiki to the Swift cluster.

Related Objects

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes
hashar set Security to None.

I have created a root tracking task for object storage on labs: T114998. Continuous-Integration-Infrastructure definitely use a Swift store.

hashar changed the task status from Open to Stalled.Oct 8 2015, 10:27 AM
hashar removed Andrew as the assignee of this task.Nov 9 2015, 1:10 PM

Unassigning, since this task is stalled pending blocking task T114998.

It isn't blocked on T114998 since that's for labs generally and would be setup totally different from production, while this should be set up *on VMs* and configured exactly the same as production

hashar changed the task status from Stalled to Open.Jan 20 2016, 2:14 PM

@fgiunchedi working on it. Hence the task is no more stalled.

I've setup a swift cluster with backends deployment-ms-be01 / deployment-ms-be02 and the frontend at deployment-ms-fe01. Each backend is a xlarge instance and replication factor of 2.

filippo@deployment-puppetmaster:~$ swift list
test-upload
filippo@deployment-puppetmaster:~$ swift list test-upload | head -10
var/cache/apt/archives/arcconf_7.31.18856-1_amd64.deb
var/cache/apt/archives/augeas-lenses_1.2.0-0ubuntu1.1_all.deb
var/cache/apt/archives/bind9-host_1%3a9.9.5.dfsg-3ubuntu0.7_amd64.deb
var/cache/apt/archives/binutils_2.24-5ubuntu3.1_amd64.deb
var/cache/apt/archives/conntrack_1%3a1.4.1-1ubuntu1_amd64.deb
var/cache/apt/archives/daemon_0.6.4-1_amd64.deb
var/cache/apt/archives/debdeploy-common_0.0.9-1~trusty1_all.deb
var/cache/apt/archives/debdeploy-minion_0.0.9-1~trusty1_all.deb
var/cache/apt/archives/diamond_3.5-5+trusty1_all.deb
var/cache/apt/archives/dnsutils_1%3a9.9.5.dfsg-3ubuntu0.7_amd64.deb
filippo@deployment-ms-fe01:~$ sudo swift-recon -d --human-readable
===============================================================================
--> Starting reconnaissance on 2 hosts
===============================================================================
[2016-01-20 14:24:15] Checking disk usage now
Distribution Graph:
  0%    2 *********************************************************************
Disk usage: space used: 213 MB of 214 GB
Disk usage: space free: 214 GB of 214 GB
Disk usage: lowest: 0.1%, highest: 0.1%, avg: 0.0995588055691%
===============================================================================

next steps:

  • import existing data on nfs into swift
  • point deployment-mediawiki to deployment swift
  • for increased resiliency, move ms-be01 or ms-be02 to a different host other than labvirt1010

Do we know how difficult it would be to run the import from NFS to Swift? Does it need some sort of scripting, or...?

@Krenair yeah first adding the right filebackend to filebackend-labs.php and then via maintenance/copyFileBackend.php from core likely will do the trick.

Also, this task completely fell off my radar, sorry about that! Happy to help with what's left tho.

From some past discussions (and maybe it is recorded on a task): we will want to clean up mass of crap that is in upload7. Specially temp directory, most probably thumbs and anything that got magically imported via instantCommons. That a large chunk of data that are not worth migrating.

There is a few hints at me deleting random files from time to time on https://tools.wmflabs.org/sal/releng?p=0&q=upload7&d= eg:

sudo find /data/project/upload7/*/*/temp -type f -delete

sudo rm /data/project/upload7/*/*/thumb/*

From some past discussions (and maybe it is recorded on a task): we will want to clean up mass of crap that is in upload7. Specially temp directory, most probably thumbs and anything that got magically imported via instantCommons. That a large chunk of data that are not worth migrating.

There is a few hints at me deleting random files from time to time on https://tools.wmflabs.org/sal/releng?p=0&q=upload7&d= eg:

sudo find /data/project/upload7/*/*/temp -type f -delete

sudo rm /data/project/upload7/*/*/thumb/*

krenair@deployment-upload:~$ du -csh /data/project/upload7/math
16M	/data/project/upload7/math
16M	total
krenair@deployment-upload:~$ du -csh /data/project/upload7/*/*/temp
18M	/data/project/upload7/wikipedia/commons/temp
40K	/data/project/upload7/wikipedia/de/temp
8.0K	/data/project/upload7/wikipedia/en/temp
18M	total

And all these old files from June/July 2012:

krenair@deployment-upload:~$ find /data/project/upload7/*/*/lockdir -type f
/data/project/upload7/wikipedia/commons/lockdir/.nfs000000000010c1820000294d
/data/project/upload7/wikipedia/commons/lockdir/.nfs000000000010c0ef0000294c
/data/project/upload7/wikipedia/commons/lockdir/.nfs000000000010cdc900024fa2
/data/project/upload7/wikipedia/commons/lockdir/.nfs000000000010c2310000295a
/data/project/upload7/wikipedia/commons/lockdir/.nfs000000000010c18f00002951
/data/project/upload7/wikipedia/de/lockdir/.nfs000000000011c1eb00029253
/data/project/upload7/wikipedia/en/lockdir/.nfs000000000011a5450002b2c1
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0f000026c07
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c2a500023d3c
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011cf0600024f99
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0e800026c02
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c00800023978
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0fb0000281c
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0ec00026c05
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c3f60000472c
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0f200026c0a
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c2df00024e17
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0e100026bf2
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0f7000027df
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000010025900021cf5
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011ca4600024f94
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0d300026b90
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c08200000de2
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c105000027f3
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c10b00026c30
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c27500029439
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c10300002827
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c11c0000003c
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c03c00025aa6
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c4ff000008eb
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c14d00029ec8
/data/project/upload7/wikipedia/simple/lockdir/.nfs0000000000090d6400016bbf
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000010029f000243f0
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0f9000027cf
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0f5000027bd
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0fd000027d2
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c10d00026c2f
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0df00002784
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0cb00026a8a
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000010025c00021d1d
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c1e900029436
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c4b600023d86
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0ff00026c3b
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c36100023d40
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000010025300021cc1
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c3fc0002b296
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c12300029432
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0ea00026c04
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0ee00026c0f
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c3ff0002b201
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c1070000283b
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c1010000282c
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011cf5a00000906
/data/project/upload7/wikipedia/simple/lockdir/.nfs0000000000090d6c00016c4c
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0e300026bfa
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c4030002b2b3
/data/project/upload7/wikipedia/simple/lockdir/.nfs00000000001002cf000007b3
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0e500026bf7
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000010025800021cfa
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c27900023d3a
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0db00002765
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c2d50002b1f4
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0dd00026b93
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c96200021a55
/data/project/upload7/wikipedia/simple/lockdir/.nfs00000000001002d000024e8b
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0d700026b59
/data/project/upload7/wikipedia/simple/lockdir/.nfs00000000001002ce000007a9
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000010025a00021d00
/data/project/upload7/wikipedia/simple/lockdir/.nfs000000000011c0d900026b5d
  • for increased resiliency, move ms-be01 or ms-be02 to a different host other than labvirt1010

Given that we currently run all text through one single Varnish instance and all upload through another, I don't think we care about resiliency in beta much.

So, we need to:

  1. clean up those files above
  2. migrate all the remaining files by adding the swift filebackend to filebackend-labs.php and running maintenance/copyFileBackend.php
  3. make deployment-ms-fe01 replace deployment-upload as the backend for upload in Varnish?
  4. shut down deployment-upload

Is that right @fgiunchedi ?

sounds right to me @AlexMonk-WMF, also mediawiki will need to be switched to deployment-ms-fe01 of course together with varnish

I have been purging a few files with pywikibot.

http://commons.wikimedia.beta.wmflabs.org/wiki/Special:MediaStatistics is also quite useful. 87% of files (or 26GB) are .tiff / .tif pages.

I have cleaned up temp files that @AlexMonk-WMF mentioned and we are down to roughly 3Gbytes down from roughly 25GBytes.

$ du -c -m -d1 /data/project/upload7
1       ./wiktionary
2       ./score
544     ./private
9       ./wikisource
1       ./wikiversity
2671    ./wikipedia
1       ./scripts
16      ./math
1       ./wikibooks
1       ./wikinews
1       ./wikivoyage
1       ./wikiquote
3242    .
3242    total
$

Looks more manageable!

Mentioned in SAL [2016-07-09T00:46:13Z] <Krenair> T64835: Live-hacked some temporary swift config in

Mentioned in SAL [2016-07-09T00:46:24Z] <Krenair> T64835: foreachwikiindblist "% all-labs.dblist - private.dblist" extensions/WikimediaMaintenance/filebackend/setZoneAccess.php --backend=local-multiwrite

Mentioned in SAL [2016-07-09T00:46:28Z] <Krenair> T64835: mwscript extensions/WikimediaMaintenance/filebackend/setZoneAccess.php zerowiki --backend=local-multiwrite --private

@Krenair yeah first adding the right filebackend to filebackend-labs.php and then via maintenance/copyFileBackend.php from core likely will do the trick.

Also, this task completely fell off my radar, sorry about that! Happy to help with what's left tho.

I'm attempting to follow https://www.mediawiki.org/wiki/Thread:User_talk:Aaron_Schulz/Using_copyFileBackend.php_to_copy_from_old_file_repo_to_swift_cloud/reply but it only ever copies 0 files:

krenair@deployment-tin:/srv/mediawiki-staging$ mwscript copyFileBackend.php commonswiki --src local-backend --dst local-multiwrite --containers local-public --subdir 0
Doing container 'local-public', directory '0'...
	Copying file(s)...
	Copied 0 file(s).
Finished container 'local-public', directory '0'.
Done.

Same for all other subdirs

I found out deployment-tin does not have the NFS share /data/project/upload7 (on purpose apparently). So I guess the script can not find any.

Try from one of the app server such as deployment-mediawiki01 ?

Ah of course. Yep, it's now running in screen on deployment-mediawiki03.

Everything's in kind of a mess while I try this - I've got live hacks to config code on deployment-tin, a live hack (and so puppet is disabled) to make deployment-ms-fe01:/usr/local/lib/python2.7/dist-packages/wmf/rewrite.py rewrite to beta URLs instead of production ones - and I'm stuck on this HTTP 401 issue: http://upload.beta.wmflabs.org/wikipedia/commons/8/81/Icon_tools.png
https://wikitech.wikimedia.org/wiki/Swift/How_To seems hopelessly outdated

I still haven't got to figuring out the temp URL thing, cirrus key (?), or commonswiki's ForeignFileRepo, or working out whether we need to keep that GWToolset config in filebackend-labs.php

I've briefly looked at it, and it seems now the commons container is accessed in its non-sharded fashion? from /var/log/swift/proxy-access.log:

Jul 11 09:36:19 deployment-ms-fe01 proxy-server: x.x.x.x 10.68.18.109 11/Jul/2016/09/36/19 GET /v1/AUTH_mw/wikipedia-commons-local-public/8/81/Icon_tools.png HTTP/1.0 499 - Mozilla/5.0%20%28X11%3B%20Linux%20x86_64%29%20AppleWebKit/537.36%20%28KHTML%2C%20like%20Gecko%29%20Chrome/51.0.2704.106%20Safari/537.36 - - 131 - tx1050a5de31584461b9086-0057836893 - 0.0317 - - 1468229779.385159016 1468229779.416816950

anyways let me know when you are online, I can help debug further too

Change 298297 had a related patch set uploaded (by Alex Monk):
deployment-prep: Point upload cache at swift, fix rewrite.py to use beta.wmflabs.org domains

https://gerrit.wikimedia.org/r/298297

Change 298299 had a related patch set uploaded (by Alex Monk):
[labs/deployment-prep] Switch file backends to swift

https://gerrit.wikimedia.org/r/298299

krenair@deployment-upload:~$ lsb_release -a
No LSB modules are available.
Distributor ID:	Ubuntu
Description:	Ubuntu 12.04.5 LTS
Release:	12.04
Codename:	precise
krenair@deployment-upload:~$ uname -a
Linux deployment-upload 3.2.0-101-virtual #141-Ubuntu SMP Thu Mar 10 22:12:23 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
krenair@deployment-upload:~$ uptime
 21:00:22 up 100 days, 23:50,  1 user,  load average: 0.00, 0.01, 0.05
krenair@deployment-upload:~$ sudo halt
W: molly-guard: SSH session detected!
Please type in hostname of the machine to halt: deployment-upload

Broadcast message from krenair@deployment-upload
	(/dev/pts/2) at 21:01 ...

The system is going down for halt NOW!

Change 298299 merged by jenkins-bot:
[labs/deployment-prep] Switch file backends to swift

https://gerrit.wikimedia.org/r/298299

My NFS import ended up working a bit like this:

subdirs=`echo "0 1 2 3 4 5 6 7 8 9 a b c d e f archive/0 archive/1 archive/2 archive/3 archive/4 archive/5 archive/6 archive/7 archive/8 archive/9 archive/a archive/b archive/c archive/d archive/e archive/f" | tr -s " " "\012"`
for subdir in $subdirs; do
	foreachwikiindblist "all-labs" copyFileBackend.php --src local-backend --dst local-multiwrite --containers local-public --subdir $subdir;
done

subdirs=`echo "0 1 2 3 4 5 6 7 8 9 a b c d e f g h i j k l m n o p q r s t u v w x y z" | tr -s " " "\012"`
for subdir in $subdirs; do
	foreachwikiindblist "all-labs" copyFileBackend.php --src local-backend --dst local-multiwrite --containers local-deleted --subdir $subdir;
done

@fgiunchedi: Okay, so I think I need your help to finish this off completely.

  • Where are the temp URL and cirrus keys?
  • Can I put that puppet change in puppet SWAT?

Change 298393 had a related patch set uploaded (by Alex Monk):
[labs/deployment-prep] Point RedisLockManager to actual redis servers

https://gerrit.wikimedia.org/r/298393

Change 298393 merged by jenkins-bot:
[labs/deployment-prep] Point RedisLockManager to actual redis servers

https://gerrit.wikimedia.org/r/298393

Mentioned in SAL [2016-07-11T23:24:36Z] <Krenair> Unmounted /data/project (NFS) on all active hosts (mediawiki0[1-3], jobrunner01, tmh01), leaving just deployment-upload (shutoff, to schedule for deletion soon) - T64835

Change 298404 had a related patch set uploaded (by Alex Monk):
Remove old pre-Swift directory variables referencing upload7

https://gerrit.wikimedia.org/r/298404

While dealing with the commit above for T129586 I noticed we probably need to rescue these from NFS:

$wgCaptchaDirectory = '/data/project/upload7/private/captcha/random';
$wgCaptchaDirectory = '/data/project/upload7/private/captcha';
$wgMathDirectory   = '/data/project/upload7/math';
$wgScoreDirectory = '/data/project/upload7/score';

Along with this image that probably broke due to my messing around with the config while it was being uploaded: http://commons.wikimedia.beta.wmflabs.org/wiki/File:Test-1468270938.124573.png

(This change makes https://gerrit.wikimedia.org/r/#/c/298397/ into something that seems sane to do now by the way :))

@fgiunchedi: Okay, so I think I need your help to finish this off completely.

  • Where are the temp URL and cirrus keys?

the temp url key is set in PrivateSettings.php, and will need to be set on the swift account at runtime as outlined in http://docs.openstack.org/liberty/config-reference/content/object-storage-tempurl.html

cirrus key can live in labs' private (or secret?) git repo

  • Can I put that puppet change in puppet SWAT?

sure, which change?

@fgiunchedi: Okay, so I think I need your help to finish this off completely.

  • Where are the temp URL and cirrus keys?

the temp url key is set in PrivateSettings.php, and will need to be set on the swift account at runtime as outlined in http://docs.openstack.org/liberty/config-reference/content/object-storage-tempurl.html

cirrus key can live in labs' private (or secret?) git repo

root@deployment-ms-fe01:/etc/swift# . account_AUTH_mw.env 
root@deployment-ms-fe01:/etc/swift# swift post -m "Temp-URL-Key:redacted"
root@deployment-ms-fe01:/etc/swift#

Think I also sorted out the Cirrus key, hopefully.

  • Can I put that puppet change in puppet SWAT?

sure, which change?

https://gerrit.wikimedia.org/r/298297 - I've already scheduled it

Change 298297 merged by Filippo Giunchedi:
deployment-prep: Point upload cache at swift, fix rewrite.py to use beta.wmflabs.org domains

https://gerrit.wikimedia.org/r/298297

Along with this image that probably broke due to my messing around with the config while it was being uploaded: http://commons.wikimedia.beta.wmflabs.org/wiki/File:Test-1468270938.124573.png

I hard-rebooted deployment-upload (turns out nova didn't like me running halt instead of powering off via nova?) to regain NFS access.
scp deployment-upload:/data/project/upload7/wikipedia/commons/4/44/Test-1468270938.124573.png Test-1468270938.124573.png
scp Test-1468270938.124573.png deployment-ms-fe01:Test-1468270938.124573.png
logged into deployment-ms-fe01, sudo -i; cd /etc/swift; . account_AUTH_mw.env; cd ~krenair; swift upload wikipedia-commons-local-public.44 Test-1468270938.124573.png --object-name 4/44/Test-1468270938.124573.png and the file was fixed.

While dealing with the commit above for T129586 I noticed we probably need to rescue these from NFS:

$wgCaptchaDirectory = '/data/project/upload7/private/captcha/random';
$wgCaptchaDirectory = '/data/project/upload7/private/captcha';
$wgMathDirectory   = '/data/project/upload7/math';
$wgScoreDirectory = '/data/project/upload7/score';

This one is more difficult. These are all using the global-multiwrite backend, which uses the global-data wiki ID which of course my foreachwiki setZoneAccess etc. etc. run wouldn't've created anything for.
Compare P3375 with this:

root@deployment-ms-fe01:/etc/swift# swift list | grep -v local | grep -v timeline-render
test-upload
root@deployment-ms-fe01:/etc/swift#

@aaron, please could you comment on how those containers listed in P3375 are supposed to be created and configured?

Change 298812 had a related patch set uploaded (by Alex Monk):
[labs/deployment-prep] Remove old pre-Swift directory variables referencing /data/project/upload7

https://gerrit.wikimedia.org/r/298812

Change 298812 merged by jenkins-bot:
[labs/deployment-prep] Remove old pre-Swift directory variables referencing /data/project/upload7

https://gerrit.wikimedia.org/r/298812

@fgiunchedi got me more information about those swift containers (P3419) and I managed to recreate them in mwscript eval.php commonswiki:

$backend = FileBackendGroup::singleton()->get( 'global-multiwrite' );

$container = 'captcha-render';
$dir = $backend->getContainerStoragePath( $container );
$status = $backend->prepare( array( 'dir' => $dir ) );
$status->merge( $backend->publish( array( 'dir' => $dir, 'access' => true ) ) );
var_dump( $status->isOK() );

$container = 'math-render';
$dir = $backend->getContainerStoragePath( $container );
$status = $backend->prepare( array( 'dir' => $dir ) );
$status->merge( $backend->publish( array( 'dir' => $dir, 'access' => true ) ) );
var_dump( $status->isOK() );

$container = 'score-render';
$dir = $backend->getContainerStoragePath( $container );
$status = $backend->prepare( array( 'dir' => $dir ) );
$status->merge( $backend->publish( array( 'dir' => $dir, 'access' => true ) ) );
var_dump( $status->isOK() );

$backend = FileBackendGroup::singleton()->get( 'local-multiwrite' );
$container = 'gwtoolset-metadata';
$dir = $backend->getContainerStoragePath( $container );
$status = $backend->prepare( array( 'dir' => $dir, 'noAccess' => true, 'noListing' => true ) );
$status->merge( $backend->secure( array( 'dir' => $dir, 'noAccess' => true, 'noListing' => true ) ) );
var_dump( $status->isOK() );

That leaves us with these:

I also set up NFS on deployment-ms-fe01 and started importing from there directly:

  • root@deployment-ms-fe01:/data/project/upload7/private/captcha# swift upload global-data-captcha-render *
  • root@deployment-ms-fe01:/data/project/upload7/score# swift upload global-data-score-render *
  • TODO: We need to find out how the Math and GWToolset containers are structured so I can upload everything into the correct position.

root container is actually easy, assuming the file list in T130709 is still complete:

root@deployment-ms-fe01:~# swift post root -r '.r:*'
root@deployment-ms-fe01:~# swift stat root
       Account: AUTH_mw
     Container: root
       Objects: 0
         Bytes: 0
      Read ACL: .r:*
     Write ACL:
       Sync To:
      Sync Key:
 Accept-Ranges: bytes
   X-Timestamp: 1468437003.52441
    X-Trans-Id: tx605a6cc76ffa466aa788c-0057869213
  Content-Type: text/plain; charset=utf-8
root@deployment-ms-fe01:~# mkdir root
root@deployment-ms-fe01:~# cd root
root@deployment-ms-fe01:~/root# wget https://upload.wikimedia.org/{crossdomain.xml,favicon.ico,index.html,robots.txt}
[...]
Downloaded: 4 files, 10K in 0s (22.0 MB/s)
root@deployment-ms-fe01:~/root# ls
crossdomain.xml  favicon.ico  index.html  robots.txt
root@deployment-ms-fe01:~/root# swift upload root *
robots.txt
index.html
favicon.ico
crossdomain.xml
root@deployment-ms-fe01:~/root# swift stat root
       Account: AUTH_mw
     Container: root
       Objects: 4
         Bytes: 10421
      Read ACL: .r:*
     Write ACL:
       Sync To:
      Sync Key:
 Accept-Ranges: bytes
   X-Timestamp: 1468437003.52441
    X-Trans-Id: txb63b3fac4e7d4c5a90cde-0057869352
  Content-Type: text/plain; charset=utf-8
  • root@deployment-ms-fe01:/data/project/upload7/private/gwtoolset/wikipedia/commons/commonswiki-gwtoolset-metadata# swift upload wikipedia-commons-gwtoolset-metadata *
  • Math is more complex so I enabled NFS on deployment-tin, added this to filebackend-labs.php:
$wgFileBackends[] = [
    'name'           => 'math-nfs',
    'class'          => 'FSFileBackend',
    'lockManager'    => 'nullLockManager',
    'fileMode'       => 0777,
    'containerPaths' => [ 'math-render' => '/data/project/upload7/math' ],
];

and did this:

subdirs=`echo "0 1 2 3 4 5 6 7 8 9 a b c d e f" | tr -s " " "\012"`
for subdir in $subdirs; do
	mwscript copyFileBackend.php commonswiki --src math-nfs --dst global-multiwrite --containers math-render --subdir $subdir;
done

I think that's everything rescued from NFS upload7 now...

AlexMonk-WMF added a subscriber: ArielGlenn.

Thanks to @fgiunchedi for setting up the instances and to @ArielGlenn for getting a few last pieces of information about the production setup for me. I'm going to call this task done.

root@deployment-ms-fe01:~# swift-recon -d --human-readable
===============================================================================
--> Starting reconnaissance on 2 hosts
===============================================================================
[2016-07-13 20:36:28] Checking disk usage now
Distribution Graph:
  4%    2 *********************************************************************
Disk usage: space used: 9 GB of 214 GB
Disk usage: space free: 205 GB of 214 GB
Disk usage: lowest: 4.26%, highest: 4.26%, avg: 4.25607787311%
===============================================================================

wohooo that's awesome @AlexMonk-WMF ! thanks for working on this and completing it!

Change 299123 had a related patch set uploaded (by Yuvipanda):
labs: Remove nfs for deployment-prep \o/

https://gerrit.wikimedia.org/r/299123

Change 298404 abandoned by Alex Monk:
Remove old pre-Swift directory variables referencing upload7

Reason:
I322fdd50

https://gerrit.wikimedia.org/r/298404