changeset 4043:165de6b06dd3

<mrhmouse> rm -rf node
author HackBot
date Fri, 22 Nov 2013 15:39:18 +0000
parents ff55d699808e
children 9eae806931c6
files node/node-v0.10.22-linux-x86/ChangeLog node/node-v0.10.22-linux-x86/LICENSE node/node-v0.10.22-linux-x86/README.md node/node-v0.10.22-linux-x86/bin/node node/node-v0.10.22-linux-x86/bin/npm node/node-v0.10.22-linux-x86/lib/dtrace/node.d node/node-v0.10.22-linux-x86/lib/node_modules/npm/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/.tern-project node/node-v0.10.22-linux-x86/lib/node_modules/npm/AUTHORS node/node-v0.10.22-linux-x86/lib/node_modules/npm/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/node-gyp-bin/node-gyp node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/node-gyp-bin/node-gyp.cmd node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/npm node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/npm-cli.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/npm.cmd node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/read-package-json.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/cli.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/configure node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-bin.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-bugs.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-commands.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-config.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-deprecate.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-docs.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-edit.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-explore.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-help-search.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-init.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-install.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-link.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-load.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-ls.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-outdated.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-owner.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-pack.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-prefix.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-prune.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-publish.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-rebuild.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-restart.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-root.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-run-script.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-search.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-shrinkwrap.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-start.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-stop.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-submodule.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-tag.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-test.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-uninstall.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-unpublish.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-update.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-version.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-view.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-whoami.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/repo.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-adduser.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-bin.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-bugs.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-build.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-bundle.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-cache.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-completion.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-config.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-dedupe.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-deprecate.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-docs.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-edit.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-explore.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-help-search.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-help.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-init.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-install.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-link.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-ls.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-outdated.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-owner.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-pack.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-prefix.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-prune.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-publish.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-rebuild.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-restart.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-rm.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-root.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-run-script.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-search.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-shrinkwrap.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-star.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-stars.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-start.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-stop.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-submodule.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-tag.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-test.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-uninstall.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-unpublish.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-update.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-version.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-view.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-whoami.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/repo.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/files/npm-folders.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/files/npmrc.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/files/package.json.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-coding-style.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-config.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-developers.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-disputes.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-faq.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-index.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-registry.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-scripts.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/removing-npm.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/semver.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/README.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-bin.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-bugs.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-commands.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-config.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-deprecate.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-docs.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-edit.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-explore.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-help-search.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-init.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-install.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-link.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-load.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-ls.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-outdated.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-owner.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-pack.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-prefix.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-prune.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-publish.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-rebuild.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-restart.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-root.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-run-script.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-search.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-shrinkwrap.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-start.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-stop.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-submodule.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-tag.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-test.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-uninstall.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-unpublish.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-update.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-version.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-view.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-whoami.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/repo.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-adduser.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-bin.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-bugs.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-build.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-bundle.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-cache.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-completion.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-config.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-dedupe.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-deprecate.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-docs.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-edit.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-explore.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-help-search.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-help.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-init.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-install.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-link.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-ls.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-outdated.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-owner.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-pack.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-prefix.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-prune.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-publish.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-rebuild.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-restart.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-rm.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-root.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-run-script.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-search.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-shrinkwrap.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-star.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-stars.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-start.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-stop.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-submodule.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-tag.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-test.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-uninstall.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-unpublish.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-update.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-version.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-view.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-whoami.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/repo.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/npm-folders.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/npm-global.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/npm-json.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/npmrc.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/package.json.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/index.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-coding-style.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-config.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-developers.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-disputes.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-faq.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-index.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-registry.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-scripts.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/removing-npm.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/semver.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/docfoot-script.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/docfoot.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/dochead.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/favicon.ico node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/index.html node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/static/style.css node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/static/webfonts/23242D_3_0.eot node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/static/webfonts/23242D_3_0.ttf node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/static/webfonts/23242D_3_0.woff node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/adduser.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/bin.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/bugs.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/build.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/cache.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/completion.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/config.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/dedupe.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/deprecate.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/docs.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/edit.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/explore.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/faq.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/get.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/help-search.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/help.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/init.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/install.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/link.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/ls.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/npm.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/outdated.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/owner.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/pack.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/prefix.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/prune.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/publish.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/rebuild.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/repo.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/restart.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/root.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/run-script.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/search.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/set.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/shrinkwrap.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/star.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/stars.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/start.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/stop.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/submodule.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/substack.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/tag.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/test.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/unbuild.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/uninstall.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/unpublish.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/update.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/completion.sh node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/completion/file-completion.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/completion/installed-deep.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/completion/installed-shallow.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/error-handler.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/fetch.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/find-prefix.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/gently-rm.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/is-git-url.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/lifecycle.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/link.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/tar.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/version.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/view.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/visnup.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/whoami.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/xmas.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/make.bat node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-README.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-adduser.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-bin.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-bugs.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-build.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-bundle.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-cache.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-completion.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-config.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-dedupe.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-deprecate.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-docs.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-edit.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-explore.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-help-search.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-help.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-init.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-install.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-link.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-ls.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-outdated.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-owner.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-pack.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-prefix.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-prune.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-publish.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-rebuild.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-restart.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-rm.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-root.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-run-script.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-search.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-shrinkwrap.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-star.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-stars.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-start.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-stop.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-submodule.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-tag.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-test.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-uninstall.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-unpublish.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-update.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-version.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-view.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-whoami.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/repo.1 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-bin.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-bugs.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-commands.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-config.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-deprecate.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-docs.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-edit.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-explore.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-help-search.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-init.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-install.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-link.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-load.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-ls.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-outdated.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-owner.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-pack.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-prefix.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-prune.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-publish.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-rebuild.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-restart.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-root.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-run-script.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-search.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-shrinkwrap.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-start.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-stop.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-submodule.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-tag.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-test.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-uninstall.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-unpublish.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-update.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-version.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-view.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-whoami.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/repo.3 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/npm-folders.5 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/npm-global.5 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/npm-json.5 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/npmrc.5 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/package.json.5 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-coding-style.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-config.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-developers.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-disputes.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-faq.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-index.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-registry.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-scripts.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/removing-npm.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/semver.7 node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/abbrev/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/abbrev/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/abbrev/lib/abbrev.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/abbrev/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/color-spaces.pl node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/beep/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/clear/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/cursorPosition.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/progress/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/starwars.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/lib/ansi.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/lib/newlines.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/archy/README.markdown node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/archy/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/archy/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/LICENCE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/bench/block-stream-pause.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/bench/block-stream.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/bench/dropper-pause.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/bench/dropper.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/block-stream.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/child-process-close/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/child-process-close/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/child-process-close/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chmodr/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chmodr/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chmodr/chmodr.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chmodr/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chownr/LICENCE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chownr/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chownr/chownr.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chownr/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/README.markdown node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/example/beep.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/example/edit.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/LICENCE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/bundle.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/dir-tar.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/dir.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/example.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/ig-tar.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/tar.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/fstream-npm.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/example/basic.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/ignore.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/examples/filter-pipe.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/examples/pipe.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/examples/reader.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/examples/symlink-write.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/fstream.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/abstract.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/collect.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/dir-reader.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/dir-writer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/file-reader.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/file-writer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/get-type.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/link-reader.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/link-writer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/proxy-reader.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/proxy-writer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/reader.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/socket-reader.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/writer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/History.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/Readme.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/test.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/examples/g.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/examples/usr-local.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/glob.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/graceful-fs.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/polyfills.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/inherits.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/inherits_browser.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/test.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ini/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ini/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ini/ini.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ini/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/default-input.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/example.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/init-package-json.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/init-input.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/init.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/substack-input.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/promzard.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lockfile/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lockfile/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lockfile/lockfile.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lockfile/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/AUTHORS node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/bench.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/lib/lru-cache.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/minimatch.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/bench.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/sigmund.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/README.markdown node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/examples/pow.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/.jshintrc node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/addon.gypi node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/AUTHORS node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/DEPS node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/MANIFEST node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/OWNERS node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/PRESUBMIT.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/codereview.settings node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/data/win/large-pdb-shim.cc node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp.bat node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp_dummy.c node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/gyptest.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSNew.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSProject.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings_test.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSToolFile.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSUserFile.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSUtil.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSVersion.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/SCons.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/__init__.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/common.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/common_test.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/easy_xml.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/easy_xml_test.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/__init__.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/android.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/dump_dependency_json.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/eclipse.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/gypd.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/gypsh.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/make.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs_test.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja_test.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/scons.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/input.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/mac_tool.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/msvs_emulation.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/ninja_syntax.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/sun_tool.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/win_tool.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/xcode_emulation.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/xcodeproj_file.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/xml_fix.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylintrc node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/samples/samples node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/samples/samples.bat node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/setup.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/README node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/README node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/Specifications/gyp.pbfilespec node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/Specifications/gyp.xclangspec node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/README node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/gyp-tests.el node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/gyp.el node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/run-unit-tests.sh node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/testdata/media.gyp node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/testdata/media.gyp.fontified node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/graphviz.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/pretty_gyp.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/pretty_sln.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/pretty_vcproj.py node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/build.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/clean.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/configure.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/install.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/list.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/node-gyp.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/rebuild.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/remove.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/bin/nopt.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/examples/my-program.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/lib/nopt.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/adduser.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/get.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/publish.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/request.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/star.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/stars.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/tag.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/unpublish.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/upload.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/couch-login.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/npm-user-validate.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/config-defs.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/LICENCE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/node_modules/proto-list/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/node_modules/proto-list/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/node_modules/proto-list/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/node_modules/proto-list/proto-list.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/readme.markdown node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/npmconf.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/example.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/log.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/once/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/once/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/once/once.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/once/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/opener/LICENSE.txt node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/opener/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/opener/opener.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/opener/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/osenv/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/osenv/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/osenv/osenv.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/osenv/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-installed/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-installed/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-installed/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-installed/read-installed.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/AUTHORS node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/core_module_names.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/extract_description.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/fixer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/normalize.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/typos.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/read-json.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/LICENCE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/example/example.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/lib/read.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/mute.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/rs.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/lib/copy.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/lib/debug.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/lib/getSafe.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/aws-sign/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/aws-sign/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/aws-sign/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/aws-sign/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/jar.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/tests/run.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/tests/test-cookie.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/tests/test-cookiejar.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/License node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/Readme.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/lib/form_data.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/component.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/lib/async.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/License node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/Readme.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/lib/combined_stream.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/License node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/Readme.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/lib/delayed_stream.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/example/usage.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/images/hawk.png node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/images/logo.png node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/browser.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/client.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/crypto.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/server.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/utils.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/images/boom.png node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/lib/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/lib/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/images/hoek.png node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/lib/escape.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/lib/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/examples/offset.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/examples/time.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/lib/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/.dir-locals.el node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/http_signing.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/parser.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/signer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/util.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/verify.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/errors.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/reader.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/types.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/writer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/tst/ber/reader.test.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/tst/ber/writer.test.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/assert.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/CHANGELOG node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/README node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/README.old node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/ctf.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/ctio.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/ctype.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/man/man3ctype/ctio.3ctype node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tools/jsl.conf node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tools/jsstyle node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/float.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/int.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/psinfo.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/struct.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.fail.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.float.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.int.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.psinfo.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.struct.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.typedef.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/typedef.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/float/tst.rfloat.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/float/tst.wfloat.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.64.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.rint.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.wbounds.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.wint.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.64.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.roundtrip.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.ruint.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.wuint.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.basicr.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.basicw.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.char.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.endian.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.oldwrite.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.readSize.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.structw.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.writeStruct.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/stringify.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/test.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/mime.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/test.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/types/mime.types node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/types/node.types node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/LICENSE.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/bench.gnu node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/bench.sh node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/benchmark-native.c node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/benchmark.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/component.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/uuid.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/test.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/.gitmodules node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/Readme.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/request.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/googledoodle.jpg node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/run.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/server.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/squid.conf node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.cnf node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.crl node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.crt node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.csr node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.key node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.srl node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.cnf node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.crt node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.csr node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.key node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/npm-ca.crt node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/test.crt node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/test.key node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-agentOptions.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-basic-auth.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-body.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-defaults.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-digest-auth.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-emptyBody.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-errors.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-follow-all-303.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-follow-all.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-form.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-hawk.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-headers.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-http-signature.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-httpModule.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-https-strict.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-https.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-isUrl.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-localAddress.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-oauth.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-onelineproxy.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-params.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-piped-redirect.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-pipes.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-pool.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-protocol-changing-redirect.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-proxy.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-qs.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-redirect.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-s3.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-timeout.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-toJSON.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-tunnel.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/unicycle.jpg node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/License node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/Readme.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/equation.gif node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/example/dns.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/lib/retry.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/lib/retry_operation.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/AUTHORS node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/bin.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/rimraf.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/Makefile node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/bin/semver node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/foot.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/head.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.browser.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.browser.js.gz node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.min.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.min.js.gz node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/duplex.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/examples/CAPSLOCKTYPER.JS node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/examples/typer-fsr.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/examples/typer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/float.patch node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/fs.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_duplex.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_passthrough.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_readable.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_transform.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_writable.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/passthrough.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/readable.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/transform.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/writable.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/zlib.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/index.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/async-map-ordered.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/async-map.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/bind-actor.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/chain.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/slide.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/.npmignore node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/.travis.yml node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/LICENCE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/examples/extracter.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/examples/reader.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/buffer-entry.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/entry-writer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/entry.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/extended-header-writer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/extended-header.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/extract.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/global-header-writer.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/header.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/pack.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/parse.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/tar.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/LICENCE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/get-uid-gid.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/uid-number.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/LICENSE node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/README.md node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/bin/which node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/which.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/package.json node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/clean-old.sh node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/doc-build.sh node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/index-build.js node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/install.sh node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/release.sh node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/relocate.sh node/node-v0.10.22-linux-x86/share/man/man1/node.1
diffstat 1092 files changed, 0 insertions(+), 137788 deletions(-) [+]
line wrap: on
line diff
--- a/node/node-v0.10.22-linux-x86/ChangeLog	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4601 +0,0 @@
-2013.11.12, Version 0.10.22 (Stable)
-
-* npm: Upgrade to 1.3.14
-
-* uv: Upgrade to v0.10.19
-
-* child_process: don't assert on stale file descriptor events (Fedor Indutny)
-
-* darwin: Fix "Not Responding" in Mavericks activity monitor (Fedor Indutny)
-
-* debugger: Fix bug in sb() with unnamed script (Maxim Bogushevich)
-
-* repl: do not insert duplicates into completions (Maciej Małecki)
-
-* src: Fix memory leak on closed handles (Timothy J Fontaine)
-
-* tls: prevent stalls by using read(0) (Fedor Indutny)
-
-* v8: use correct timezone information on Solaris (Maciej Małecki)
-
-
-2013.10.18, Version 0.10.21 (Stable), e2da042844a830fafb8031f6c477eb4f96195210
-
-* uv: Upgrade to v0.10.18
-
-* crypto: clear errors from verify failure (Timothy J Fontaine)
-
-* dtrace: interpret two byte strings (Dave Pacheco)
-
-* fs: fix fs.truncate() file content zeroing bug (Ben Noordhuis)
-
-* http: provide backpressure for pipeline flood (isaacs)
-
-* tls: fix premature connection termination (Ben Noordhuis)
-
-
-2013.09.30, Version 0.10.20 (Stable), d7234c8d50a1af73f60d2d3c0cc7eed17429a481
-
-* tls: fix sporadic hang and partial reads (Fedor Indutny)
-  - fixes "npm ERR! cb() never called!"
-
-
-2013.09.24, Version 0.10.19 (Stable), 6b5e6a5a3ec8d994c9aab3b800b9edbf1b287904
-
-* uv: Upgrade to v0.10.17
-
-* npm: upgrade to 1.3.11
-
-* readline: handle input starting with control chars (Eric Schrock)
-
-* configure: add mips-float-abi (soft, hard) option (Andrei Sedoi)
-
-* stream: objectMode transforms allow falsey values (isaacs)
-
-* tls: prevent duplicate values returned from read (Nathan Rajlich)
-
-* tls: NPN protocols are now local to connections (Fedor Indutny)
-
-
-2013.09.04, Version 0.10.18 (Stable), 67a1f0c52e0708e2596f3f2134b8386d6112561e
-
-* uv: Upgrade to v0.10.15
-
-* stream: Don't crash on unset _events property (isaacs)
-
-* stream: Pass 'buffer' encoding with decoded writable chunks (isaacs)
-
-
-2013.08.21, Version 0.10.17 (Stable), 469a4a5091a677df62be319675056b869c31b35c
-
-* uv: Upgrade v0.10.14
-
-* http_parser: Do not accept PUN/GEM methods as PUT/GET (Chris Dickinson)
-
-* tls: fix assertion when ssl is destroyed at read (Fedor Indutny)
-
-* stream: Throw on 'error' if listeners removed (isaacs)
-
-* dgram: fix assertion on bad send() arguments (Ben Noordhuis)
-
-* readline: pause stdin before turning off terminal raw mode (Daniel Chatfield)
-
-
-2013.08.16, Version 0.10.16 (Stable), 50b4c905a4425430ae54db4906f88982309e128d
-
-* v8: back-port fix for CVE-2013-2882
-
-* npm: Upgrade to 1.3.8
-
-* crypto: fix assert() on malformed hex input (Ben Noordhuis)
-
-* crypto: fix memory leak in randomBytes() error path (Ben Noordhuis)
-
-* events: fix memory leak, don't leak event names (Ben Noordhuis)
-
-* http: Handle hex/base64 encodings properly (isaacs)
-
-* http: improve chunked res.write(buf) performance (Ben Noordhuis)
-
-* stream: Fix double pipe error emit (Eran Hammer)
-
-
-2013.07.25, Version 0.10.15 (Stable)
-
-* src: fix process.getuid() return value (Ben Noordhuis)
-
-
-2013.07.25, Version 0.10.14 (Stable), fdf57f811f9683a4ec49a74dc7226517e32e6c9d
-
-* uv: Upgrade to v0.10.13
-
-* npm: Upgrade to v1.3.5
-
-* os: Don't report negative times in cpu info (Ben Noordhuis)
-
-* fs: Handle large UID and GID (Ben Noordhuis)
-
-* url: Fix edge-case when protocol is non-lowercase (Shuan Wang)
-
-* doc: Streams API Doc Rewrite (isaacs)
-
-* node: call MakeDomainCallback in all domain cases (Trevor Norris)
-
-* crypto: fix memory leak in LoadPKCS12 (Fedor Indutny)
-
-
-2013.07.09, Version 0.10.13 (Stable), e32660a984427d46af6a144983cf7b8045b7299c
-
-* uv: Upgrade to v0.10.12
-
-* npm: Upgrade to 1.3.2
-
-* windows: get proper errno (Ben Noordhuis)
-
-* tls: only wait for finish if we haven't seen it (Timothy J Fontaine)
-
-* http: Dump response when request is aborted (isaacs)
-
-* http: use an unref'd timer to fix delay in exit (Peter Rust)
-
-* zlib: level can be negative (Brian White)
-
-* zlib: allow zero values for level and strategy (Brian White)
-
-* buffer: add comment explaining buffer alignment (Ben Noordhuis)
-
-* string_bytes: properly detect 64bit (Timothy J Fontaine)
-
-* src: fix memory leak in UsingDomains() (Ben Noordhuis)
-
-
-2013.06.18, Version 0.10.12 (Stable), a088cf4f930d3928c97d239adf950ab43e7794aa
-
-* npm: Upgrade to 1.2.32
-
-* readline: make `ctrl + L` clear the screen (Yuan Chuan)
-
-* v8: add setVariableValue debugger command (Ben Noordhuis)
-
-* net: Do not destroy socket mid-write (isaacs)
-
-* v8: fix build for mips32r2 architecture (Andrei Sedoi)
-
-* configure: fix cross-compilation host_arch_cc() (Andrei Sedoi)
-
-
-2013.06.13, Version 0.10.11 (Stable), d9d5bc465450ae5d60da32e9ffcf71c2767f1fad
-
-* uv: upgrade to 0.10.11
-
-* npm: Upgrade to 1.2.30
-
-* openssl: add missing configuration pieces for MIPS (Andrei Sedoi)
-
-* Revert "http: remove bodyHead from 'upgrade' events" (isaacs)
-
-* v8: fix pointer arithmetic undefined behavior (Trevor Norris)
-
-* crypto: fix utf8/utf-8 encoding check (Ben Noordhuis)
-
-* net: Fix busy loop on POLLERR|POLLHUP on older linux kernels (Ben Noordhuis, isaacs)
-
-
-
-2013.06.04, Version 0.10.10 (Stable), 25e51c396aa23018603baae2b1d9390f5d9db496
-
-* uv: Upgrade to 0.10.10
-
-* npm: Upgrade to 1.2.25
-
-* url: Properly parse certain oddly formed urls (isaacs)
-
-* stream: unshift('') is a noop (isaacs)
-
-
-2013.05.30, Version 0.10.9 (Stable), 878ffdbe6a8eac918ef3a7f13925681c3778060b
-
-* npm: Upgrade to 1.2.24
-
-* uv: Upgrade to v0.10.9
-
-* repl: fix JSON.parse error check (Brian White)
-
-* tls: proper .destroySoon (Fedor Indutny)
-
-* tls: invoke write cb only after opposite read end (Fedor Indutny)
-
-* tls: ignore .shutdown() syscall error (Fedor Indutny)
-
-
-2013.05.24, Version 0.10.8 (Stable), 30d9e9fdd9d4c33d3d95a129d021cd8b5b91eddb
-
-* v8: update to 3.14.5.9
-
-* uv: upgrade to 0.10.8
-
-* npm: Upgrade to 1.2.23
-
-* http: remove bodyHead from 'upgrade' events (Nathan Zadoks)
-
-* http: Return true on empty writes, not false (isaacs)
-
-* http: save roundtrips, convert buffers to strings (Ben Noordhuis)
-
-* configure: respect the --dest-os flag consistently (Nathan Rajlich)
-
-* buffer: throw when writing beyond buffer (Trevor Norris)
-
-* crypto: Clear error after DiffieHellman key errors (isaacs)
-
-* string_bytes: strip padding from base64 strings (Trevor Norris)
-
-
-2013.05.17, Version 0.10.7 (Stable), d2fdae197ac542f686ee06835d1153dd43b862e5
-
-* uv: upgrade to v0.10.7
-
-* npm: Upgrade to 1.2.21
-
-* crypto: Don't ignore verify encoding argument (isaacs)
-
-* buffer, crypto: fix default encoding regression (Ben Noordhuis)
-
-* timers: fix setInterval() assert (Ben Noordhuis)
-
-
-2013.05.14, Version 0.10.6 (Stable), 5deb1672f2b5794f8be19498a425ea4dc0b0711f
-
-* module: Deprecate require.extensions (isaacs)
-
-* stream: make Readable.wrap support objectMode, empty streams (Daniel Moore)
-
-* child_process: fix handle delivery (Ben Noordhuis)
-
-* crypto: Fix performance regression (isaacs)
-
-* src: DRY string encoding/decoding (isaacs)
-
-
-2013.04.23, Version 0.10.5 (Stable), deeaf8fab978e3cadb364e46fb32dafdebe5f095
-
-* uv: Upgrade to 0.10.5 (isaacs)
-
-* build: added support for Visual Studio 2012 (Miroslav Bajtoš)
-
-* http: Don't try to destroy nonexistent sockets (isaacs)
-
-* crypto: LazyTransform on properties, not methods (isaacs)
-
-* assert: put info in err.message, not err.name (Ryan Doenges)
-
-* dgram: fix no address bind() (Ben Noordhuis)
-
-* handle_wrap: fix NULL pointer dereference (Ben Noordhuis)
-
-* os: fix unlikely buffer overflow in os.type() (Ben Noordhuis)
-
-* stream: Fix unshift() race conditions (isaacs)
-
-
-2013.04.11, Version 0.10.4 (Stable), 9712aa9f76073c30850b20a188b1ed12ffb74d17
-
-* uv: Upgrade to 0.10.4
-
-* npm: Upgrade to 1.2.18
-
-* v8: Avoid excessive memory growth in JSON.parse (Fedor Indutny)
-
-* child_process, cluster: fix O(n*m) scan of cmd string (Ben Noordhuis)
-
-* net: fix socket.bytesWritten Buffers support (Fedor Indutny)
-
-* buffer: fix offset checks (Łukasz Walukiewicz)
-
-* stream: call write cb before finish event (isaacs)
-
-* http: Support write(data, 'hex') (isaacs)
-
-* crypto: dh secret should be left-padded (Fedor Indutny)
-
-* process: expose NODE_MODULE_VERSION in process.versions (Rod Vagg)
-
-* crypto: fix constructor call in crypto streams (Andreas Madsen)
-
-* net: account for encoding in .byteLength (Fedor Indutny)
-
-* net: fix buffer iteration in bytesWritten (Fedor Indutny)
-
-* crypto: zero is not an error if writing 0 bytes (Fedor Indutny)
-
-* tls: Re-enable check of CN-ID in cert verification (Tobias Müllerleile)
-
-
-2013.04.03, Version 0.10.3 (Stable), d4982f6f5e4a9a703127489a553b8d782997ea43
-
-* npm: Upgrade to 1.2.17
-
-* child_process: acknowledge sent handles (Fedor Indutny)
-
-* etw: update prototypes to match dtrace provider (Timothy J Fontaine)
-
-* dtrace: pass more arguments to probes (Dave Pacheco)
-
-* build: allow building with dtrace on osx (Dave Pacheco)
-
-* http: Remove legacy ECONNRESET workaround code (isaacs)
-
-* http: Ensure socket cleanup on client response end (isaacs)
-
-* tls: Destroy socket when encrypted side closes (isaacs)
-
-* repl: isSyntaxError() catches "strict mode" errors (Nathan Rajlich)
-
-* crypto: Pass options to ctor calls (isaacs)
-
-* src: tie process.versions.uv to uv_version_string() (Ben Noordhuis)
-
-
-2013.03.28, Version 0.10.2 (Stable)
-
-* npm: Upgrade to 1.2.15
-
-* uv: Upgrade to 0.10.3
-
-* tls: handle SSL_ERROR_ZERO_RETURN (Fedor Indutny)
-
-* tls: handle errors before calling C++ methods (Fedor Indutny)
-
-* tls: remove harmful unnecessary bounds checking (Marcel Laverdet)
-
-* crypto: make getCiphers() return non-SSL ciphers (Ben Noordhuis)
-
-* crypto: check randomBytes() size argument (Ben Noordhuis)
-
-* timers: do not calculate Timeout._when property (Alexey Kupershtokh)
-
-* timers: fix off-by-one ms error (Alexey Kupershtokh)
-
-* timers: handle signed int32 overflow in enroll() (Fedor Indutny)
-
-* stream: Fix stall in Transform under very specific conditions (Gil Pedersen)
-
-* stream: Handle late 'readable' event listeners (isaacs)
-
-* stream: Fix early end in Writables on zero-length writes (isaacs)
-
-* domain: fix domain callback from MakeCallback (Trevor Norris)
-
-* child_process: don't emit same handle twice (Ben Noordhuis)
-
-* child_process: fix sending utf-8 to child process (Ben Noordhuis)
-
-
-2013.03.21, Version 0.10.1 (Stable), c274d1643589bf104122674a8c3fd147527a667d
-
-* npm: upgrade to 1.2.15
-
-* crypto: Improve performance of non-stream APIs (Fedor Indutny)
-
-* tls: always reset this.ssl.error after handling (Fedor Indutny)
-
-* tls: Prevent mid-stream hangs (Fedor Indutny, isaacs)
-
-* net: improve arbitrary tcp socket support (Ben Noordhuis)
-
-* net: handle 'finish' event only after 'connect' (Fedor Indutny)
-
-* http: Don't hot-path end() for large buffers (isaacs)
-
-* fs: Missing cb errors are deprecated, not a throw (isaacs)
-
-* fs: make write/appendFileSync correctly set file mode (Raymond Feng)
-
-* stream: Return self from readable.wrap (isaacs)
-
-* stream: Never call decoder.end() multiple times (Gil Pedersen)
-
-* windows: enable watching signals with process.on('SIGXYZ') (Bert Belder)
-
-* node: revert removal of MakeCallback (Trevor Norris)
-
-* node: Unwrap without aborting in handle fd getter (isaacs)
-
-
-2013.03.11, Version 0.10.0 (Stable), 163ca274230fce536afe76c64676c332693ad7c1
-
-* npm: Upgrade to 1.2.14
-
-* core: Append filename properly in dlopen on windows (isaacs)
-
-* zlib: Manage flush flags appropriately (isaacs)
-
-* domains: Handle errors thrown in nested error handlers (isaacs)
-
-* buffer: Strip high bits when converting to ascii (Ben Noordhuis)
-
-* win/msi: Enable modify and repair (Bert Belder)
-
-* win/msi: Add feature selection for various node parts (Bert Belder)
-
-* win/msi: use consistent registry key paths (Bert Belder)
-
-* child_process: support sending dgram socket (Andreas Madsen)
-
-* fs: Raise EISDIR on Windows when calling fs.read/write on a dir (isaacs)
-
-* unix: fix strict aliasing warnings, macro-ify functions (Ben Noordhuis)
-
-* unix: honor UV_THREADPOOL_SIZE environment var (Ben Noordhuis)
-
-* win/tty: fix typo in color attributes enumeration (Bert Belder)
-
-* win/tty: don't touch insert mode or quick edit mode (Bert Belder)
-
-
-2013.03.06, Version 0.9.12 (Unstable), 0debf5a82934da805592b6496756cdf27c993abc
-
-* stream: Allow strings in Readable.push/unshift (isaacs)
-
-* stream: Remove bufferSize option (isaacs)
-
-* stream: Increase highWaterMark on large reads (isaacs)
-
-* stream: _write: takes an encoding argument (isaacs)
-
-* stream: _transform: remove output() method, provide encoding (isaacs)
-
-* stream: Don't require read(0) to emit 'readable' event (isaacs)
-
-* node: Add --throw-deprecation (isaacs)
-
-* http: fix multiple timeout events (Eugene Girshov)
-
-* http: More useful setTimeout API on server (isaacs)
-
-* net: use close callback, not process.nextTick (Ben Noordhuis)
-
-* net: Provide better error when writing after FIN (isaacs)
-
-* dns: Support NAPTR queries (Pavel Lang)
-
-* dns: fix ReferenceError in resolve() error path (Xidorn Quan)
-
-* child_process: handle ENOENT correctly on Windows (Scott Blomquist)
-
-* cluster: Rename destroy() to kill(signal=SIGTERM) (isaacs)
-
-* build: define nightly tag external to build system (Timothy J Fontaine)
-
-* build: make msi build work when spaces are present in the path (Bert Belder)
-
-* build: fix msi build issue with WiX 3.7/3.8 (Raymond Feng)
-
-* repl: make compatible with domains (Dave Olszewski)
-
-* events: Code cleanup and performance improvements (Trevor Norris)
-
-
-2013.03.01, Version 0.9.11 (Unstable), 83392403b7a9b7782b37c17688938c75010f81ba
-
-* V8: downgrade to 3.14.5
-
-* openssl: update to 1.0.1e
-
-* darwin: Make process.title work properly (Ben Noordhuis)
-
-* fs: Support mode/flag options to read/append/writeFile (isaacs)
-
-* stream: _read() no longer takes a callback (isaacs)
-
-* stream: Add stream.unshift(chunk) (isaacs)
-
-* stream: remove lowWaterMark feature (isaacs)
-
-* net: omit superfluous 'connect' event (Ben Noordhuis)
-
-* build, windows: disable SEH (Ben Noordhuis)
-
-* core: remove errno global (Ben Noordhuis)
-
-* core: Remove the nextTick for running the main file (isaacs)
-
-* core: Mark exit() calls with status codes (isaacs)
-
-* core: Fix debug signal handler race condition lock (isaacs)
-
-* crypto: clear error stack (Ben Noordhuis)
-
-* test: optionally set common.PORT via env variable (Timothy J Fontaine)
-
-* path: Throw TypeError on non-string args to path.resolve/join (isaacs, Arianit Uka)
-
-* crypto: fix uninitialized memory access in openssl (Ben Noordhuis)
-
-
-2013.02.19, Version 0.9.10 (Unstable)
-
-* V8: Upgrade to 3.15.11.15
-
-* npm: Upgrade to 1.2.12
-
-* fs: Change default WriteStream config, increase perf (isaacs)
-
-* process: streamlining tick callback logic (Trevor Norris)
-
-* stream_wrap, udp_wrap: add read-only fd property (Ben Noordhuis)
-
-* buffer: accept negative indices in Buffer#slice() (Ben Noordhuis)
-
-* tls: Cycle data when underlying socket drains (isaacs)
-
-* stream: read(0) should not always trigger _read(n,cb) (isaacs)
-
-* stream: Empty strings/buffers do not signal EOF any longer (isaacs)
-
-* crypto: improve cipher/decipher error messages (Ben Noordhuis)
-
-* net: Respect the 'readable' flag on sockets (isaacs)
-
-* net: don't suppress ECONNRESET (Ben Noordhuis)
-
-* typed arrays: copy Buffer in typed array constructor (Ben Noordhuis)
-
-* typed arrays: make DataView throw on non-ArrayBuffer (Ben Noordhuis)
-
-* windows: MSI installer enhancements (Scott Blomquist, Jim Schubert)
-
-
-2013.02.07, Version 0.9.9 (Unstable), 4b9f0d190cd6b22853caeb0e07145a98ce1d1d7f
-
-* tls: port CryptoStream to streams2 (Fedor Indutny)
-
-* typed arrays: only share ArrayBuffer backing store (Ben Noordhuis)
-
-* stream: make Writable#end() accept a callback function (Nathan Rajlich)
-
-* buffer: optimize 'hex' handling (Ben Noordhuis)
-
-* dns, cares: don't filter NOTIMP, REFUSED, SERVFAIL (Ben Noordhuis)
-
-* readline: treat bare \r as a line ending (isaacs)
-
-* readline: make \r\n emit one 'line' event (Ben Noordhuis)
-
-* cluster: support datagram sockets (Bert Belder)
-
-* stream: Correct Transform class backpressure (isaacs)
-
-* addon: Pass module object to NODE_MODULE init function (isaacs, Rod Vagg)
-
-* buffer: slow buffer copy compatibility fix (Trevor Norris)
-
-* Add bytesWritten to tls.CryptoStream (Andy Burke)
-
-
-2013.01.24, Version 0.9.8 (Unstable), 5f2f8400f665dc32c3e10e7d31d53d756ded9156
-
-* npm: Upgrade to v1.2.3
-
-* V8: Upgrade to 3.15.11.10
-
-* streams: Support objects other than Buffers (Jake Verbaten)
-
-* buffer: remove float write range checks (Trevor Norris)
-
-* http: close connection on 304/204 responses with chunked encoding (Ben Noordhuis)
-
-* build: fix build with dtrace support on FreeBSD (Fedor Indutny)
-
-* console: Support formatting options in trace() (isaacs)
-
-* domain: empty stack on all exceptions (Dave Olszewski)
-
-* unix, windows: make uv_*_bind() error codes consistent (Andrius Bentkus)
-
-* linux: add futimes() fallback (Ben Noordhuis)
-
-
-2013.01.18, Version 0.9.7 (Unstable), 9e7bebeb8305edd55735a95955a98fdbe47572e5
-
-* V8: Upgrade to 3.15.11.7
-
-* npm: Upgrade to 1.2.2
-
-* punycode: Upgrade to 1.2.0 (Mathias Bynens)
-
-* repl: make built-in modules available by default (Felix Böhm)
-
-* windows: add support for '_Total' perf counters (Scott Blomquist)
-
-* cluster: make --prof work for workers (Ben Noordhuis)
-
-* child_process: do not keep list of sent sockets (Fedor Indutny)
-
-* tls: Follow RFC6125 more strictly (Fedor Indutny)
-
-* buffer: floating point read/write improvements (Trevor Norris)
-
-* TypedArrays: Improve dataview perf without endian param (Dean McNamee)
-
-* module: assert require() called with a non-empty string (Felix Böhm, James Campos)
-
-* stdio: Set readable/writable flags properly (isaacs)
-
-* stream: Properly handle large reads from push-streams (isaacs)
-
-
-2013.01.11, Version 0.9.6 (Unstable), 9313fdc71ca8335d5e3a391c103230ee6219b3e2
-
-* V8: update to 3.15.11.5
-
-* node: remove ev-emul.h (Ben Noordhuis)
-
-* path: make basename and extname ignore trailing slashes (Bert Belder)
-
-* typed arrays: fix sunos signed/unsigned char issue (Ben Noordhuis)
-
-* child_process: Fix {stdio:'inherit'} regression (Ben Noordhuis)
-
-* child_process: Fix pipe() from child stdio streams  (Maciej Małecki)
-
-* child_process: make fork() execPath configurable (Bradley Meck)
-
-* stream: Add readable.push(chunk) method (isaacs)
-
-* dtrace: x64 ustack helper (Fedor Indutny)
-
-* repl: fix floating point number parsing (Nirk Niggler)
-
-* repl: allow overriding builtins (Ben Noordhuis)
-
-* net: add localAddress and localPort to Socket (James Hight)
-
-* fs: make pool size coincide with ReadStream bufferSize (Shigeki Ohtsu)
-
-* typed arrays: implement load and store swizzling (Dean McNamee)
-
-* windows: fix perfctr crash on XP and 2003 (Scott Blomquist)
-
-* dgram: fix double implicit bind error (Ben Noordhuis)
-
-
-2012.12.30, Version 0.9.5 (Unstable), 01994e8119c24f2284bac0779b32acb49c95bee7
-
-* assert: improve support for new execution contexts (lukebayes)
-
-* domain: use camelCase instead of snake_case (isaacs)
-
-* domain: Do not use uncaughtException handler (isaacs)
-
-* fs: make 'end' work with ReadStream without 'start' (Ben Noordhuis)
-
-* https: optimize createConnection() (Ryunosuke SATO)
-
-* buffer: speed up base64 encoding by 20% (Ben Noordhuis)
-
-* doc: Colorize API stabilitity index headers in docs (Luke Arduini)
-
-* net: socket.readyState corrections (bentaber)
-
-* http: Performance enhancements for http under streams2 (isaacs)
-
-* stream: fix to emit end event on http.ClientResponse (Shigeki Ohtsu)
-
-* stream: fix event handler leak in readstream pipe and unpipe (Andreas Madsen)
-
-* build: Support ./configure --tag switch (Maciej Małecki)
-
-* repl: don't touch `require.cache` (Nathan Rajlich)
-
-* node: Emit 'exit' event when exiting for an uncaught exception (isaacs)
-
-
-2012.12.21, Version 0.9.4 (Unstable), d86d83c75f6343b5368bb7bd328b4466a035e1d4
-
-* streams: Update all streaming interfaces to use new classes (isaacs)
-
-* node: remove idle gc (Ben Noordhuis)
-
-* http: protect against response splitting attacks (Bert Belder)
-
-* fs: Raise error when null bytes detected in paths (isaacs)
-
-* fs: fix 'object is not a function' callback errors (Ben Noordhuis)
-
-* fs: add autoClose=true option to fs.createReadStream (Farid Neshat)
-
-* process: add getgroups(), setgroups(), initgroups() (Ben Noordhuis)
-
-* openssl: optimized asm code on x86 and x64 (Bert Belder)
-
-* crypto: fix leak in GetPeerCertificate (Fedor Indutny)
-
-* add systemtap support (Jan Wynholds)
-
-* windows: add ETW and PerfCounters support (Scott Blomquist)
-
-* windows: fix normalization of UNC paths (Bert Belder)
-
-* crypto: fix ssl error handling (Sergey Kholodilov)
-
-* node: remove eio-emul.h (Ben Noordhuis)
-
-* os: add os.endianness() function (Nathan Rajlich)
-
-* readline: don't emit "line" events with a trailing '\n' char (Nathan Rajlich)
-
-* build: add configure option to generate xcode build files (Timothy J Fontaine)
-
-* build: allow linking against system libuv, cares, http_parser (Stephen Gallagher)
-
-* typed arrays: add slice() support to ArrayBuffer (Anthony Pesch)
-
-* debugger: exit and kill child on SIGTERM or SIGHUP (Fedor Indutny)
-
-* url: url.format escapes delimiters in path and query (J. Lee Coltrane)
-
-
-2012.10.24, Version 0.9.3 (Unstable), 1ed4c6776e4f52956918b70565502e0f8869829d
-
-* V8: Upgrade to 3.13.7.4
-
-* crypto: Default to buffers instead of binary strings (isaacs, Fedor Indutny)
-
-* crypto: add getHashes() and getCiphers() (Ben Noordhuis)
-
-* unix: add custom thread pool, remove libeio (Ben Noordhuis)
-
-* util: make `inspect()` accept an "options" argument (Nathan Rajlich)
-
-* https: fix renegotation attack protection (Ben Noordhuis)
-
-* cluster: make 'listening' handler see actual port (Aaditya Bhatia)
-
-* windows: use USERPROFILE to get the user's home dir (Bert Belder)
-
-* path: add platform specific path delimiter (Paul Serby)
-
-* http: add response.headersSent property (Pavel Lang)
-
-* child_process: make .fork()'d child auto-exit (Ben Noordhuis)
-
-* events: add 'removeListener' event (Ben Noordhuis)
-
-* string_decoder: Add 'end' method, do base64 properly (isaacs)
-
-* buffer: include encoding value in exception when invalid (Ricky Ng-Adam)
-
-* http: make http.ServerResponse no longer emit 'end' (isaacs)
-
-* streams: fix pipe is destructed by 'end' from destination (koichik)
-
-
-2012.09.17, Version 0.9.2 (Unstable), 6e2055889091a424fbb5c500bc3ab9c05d1c28b4
-
-* http_parser: upgrade to ad3b631
-
-* openssl: upgrade 1.0.1c
-
-* darwin: use FSEvents to watch directory changes (Fedor Indutny)
-
-* unix: support missing API on NetBSD (Shigeki Ohtsu)
-
-* unix: fix EMFILE busy loop (Ben Noordhuis)
-
-* windows: un-break writable tty handles (Bert Belder)
-
-* windows: map WSAESHUTDOWN to UV_EPIPE (Bert Belder)
-
-* windows: make spawn with custom environment work again (Bert Belder)
-
-* windows: map ERROR_DIRECTORY to UV_ENOENT (Bert Belder)
-
-* tls, https: validate server certificate by default (Ben Noordhuis)
-
-* tls, https: throw exception on missing key/cert (Ben Noordhuis)
-
-* tls: async session storage (Fedor Indutny)
-
-* installer: don't install header files (Ben Noordhuis)
-
-* buffer: implement Buffer.prototype.toJSON() (Nathan Rajlich)
-
-* buffer: added support for writing NaN and Infinity (koichik)
-
-* http: make http.ServerResponse emit 'end' (Ben Noordhuis)
-
-* build: ./configure --ninja (Ben Noordhuis, Timothy J Fontaine)
-
-* installer: fix --without-npm (Ben Noordhuis)
-
-* cli: make -p equivalent to -pe (Ben Noordhuis)
-
-* url: Go much faster by using Url class (isaacs)
-
-
-2012.08.28, Version 0.9.1 (Unstable), e6ce259d2caf338fec991c2dd447de763ce99ab7
-
-* buffer: Add Buffer.isEncoding(enc) to test for valid encoding values (isaacs)
-
-* Raise UV_ECANCELED on premature close. (Ben Noordhuis)
-
-* Remove c-ares from libuv, move to a top-level node dependency (Bert Belder)
-
-* ref/unref for all HandleWraps, timers, servers, and sockets (Timothy J Fontaine)
-
-* addon: remove node-waf, superseded by node-gyp (Ben Noordhuis)
-
-* child_process: emit error on exec failure (Ben Noordhuis)
-
-* cluster: do not use internal server API (Andreas Madsen)
-
-* constants: add O_DIRECT (Ian Babrou)
-
-* crypto: add sync interface to crypto.pbkdf2() (Ben Noordhuis)
-
-* darwin: emulate fdatasync() (Fedor Indutny)
-
-* dgram: make .bind() always asynchronous (Ben Noordhuis)
-
-* events: Make emitter.listeners() side-effect free (isaacs, Joe Andaverde)
-
-* fs: Throw early on invalid encoding args (isaacs)
-
-* fs: fix naming of truncate/ftruncate functions (isaacs)
-
-* http: bubble up parser errors to ClientRequest (Brian White)
-
-* linux: improve cpuinfo parser on ARM and MIPS (Ben Noordhuis)
-
-* net: add support for IPv6 addresses ending in :: (Josh Erickson)
-
-* net: support Server.listen(Pipe) (Andreas Madsen)
-
-* node: don't scan add-on for "init" symbol (Ben Noordhuis)
-
-* remove process.uvCounters() (Ben Noordhuis)
-
-* repl: console writes to repl rather than process stdio (Nathan Rajlich)
-
-* timers: implement setImmediate (Timothy J Fontaine)
-
-* tls: fix segfault in pummel/test-tls-ci-reneg-attack (Ben Noordhuis)
-
-* tools: Move gyp addon tools to node-gyp (Nathan Rajlich)
-
-* unix: preliminary signal handler support (Ben Noordhuis)
-
-* unix: remove dependency on ev_child (Ben Noordhuis)
-
-* unix: work around darwin bug, don't poll() on pipe (Fedor Indutny)
-
-* util: Formally deprecate util.pump() (Ben Noordhuis)
-
-* windows: make active and closing handle state independent (Bert Belder)
-
-* windows: report spawn errors to the exit callback (Bert Belder)
-
-* windows: signal handling support with uv_signal_t (Bert Belder)
-
-
-2012.07.20, Version 0.9.0 (Unstable), f9b237f478c372fd55e4590d7399dcd8f25f3603
-
-* punycode: update to v1.1.1 (Mathias Bynens)
-
-* c-ares: upgrade to 1.9.0 (Saúl Ibarra Corretgé)
-
-* dns: ignore rogue DNS servers reported by windows (Saúl Ibarra Corretgé)
-
-* unix: speed up uv_async_send() (Ben Noordhuis)
-
-* darwin: get cpu model correctly on mac (Xidorn Quan)
-
-* nextTick: Handle tick callbacks before any other I/O (isaacs)
-
-* Enable color customization of `util.inspect` (Pavel Lang)
-
-* tls: Speed and memory improvements (Fedor Indutny)
-
-* readline: Use one history item for reentered line (Vladimir Beloborodov)
-
-* Fix #3521 Make process.env more like a regular Object (isaacs)
-
-
-2013.06.13, Version 0.8.25 (maintenance), 0b9bdb2bc7e1c872f0ea4713517fda22a4b0b202
-
-* npm: Upgrade to 1.2.30
-
-* child_process: fix handle delivery (Ben Noordhuis)
-
-
-2013.06.04, Version 0.8.24 (maintenance), c1a1ab067721ea17ef7b05ec5c68b01321017f05
-
-* npm: Upgrade to v1.2.24
-
-* url: Properly parse certain oddly formed urls (isaacs)
-
-* http: Don't try to destroy nonexistent sockets (isaacs)
-
-* handle_wrap: fix NULL pointer dereference (Ben Noordhuis)
-
-
-2013.04.09, Version 0.8.23 (maintenance), c67f8d0500fe15637a623eb759d2ad7eb9fb3b0b
-
-* npm: Upgrade to v1.2.18
-
-* http: Avoid EE warning on ECONNREFUSED handling (isaacs)
-
-* tls: Re-enable check of CN-ID in cert verification (Tobias Müllerleile)
-
-* child_process: fix sending utf-8 to child process (Ben Noordhuis)
-
-* crypto: check key type in GetPeerCertificate() (Ben Noordhuis)
-
-* win/openssl: mark assembled object files as seh safe (Bert Belder)
-
-* windows/msi: fix msi build issue with WiX 3.7/3.8 (Raymond Feng)
-
-
-2013.03.07, Version 0.8.22 (Stable), 67a4cb4fe8c2346e30ffb83f7178e205cc2dab33
-
-* npm: Update to 1.2.14
-
-* cluster: propagate bind errors (Ben Noordhuis)
-
-* crypto: don't assert when calling Cipher#final() twice (Ben Noordhuis)
-
-* build, windows: disable SEH (Ben Noordhuis)
-
-
-2013.02.25, Version 0.8.21 (Stable), 530d8c05d4c546146f18e5ba811d7eb3b7b7c0c5
-
-* http: Do not free the wrong parser on socket close (isaacs)
-
-* http: Handle hangup writes more gently (isaacs)
-
-* zlib: fix assert on bad input (Ben Noordhuis)
-
-* test: add TAP output to the test runner (Timothy J Fontaine)
-
-* unix: Handle EINPROGRESS from domain sockets (Ben Noordhuis)
-
-
-2013.02.15, Version 0.8.20 (Stable), e10c75579b536581ddd7ae4e2c3bf8a9d550d343
-
-* npm: Upgrade to v1.2.11
-
-* http: Do not let Agent hand out destroyed sockets (isaacs)
-
-* http: Raise hangup error on destroyed socket write (isaacs)
-
-* http: protect against response splitting attacks (Bert Belder)
-
-
-2013.02.06, Version 0.8.19 (Stable), 53978bdf420622ff0121c63c0338c9e7c2e60869
-
-* npm: Upgrade to v1.2.10
-
-* zlib: pass object size hint to V8 (Ben Noordhuis)
-
-* zlib: reduce memory consumption, release early (Ben Noordhuis)
-
-* buffer: slow buffer copy compatibility fix (Trevor Norris)
-
-* zlib: don't assert on malformed dictionary (Ben Noordhuis)
-
-* zlib: don't assert on missing dictionary (Ben Noordhuis)
-
-* windows: better ipv6 support (Bert Belder)
-
-* windows: add error mappings related to unsupported protocols (Bert Belder)
-
-* windows: map ERROR_DIRECTORY to UV_ENOENT (Bert Belder)
-
-
-2013.01.18, Version 0.8.18 (Stable), 2c4eef0d972838c51999d32c0d251857a713dc18
-
-* npm: Upgrade to v1.2.2
-
-* dns: make error message match errno (Dan Milon)
-
-* tls: follow RFC6125 more stricly (Fedor Indutny)
-
-* buffer: reject negative SlowBuffer offsets (Ben Noordhuis)
-
-* install: add simplejson fallback (Chris Dent)
-
-* http: fix "Cannot call method 'emit' of null" (Ben Noordhuis)
-
-
-2013.01.09, Version 0.8.17 (Stable), c50c33e9397d7a0a8717e8ce7530572907c054ad
-
-* npm: Upgrade to v1.2.0
-  - peerDependencies (Domenic Denicola)
-  - node-gyp v0.8.2 (Nathan Rajlich)
-  - Faster installs from github user/project shorthands (Nathan Zadoks)
-
-* typed arrays: fix 32 bit size/index overflow (Ben Noordhuis)
-
-* http: Improve performance of single-packet responses (Ben Noordhuis)
-
-* install: fix openbsd man page location (Ben Noordhuis)
-
-* http: bubble up parser errors to ClientRequest (Brian White)
-
-
-2012.12.13, Version 0.8.16 (Stable), 1c9c6277d5cfcaaac8569c0c8f7daa64292048a9
-
-* npm: Upgrade to 1.1.69
-
-* fs: fix WriteStream/ReadStream fd leaks (Ben Noordhuis)
-
-* crypto: fix leak in GetPeerCertificate (Fedor Indutny)
-
-* buffer: Don't double-negate numeric buffer arg (Trevor Norris)
-
-* net: More accurate IP address validation and IPv6 dotted notation. (Joshua Erickson)
-
-
-2012.11.26, Version 0.8.15 (Stable), fdf91afb494a7a2fff2913d817f589c191a2c88f
-
-* npm: Upgrade to 1.1.66 (isaacs)
-
-* linux: use /proc/cpuinfo for CPU frequency (Ben Noordhuis)
-
-* windows: map WSAESHUTDOWN to UV_EPIPE (Ben Noordhuis)
-
-* windows: map ERROR_GEN_FAILURE to UV_EIO (Bert Belder)
-
-* unix: do not set environ unless one is provided (Charlie McConnell)
-
-* domains: don't crash if domain is set to null (Bert Belder)
-
-* windows: fix the x64 debug build (Bert Belder)
-
-* net, tls: fix connect() resource leak (Ben Noordhuis)
-
-
-2012.10.25, Version 0.8.14 (Stable), b00527fcf05c3d9f/b5d5d790f9472906a59fe218
-
-* events: Don't clobber pre-existing _events obj in EE ctor (isaacs)
-
-
-2012.10.25, Version 0.8.13 (Stable), ff4c974873f9a7cc6a5b042eb9b6389bb8dde6d6
-
-* V8: Upgrade to 3.11.10.25
-
-* npm: Upgrade to 1.1.65
-
-* url: parse hostnames that start with - or _ (Ben Noordhuis)
-
-* repl: Fix Windows 8 terminal issue (Bert Belder)
-
-* typed arrays: use signed char for signed int8s (Aaron Jacobs)
-
-* crypto: fix bugs in DiffieHellman (Ben Noordhuis)
-
-* configure: turn on VFPv3 on ARMv7 (Ben Noordhuis)
-
-* Re-enable OpenSSL UI for entering passphrases via tty (Ben Noordhuis)
-
-* repl: ensure each REPL instance gets its own "context" (Nathan Rajlich)
-
-
-2012.10.12, Version 0.8.12 (Stable), 38c72d4e29574dec5205bcf23c2a85efe65331a4
-
-* npm: Upgrade to 1.1.63
-
-* crypto: Reduce stability index to 2-Unstable (isaacs)
-
-* windows: fix handle leak in uv_fs_utime (Bert Belder)
-
-* windows: fix application crashed popup in debug version (Bert Belder)
-
-* buffer: report proper retained size in profiler (Ben Noordhuis)
-
-* buffer: fix byteLength with UTF-16LE (koichik)
-
-* repl: make "end of input" JSON.parse() errors throw in the REPL (Nathan Rajlich)
-
-* repl: make invalid RegExp modifiers throw in the REPL (Nathan Rajlich)
-
-* http: handle multiple Proxy-Authenticate values (Willi Eggeling)
-
-
-2012.09.27, Version 0.8.11 (Stable), e1f39468fa580c1e4cb15fac621f87944ee625dc
-
-* fs: Fix stat() size reporting for large files (Ben Noordhuis)
-
-
-2012.09.25, Version 0.8.10 (Stable), 0bc273da4fcaa79b209ed755ad249a3e7be626a6
-
-* npm: Upgrade to 1.1.62
-
-* repl: make invalid RegExps throw in the REPL (Nathan Rajlich)
-
-* v8: loosen artificial mmap constraint (Bryan Cantrill)
-
-* process: fix setuid() and setgid() error reporting (Ben Noordhuis)
-
-* domain: Properly exit() on domain disposal (isaacs)
-
-* fs: fix watchFile() missing deletion events (Ben Noordhuis)
-
-* fs: fix assert in fs.watch() (Ben Noordhuis)
-
-* fs: don't segfault on deeply recursive stat() (Ben Noordhuis)
-
-* http: Remove timeout handler when data arrives (Frédéric Germain)
-
-* http: make the client "res" object gets the same domain as "req" (Nathan Rajlich)
-
-* windows: don't blow up when an invalid FD is used (Bert Belder)
-
-* unix: map EDQUOT to UV_ENOSPC (Charlie McConnell)
-
-* linux: improve /proc/cpuinfo parser (Ben Noordhuis)
-
-* win/tty: reset background brightness when color is set to default (Bert Belder)
-
-* unix: put child process stdio fds in blocking mode (Ben Noordhuis)
-
-* unix: fix EMFILE busy loop (Ben Noordhuis)
-
-* sunos: don't set TCP_KEEPALIVE (Ben Noordhuis)
-
-* tls: Use slab allocator for memory management (Fedor Indutny)
-
-* openssl: Use optimized assembly code for x86 and x64 (Bert Belder)
-
-
-2012.09.11, Version 0.8.9 (Stable), b88c3902b241cf934e75443b934f2033ad3915b1
-
-* v8: upgrade to 3.11.10.22
-
-* GYP: upgrade to r1477
-
-* npm: Upgrade to 1.1.61
-
-* npm: Don't create world-writable files (isaacs)
-
-* windows: fix single-accept mode for shared server sockets (Bert Belder)
-
-* windows: fix uninitialized memory access in uv_update_time() (Bert Belder)
-
-* windows: don't throw when a signal handler is attached (Bert Belder)
-
-* unix: fix memory leak in udp (Ben Noordhuis)
-
-* unix: map errno ESPIPE (Ben Noordhuis)
-
-* unix, windows: fix memory corruption in fs-poll.c (Ben Noordhuis)
-
-* sunos: fix os.cpus() on x86_64 (Ben Noordhuis)
-
-* child process: fix processes with IPC channel don't emit 'close' (Bert Belder)
-
-* build: add a "--dest-os" option to force a gyp "flavor" (Nathan Rajlich)
-
-* build: set `process.platform` to "sunos" on SunOS (Nathan Rajlich)
-
-* build: fix `make -j` fails after `make clean` (Bearice Ren)
-
-* build: fix openssl configuration for "arm" builds (Nathan Rajlich)
-
-* tls: support unix domain socket/named pipe in tls.connect (Shigeki Ohtsu)
-
-* https: make https.get() accept a URL (koichik)
-
-* http: respect HTTP/1.0 TE header (Ben Noordhuis)
-
-* crypto, tls: Domainify setSNICallback, pbkdf2, randomBytes (Ben Noordhuis)
-
-* stream.pipe: Don't call destroy() unless it's a function (isaacs)
-
-
-2012.08.22, Version 0.8.8 (Stable), a299c97bbc701f4d460e91214d7bfe7a9589d361
-
-* V8: upgrade to 3.11.10.19
-
-* npm: upgrade to 1.1.59
-
-* windows: fix uninitialized memory access in uv_update_time() (Bert Belder)
-
-* unix, windows: fix memory corruption in fs-poll.c (Ben Noordhuis)
-
-* unix: fix integer overflow in uv_hrtime (Tim Holy)
-
-* sunos: fix uv_cpu_info() on x86_64 (Ben Noordhuis)
-
-* tls: update default cipher list (Ben Noordhuis)
-
-* unix: Fix llvm and older gcc duplicate symbol warnings (Bert Belder)
-
-* fs: fix use after free in stat watcher (Ben Noordhuis)
-
-* build: Fix using manually compiled gcc on OS X (Nathan Rajlich)
-
-* windows: make junctions work again (Bert Belder)
-
-
-2012.08.15, Version 0.8.7 (Stable), f640c5d35cba96634cd8176a525a1d876e361a61
-
-* npm: Upgrade to 1.1.49
-
-* website: download page (Golo Roden)
-
-* crypto: fix uninitialized memory access in openssl (Ben Noordhuis)
-
-* buffer, crypto: fix buffer decoding (Ben Noordhuis)
-
-* build: compile with -fno-tree-vrp when gcc >= 4.0 (Ben Noordhuis)
-
-* tls: handle multiple CN fields when verifying cert (Ben Noordhuis)
-
-* doc: remove unused util from child_process (Kyle Robinson Young)
-
-* build: rework -fvisibility=hidden detection (Ben Noordhuis)
-
-* windows: don't duplicate invalid stdio handles (Bert Belder)
-
-* windows: fix typos in process-stdio.c (Bert Belder)
-
-
-2012.08.07, Version 0.8.6 (Stable), 0544a586ca6b6b900a42e164033dbf350765700a
-
-* npm: Upgrade to v1.1.48
-
-* Add 'make binary' to build binary tarballs for all Unixes (Nathan Rajlich)
-
-* zlib: Emit 'close' on destroy(). (Dominic Tarr)
-
-* child_process: Fix stdout=null when stdio=['pipe'] (Tyler Neylon)
-
-* installer: prevent ETXTBSY errors (Ben Noordhuis)
-
-* installer: honor --without-npm, default install path (Ben Noordhuis)
-
-* net: make pause work with connecting sockets (Bert Belder)
-
-* installer: fix cross-compile installs (Ben Noordhuis)
-
-* net: fix .listen({fd:0}) (Ben Noordhuis)
-
-* windows: map WSANO_DATA to UV_ENOENT (Bert Belder)
-
-
-2012.08.02, Version 0.8.5 (Stable), 9b86a4453f0c76f2707a75c0b2343aba33ec63bc
-
-* node: tag Encode and friends NODE_EXTERN (Ben Noordhuis)
-
-* fs: fix ReadStream / WriteStream missing callback (Gil Pedersen)
-
-* fs: fix readFileSync("/proc/cpuinfo") regression (Ben Noordhuis)
-
-* installer: don't assume bash is installed (Ben Noordhuis)
-
-* Report errors properly from --eval and stdin (isaacs)
-
-* assert: fix throws() throws an error without message property (koichik)
-
-* cluster: fix libuv assert in net.listen() (Ben Noordhuis)
-
-* build: always link sunos builds with libumem (Trent Mick)
-
-* build: improve armv7 / hard-float detection (Adam Malcontenti-Wilson)
-
-* https: Use host header as effective servername (isaacs)
-
-* sunos: work around OS bug to prevent fs.watch() from spinning (Bryan Cantrill)
-
-* linux: fix 'two watchers, one path' segfault (Ben Noordhuis)
-
-* windows: fix memory leaks in many fs functions (Bert Belder)
-
-* windows: don't allow directories to be opened for writing/appending (Bert Belder)
-
-* windows: make fork() work even when not all stdio handles are valid (Bert Belder)
-
-* windows: make unlink() not remove mount points, and improve performance (Bert Belder)
-
-* build: Sign pkg installer for OS X (isaacs)
-
-
-2012.07.25, Version 0.8.4 (Stable), f98562fcd7d1cab573ca4dc1612157d6999befd4
-
-* V8: Upgrade to 3.11.10.17
-
-* npm: Upgrade to 1.1.45
-
-* net: fix Socket({ fd: 42 }) api (Ben Noordhuis)
-
-* readline: Remove event listeners on close (isaacs)
-
-* windows: correctly prep long path for fs.exists(Sync) (Bert Belder)
-
-* debugger: wake up the event loop when a debugger command is dispatched (Peter Rybin)
-
-* tls: verify server's identity (Fedor Indutny)
-
-* net: ignore socket.setTimeout(Infinity or NaN) (Fedor Indutny)
-
-
-2012.07.19, Version 0.8.3 (Stable), 60bf2d6cb33e4ce55604f73889ab840a9de8bdab
-
-* V8: upgrade to 3.11.10.15
-
-* npm: Upgrade to 1.1.43
-
-* net: fix net.Server.listen({fd:x}) error reporting (Ben Noordhuis)
-
-* net: fix bogus errno reporting (Ben Noordhuis)
-
-* build: Move npm shebang logic into an npm script (isaacs)
-
-* build: fix add-on loading on freebsd (Ben Noordhuis)
-
-* build: disable unsafe optimizations (Ben Noordhuis)
-
-* build: fix spurious mksnapshot crashes for good (Ben Noordhuis)
-
-* build: speed up genv8constants (Dave Pacheco)
-
-* fs: make unwatchFile() remove a specific listener (Ben Noordhuis)
-
-* domain: Remove first arg from intercepted fn (Toshihiro Nakamura)
-
-* domain: Fix memory leak on error (isaacs)
-
-* events: Fix memory leak from removeAllListeners (Nathan Rajlich)
-
-* zlib: Fix memory leak in Unzip class. (isaacs)
-
-* crypto: Fix memory leak in DecipherUpdate() (Ben Noordhuis)
-
-
-2012.07.09, Version 0.8.2 (Stable), cc6084b9ac5cf1d4fe5e7165b71e8fc05d11be1f
-
-* npm: Upgrade to 1.1.36
-
-* readline: don't use Function#call() (Nathan Rajlich)
-
-* Code cleanup to pass 'use strict' (Jonas Westerlund)
-
-* module: add filename to require() json errors (TJ Holowaychuk)
-
-* readline: fix for unicode prompts (Tim Macfarlane)
-
-* timers: fix handling of large timeouts (Ben Noordhuis)
-
-* repl: fix passing an empty line inserting "undefined" into the buffer (Nathan Rajlich)
-
-* repl: fix crashes when buffering command (Maciej Małecki)
-
-* build: rename strict_aliasing to node_no_strict_aliasing (Ben Noordhuis)
-
-* build: disable -fstrict-aliasing for any gcc < 4.6.0 (Ben Noordhuis)
-
-* build: detect cc version with -dumpversion (Ben Noordhuis)
-
-* build: handle output of localized gcc or clang (Ben Noordhuis)
-
-* unix: fix memory corruption in freebsd.c (Ben Noordhuis)
-
-* unix: fix 'zero handles, one request' busy loop (Ben Noordhuis)
-
-* unix: fix busy loop on unexpected tcp message (Ben Noordhuis)
-
-* unix: fix EINPROGRESS busy loop (Ben Noordhuis)
-
-
-2012.06.29, Version 0.8.1 (stable), 2134aa3d5c622fc3c3b02ccb713fcde0e0df479a
-
-* V8: upgrade to v3.11.10.12
-
-* npm: upgrade to v1.1.33
-  - Support for parallel use of the cache folder
-  - Retry on registry timeouts or network failures (Trent Mick)
-  - Reduce 'engines' failures to a warning
-  - Use new zsh completion if aviailable (Jeremy Cantrell)
-
-* Fix #3577 Un-break require('sys')
-
-* util: speed up formatting of large arrays/objects (Ben Noordhuis)
-
-* windows: make fs.realpath(Sync) work with UNC paths (Bert Belder)
-
-* build: fix --shared-v8 option (Ben Noordhuis)
-
-* doc: `detached` is a boolean (Andreas Madsen)
-
-* build: use proper python interpreter (Ben Noordhuis)
-
-* build: expand ~ in `./configure --prefix=~/a/b/c` (Ben Noordhuis)
-
-* build: handle CC env var with spaces (Gabriel de Perthuis)
-
-* build: fix V8 build when compiling with gcc 4.5 (Ben Noordhuis)
-
-* build: fix --shared-v8 option (Ben Noordhuis)
-
-* windows msi: Fix icon issue which caused huge file size (Bert Belder)
-
-* unix: assume that dlopen() may clobber dlerror() (Ben Noordhuis)
-
-* sunos: fix memory corruption bugs (Ben Noordhuis)
-
-* windows: better (f)utimes and (f)stat (Bert Belder)
-
-
-2012.06.25, Version 0.8.0 (stable), 8b8a7a7f9b41e74e1e810d0330738ad06fc302ec
-
-* V8: upgrade to v3.11.10.10
-
-* npm: Upgrade to 1.1.32
-
-* Deprecate iowatcher (Ben Noordhuis)
-
-* windows: update icon (Bert Belder)
-
-* http: Hush 'MUST NOT have a body' warnings to debug() (isaacs)
-
-* Move blog.nodejs.org content into repository (isaacs)
-
-* Fix #3503: stdin: resume() on pipe(dest) (isaacs)
-
-* crypto: fix error reporting in SetKey() (Fedor Indutny)
-
-* Add --no-deprecation and --trace-deprecation command-line flags (isaacs)
-
-* fs: fix fs.watchFile() (Ben Noordhuis)
-
-* fs: Fix fs.readfile() on pipes (isaacs)
-
-* Rename GYP variable node_use_system_openssl to be consistent (Ryan Dahl)
-
-
-2012.06.19, Version 0.7.12 (unstable), a72120190a8ffdbcd3d6ad2a2e6ceecd2087111e
-
-* npm: Upgrade to 1.1.30
-	- Improved 'npm init'
-	- Fix the 'cb never called' error from 'oudated' and 'update'
-	- Add --save-bundle|-B config
-	- Fix isaacs/npm#2465: Make npm script and windows shims cygwin-aware
-	- Fix isaacs/npm#2452 Use --save(-dev|-optional) in npm rm
-	- `logstream` option to replace removed `logfd` (Rod Vagg)
-	- Read default descriptions from README.md files
-
-* Shims to support deprecated ev_* and eio_* methods (Ben Noordhuis)
-
-* #3118 net.Socket: Delay pause/resume until after connect (isaacs)
-
-* #3465 Add ./configure --no-ifaddrs flag (isaacs)
-
-* child_process: add .stdin stream to forks (Fedor Indutny)
-
-* build: fix `make install DESTDIR=/path` (Ben Noordhuis)
-
-* tls: fix off-by-one error in renegotiation check (Ben Noordhuis)
-
-* crypto: Fix diffie-hellman key generation UTF-8 errors (Fedor Indutny)
-
-* node: change the constructor name of process from EventEmitter to process (Andreas Madsen)
-
-* net: Prevent property access throws during close (Reid Burke)
-
-* querystring: improved speed and code cleanup (Felix Böhm)
-
-* sunos: fix assertion errors breaking fs.watch() (Fedor Indutny)
-
-* unix: stat: detect sub-second changes (Ben Noordhuis)
-
-* add stat() based file watcher (Ben Noordhuis)
-
-
-2012.06.15, Version 0.7.11 (unstable), 5cfe0b86d5be266ef51bbba369c39e412ee51944
-
-* V8: Upgrade to v3.11.10
-
-* npm: Upgrade to 1.1.26
-
-* doc: Improve cross-linking in API docs markdown (Ben Kelly)
-
-* Fix #3425: removeAllListeners should delete array (Reid Burke)
-
-* cluster: don't silently drop messages when the write queue gets big (Bert Belder)
-
-* Add Buffer.concat method (isaacs)
-
-* windows: make symlinks tolerant to forward slashes (Bert Belder)
-
-* build: Add node.d and node.1 to installer (isaacs)
-
-* cluster: rename worker.unqiueID to worker.id (Andreas Madsen)
-
-* Windows: Enable ETW events on Windows for existing DTrace probes. (Igor Zinkovsky)
-
-* test: bundle node-weak in test/gc so that it doesn't need to be downloaded (Nathan Rajlich)
-
-* Make many tests pass on Windows (Bert Belder)
-
-* Fix #3388 Support listening on file descriptors (isaacs)
-
-* Fix #3407 Add os.tmpDir() (isaacs)
-
-* Unbreak the snapshotted build on Windows (Bert Belder)
-
-* Clean up child_process.kill throws (Bert Belder)
-
-* crypto: make cipher/decipher accept buffer args (Ben Noordhuis)
-
-
-2012.06.11, Version 0.7.10 (unstable), 12a32a48a30182621b3f8e9b9695d1946b53c131
-
-* Roll V8 back to 3.9.24.31
-
-* build: x64 target should always pass -m64 (Robert Mustacchi)
-
-* add NODE_EXTERN to node::Start (Joel Brandt)
-
-* repl: Warn about running npm commands (isaacs)
-
-* slab_allocator: fix crash in dtor if V8 is dead (Ben Noordhuis)
-
-* slab_allocator: fix leak of Persistent handles (Shigeki Ohtsu)
-
-* windows/msi: add node.js prompt to startmenu (Jeroen Janssen)
-
-* windows/msi: fix adding node to PATH (Jeroen Janssen)
-
-* windows/msi: add start menu links when installing (Jeroen Janssen)
-
-* windows: don't install x64 version into the 'program files (x86)' folder (Matt Gollob)
-
-* domain: Fix #3379 domain.intercept no longer passes error arg to cb (Marc Harter)
-
-* fs: make callbacks run in global context (Ben Noordhuis)
-
-* fs: enable fs.realpath on windows (isaacs)
-
-* child_process: expose UV_PROCESS_DETACHED as options.detached (Charlie McConnell)
-
-* child_process: new stdio API for .spawn() method (Fedor Indutny)
-
-* child_process: spawn().ref() and spawn().unref() (Fedor Indutny)
-
-* Upgrade npm to 1.1.25
-	- Enable npm link on windows
-	- Properly remove sh-shim on Windows
-	- Abstract out registry client and logger
-
-
-2012.05.28, Version 0.7.9 (unstable), 782277f11a753ded831439ed826448c06fc0f356
-
-* Upgrade V8 to 3.11.1
-
-* Upgrade npm to 1.1.23
-
-* uv: rework reference counting scheme (Ben Noordhuis)
-
-* uv: add interface for joining external event loops (Bert Belder)
-
-* repl, readline: Handle Ctrl+Z and SIGCONT better (Nathan Rajlich)
-
-* fs: 64bit offsets for fs calls (Igor Zinkovsky)
-
-* fs: add sync open flags 'rs' and 'rs+' (Kevin Bowman)
-
-* windows: enable creating directory junctions with fs.symlink (Igor Zinkovsky, Bert Belder)
-
-* windows: fix fs.lstat to properly detect symlinks. (Igor Zinkovsky)
-
-* Fix #3270 Escape url.parse delims (isaacs)
-
-* http: make http.get() accept a URL (Adam Malcontenti-Wilson)
-
-* Cleanup vm module memory leakage (Marcel Laverdet)
-
-* Optimize writing strings with Socket.write (Bert Belder)
-
-* add support for CESU-8 and UTF-16LE encodings (koichik)
-
-* path: add path.sep to get the path separator. (Yi, EungJun)
-
-* net, http: add backlog parameter to .listen() (Erik Dubbelboer)
-
-* debugger: support mirroring Date objects (Fedor Indutny)
-
-* addon: add AtExit() function (Ben Noordhuis)
-
-* net: signal localAddress bind failure in connect (Brian Schroeder)
-
-* util: handle non-string return value in .inspect() (Alex Kocharin)
-
-
-2012.04.18, Version 0.7.8 (unstable), c2b47097c0b483552efc1947c6766fa1128600b6
-
-* Upgrade V8 to 3.9.24.9
-
-* Upgrade OpenSSL to 1.0.0f
-
-* Upgrade npm to 1.1.18
-
-* Show licenses in Binary installers
-
-* Domains (isaacs)
-
-* readline: rename "end" to "close" (Nathan Rajlich)
-
-* tcp: make getsockname() return address family as string (Shigeki Ohtsu)
-
-* http, https: fix .setTimeout() (ssuda)
-
-* os: add cross platform EOL character (Mustansir Golawala)
-
-* typed arrays: unexport SizeOfArrayElementForType() (Aaron Jacobs)
-
-* net: honor 'enable' flag in .setNoDelay() (Ben Noordhuis)
-
-* child_process: emit error when .kill fails (Andreas Madsen)
-
-* gyp: fix 'argument list too long' build error (Ben Noordhuis)
-
-* fs.WriteStream: Handle modifications to fs.open (isaacs)
-
-* repl, readline: Handle newlines better (Nathan Rajlich, Nathan Friedly)
-
-* build: target OSX 10.5 when building on darwin (Nathan Rajlich)
-
-* Fix #3052 Handle errors properly in zlib (isaacs)
-
-* build: add support for DTrace and postmortem (Dave Pacheco)
-
-* core: add reusable Slab allocator (Ben Noordhuis)
-
-
-2012.03.30, Version 0.7.7 (unstable), 5cda2542fdb086f9fe5de889bea435a65e377dea
-
-* Upgrade V8 to 3.9.24.7
-
-* Upgrade npm to 1.1.15
-
-* Handle Emoji characters properly (Erik Corry, Bert Belder)
-
-* readline: migrate ansi/vt100 logic from tty to readline (Nathan Rajlich)
-
-* readline: Fix multiline handling (Alex Kocharin)
-
-* add a -i/--interactive flag to force the REPL (Nathan Rajlich)
-
-* debugger: add breakOnException command (Fedor Indutny)
-
-* cluster: kill workers when master dies (Andreas Madsen)
-
-* cluster: add graceful disconnect support (Andreas Madsen)
-
-* child_process: Separate 'close' event from 'exit' (Charlie McConnell)
-
-* typed arrays: add Uint8ClampedArray (Mikael Bourges-Sevenier)
-
-* buffer: Fix byte alignment issues (Ben Noordhuis, Erik Lundin)
-
-* tls: fix CryptoStream.setKeepAlive() (Shigeki Ohtsu)
-
-* Expose http parse error codes (Felix Geisendörfer)
-
-* events: don't delete the listeners array (Ben Noordhuis, Nathan Rajlich)
-
-* process: add process.config to view node's ./configure settings (Nathan Rajlich)
-
-* process: process.execArgv to see node's arguments (Micheil Smith)
-
-* process: fix process.title setter (Ben Noordhuis)
-
-* timers: handle negative or non-numeric timeout values (Ben Noordhuis)
-
-
-2012.03.13, Version 0.7.6 (unstable), f06abda6f58e517349d1b63a2cbf5a8d04a03505
-
-* Upgrade v8 to 3.9.17
-
-* Upgrade npm to 1.1.8
-  - Add support for os/cpu fields in package.json (Adam Blackburn)
-  - Automatically node-gyp packages containing a binding.gyp
-  - Fix failures unpacking in UNC shares
-  - Never create un-listable directories
-  - Handle cases where an optionalDependency fails to build
-
-* events: newListener emit correct fn when using 'once' (Roly Fentanes)
-
-* url: Ignore empty port component (Łukasz Walukiewicz)
-
-* module: replace 'children' array (isaacs)
-
-* tls: parse multiple values of a key in ssl certificate (Sambasiva Suda)
-
-* cluster: support passing of named pipes (Ben Noordhuis)
-
-* Windows: include syscall in fs errors (Bert Belder)
-
-* http: #2888 Emit end event only once (Igor Zinkovsky)
-
-* readline: add multiline support (Rlidwka)
-
-* process: add `process.hrtime()` (Nathan Rajlich)
-
-* net, http, https: add localAddress option (Dmitry Nizovtsev)
-
-* addon improvements (Nathan Rajlich)
-
-* build improvements (Ben Noordhuis, Sadique Ali, T.C. Hollingsworth, Nathan Rajlich)
-
-* add support for "SEARCH" request methods (Nathan Rajlich)
-
-* expose the zlib and http_parser version in process.versions (Nathan Rajlich)
-
-
-2012.02.23, Version 0.7.5 (unstable), d384b8b0d2ab7f05465f0a3e15fe20b4e25b5f86
-
-* startup speed improvements (Maciej Małecki)
-
-* crypto: add function getDiffieHellman() (Tomasz Buchert)
-
-* buffer: support decoding of URL-safe base64 (Ben Noordhuis)
-
-* Make QueryString.parse() even faster (Brian White)
-
-* url: decode url entities in auth section (Ben Noordhuis)
-
-* http: support PURGE request method (Ben Noordhuis)
-
-* http: Generate Date headers on responses (Mark Nottingham)
-
-* Fix #2762: Add callback to close function. (Mikeal Rogers)
-
-* dgram: fix out-of-bound memory read (Ben Noordhuis)
-
-* repl: add automatic loading of built-in libs (Brandon Benvie)
-
-* repl: remove double calls where possible (Fedor Indutny)
-
-* Readline improvements. Related: #2737 #2756 (Colton Baker)
-
-* build: disable -fomit-frame-pointer on solaris (Dave Pacheco)
-
-* build: arch detection improvements (Nathan Rajlich)
-
-* build: Make a fat binary for the OS X `make pkg`. (Nathan Rajlich)
-
-* jslint src/ and lib/ on 'make test' (isaacs)
-
-
-
-2012.02.14, Version 0.7.4 (unstable), de21de920cf93ec40736ada3792a7f85f3eadeda
-
-* Upgrade V8 to 3.9.5
-
-* Upgrade npm to 1.1.1
-
-* build: Detect host_arch better (Karl Skomski)
-
-* debugger: export `debug_port` to `process` (Fedor Indutny)
-
-* api docs: CSS bug fixes (isaacs)
-
-* build: use -fPIC for native addons on UNIX (Nathan Rajlich)
-
-* Re-add top-level v8::Locker (Marcel Laverdet)
-
-* Move images out of the dist tarballs (isaacs)
-
-* libuv: Remove uv_export and uv_import (Ben Noordhuis)
-
-* build: Support x64 build on Windows (Igor Zinkovsky)
-
-
-2012.02.07, Version 0.7.3 (unstable), 99059aad8d654acda4abcfaa68df182b50f2ec90
-
-* Upgrade V8 to 3.9.2
-
-* Revert support for isolates. (Ben Noordhuis)
-
-* cluster: Cleanup docs, event handling, and process.disconnect (Andreas Madsen)
-
-* gyp_addon: link with node.lib on Windows (Nathan Rajlich)
-
-* http: fix case where http-parser is freed twice (koichik)
-
-* Windows: disable RTTI and exceptions (Bert Belder)
-
-
-2012.02.01, Version 0.7.2 (unstable), ec79acb3a6166e30f0bf271fbbfda1fb575b3321
-
-* Update V8 to 3.8.9
-
-* Support for sharing streams across Isolates (Igor Zinkovsky)
-
-* #2636 - Fix case where http_parsers are freed too early (koichik)
-
-* url: Support for IPv6 addresses in URLs (Łukasz Walukiewicz)
-
-* child_process: Add disconnect() method to child processes (Andreas Madsen)
-
-* fs: add O_EXCL support, exclusive open file (Ben Noordhuis)
-
-* fs: more specific error messages (Tj Holowaychuk)
-
-* tty: emit 'unknown' key event if key sequence not found (Dan VerWeire, Nathan Rajlich)
-
-* build: compile release build too if BUILDTYPE=Debug (Ben Noordhuis)
-
-* module: fix --debug-brk on symlinked scripts (Fedor Indutny)
-
-* zlib: fix `Failed to set dictionary` issue (Fedor Indutny)
-
-* waf: predict target arch for OS X (Fedor Indutny)
-
-
-2012.01.23, Version 0.7.1 (unstable), a74354735ab5d5b0fa35a1e4ff7e653757d2069b
-
-* Update V8 to 3.8.8
-
-* Install node-waf by default (Fedor Indutny)
-
-* crypto: Add ability to turn off PKCS padding (Ingmar Runge)
-
-* v8: implement VirtualMemory class on SunOS (Ben Noordhuis)
-
-* Add cluster.setupMaster (Andreas Madsen)
-
-* move `path.exists*` to `fs.exists*` (Maciej Małecki)
-
-* typed arrays: set class name (Ben Noordhuis)
-
-* libuv bug fixes (Igor Zinkovsky, Ben Noordhuis, Dan VerWeire)
-
-
-2012.01.16, Version 0.7.0 (unstable), 9cc55dca6f67a6096c858b841c677b0593404321
-
-* Upgrade V8 to 3.8.6
-
-* Use GYP build system on unix (Ben Noordhuis)
-
-* Experimenetal isolates support (Ben Noordhuis)
-
-* Improvements to Cluster API (Andreas Madsen)
-
-* Use isolates for internal debugger (Fedor Indutny)
-
-* Bug fixes
-
-
-2012.07.10 Version 0.6.20 (maintenance), 952e513379169ec1b40909d1db056e9bf4294899
-
-* npm: Upgrade to 1.1.37 (isaacs)
-
-* benchmark: Backport improvements made in master (isaacs)
-
-* build: always link with -lz (Trent Mick)
-
-* core: use proper #include directives (Ben Noordhuis)
-
-* cluster: don't silently drop messages when the write queue gets big (Bert Belder)
-
-* windows: don't print error when GetConsoleTitleW returns an empty string (Bert Belder)
-
-
-2012.06.06 Version 0.6.19 (stable), debf552ed2d4a53957446e82ff3c52a8182d5ff4
-
-* npm: upgrade to 1.1.24
-
-* fs: no end emit after createReadStream.pause() (Andreas Madsen)
-
-* vm: cleanup module memory leakage (Marcel Laverdet)
-
-* unix: fix loop starvation under high network load (Ben Noordhuis)
-
-* unix: remove abort() in ev_unref() (Ben Noordhuis)
-
-* windows/tty: never report error after forcibly aborting line-buffered read (Bert Belder)
-
-* windows: skip GetFileAttributes call when opening a file (Bert Belder)
-
-
-2012.05.15 Version 0.6.18 (stable), 4bc1d395de6abed2cf1e4d0b7b3a1480a21c368f
-
-* windows: skip GetFileAttributes call when opening a file (Bert Belder)
-
-* crypto: add PKCS12/PFX support (Sambasiva Suda)
-
-* #3240: child_process: delete NODE_CHANNEL_FD from env in spawn (Ben Noordhuis)
-
-* windows: add test for path.normalize with UNC paths (Bert Belder)
-
-* windows: make path.normalize convert all slashes to backslashes (Bert Belder)
-
-* fs: Automatically close FSWatcher on error (Bert Belder)
-
-* #3258: fs.ReadStream.pause() emits duplicate data event (koichik)
-
-* pipe_wrap: don't assert() on pipe accept errors (Ben Noordhuis)
-
-* Better exception output for module load and process.nextTick (Felix Geisendörfer)
-
-* zlib: fix error reporting (Ben Noordhuis)
-
-* http: Don't destroy on timeout (isaacs)
-
-* #3231: http: Don't try to emit error on a null'ed req object (isaacs)
-
-* #3236: http: Refactor ClientRequest.onSocket (isaacs)
-
-
-2012.05.04 Version 0.6.17 (stable), 4ced23deaf36493f4303a18f6fdce768c58becc0
-
-* Upgrade npm to 1.1.21
-
-* uv: Add support for EROFS errors (Ben Noordhuis, Maciej Małecki)
-
-* uv: Add support for EIO and ENOSPC errors (Fedor Indutny)
-
-* windows: Add support for EXDEV errors (Bert Belder)
-
-* http: Fix client memory leaks (isaacs, Vincent Voyer)
-
-* fs: fix file descriptor leak in sync functions (Ben Noordhuis)
-
-* fs: fix ReadStream / WriteStream double close bug (Ben Noordhuis)
-
-
-2012.04.30 Version 0.6.16 (stable), a1d193963ddc80a27da5da01b59751e14e33d1d6
-
-* Upgrade V8 to 3.6.6.25
-
-* Upgrade npm to 1.1.19
-
-* Windows: add mappings for UV_ENOENT (Bert Belder)
-
-* linux: add IN_MOVE_SELF to inotify event mask (Ben Noordhuis)
-
-* unix: call pipe handle connection cb on accept() error (Ben Noordhuis)
-
-* unix: handle EWOULDBLOCK (Ben Noordhuis)
-
-* map EWOULDBLOCK to UV_EAGAIN (Ben Noordhuis)
-
-* Map ENOMEM to UV_ENOMEM (isaacs)
-
-* Child process: support the `gid` and `uid` options (Bert Belder)
-
-* test: cluster: add worker death event test (Ben Noordhuis)
-
-* typo in node_http_parser (isaacs)
-
-* http_parser: Eat CRLF between requests, even on connection:close. (Ben Noordhuis)
-
-* don't check return value of unsetenv (Ben Noordhuis)
-
-
-2012.04.09 Version 0.6.15 (stable), f160a45b254e591eb33716311c92be533c6d86c4
-
-* Update npm to 1.1.16
-
-* Show licenses in binary installers.
-
-* unix: add uv_fs_read64, uv_fs_write64 and uv_fs_ftruncate64 (Ben Noordhuis)
-
-* add 64bit offset fs functions (Igor Zinkovsky)
-
-* windows: don't report ENOTSOCK when attempting to bind an udp handle twice (Bert Belder)
-
-* windows: backport pipe-connect-to-file fixes from master (Bert Belder)
-
-* windows: never call fs event callbacks after closing the watcher (Bert Belder)
-
-* fs.readFile: don't make the callback before the fd is closed (Bert Belder)
-
-* windows: use 64bit offsets for uv_fs apis (Igor Zinkovsky)
-
-* Fix #2061: segmentation fault on OS X due to stat size mismatch (Ben Noordhuis)
-
-
-2012.03.22 Version 0.6.14 (stable), e513ffef7549a56a5af728e1f0c2c0c8f290518a
-
-* net: don't crash when queued write fails (Igor Zinkovsky)
-
-* sunos: fix EMFILE on process.memoryUsage() (Bryan Cantrill)
-
-* crypto: fix compile-time error with openssl 0.9.7e (Ben Noordhuis)
-
-* unix: ignore ECONNABORTED errors from accept() (Ben Noordhuis)
-
-* Add UV_ENOSPC and mappings to it (Bert Belder)
-
-* http-parser: Fix response body is not read (koichik)
-
-* Upgrade npm to 1.1.12
-  - upgrade node-gyp to 0.3.7
-  - work around AV-locked directories on Windows
-  - Fix isaacs/npm#2293 Don't try to 'uninstall' /
-  - Exclude symbolic links from packages.
-  - Fix isaacs/npm#2275 Spurious 'unresolvable cycle' error.
-  - Exclude/include dot files as if they were normal files
-
-
-2012.03.15 Version 0.6.13 (stable), 9f7f86b534f8556290eb8cad915984ff4ca54996
-
-* Windows: Many libuv test fixes (Bert Belder)
-
-* Windows: avoid uv_guess_handle crash in when fd < 0 (Bert Belder)
-
-* Map EBUSY and ENOTEMPTY errors (Bert Belder)
-
-* Windows: include syscall in fs errors (Bert Belder)
-
-* Fix fs.watch ENOSYS on Linux kernel version mismatch (Ben Noordhuis)
-
-* Update npm to 1.1.9
-  - upgrade node-gyp to 0.3.5 (Nathan Rajlich)
-  - Fix isaacs/npm#2249 Add cache-max and cache-min configs
-  - Properly redirect across https/http registry requests
-  - log config usage if undefined key in set function (Kris Windham)
-  - Add support for os/cpu fields in package.json (Adam Blackburn)
-  - Automatically node-gyp packages containing a binding.gyp
-  - Fix failures unpacking in UNC shares
-  - Never create un-listable directories
-  - Handle cases where an optionalDependency fails to build
-
-
-2012.03.02 Version 0.6.12 (stable), 48a2d34cfe6b7e1c9d15202a4ef5e3c82d1fba35
-
-* Upgrade V8 to 3.6.6.24
-
-* dtrace ustack helper improvements (Dave Pacheco)
-
-* API Documentation refactor (isaacs)
-
-* #2827 net: fix race write() before and after connect() (koichik)
-
-* #2554 #2567 throw if fs args for 'start' or 'end' are strings (AJ ONeal)
-
-* punycode: Update to v1.0.0 (Mathias Bynens)
-
-* Make a fat binary for the OS X pkg (isaacs)
-
-* Fix hang on accessing process.stdin (isaacs)
-
-* repl: make tab completion work on non-objects (Nathan Rajlich)
-
-* Fix fs.watch on OS X (Ben Noordhuis)
-
-* Fix #2515 nested setTimeouts cause premature process exit (Ben Noordhuis)
-
-* windows: fix time conversion in stat (Igor Zinkovsky)
-
-* windows: fs: handle EOF in read (Brandon Philips)
-
-* windows: avoid IOCP short-circuit on non-ifs lsps (Igor Zinkovsky)
-
-* Upgrade npm to 1.1.4 (isaacs)
-  - windows fixes
-  - Bundle nested bundleDependencies properly
-  - install: support --save with url install targets
-  - shrinkwrap: behave properly with url-installed modules
-  - support installing uncompressed tars or single file modules from urls etc.
-  - don't run make clean on rebuild
-  - support HTTPS-over-HTTP proxy tunneling
-
-
-2012.02.17 Version 0.6.11 (stable), 1eb1fe32250fc88cb5b0a97cddf3e02be02e3f4a
-
-* http: allow multiple WebSocket RFC6455 headers (Einar Otto Stangvik)
-
-* http: allow multiple WWW-Authenticate headers (Ben Noordhuis)
-
-* windows: support unicode argv and environment variables (Bert Belder)
-
-* tls: mitigate session renegotiation attacks (Ben Noordhuis)
-
-* tcp, pipe: don't assert on uv_accept() errors (Ben Noordhuis)
-
-* tls: Allow establishing secure connection on the existing socket (koichik)
-
-* dgram: handle close of dgram socket before DNS lookup completes (Seth Fitzsimmons)
-
-* windows: Support half-duplex pipes (Igor Zinkovsky)
-
-* build: disable omit-frame-pointer on solaris systems (Dave Pacheco)
-
-* debugger: fix --debug-brk (Ben Noordhuis)
-
-* net: fix large file downloads failing (koichik)
-
-* fs: fix ReadStream failure to read from existing fd (Christopher Jeffrey)
-
-* net: destroy socket on DNS error (Stefan Rusu)
-
-* dtrace: add missing translator (Dave Pacheco)
-
-* unix: don't flush tty on switch to raw mode (Ben Noordhuis)
-
-* windows: reset brightness when reverting to default text color (Bert Belder)
-
-* npm: update to 1.1.1
-  - Update which, fstream, mkdirp, request, and rimraf
-  - Fix #2123 Set path properly for lifecycle scripts on windows
-  - Mark the root as seen, so we don't recurse into it. Fixes #1838. (Martin Cooper)
-
-
-2012.02.02, Version 0.6.10 (stable), 051908e023f87894fa68f5b64d0b99a19a7db01e
-
-* Update V8 to 3.6.6.20
-
-* Add npm msysgit bash shim to msi installer (isaacs)
-
-* buffers: fix intermittent out of bounds error (Ben Noordhuis)
-
-* buffers: honor length argument in base64 decoder (Ben Noordhuis)
-
-* windows: Fix path.exists regression (Bert Belder)
-
-* Make QueryString.parse run faster (Philip Tellis)
-
-* http: avoid freeing http-parser objects too early (koichik)
-
-* timers: add v0.4 compatibility hack (Ben Noordhuis)
-
-* Proper EPERM error code support (Igor Zinkovsky, Brandon Philips)
-
-* dgram: Implement udp multicast methods on windows (Bert Belder)
-
-
-2012.01.27, Version 0.6.9 (stable), f19e20d33f57c4d2853aaea7d2724d44f3b0012f
-
-* dgram: Bring back missing functionality for Unix (Dan VerWeire, Roman Shtylman, Ben Noordhuis)
-  - Note: Windows UDP support not yet complete.
-
-* http: Fix parser memory leak (koichik)
-
-* zlib: Fix #2365 crashes on invalid input (Nicolas LaCasse)
-
-* module: fix --debug-brk on symlinked scripts (Fedor Indutny)
-
-* Documentation Restyling (Matthew Fitzsimmons)
-
-* Update npm to 1.1.0-3 (isaacs)
-
-* Windows: fix regression in stat() calls to C:\ (Bert Belder)
-
-
-2012.01.19, Version 0.6.8 (stable), d18cebaf8a7ac701dabd71a3aa4eb0571db6a645
-
-* Update V8 to 3.6.6.19
-
-* Numeric key hash collision fix for V8 (Erik Corry, Fedor Indutny)
-
-* Add missing TTY key translations for F1-F5 on Windows (Brandon Benvie)
-
-* path.extname bugfix with . and .. paths (Bert Belder)
-
-* cluster: don't always kill the master on uncaughtException (Ben Noordhuis)
-
-* Update npm to 1.1.0-2 (isaacs)
-
-* typed arrays: set class name (Ben Noordhuis)
-
-* zlib binding cleanup (isaacs, Bert Belder)
-
-* dgram: use slab memory allocator (Michael Bernstein)
-
-* fix segfault #2473
-
-* #2521 60% improvement in fs.stat on Windows (Igor Zinkovsky)
-
-
-2012.01.06, Version 0.6.7 (stable), d5a189acef14a851287ee555f7a39431fe276e1c
-
-* V8 hash collision fix (Breaks MIPS) (Bert Belder, Erik Corry)
-
-* Upgrade V8 to 3.6.6.15
-
-* Upgrade npm to 1.1.0-beta-10 (isaacs)
-
-* many doc updates (Ben Noordhuis, Jeremy Martin, koichik, Dave Irvine,
-  Seong-Rak Choi, Shannen, Adam Malcontenti-Wilson, koichik)
-
-* Fix segfault in node_http_parser.cc
-
-* dgram, timers: fix memory leaks (Ben Noordhuis, Yoshihiro Kikuchi)
-
-* repl: fix repl.start not passing the `ignoreUndefined` arg (Damon Oehlman)
-
-* #1980: Socket.pause null reference when called on a closed Stream (koichik)
-
-* #2263: XMLHttpRequest piped in a writable file stream hang (koichik)
-
-* #2069: http resource leak (koichik)
-
-* buffer.readInt global pollution fix (Phil Sung)
-
-* timers: fix performance regression (Ben Noordhuis)
-
-* #2308, #2246: node swallows openssl error on request (koichik)
-
-* #2114: timers: remove _idleTimeout from item in .unenroll() (James Hartig)
-
-* #2379: debugger: Request backtrace w/o refs (Fedor Indutny)
-
-* simple DTrace ustack helper (Dave Pacheco)
-
-* crypto: rewrite HexDecode without snprintf (Roman Shtylman)
-
-* crypto: don't ignore DH init errors (Ben Noordhuis)
-
-
-2011.12.14, Version 0.6.6 (stable), 9a059ea69e1f6ebd8899246682d8ca257610b8ab
-
-* npm update to 1.1.0-beta-4 (Isaac Z. Schlueter)
-
-* cli: fix output of --help (Ben Noordhuis)
-
-* new website
-
-* pause/resume semantics for stdin (Isaac Z. Schlueter)
-
-* Travis CI integration (Maciej Małecki)
-
-* child_process: Fix bug regarding closed stdin (Ben Noordhuis)
-
-* Enable upgrades in MSI. (Igor Zinkovsky)
-
-* net: Fixes memory leak (Ben Noordhuis)
-
-* fs: handle fractional or NaN ReadStream buffer size (Ben Noordhuis)
-
-* crypto: fix memory leaks in PBKDF2 error path (Ben Noordhuis)
-
-
-2011.12.04, Version 0.6.5 (stable), 6cc94db653a2739ab28e33b2d6a63c51bd986a9f
-
-* npm workaround Windows antivirus software (isaacs)
-
-* Upgrade V8 to 3.6.6.11
-
-
-2011.12.02, Version 0.6.4 (stable), 9170077f13e5e5475b23d1d3c2e7f69bfe139727
-
-* doc improvements (Kyle Young, Tim Oxley, Roman Shtylman, Mathias Bynens)
-
-* upgrade bundled npm (Isaac Schlueter)
-
-* polish Windows installer (Igor Zinkovsky, Isaac Schlueter)
-
-* punycode: upgrade to v0.2.1 (Mathias Bynens)
-
-* build: add –without-npm flag to configure script
-
-* sys: deprecate module some more, print stack trace if NODE_DEBUG=sys
-
-* cli: add -p switch, prints result of –eval
-
-* #1997: fix Blowfish ECB encryption and decryption (Ingmar Runge)
-
-* #2223: fix socket ‘close’ event being emitted twice
-
-* #2224: fix RSS memory usage > 4 GB reporting (Russ Bradberry)
-
-* #2225: fix util.inspect() object stringification bug (Nathan Rajlich)
-
-
-2011.11.25, Version 0.6.3 (stable), b159c6d62e5756d3f8847419d29c6959ea288b56
-
-* #2083 Land NPM in Node. It is included in packages/installers and installed
-  on `make install`.
-
-* #2076 Add logos to windows installer.
-
-* #1711 Correctly handle http requests without headers. (Ben Noordhuis,
-  Felix Geisendörfer)
-
-* TLS: expose more openssl SSL context options and constants. (Ben Noordhuis)
-
-* #2177 Windows: don't kill UDP socket when a packet fails to reach its
-  destination. (Bert Belder)
-
-* Windows: support paths longer than 260 characters. (Igor Zinkovsky)
-
-* Windows: correctly resolve drive-relative paths. (Bert Belder)
-
-* #2166 Don't leave file descriptor open after lchmod. (Isaac Schlueter)
-
-* #2084 Add OS X .pkg build script to make file.
-
-* #2160 Documentation improvements. (Ben Noordhuis)
-
-
-2011.11.18, Version 0.6.2 (stable), a4402f0b2e410b19375a1d5c5fb7fe7f66f3c7f8
-
-* doc improvements (Artur Adib, Trevor Burnham, Ryan Emery, Trent Mick)
-
-* timers: remember extra setTimeout() arguments when timeout==0
-
-* punycode: use Mathias Bynens's punycode library, it's more compliant
-
-* repl: improved tab completion (Ryan Emery)
-
-* buffer: fix range checks in .writeInt() functions (Lukasz Walukiewicz)
-
-* tls: make cipher list configurable
-
-* addons: make Buffer and ObjectWrap visible to Windows add-ons (Bert Belder)
-
-* crypto: add PKCS#1 a.k.a RSA public key verification support
-
-* windows: fix stdout writes when redirected to nul
-
-* sunos: fix build on Solaris and Illumos
-
-* Upgrade V8 to 3.6.6.8
-
-
-2011.11.11, Version 0.6.1 (stable), 170f2addb2dd0c625bc4a6d461e89a31ad68b79b
-
-* doc improvements (Eric Lovett, Ben Noordhuis, Scott Anderson, Yoji SHIDARA)
-
-* crypto: make thread-safe (Ben Noordhuis)
-
-* fix process.kill error object
-
-* debugger: correctly handle source with multi-byte characters (Shigeki Ohtsu)
-
-* make stdout and stderr non-destroyable (Igor Zinkovsky)
-
-* fs: don't close uninitialized fs.watch handle (Ben Noordhuis)
-
-* #2026 fix man page install on BSDs (Ben Noordhuis)
-
-* #2040 fix unrecognized errno assert in uv_err_name
-
-* #2043 fs: mkdir() should call callback if mode is omitted
-
-* #2045 fs: fix fs.realpath on windows to return on error (Benjamin Pasero)
-
-* #2047 minor cluster improvements
-
-* #2052 readline get window columns correctly
-
-* Upgrade V8 to 3.6.6.7
-
-
-2011.11.04, Version 0.6.0 (stable), 865b077819a9271a29f982faaef99dc635b57fbc
-
-* print undefined on undefined values in REPL (Nathan Rajlich)
-
-* doc improvements (koichik, seebees, bnoordhuis,
-  Maciej Małecki, Jacob Kragh)
-
-* support native addon loading in windows (Bert Belder)
-
-* rename getNetworkInterfaces() to networkInterfaces() (bnoordhuis)
-
-* add pending accepts knob for windows (igorzi)
-
-* http.request(url.parse(x)) (seebees)
-
-* #1929 zlib Respond to 'resume' events properly (isaacs)
-
-* stream.pipe: Remove resume and pause events
-
-* test fixes for windows (igorzi)
-
-* build system improvements (bnoordhuis)
-
-* #1936 tls: does not emit 'end' from EncryptedStream (koichik)
-
-* #758 tls: add address(), remoteAddress/remotePort
-
-* #1399 http: emit Error object after .abort() (bnoordhuis)
-
-* #1999 fs: make mkdir() default to 0777 permissions (bnoordhuis)
-
-* #2001 fix pipe error codes
-
-* #2002 Socket.write should reset timeout timer
-
-* stdout and stderr are blocking when associated with file too.
-
-* remote debugger support on windows (Bert Belder)
-
-* convenience methods for zlib (Matt Robenolt)
-
-* process.kill support on windows (igorzi)
-
-* process.uptime() support on windows (igorzi)
-
-* Return IPv4 addresses before IPv6 addresses from getaddrinfo
-
-* util.inspect improvements (Nathan Rajlich)
-
-* cluster module api changes
-
-* Downgrade V8 to 3.6.6.6
-
-
-2011.10.21, Version 0.5.10 (unstable), 220e61c1f65bf4db09699fcf6399c0809c0bc446
-
-* Remove cmake build system, support for Cygwin, legacy code base,
-	process.ENV, process.ARGV, process.memoryUsage().vsize, os.openOSHandle
-
-* Documentation improvments (Igor Zinkovsky, Bert Belder, Ilya Dmitrichenko,
-koichik, Maciej Małecki, Guglielmo Ferri, isaacs)
-
-* Performance improvements (Daniel Ennis, Bert Belder, Ben Noordhuis)
-
-* Long process.title support (Ben Noordhuis)
-
-* net: register net.Server callback only once (Simen Brekken)
-
-* net: fix connect queue bugs (Ben Noordhuis)
-
-* debugger: fix backtrace err handling (Fedor Indutny)
-
-* Use getaddrinfo instead of c-ares for dns.lookup
-
-* Emit 'end' from crypto streams on close
-
-* #1902 buffer: use NO_NULL_TERMINATION flag (koichik)
-
-* #1907 http: Added support for HTTP PATCH verb (Thomas Parslow)
-
-* #1644 add GetCPUInfo on windows (Karl Skomski)
-
-* #1484, #1834, #1482, #771 Don't use a separate context for the repl.
-  (isaacs)
-
-* #1882 zlib Update 'availOutBefore' value, and test (isaacs)
-
-* #1888 child_process.fork: don't modify args (koichik)
-
-* #1516 tls: requestCert unusable with Firefox and Chrome (koichik)
-
-* #1467 tls: The TLS API is inconsistent with the TCP API (koichik)
-
-* #1894 net: fix error handling in listen() (koichik)
-
-* #1860 console.error now goes through uv_tty_t
-
-* Upgrade V8 to 3.7.0
-
-* Upgrade GYP to r1081
-
-
-2011.10.10, Version 0.5.9 (unstable), 3bd9b08fb125b606f97a4079b147accfdeebb07d
-
-* fs.watch interface backed by kqueue, inotify, and ReadDirectoryChangesW
-  (Igor Zinkovsky, Ben Noordhuis)
-
-* add dns.resolveTxt (Christian Tellnes)
-
-* Remove legacy http library (Ben Noordhuis)
-
-* child_process.fork returns and works on Windows. Allows passing handles.
-  (Igor Zinkovsky, Bert Belder)
-
-* #1774 Lint and clean up for --harmony_block_scoping (Tyler Larson, Colton
-  Baker)
-
-* #1813 Fix ctrl+c on Windows (Bert Belder)
-
-* #1844 unbreak --use-legacy (Ben Noordhuis)
-
-* process.stderr now goes through libuv. Both process.stdout and
-  process.stderr are blocking when referencing a TTY.
-
-* net_uv performance improvements (Ben Noordhuis, Bert Belder)
-
-
-2011.09.30, Version 0.5.8 (unstable), 7cc17a0cea1d25188c103745a7d0c24375e3a609
-
-* zlib bindings (isaacs)
-
-* Windows supports TTY ANSI escape codes (Bert Belder)
-
-* Debugger improvements (Fedor Indutny)
-
-* crypto: look up SSL errors with ERR_print_errors() (Ben Noordhuis)
-
-* dns callbacks go through MakeCallback now
-
-* Raise an error when a malformed package.json file is found. (Ben Leslie)
-
-* buffers: handle bad length argument in constructor (Ben Noordhuis)
-
-* #1726, unref process.stdout
-
-* Doc improvements (Ben Noordhuis, Fedor Indutny, koichik)
-
-* Upgrade libuv to fe18438
-
-
-2011.09.16, Version 0.5.7 (unstable), 558241166c4f3c516e5a448e676db0b57119212f
-
-* Upgrade V8 to 3.6.4
-
-* Improve Windows compatibility
-
-* Documentation improvements
-
-* Debugger and REPL improvements (Fedor Indutny)
-
-* Add legacy API support: net.Stream(fd), process.stdout.writable,
-  process.stdout.fd
-
-* Fix mkdir EEXIST handling (isaacs)
-
-* Use net_uv instead of net_legacy for stdio
-
-* Do not load readline from util.inspect
-
-* #1673 Fix bug related to V8 context with accessors (Fedor Indutny)
-
-* #1634 util: Fix inspection for Error (koichik)
-
-* #1645 fs: Add positioned file writing feature to fs.WriteStream (Thomas
-  Shinnick)
-
-* #1637 fs: Unguarded fs.watchFile cache statWatchers checking fixed (Thomas
-  Shinnick)
-
-* #1695 Forward customFds to ChildProcess.spawn
-
-* #1707 Fix hasOwnProperty security problem in querystring (isaacs)
-
-* #1719 Drain OpenSSL error queue
-
-
-2011.09.08, Version 0.5.6 (unstable), b49bec55806574a47403771bce1ee379c2b09ca2
-
-* #345, #1635, #1648 Documentation improvements (Thomas Shinnick,
-  Abimanyu Raja, AJ ONeal, Koichi Kobayashi, Michael Jackson, Logan Smyth,
-  Ben Noordhuis)
-
-* #650 Improve path parsing on windows (Bert Belder)
-
-* #752 Remove headers sent check in OutgoingMessage.getHeader()
-  (Peter Lyons)
-
-* #1236, #1438, #1506, #1513, #1621, #1640, #1647 Libuv-related bugs fixed
-  (Jorge Chamorro Bieling, Peter Bright, Luis Lavena, Igor Zinkovsky)
-
-* #1296, #1612 crypto: Fix BIO's usage. (Koichi Kobayashi)
-
-* #1345 Correctly set socket.remoteAddress with libuv backend (Bert Belder)
-
-* #1429 Don't clobber quick edit mode on windows (Peter Bright)
-
-* #1503 Make libuv backend default on unix, override with `node --use-legacy`
-
-* #1565 Fix fs.stat for paths ending with \ on windows (Igor Zinkovsky)
-
-* #1568 Fix x509 certificate subject parsing (Koichi Kobayashi)
-
-* #1586 Make socket write encoding case-insensitive (Koichi Kobayashi)
-
-* #1591, #1656, #1657 Implement fs in libuv, remove libeio and pthread-win32
-  dependency on windows (Igor Zinkovsky, Ben Noordhuis, Ryan Dahl,
-  Isaac Schlueter)
-
-* #1592 Don't load-time link against CreateSymbolicLink on windows
-  (Peter Bright)
-
-* #1601 Improve API consistency when dealing with the socket underlying a HTTP
-  client request (Mikeal Rogers)
-
-* #1610 Remove DigiNotar CA from trusted list (Isaac Schlueter)
-
-* #1617 Added some win32 os functions (Karl Skomski)
-
-* #1624 avoid buffer overrun with 'binary' encoding (Koichi Kobayashi)
-
-* #1633 make Buffer.write() always set _charsWritten (Koichi Kobayashi)
-
-* #1644 Windows: set executables to be console programs (Peter Bright)
-
-* #1651 improve inspection for sparse array (Koichi Kobayashi)
-
-* #1672 set .code='ECONNRESET' on socket hang up errors (Ben Noordhuis)
-
-* Add test case for foaf+ssl client certificate (Niclas Hoyer)
-
-* Added RPATH environment variable to override run-time library paths
-  (Ashok Mudukutore)
-
-* Added TLS client-side session resumption support (Sean Cunningham)
-
-* Added additional properties to getPeerCertificate (Nathan Rixham,
-  Niclas Hoyer)
-
-* Don't eval repl command twice when an error is thrown (Nathan Rajlich)
-
-* Improve util.isDate() (Nathan Rajlich)
-
-* Improvements in libuv backend and bindings, upgrade libuv to
-  bd6066cb349a9b3a1b0d87b146ddaee06db31d10
-
-* Show warning when using lib/sys.js (Maciej Malecki)
-
-* Support plus sign in url protocol (Maciej Malecki)
-
-* Upgrade V8 to 3.6.2
-
-
-2011.08.26, Version 0.5.5 (unstable), d2d53d4bb262f517a227cc178a1648094ba54c20
-
-* typed arrays, implementation from Plesk
-
-* fix IP multicast on SunOS
-
-* fix DNS lookup order: IPv4 first, IPv6 second (--use-uv only)
-
-* remove support for UNIX datagram sockets (--use-uv only)
-
-* UDP support for Windows (Bert Belder)
-
-* #1572 improve tab completion for objects in the REPL (Nathan Rajlich)
-
-* #1563 fix buffer overflow in child_process module (reported by Dean McNamee)
-
-* #1546 fix performance regression in http module (reported by Brian Geffon)
-
-* #1491 add PBKDF2 crypto support (Glen Low)
-
-* #1447 remove deprecated http.cat() function (Mikeal Rogers)
-
-* #1140 fix incorrect dispatch of vm.runInContext's filename argument
-  (Antranig Basman)
-
-* #1140 document vm.runInContext() and vm.createContext() (Antranig Basman)
-
-* #1428 fix os.freemem() on 64 bits freebsd (Artem Zaytsev)
-
-* #1164 make all DNS lookups async, fixes uncatchable exceptions
-  (Koichi Kobayashi)
-
-* fix incorrect ssl shutdown check (Tom Hughes)
-
-* various cmake fixes (Tom Hughes)
-
-* improved documentation (Koichi Kobayashi, Logan Smyth, Fedor Indutny,
-  Mikeal Rogers, Maciej Małecki, Antranig Basman, Mickaël Delahaye)
-
-* upgrade libuv to commit 835782a
-
-* upgrade V8 to 3.5.8
-
-
-2011.08.12, Version 0.5.4 (unstable), cfba1f59224ff8602c3fe9145181cad4c6df89a9
-
-* libuv/Windows compatibility improvements
-
-* Build on Microsoft Visual Studio via GYP. Use generate-projects.bat in the
-  to build sln files. (Peter Bright, Igor Zinkovsky)
-
-* Make Mikeal's HTTP agent client the default. Use old HTTP client with
-  --use-http1
-
-* Fixes https host header default port handling. (Mikeal Rogers)
-
-* #1440 strip byte order marker when loading *.js and *.json files
-  (Ben Noordhuis)
-
-* #1434 Improve util.format() compatibility with browser. (Koichi Kobayashi)
-
-* Provide unchecked uint entry points for integer Buffer.read/writeInt
-  methods. (Robert Mustacchi)
-
-* CMake improvements (Tom Huges)
-
-* Upgrade V8 to 3.5.4.
-
-
-2011.08.01, Version 0.5.3 (unstable), 4585330afef44ddfb6a4054bd9b0f190b352628b
-
-* Fix crypto encryption/decryption with Base64. (SAWADA Tadashi)
-
-* #243 Add an optional length argument to Buffer.write() (koichik)
-
-* #657 convert nonbuffer data to string in fs.writeFile/Sync
-  (Daniel Pihlström)
-
-* Add process.features, remove process.useUV (Ben Noordhuis)
-
-* #324 Fix crypto hmac to accept binary keys + add test cases from rfc 2202
-  and 4231 (Stefan Bühler)
-
-* Add Socket::bytesRead, Socket::bytesWritten (Alexander Uvarov)
-
-* #572 Don't print result of --eval in CLI (Ben Noordhuis)
-
-* #1223 Fix http.ClientRequest crashes if end() was called twice (koichik)
-
-* #1383 Emit 'close' after all connections have closed (Felix Geisendörfer)
-
-* Add sprintf-like util.format() function (Ben Noordhuis)
-
-* Add support for TLS SNI (Fedor Indutny)
-
-* New http agent implementation. Off by default the command line flag
-  --use-http2 will enable it. "make test-http2" will run the tests
-	for the new implementation. (Mikeal Rogers)
-
-* Revert AMD compatibility. (isaacs)
-
-* Windows: improvements, child_process support.
-
-* Remove pkg-config file.
-
-* Fix startup time regressions.
-
-* doc improvements
-
-
-2011.07.22, Version 0.5.2 (unstable), 08ffce1a00dde1199174b390a64a90b60768ddf5
-
-* libuv improvements; named pipe support
-
-* #1242 check for SSL_COMP_get_compression_methods() (Ben Noordhuis)
-
-* #1348 remove require.paths (isaacs)
-
-* #1349 Delimit NODE_PATH with ; on Windows (isaacs)
-
-* #1335 Remove EventEmitter from C++
-
-* #1357 Load json files with require() (isaacs)
-
-* #1374 fix setting ServerResponse.statusCode in writeHead (Trent Mick)
-
-* Fixed: GC was being run too often.
-
-* Upgrade V8 to 3.4.14
-
-* doc improvements
-
-
-2011.07.14, Version 0.5.1 (unstable), f8bfa54d0fa509f9242637bef2869a1b1e842ec8
-
-* #1233 Fix os.totalmem on FreeBSD amd64 (Artem Zaytsev)
-
-* #1149 IDNA and Punycode support in url.parse
-  (Jeremy Selier, Ben Noordhuis, isaacs)
-
-* Export $CC and $CXX to uv and V8's build systems
-
-* Include pthread-win32 static libraries in build (Igor Zinkovsky)
-
-* #1199, #1094 Fix fs can't handle large file on 64bit platform (koichik)
-
-* #1281 Make require a public member of module (isaacs)
-
-* #1303 Stream.pipe returns the destination (Elijah Insua)
-
-* #1229 Addons should not -DEV_MULTIPLICITY=0 (Brian White)
-
-* libuv backend improvements
-
-* Upgrade V8 to 3.4.10
-
-
-2011.07.05, Version 0.5.0 (unstable), ae7ed8482ea7e53c59acbdf3cf0e0a0ae9d792cd
-
-* New non-default libuv backend to support IOCP on Windows.
-  Use --use-uv to enable.
-
-* deprecate http.cat
-
-* docs improved.
-
-* add child_process.fork
-
-* add fs.utimes() and fs.futimes() support (Ben Noordhuis)
-
-* add process.uptime() (Tom Huges)
-
-* add path.relative (Tony Huang)
-
-* add os.getNetworkInterfaces()
-
-* add remoteAddress and remotePort for client TCP connections
-  (Brian White)
-
-* add secureOptions flag, setting ciphers,
-  SSL_OP_CRYPTOPRO_TLSEXT_BUG to TLS (Theo Schlossnagle)
-
-* add process.arch (Nathan Rajlich)
-
-* add reading/writing of floats and doubles from/to buffers (Brian White)
-
-* Allow script to be read from stdin
-
-* #477 add Buffer::fill method to do memset (Konstantin Käfer)
-
-* #573 Diffie-Hellman support to crypto module (Håvard Stranden)
-
-* #695 add 'hex' encoding to buffer (isaacs)
-
-* #851 Update how REPLServer uses contexts (Ben Weaver)
-
-* #853 add fs.lchow, fs.lchmod, fs.fchmod, fs.fchown (isaacs)
-
-* #889 Allow to remove all EventEmitter listeners at once
-  (Felix Geisendörfer)
-
-* #926 OpenSSL NPN support (Fedor Indutny)
-
-* #955 Change ^C handling in REPL (isaacs)
-
-* #979 add support for Unix Domain Sockets to HTTP (Mark Cavage)
-
-* #1173 #1170 add AMD, asynchronous module definition (isaacs)
-
-* DTrace probes: support X-Forwarded-For (Dave Pacheco)
-
-
-2011.09.15, Version 0.4.12 (stable), 771ba34ca7b839add2ef96879e1ffc684813cf7c
-
-* Improve docs
-
-* #1563 overflow in ChildProcess custom_fd.
-
-* #1569, parse error on multi-line HTTP headers. (Ben Noordhuis)
-
-* #1586 net: Socket write encoding case sensitivity (koichik)
-
-* #1610 Remove DigiNotar CA from trusted list (isaacs)
-
-* #1624 buffer: Avoid overrun with 'binary' encoding. (koichik)
-
-* #1633 buffer: write() should always set _charsWritten. (koichik)
-
-* #1707 hasOwnProperty usage security hole in querystring (isaacs)
-
-* #1719 Drain OpenSSL error queue
-
-* Fix error reporting in net.Server.listen
-
-
-2011.08.17, Version 0.4.11 (stable), a745d19ce7d1c0e3778371af4f0346be70cf2c8e
-
-* #738 Fix crypto encryption/decryption with Base64. (SAWADA Tadashi)
-
-* #1202 net.createConnection defer DNS lookup error events to next tick
-  (Ben Noordhuis)
-
-* #1374 fix setting ServerResponse.statusCode in writeHead (Trent Mick)
-
-* #1417 Fix http.ClientRequest crashes if end() was called twice
-
-* #1497 querystring: Replace 'in' test with 'hasOwnProperty' (isaacs)
-
-* #1546 http perf improvement
-
-* fix memleak in libeio (Tom Hughes)
-
-* cmake improvements (Tom Hughes)
-
-* node_net.cc: fix incorrect sizeof() (Tom Hughes)
-
-* Windows/cygwin: no more GetConsoleTitleW errors on XP (Bert Belder)
-
-* Doc improvments (koichik, Logan Smyth, Ben Noordhuis, Arnout Kazemier)
-
-
-2011.07.19, Version 0.4.10 (stable), 1b8dd65d6e3b82b6863ef38835cc436c5d30c1d5
-
-* #394 Fix Buffer drops last null character in UTF-8
-
-* #829 Backport r8577 from V8 (Ben Noordhuis)
-
-* #877 Don't wait for HTTP Agent socket pool to establish connections.
-
-* #915 Find kqueue on FreeBSD correctly (Brett Kiefer)
-
-* #1085 HTTP: Fix race in abort/dispatch code (Stefan Rusu)
-
-* #1274 debugger improvement (Yoshihiro Kikuchi)
-
-* #1291 Properly respond to HEAD during end(body) hot path (Reid Burke)
-
-* #1304 TLS: Fix race in abort/connection code (Stefan Rusu)
-
-* #1360 Allow _ in url hostnames.
-
-* Revert 37d529f8 - unbreaks debugger command parsing.
-
-* Bring back global execScript
-
-* Doc improvements
-
-
-2011.06.29, Version 0.4.9 (stable), de44eafd7854d06cd85006f509b7051e8540589b
-
-* Improve documentation
-
-* #1095 error handling bug in stream.pipe() (Felix Geisendörfer)
-
-* #1097 Fix a few leaks in node_crypto.cc (Ben Noordhuis)
-
-* #562 #1078 Parse file:// urls properly (Ryan Petrello)
-
-* #880 Option to disable SSLv2 (Jérémy Lal)
-
-* #1087 Disabling SSL compression disabled with early OpenSSLs.
-
-* #1144 debugger: don't allow users to input non-valid commands
-  (Siddharth Mahendraker)
-
-* Perf improvement for util.inherits
-
-* #1166 Support for signature verification with RSA/DSA public keys
-  (Mark Cavage)
-
-* #1177 Remove node_modules lookup optimization to better support
-  nested project structures (Mathias Buus)
-
-* #1203 Add missing scope.Close to fs.sendfileSync
-
-* #1187 Support multiple 'link' headers
-
-* #1196 Fix -e/--eval can't load module from node_modules (Koichi Kobayashi)
-
-* Upgrade V8 to 3.1.8.25, upgrade http-parser.
-
-
-2011.05.20, Version 0.4.8 (stable), 7dd22c26e4365698dc3efddf138c4d399cb912c8
-
-* #974 Properly report traceless errors (isaacs)
-
-* #983 Better JSON.parse error detection in REPL (isaacs)
-
-* #836 Agent socket errors bubble up to req only if req exists
-
-* #1041 Fix event listener leak check timing (koichik)
-
-*	#1038 Fix dns.resolve() with 'PTR' throws Error: Unknown type "PTR"
-  (koichik)
-
-* #1073 Share SSL context between server connections (Fedor Indutny)
-
-* Disable compression with OpenSSL. Improves memory perf.
-
-* Implement os.totalmem() and os.freemem() for SunOS (Alexandre Marangone)
-
-* Fix a special characters in URL regression (isaacs)
-
-* Fix idle timeouts in HTTPS (Felix Geisendörfer)
-
-* SlowBuffer.write() with 'ucs2' throws ReferenceError. (koichik)
-
-* http.ServerRequest 'close' sometimes gets an error argument
-  (Felix Geisendörfer)
-
-* Doc improvements
-
-* cleartextstream.destroy() should close(2) the socket. Previously was being
-	mapped to a shutdown(2) syscall.
-
-* No longer compile out asserts and debug statements in normal build.
-
-* Debugger improvements.
-
-* Upgrade V8 to 3.1.8.16.
-
-
-2011.04.22, Version 0.4.7 (stable), c85455a954411b38232e79752d4abb61bb75031b
-
-* Don't emit error on ECONNRESET from read() #670
-
-* Fix: Multiple pipes to the same stream were broken #929
-  (Felix Geisendörfer)
-
-* URL parsing/formatting corrections #954 (isaacs)
-
-* make it possible to do repl.start('', stream) (Wade Simmons)
-
-* Add os.loadavg for SunOS (Robert Mustacchi)
-
-* Fix timeouts with floating point numbers #897 (Jorge Chamorro Bieling)
-
-* Improve docs.
-
-
-2011.04.13, Version 0.4.6 (stable), 58002d56bc79410c5ff397fc0e1ffec0665db38a
-
-* Don't error on ENOTCONN from shutdown() #670
-
-* Auto completion of built-in debugger suggests prefix match rather than
-	partial match. (koichik)
-
-* circular reference in vm modules. #822 (Jakub Lekstan)
-
-* http response.readable should be false after 'end' #867 (Abe Fettig)
-
-* Implement os.cpus() and os.uptime() on Solaris (Scott McWhirter)
-
-* fs.ReadStream: Allow omission of end option for range reads #801
-	(Felix Geisendörfer)
-
-* Buffer.write() with UCS-2 should not be write partial char
-	#916 (koichik)
-
-* Pass secureProtocol through on tls.Server creation (Theo Schlossnagle)
-
-* TLS use RC4-SHA by default
-
-* Don't strangely drop out of event loop on HTTPS client uploads #892
-
-* Doc improvements
-
-* Upgrade v8 to 3.1.8.10
-
-
-2011.04.01, Version 0.4.5 (stable), 787a343b588de26784fef97f953420b53a6e1d73
-
-* Fix listener leak in stream.pipe() (Mikeal Rogers)
-
-* Retain buffers in fs.read/write() GH-814 (Jorge Chamorro Bieling)
-
-* TLS performance improvements
-
-* SlowBuffer.prototype.slice bug GH-843
-
-* process.stderr.write should return true
-
-* Immediate pause/resume race condition GH-535 (isaacs)
-
-* Set default host header properly GH-721 (isaacs)
-
-* Upgrade V8 to 3.1.8.8
-
-
-2011.03.26, Version 0.4.4 (stable), 25122b986a90ba0982697b7abcb0158c302a1019
-
-* CryptoStream.end shouldn't throw if not writable GH-820
-
-* Drop out if connection destroyed before connect() GH-819
-
-* expose https.Agent
-
-* Correctly setsid in tty.open GH-815
-
-* Bug fix for failed buffer construction
-
-* Added support for removing .once listeners (GH-806)
-
-* Upgrade V8 to 3.1.8.5
-
-
-2011.03.18, Version 0.4.3 (stable), c095ce1a1b41ca015758a713283bf1f0bd41e4c4
-
-* Don't decrease server connection counter again if destroy() is called more
-	than once GH-431 (Andreas Reich, Anders Conbere)
-
-* Documentation improvements (koichik)
-
-* Fix bug with setMaxListeners GH-682
-
-* Start up memory footprint improvement. (Tom Hughes)
-
-* Solaris improvements.
-
-* Buffer::Length(Buffer*) should not invoke itself recursively GH-759 (Ben
-  Noordhuis)
-
-* TLS: Advertise support for client certs GH-774 (Theo Schlossnagle)
-
-* HTTP Agent bugs: GH-787, GH-784, GH-803.
-
-* Don't call GetMemoryUsage every 5 seconds.
-
-* Upgrade V8 to 3.1.8.3
-
-
-2011.03.02, Version 0.4.2 (stable), 39280e1b5731f3fcd8cc42ad41b86cdfdcb6d58b
-
-* Improve docs.
-
-* Fix process.on edge case with signal event (Alexis Sellier)
-
-* Pragma HTTP header comma separation
-
-* In addition to 'aborted' emit 'close' from incoming requests
-  (Felix Geisendörfer)
-
-* Fix memleak in vm.runInNewContext
-
-* Do not cache modules that throw exceptions (Felix Geisendörfer)
-
-* Build system changes for libnode (Aria Stewart)
-
-* Read up the prototype of the 'env' object. (Nathan Rajlich)
-
-* Add 'close' and 'aborted' events to Agent responses
-
-* http: fix missing 'drain' events (Russell Haering)
-
-* Fix process.stdout.end() throws ENOTSOCK error. (Koichi Kobayashi)
-
-* REPL bug fixes (isaacs)
-
-* node_modules folders should be highest priority (isaacs)
-
-* URL parse more safely (isaacs)
-
-* Expose errno with a string for dns/cares (Felix Geisendörfer)
-
-* Fix tty.setWindowSize
-
-* spawn: setuid after chdir (isaacs)
-
-* SIGUSR1 should break the VM without delay
-
-* Upgrade V8 to 3.1.8.
-
-
-2011.02.19, Version 0.4.1 (stable), e8aef84191bc2c1ba2bcaa54f30aabde7f03769b
-
-* Fixed field merging with progressive fields on writeHead()
-  (TJ Holowaychuk)
-
-* Make the repl respect node_modules folders (isaacs)
-
-* Fix for DNS fail in HTTP request (Richard Rodger)
-
-* Default to port 80 for http.request and http.get.
-
-* Improve V8 support for Cygwin (Bert Belder)
-
-* Fix fs.open param parsing. (Felix Geisendörfer)
-
-* Fixed null signal.
-
-* Fix various HTTP and HTTPS bugs
-
-* cmake improvements (Tom Hughes)
-
-* Fix: TLS sockets should not be writable after 'end'
-
-* Fix os.cpus() on cygwin (Brian White)
-
-* MinGW: OpenSSL support (Bert Belder)
-
-* Upgrade V8 to 3.1.5, libev to 4.4.
-
-
-2011.02.10, Version 0.4.0 (stable), eb155ea6f6a6aa341aa8c731dca8da545c6a4008
-
-* require() improvements (isaacs)
-  - understand package.json (isaacs)
-  - look for 'node_modules' dir
-
-* cmake fixes (Daniel Gröber)
-
-* http: fix buffer writes to outgoing messages (Russell Haering)
-
-* Expose UCS-2 Encoding (Konstantin Käfer)
-
-* Support strings for octal modes (isaacs)
-
-* Support array-ish args to Buffer ctor (isaacs)
-
-* cygwin and mingw improvements (Bert Belder)
-
-* TLS improvements
-
-* Fewer syscalls during require (Bert Belder, isaacs)
-
-* More DTrace probes (Bryan Cantrill,  Robert Mustacchi)
-
-* 'pipe' event on pipe() (Mikeal Rogers)
-
-* CRL support in TLS (Theo Schlossnagle)
-
-* HTTP header manipulation methods (Tim Caswell, Charlie Robbins)
-
-* Upgrade V8 to 3.1.2
-
-
-2011.02.04, Version 0.3.8 (unstable), 9493b7563bff31525b4080df5aeef09747782d5e
-
-* Add req.abort() for client side requests.
-
-* Add exception.code for easy testing:
-  Example: if (err.code == 'EADDRINUSE');
-
-* Add process.stderr.
-
-* require.main is the main module. (Isaac Schlueter)
-
-* dgram: setMulticastTTL, setMulticastLoopback and addMembership.
-  (Joe Walnes)
-
-* Fix throttling in TLS connections
-
-* Add socket.bufferSize
-
-* MinGW improvements (Bert Belder)
-
-* Upgrade V8 to 3.1.1
-
-2011.01.27, Version 0.3.7 (unstable), d8579c6afdbe868de6dffa8db78bbe4ba2d03e0e
-
-* Expose agent in http and https client. (Mikeal Rogers)
-
-* Fix bug in http request's end method. (Ali Farhadi)
-
-* MinGW: better net support (Bert Belder)
-
-* fs.open should set FD_CLOEXEC
-
-* DTrace probes (Bryan Cantrill)
-
-* REPL fixes and improvements (isaacs, Bert Belder)
-
-* Fix many bugs with legacy http.Client interface
-
-* Deprecate process.assert. Use require('assert').ok
-
-* Add callback parameter to socket.setTimeout(). (Ali Farhadi)
-
-* Fixing bug in http request default encoding (Ali Farhadi)
-
-* require: A module ID with a trailing slash must be a dir.
-  (isaacs)
-
-* Add ext_key_usage to getPeerCertificate (Greg Hughes)
-
-* Error when child_process.exec hits maxBuffer.
-
-* Fix option parsing in tls.connect()
-
-* Upgrade to V8 3.0.10
-
-
-2011.01.21, Version 0.3.6 (unstable), bb3e71466e5240626d9d21cf791fe43e87d90011
-
-* REPL and other improvements on MinGW (Bert Belder)
-
-* listen/bind errors should close net.Server
-
-* New HTTP and HTTPS client APIs
-
-* Upgrade V8 to 3.0.9
-
-
-2011.01.16, Version 0.3.5 (unstable), b622bc6305e3c675e0edfcdbaa387d849ad0bba0
-
-* Built-in debugger improvements.
-
-* Add setsid, setuid, setgid options to child_process.spawn
-  (Isaac Schlueter)
-
-* tty module improvements.
-
-* Upgrade libev to 4.3, libeio to latest, c-ares to 1.7.4
-
-* Allow third party hooks before main module load.
-  (See 496be457b6a2bc5b01ec13644b9c9783976159b2)
-
-* Don't stat() on cached modules. (Felix Geisendörfer)
-
-
-2011.01.08, Version 0.3.4 (unstable), 73f53e12e4a5b9ef7dbb4792bd5f8ad403094441
-
-* Primordial mingw build (Bert Belder)
-
-* HTTPS server
-
-* Built in debugger 'node debug script.js'
-
-* realpath files during module load (Mihai Călin Bazon)
-
-* Rename net.Stream to net.Socket (existing name will continue to be
-  supported)
-
-* Fix process.platform
-
-
-2011.01.02, Version 0.3.3 (unstable), 57544ba1c54c7d0da890317deeb73076350c5647
-
-* TLS improvements.
-
-* url.parse(url, true) defaults query field to {} (Jeremy Martin)
-
-* Upgrade V8 to 3.0.4
-
-* Handle ECONNABORT properly (Theo Schlossnagle)
-
-* Fix memory leaks (Tom Hughes)
-
-* Add os.cpus(), os.freemem(), os.totalmem(), os.loadavg() and other
-  functions for OSX, Linux, and Cygwin. (Brian White)
-
-* Fix REPL syntax error bug (GH-543), improve how REPL commands are
-  evaluated.
-
-* Use process.stdin instead of process.openStdin().
-
-* Disable TLS tests when node doesn't have OpenSSL.
-
-
-2010.12.16, Version 0.3.2 (unstable), 4bb914bde9f3c2d6de00853353b6b8fc9c66143a
-
-* Rip out the old (broken) TLS implementation introduce new tested
-  implementation and API. See docs. HTTPS not supported in this release.
-
-* Introduce 'os' and 'tty' modules.
-
-* Callback parameters for socket.write() and socket.connect().
-
-* Support CNAME lookups in DNS module. (Ben Noordhuis)
-
-* cmake support (Tom Hughes)
-
-* 'make lint'
-
-* oprofile support (./configure --oprofile)
-
-* Lots of bug fixes, including:
-  - Memory leak in ChildProcess:Spawn(). (Tom Hughes)
-  - buffer.slice(0, 0)
-  - Global variable leaks
-  - clearTimeouts calling multiple times (Michael W)
-  - utils.inspect's detection of circular structures (Tim Cooijmans)
-  - Apple's threaded write()s bug (Jorge Chamorro Bieling)
-  - Make sure raw mode is disabled when exiting a terminal-based REPL.
-    (Brian White)
-
-* Deprecate process.compile, process.ENV
-
-* Upgrade V8 to 3.0.3, upgrade http-parser.
-
-
-2010.11.16, Version 0.3.1 (unstable), ce9a54aa1fbf709dd30316af8a2f14d83150e947
-
-* TLS improvements (Paul Querna)
-  - Centralize error handling in SecureStream
-  - Add SecurePair for handling of a ssl/tls stream.
-
-* New documentation organization (Micheil Smith)
-
-* allowHalfOpen TCP connections disabled by default.
-
-* Add C++ API for constructing fast buffer from string
-
-* Move idle timers into its own module
-
-* Gracefully handle EMFILE and server.maxConnections
-
-* make "node --eval" eval in the global scope.
-  (Jorge Chamorro Bieling)
-
-* Let exit listeners know the exit code (isaacs)
-
-* Handle cyclic links smarter in fs.realpath (isaacs)
-
-* Remove node-repl (just use 'node' without args)
-
-* Rewrite libeio After callback to use req->result instead of req->errorno
-  for error checking (Micheil Smith)
-
-* Remove warning about deprecating 'sys' - too aggressive
-
-* Make writes to process.env update the real environment. (Ben Noordhuis)
-
-* Set FD_CLOEXEC flag on stdio FDs before spawning. (Guillaume Tuton)
-
-* Move ev_loop out of javascript
-
-* Switch \n with \r\n for all strings printed out.
-
-* Added support for cross compilation (Rasmus Andersson)
-
-* Add --profile flag to configure script, enables gprof profiling.
-  (Ben Noordhuis)
-
-* writeFileSync could exhibit pathological behavior when a buffer
-  could not be written to the file in a single write() call.
-
-* new path.join behavior (isaacs)
-  - Express desired path.join behavior in tests.
-  - Update fs.realpath to reflect new path.join behavior
-  - Update url.resolve() to use new path.join behavior.
-
-* API: Move process.binding('evals') to require('vm')
-
-* Fix V8 build on Cygwin (Bert Belder)
-
-* Add ref to buffer during fs.write and fs.read
-
-* Fix segfault on test-crypto
-
-* Upgrade http-parser to latest and V8 to 2.5.3
-
-
-2010.10.23, Version 0.3.0 (unstable) 1582cfebd6719b2d2373547994b3dca5c8c569c0
-
-* Bugfix: Do not spin on accept() with EMFILE
-
-* Improvements to readline.js (Trent Mick, Johan Euphrosine, Brian White)
-
-* Safe constructors (missing 'new' doesn't segfault)
-
-* Fix process.nextTick so thrown errors don't confuse it.
-  (Benjamin Thomas)
-
-* Allow Strings for ports on net.Server.listen (Bradley Meck)
-
-* fs bugfixes (Tj Holowaychuk, Tobie Langel, Marco Rogers, isaacs)
-
-* http bug fixes (Fedor Indutny, Mikeal Rogers)
-
-* Faster buffers; breaks C++ API (Tim-Smart, Stéphan Kochen)
-
-* crypto, tls improvements (Paul Querna)
-
-* Add lfs flags to node addon script
-
-* Simpler querystring parsing; breaks API (Peter Griess)
-
-* HTTP trailers (Mark Nottingham)
-
-* http 100-continue support (Mark Nottingham)
-
-* Module system simplifications (Herbert Vojčík, isaacs, Tim-Smart)
-  - remove require.async
-  - remove registerExtension, add .extensions
-  - expose require.resolve
-  - expose require.cache
-  - require looks in  node_modules folders
-
-* Add --eval command line option (TJ Holowaychuk)
-
-* Commas last in sys.inspect
-
-* Constants moved from process object to require('constants')
-
-* Fix parsing of linux memory (Vitali Lovich)
-
-* inspect shows function names (Jorge Chamorro Bieling)
-
-* uncaughtException corner cases (Felix Geisendörfer)
-
-* TCP clients now buffer writes before connection
-
-* Rename sys module to 'util' (Micheil Smith)
-
-* Properly set stdio handlers to blocking on SIGTERM and SIGINT
-  (Tom Hughes)
-
-* Add destroy methods to HTTP messages
-
-* base64 improvements (isaacs, Jorge Chamorro Bieling)
-
-* API for defining REPL commands (Sami Samhuri)
-
-* child_process.exec timeout fix (Aaron Heckmann)
-
-* Upgrade V8 to 2.5.1, Libev to 4.00, libeio, http-parser
-
-
-2010.08.20, Version 0.2.0, 9283e134e558900ba89d9a33c18a9bdedab07cb9
-
-* process.title support for FreeBSD, Macintosh, Linux
-
-* Fix OpenSSL 100% CPU usage on error (Illarionov Oleg)
-
-* Implement net.Server.maxConnections.
-
-* Fix process.platform, add process.version.
-
-* Add --without-snapshot configure option.
-
-* Readline REPL improvements (Trent Mick)
-
-* Bug fixes.
-
-* Upgrade V8 to 2.3.8
-
-
-2010.08.13, Version 0.1.104, b14dd49222687c12f3e8eac597cff4f2674f84e8
-
-* Various bug fixes (console, querystring, require)
-
-* Set cwd for child processes (Bert Belder)
-
-* Tab completion for readline (Trent Mick)
-
-* process.title getter/setter for OSX, Linux, Cygwin.
-	(Rasmus Andersson, Bert Belder)
-
-* Upgrade V8 to 2.3.6
-
-
-2010.08.04, Version 0.1.103, 0b925d075d359d03426f0b32bb58a5e05825b4ea
-
-* Implement keep-alive for http.Client (Mikeal Rogers)
-
-* base64 fixes. (Ben Noordhuis)
-
-* Fix --debug-brk (Danny Coates)
-
-* Don't let path.normalize get above the root. (Isaac Schlueter)
-
-* Allow signals to be used with process.on in addition to
-  process.addListener. (Brian White)
-
-* Globalize the Buffer object
-
-* Use kqueue on recent macintosh builds
-
-* Fix addrlen for unix_dgram sockets (Benjamin Kramer)
-
-* Fix stats.isDirectory() and friends (Benjamin Kramer)
-
-* Upgrade http-parser, V8 to 2.3.5
-
-
-2010.07.25, Version 0.1.102, 2a4568c85f33869c75ff43ccd30f0ec188b43eab
-
-* base64 encoding for Buffers.
-
-* Buffer support for Cipher, Decipher, Hmac, Sign and Verify
-  (Andrew Naylor)
-
-* Support for reading byte ranges from files using fs.createReadStream.
-  (Chandra Sekar)
-
-* Fix Buffer.toString() on 0-length slices. (Peter Griess)
-
-* Cache modules based on filename rather than ID (Isaac Schlueter)
-
-* querystring improvments (Jan Kassens, Micheil Smith)
-
-* Support DEL in the REPL. (Jérémy Lal)
-
-* Upgrade http-parser, upgrade V8 to 2.3.2
-
-
-2010.07.16, Version 0.1.101, 0174ceb6b24caa0bdfc523934c56af9600fa9b58
-
-* Added env to child_process.exec (Сергей Крыжановский)
-
-* Allow modules to optionally be loaded in separate contexts
-  with env var NODE_MODULE_CONTEXTS=1.
-
-* setTTL and setBroadcast for dgram (Matt Ranney)
-
-* Use execPath for default NODE_PATH, not installPrefix
-  (Isaac Schlueter)
-
-* Support of console.dir + console.assert (Jerome Etienne)
-
-* on() as alias to addListener()
-
-* Use javascript port of Ronn to build docs (Jérémy Lal)
-
-* Upgrade V8 to 2.3.0
-
-
-2010.07.03, Version 0.1.100, a6b8586e947f9c3ced180fe68c233d0c252add8b
-
-* process.execPath (Marshall Culpepper)
-
-* sys.pump (Mikeal Rogers)
-
-* Remove ini and mjsunit libraries.
-
-* Introduce console.log() and friends.
-
-* Switch order of arguments for Buffer.write (Blake Mizerany)
-
-* On overlapping buffers use memmove (Matt Ranney)
-
-* Resolve .local domains with getaddrinfo()
-
-* Upgrade http-parser, V8 to 2.2.21
-
-
-2010.06.21, Version 0.1.99, a620b7298f68f68a855306437a3b60b650d61d78
-
-* Datagram sockets (Paul Querna)
-
-* fs.writeFile could not handle utf8 (Felix Geisendörfer)
-  and now accepts Buffers (Aaron Heckmann)
-
-* Fix crypto memory leaks.
-
-* A replacement for decodeURIComponent that doesn't throw.
-  (Isaac Schlueter)
-
-* Only concatenate some incoming HTTP headers. (Peter Griess)
-
-* Upgrade V8 to 2.2.18
-
-
-2010.06.11, Version 0.1.98, 10d8adb08933d1d4cea60192c2a31c56d896733d
-
-* Port to Windows/Cygwin (Raffaele Sena)
-
-* File descriptor passing on unix sockets. (Peter Griess)
-
-* Simple, builtin readline library. REPL is now entered by
-  executing "node" without arguments.
-
-* Add a parameter to spawn() that sets the child's stdio file
-  descriptors. (Orlando Vazquez)
-
-* Upgrade V8 to 2.2.16, http-parser fixes, upgrade c-ares to 1.7.3.
-
-
-2010.05.29, Version 0.1.97, 0c1aa36835fa6a3557843dcbc6ed6714d353a783
-
-* HTTP throttling: outgoing messages emit 'drain' and write() returns false
-  when send buffer is full.
-
-* API: readFileSync without encoding argument now returns a Buffer
-
-* Improve Buffer C++ API; addons now compile with debugging symbols.
-
-* Improvements to  path.extname() and REPL; add fs.chown().
-
-* fs.ReadStream now emits buffers, fs.readFileSync returns buffers.
-
-* Bugfix: parsing HTTP responses to HEAD requests.
-
-* Port to OpenBSD.
-
-* Upgrade V8 to 2.2.12, libeio, http-parser.
-
-
-2010.05.21, Version 0.1.96, 9514a4d5476225e8c8310ce5acae2857033bcaaa
-
-* Thrown errors in http and socket call back get bubbled up.
-
-* Add fs.fsync (Andrew Johnston)
-
-* Bugfix: signal unregistering (Jonas Pfenniger)
-
-* Added better error messages for async and sync fs calls with paths
-  (TJ Holowaychuk)
-
-* Support arrays and strings in buffer constructor.
-  (Felix Geisendörfer)
-
-* Fix errno reporting in DNS exceptions.
-
-* Support buffers in fs.WriteStream.write.
-
-* Bugfix: Safely decode a utf8 streams that are broken on a multbyte
-  character (http and net). (Felix Geisendörfer)
-
-* Make Buffer's C++ constructor public.
-
-* Deprecate sys.p()
-
-* FIX path.dirname('/tmp') => '/'. (Jonathan Rentzsch)
-
-
-2010.05.13, Version 0.1.95, 0914d33842976c2c870df06573b68f9192a1fb7a
-
-* Change GC idle notify so that it runs alongside setInterval
-
-* Install node_buffer.h on make install
-
-* fs.readFile returns Buffer by default (Tim Caswell)
-
-* Fix error reporting in child_process callbacks
-
-* Better logic for testing if an argument is a port
-
-* Improve error reporting (single line "node.js:176:9" errors)
-
-* Bugfix: Some http responses being truncated (appeared in 0.1.94)
-
-* Fix long standing net idle timeout bugs. Enable 2 minute timeout
-  by default in HTTP servers.
-
-* Add fs.fstat (Ben Noordhuis)
-
-* Upgrade to V8 2.2.9
-
-
-2010.05.06, Version 0.1.94, f711d5343b29d1e72e87107315708e40951a7826
-
-* Look in /usr/local/lib/node for modules, so that there's a way
-  to install modules globally (Issac Schlueter)
-
-* SSL improvements (Rhys Jones, Paulo Matias)
-
-* Added c-ares headers for linux-arm (Jonathan Knezek)
-
-* Add symbols to release build
-
-* HTTP upgrade improvements, docs (Micheil Smith)
-
-* HTTP server emits 'clientError' instead of printing message
-
-* Bugfix: Don't emit 'error' twice from http.Client
-
-* Bugfix: Ignore SIGPIPE
-
-* Bugfix: destroy() instead of end() http connection at end of
-  pipeline
-
-* Bugfix: http.Client may be prematurely released back to the
-  free pool.  (Thomas Lee)
-
-* Upgrade V8 to 2.2.8
-
-
-2010.04.29, Version 0.1.93, 557ba6bd97bad3afe0f9bd3ac07efac0a39978c1
-
-  * Fixed no 'end' event on long chunked HTTP messages
-    https://github.com/joyent/node/issues/77
-
-  * Remove legacy modules http_old and tcp_old
-
-  * Support DNS MX queries (Jérémy Lal)
-
-  * Fix large socket write (tlb@tlb.org)
-
-  * Fix child process exit codes (Felix Geisendörfer)
-
-  * Allow callers to disable PHP/Rails style parameter munging in
-    querystring.stringify (Thomas Lee)
-
-  * Upgrade V8 to 2.2.6
-
-
-2010.04.23, Version 0.1.92, caa828a242f39b6158084ef4376355161c14fe34
-
-  * OpenSSL support. Still undocumented (see tests). (Rhys Jones)
-
-  * API: Unhandled 'error' events throw.
-
-  * Script class with eval-function-family in binding('evals') plus tests.
-    (Herbert Vojcik)
-
-  * stream.setKeepAlive (Julian Lamb)
-
-  * Bugfix: Force no body on http 204 and 304
-
-  * Upgrade Waf to 1.5.16, V8 to 2.2.4.2
-
-
-2010.04.15, Version 0.1.91, 311d7dee19034ff1c6bc9098c36973b8d687eaba
-
-  * Add incoming.httpVersion
-
-  * Object.prototype problem with C-Ares binding
-
-  * REPL can be run from multiple different streams. (Matt Ranney)
-
-  * After V8 heap is compact, don't use a timer every 2 seconds.
-
-  * Improve nextTick implementation.
-
-  * Add primative support for Upgrading HTTP connections.
-    (See commit log for docs 760bba5)
-
-  * Add timeout and maxBuffer options to child_process.exec
-
-  * Fix bugs.
-
-  * Upgrade V8 to 2.2.3.1
-
-
-2010.04.09, Version 0.1.90, 07e64d45ffa1856e824c4fa6afd0442ba61d6fd8
-
-  * Merge writing of networking system (net2)
-   - New Buffer object for binary data.
-   - Support UNIX sockets, Pipes
-   - Uniform stream API
-   - Currently no SSL
-   - Legacy modules can be accessed at 'http_old' and 'tcp_old'
-
-  * Replace udns with c-ares. (Krishna Rajendran)
-
-  * New documentation system using Markdown and Ronn
-    (Tim Caswell, Micheil Smith)
-
-  * Better idle-time GC
-
-  * Countless small bug fixes.
-
-  * Upgrade V8 to 2.2.X, WAF 1.5.15
-
-
-2010.03.19, Version 0.1.33, 618296ef571e873976f608d91a3d6b9e65fe8284
-
-  * Include lib/ directory in node executable. Compile on demand.
-
-  * evalcx clean ups (Isaac Z. Schlueter, Tim-Smart)
-
-  * Various fixes, clean ups
-
-  * V8 upgraded to 2.1.5
-
-
-2010.03.12, Version 0.1.32, 61c801413544a50000faa7f58376e9b33ba6254f
-
-  * Optimize event emitter for single listener
-
-  * Add process.evalcx, require.registerExtension (Tim Smart)
-
-  * Replace --cflags with --vars
-
-  * Fix bugs in fs.create*Stream (Felix Geisendörfer)
-
-  * Deprecate process.mixin, process.unloop
-
-  * Remove the 'Error: (no message)' exceptions, print stack
-    trace instead
-
-  * INI parser bug fixes (Isaac Schlueter)
-
-  * FreeBSD fixes (Vanilla Hsu)
-
-  * Upgrade to V8 2.1.3, WAF 1.5.14a, libev
-
-
-2010.03.05, Version 0.1.31, 39b63dfe1737d46a8c8818c92773ef181fd174b3
-
-  * API: - Move process.watchFile into fs module
-         - Move process.inherits to sys
-
-  * Improve Solaris port
-
-  * tcp.Connection.prototype.write now returns boolean to indicate if
-    argument was flushed to the kernel buffer.
-
-  * Added fs.link, fs.symlink, fs.readlink, fs.realpath
-    (Rasmus Andersson)
-
-  * Add setgid,getgid (James Duncan)
-
-  * Improve sys.inspect (Benjamin Thomas)
-
-  * Allow passing env to child process (Isaac Schlueter)
-
-  * fs.createWriteStream, fs.createReadStream (Felix Geisendörfer)
-
-  * Add INI parser (Rob Ellis)
-
-  * Bugfix: fs.readFile handling encoding (Jacek Becela)
-
-  * Upgrade V8 to 2.1.2
-
-
-2010.02.22, Version 0.1.30, bb0d1e65e1671aaeb21fac186b066701da0bc33b
-
-  * Major API Changes
-
-    - Promises removed. See
-      http://groups.google.com/group/nodejs/msg/426f3071f3eec16b
-      http://groups.google.com/group/nodejs/msg/df199d233ff17efa
-      The API for fs was
-
-         fs.readdir("/usr").addCallback(function (files) {
-           puts("/usr files: " + files);
-         });
-
-      It is now
-
-         fs.readdir("/usr", function (err, files) {
-           if (err) throw err;
-           puts("/usr files: " + files);
-         });
-
-    - Synchronous fs operations exposed, use with care.
-
-    - tcp.Connection.prototype.readPause() and readResume()
-      renamed to pause() and resume()
-
-    - http.ServerResponse.prototype.sendHeader() renamed to
-      writeHeader(). Now accepts reasonPhrase.
-
-  * Compact garbage on idle.
-
-  * Configurable debug ports, and --debug-brk (Zoran Tomicic)
-
-  * Better command line option parsing (Jeremy Ashkenas)
-
-  * Add fs.chmod (Micheil Smith), fs.lstat (Isaac Z. Schlueter)
-
-  * Fixes to process.mixin (Rasmus Andersson, Benjamin Thomas)
-
-  * Upgrade V8 to 2.1.1
-
-
-2010.02.17, Version 0.1.29, 87d5e5b316a4276bcf881f176971c1a237dcdc7a
-
-  * Major API Changes
-    - Remove 'file' module
-    - require('posix') -----------------> require('fs')
-    - fs.cat ---------------------------> fs.readFile
-    - file.write -----------------------> fs.writeFile
-    - TCP 'receive' event --------------> 'data'
-    - TCP 'eof' event ------------------> 'end'
-    - TCP send() -----------------------> write()
-    - HTTP sendBody() ------------------> write()
-    - HTTP finish() --------------------> close()
-    - HTTP 'body' event ----------------> 'data'
-    - HTTP 'complete' event ------------> 'end'
-    - http.Client.prototype.close() (formerly finish()) no longer
-      takes an argument. Add the 'response' listener manually.
-    - Allow strings for the flag argument to fs.open
-      ("r", "r+", "w", "w+", "a", "a+")
-
-  * Added multiple arg support for sys.puts(), print(), etc.
-    (tj@vision-media.ca)
-
-  * sys.inspect(Date) now shows the date value (Mark Hansen)
-
-  * Calculate page size with getpagesize for armel (Jérémy Lal)
-
-  * Bugfix: stderr flushing.
-
-  * Bugfix: Promise late chain (Yuichiro MASUI)
-
-  * Bugfix: wait() on fired promises
-    (Felix Geisendörfer, Jonas Pfenniger)
-
-  * Bugfix: Use InstanceTemplate() instead of PrototypeTemplate() for
-    accessor methods. Was causing a crash with Eclipse debugger.
-    (Zoran Tomicic)
-
-  * Bugfix: Throw from connection.connect if resolving.
-    (Reported by James Golick)
-
-
-2010.02.09, Version 0.1.28, 49de41ef463292988ddacfb01a20543b963d9669
-
-  * Use Google's jsmin.py which can be used for evil.
-
-  * Add posix.truncate()
-
-  * Throw errors from server.listen()
-
-  * stdio bugfix (test by Mikeal Rogers)
-
-  * Module system refactor (Felix Geisendörfer, Blaine Cook)
-
-  * Add process.setuid(), getuid() (Michael Carter)
-
-  * sys.inspect refactor (Tim Caswell)
-
-  * Multipart library rewrite (isaacs)
-
-
-2010.02.03, Version 0.1.27, 0cfa789cc530848725a8cb5595224e78ae7b9dd0
-
-  * Implemented __dirname (Felix Geisendörfer)
-
-  * Downcase process.ARGV, process.ENV, GLOBAL
-    (now process.argv, process.env, global)
-
-  * Bug Fix: Late promise promise callbacks firing
-    (Felix Geisendörfer, Jonas Pfenniger)
-
-  * Make assert.AssertionError instance of Error
-
-  * Removed inline require call for querystring
-    (self@cloudhead.net)
-
-  * Add support for MX, TXT, and SRV records in DNS module.
-    (Blaine Cook)
-
-  * Bugfix: HTTP client automatically reconnecting
-
-  * Adding OS X .dmg build scripts. (Standa Opichal)
-
-  * Bugfix: ObjectWrap memory leak
-
-  * Bugfix: Multipart handle Content-Type headers with charset
-    (Felix Geisendörfer)
-
-  * Upgrade http-parser to fix header overflow attack.
-
-  * Upgrade V8 to 2.1.0
-
-  * Various other bug fixes, performance improvements.
-
-
-2010.01.20, Version 0.1.26, da00413196e432247346d9e587f8c78ce5ceb087
-
-  * Bugfix, HTTP eof causing crash (Ben Williamson)
-
-  * Better error message on SyntaxError
-
-  * API: Move Promise and EventEmitter into 'events' module
-
-  * API: Add process.nextTick()
-
-  * Allow optional params to setTimeout, setInterval
-    (Micheil Smith)
-
-  * API: change some Promise behavior (Felix Geisendörfer)
-    - Removed Promise.cancel()
-    - Support late callback binding
-    - Make unhandled Promise errors throw an exception
-
-  * Upgrade V8 to 2.0.6.1
-
-  * Solaris port (Erich Ocean)
-
-
-2010.01.09, Version 0.1.25, 39ca93549af91575ca9d4cbafd1e170fbcef3dfa
-
-  * sys.inspect() improvements (Tim Caswell)
-
-  * path module improvements (isaacs, Benjamin Thomas)
-
-  * API: request.uri -> request.url
-    It is no longer an object, but a string. The 'url' module
-    was addded to parse that string. That is, node no longer
-    parses the request URL automatically.
-
-       require('url').parse(request.url)
-
-    is roughly equivlent to the old request.uri object.
-    (isaacs)
-
-  * Bugfix: Several libeio related race conditions.
-
-  * Better errors for multipart library (Felix Geisendörfer)
-
-  * Bugfix: Update node-waf version to 1.5.10
-
-  * getmem for freebsd (Vanilla Hsu)
-
-
-2009.12.31, Version 0.1.24, 642c2773a7eb2034f597af1cd404b9e086b59632
-
-  * Bugfix: don't chunk responses to HTTP/1.0 clients, even if
-    they send Connection: Keep-Alive (e.g. wget)
-
-  * Bugfix: libeio race condition
-
-  * Bugfix: Don't segfault on unknown http method
-
-  * Simplify exception reporting
-
-  * Upgrade V8 to 2.0.5.4
-
-
-2009.12.22, Version 0.1.23, f91e347eeeeac1a8bd6a7b462df0321b60f3affc
-
-  * Bugfix: require("../blah") issues (isaacs)
-
-  * Bugfix: posix.cat (Jonas Pfenniger)
-
-  * Do not pause request for multipart parsing (Felix Geisendörfer)
-
-
-2009.12.19, Version 0.1.22, a2d809fe902f6c4102dba8f2e3e9551aad137c0f
-
-  * Bugfix: child modules get wrong id with "index.js" (isaacs)
-
-  * Bugfix: require("../foo") cycles (isaacs)
-
-  * Bugfix: require() should throw error if module does.
-
-  * New URI parser stolen from Narwhal (isaacs)
-
-  * Bugfix: correctly check kqueue and epoll. (Rasmus Andersson)
-
-  * Upgrade WAF to 1.5.10
-
-  * Bugfix: posix.statSync() was crashing
-
-  * Statically define string symbols for performance improvement
-
-  * Bugfix: ARGV[0] weirdness
-
-  * Added superCtor to ctor.super_ instead superCtor.prototype.
-    (Johan Dahlberg)
-
-  * http-parser supports webdav methods
-
-  * API: http.Client.prototype.request() (Christopher Lenz)
-
-
-2009.12.06, Version 0.1.21, c6affb64f96a403a14d20035e7fbd6d0ce089db5
-
-  * Feature: Add HTTP client TLS support (Rhys Jones)
-
-  * Bugfix: use --jobs=1 with WAF
-
-  * Bugfix: Don't use chunked encoding for 1.0 requests
-
-  * Bugfix: Duplicated header weren't handled correctly
-
-  * Improve sys.inspect (Xavier Shay)
-
-  * Upgrade v8 to 2.0.3
-
-  * Use CommonJS assert API (Felix Geisendörfer, Karl Guertin)
-
-
-2009.11.28, Version 0.1.20, aa42c6790da8ed2cd2b72051c07f6251fe1724d8
-
-  * Add gnutls version to configure script
-
-  * Add V8 heap info to process.memoryUsage()
-
-  * process.watchFile callback has 2 arguments with the stat object
-    (choonkeat@gmail.com)
-
-
-2009.11.28, Version 0.1.19, 633d6be328708055897b72327b88ac88e158935f
-
-  * Feature: Initial TLS support for TCP servers and clients.
-    (Rhys Jones)
-
-  * Add options to process.watchFile()
-
-  * Add process.umask() (Friedemann Altrock)
-
-  * Bugfix: only detach timers when active.
-
-  * Bugfix: lib/file.js write(), shouldn't always emit errors or success
-    (onne@onnlucky.com)
-
-  * Bugfix: Memory leak in fs.write
-    (Reported by onne@onnlucky.com)
-
-  * Bugfix: Fix regular expressions detecting outgoing message headers.
-    (Reported by Elliott Cable)
-
-  * Improvements to Multipart parser (Felix Geisendörfer)
-
-  * New HTTP parser
-
-  * Upgrade v8 to 2.0.2
-
-
-2009.11.17, Version 0.1.18, 027829d2853a14490e6de9fc5f7094652d045ab8
-
-  * Feature: process.watchFile() process.unwatchFile()
-
-  * Feature: "uncaughtException" event on process
-    (Felix Geisendörfer)
-
-  * Feature: 'drain' event to tcp.Connection
-
-  * Bugfix: Promise.timeout() blocked the event loop
-    (Felix Geisendörfer)
-
-  * Bugfix: sendBody() and chunked utf8 strings
-    (Felix Geisendörfer)
-
-  * Supply the strerror as a second arg to the tcp.Connection close
-    event (Johan Sørensen)
-
-  * Add EventEmitter.removeListener (frodenius@gmail.com)
-
-  * Format JSON for inspecting objects (Felix Geisendörfer)
-
-  * Upgrade libev to latest CVS
-
-
-2009.11.07, Version 0.1.17, d1f69ef35dac810530df8249d523add168e09f03
-
-  * Feature: process.chdir() (Brandon Beacher)
-
-  * Revert http parser upgrade. (b893859c34f05db5c45f416949ebc0eee665cca6)
-    Broke keep-alive.
-
-  * API: rename process.inherits to sys.inherits
-
-
-2009.11.03, Version 0.1.16, 726865af7bbafe58435986f4a193ff11c84e4bfe
-
-  * API: Use CommonJS-style module requiring
-    - require("/sys.js") becomes require("sys")
-    - require("circle.js") becomes require("./circle")
-    - process.path.join() becomes require("path").join()
-    - __module becomes module
-
-  * API: Many namespacing changes
-    - Move node.* into process.*
-    - Move node.dns into module "dns"
-    - Move node.fs into module "posix"
-    - process is no longer the global object. GLOBAL is.
-
-  For more information on the API changes see:
-    http://thread.gmane.org/gmane.comp.lang.javascript.nodejs/6
-    http://thread.gmane.org/gmane.comp.lang.javascript.nodejs/14
-
-  * Feature: process.platform, process.memoryUsage()
-
-  * Feature: promise.cancel() (Felix Geisendörfer)
-
-  * Upgrade V8 to 1.3.18
-
-
-2009.10.28, Version 0.1.15, eca2de73ed786b935507fd1c6faccd8df9938fd3
-
-  * Many build system fixes (esp. for OSX users)
-
-  * Feature: promise.timeout() (Felix Geisendörfer)
-
-  * Feature: Added external interface for signal handlers, process.pid, and
-    process.kill() (Brandon Beacher)
-
-  * API: Rename node.libraryPaths to require.paths
-
-  * Bugfix: 'data' event for stdio should emit a string
-
-  * Large file support
-
-  * Upgrade http_parser
-
-  * Upgrade v8 to 1.3.16
-
-
-2009.10.09, Version 0.1.14, b12c809bb84d1265b6a4d970a5b54ee8a4890513
-
-  * Feature: Improved addon builds with node-waf
-
-  * Feature: node.SignalHandler (Brandon Beacher)
-
-  * Feature: Enable V8 debugging (but still need to make a debugger)
-
-  * API: Rename library /utils.js to /sys.js
-
-  * Clean up Node's build system
-
-  * Don't use parseUri for HTTP server
-
-  * Remove node.pc
-
-  * Don't use /bin/sh to create child process except with exec()
-
-  * API: Add __module to reference current module
-
-  * API: Remove include() add node.mixin()
-
-  * Normalize http headers; "Content-Length" becomes "content-length"
-
-  * Upgrade V8 to 1.3.15
-
-
-2009.09.30, Version 0.1.13, 58493bb05b3da3dc8051fabc0bdea9e575c1a107
-
-  * Feature: Multipart stream parser (Felix Geisendörfer)
-
-  * API: Move node.puts(), node.exec() and others to /utils.js
-
-  * API: Move http, tcp libraries to /http.js and /tcp.js
-
-  * API: Rename node.exit() to process.exit()
-
-  * Bugfix: require() and include() should work in callbacks.
-
-  * Pass the Host header in http.cat calls
-
-  * Add warning when coroutine stack size grows too large.
-
-  * Enhance repl library (Ray Morgan)
-
-  * Bugfix: build script for
-      GCC 4.4 (removed -Werror in V8),
-      on Linux 2.4,
-      and with Python 2.4.4.
-
-  * Add read() and write() to /file.js to read and write
-    whole files at once.
-
-
-2009.09.24, Version 0.1.12, 2f56ccb45e87510de712f56705598b3b4e3548ec
-
-  * Feature: System modules, node.libraryPaths
-
-  * API: Remove "raw" encoding, rename "raws" to "binary".
-
-  * API: Added connection.setNoDElay() to disable Nagle algo.
-
-  * Decrease default TCP server backlog to 128
-
-  * Bugfix: memory leak involving node.fs.* methods.
-
-  * Upgrade v8 to 1.3.13
-
-
-2009.09.18, Version 0.1.11, 5ddc4f5d0c002bac0ae3d62fc0dc58f0d2d83ec4
-
-  * API: default to utf8 encoding for node.fs.cat()
-
-  * API: add node.exec()
-
-  * API: node.fs.read() takes a normal encoding parameter.
-
-  * API: Change arguments of emit(), emitSuccess(), emitError()
-
-  * Bugfix: node.fs.write() was stack allocating buffer.
-
-  * Bugfix: ReportException shouldn't forget the top frame.
-
-  * Improve buffering for HTTP outgoing messages
-
-  * Fix and reenable x64 macintosh build.
-
-  * Upgrade v8 to 1.3.11
-
-
-2009.09.11, Version 0.1.10, 12bb0d46ce761e3d00a27170e63b40408c15b558
-
-  * Feature: raw string encoding "raws"
-
-  * Feature: access to environ through "ENV"
-
-  * Feature: add isDirectory, isFile, isSocket, ... methods
-    to stats object.
-
-  * Bugfix: Internally use full paths when loading modules
-    this fixes a shebang loading problem.
-
-  * Bugfix: Add '--' command line argument for seperating v8
-    args from program args.
-
-  * Add man page.
-
-  * Add node-repl
-
-  * Upgrade v8 to 1.3.10
-
-2009.09.05, Version 0.1.9, d029764bb32058389ecb31ed54a5d24d2915ad4c
-
-  * Bugfix: Compile on Snow Leopard.
-
-  * Bugfix: Malformed URIs raising exceptions.
-
-2009.09.04, Version 0.1.8, e6d712a937b61567e81b15085edba863be16ba96
-
-  * Feature: External modules
-
-  * Feature: setTimeout() for node.tcp.Connection
-
-  * Feature: add node.cwd(), node.fs.readdir(), node.fs.mkdir()
-
-  * Bugfix: promise.wait() releasing out of order.
-
-  * Bugfix: Asyncly do getaddrinfo() on Apple.
-
-  * Disable useless evcom error messages.
-
-  * Better stack traces.
-
-  * Built natively on x64.
-
-  * Upgrade v8 to 1.3.9
-
-2009.08.27, Version 0.1.7, f7acef9acf8ba8433d697ad5ed99d2e857387e4b
-
-  * Feature: global 'process' object. Emits "exit".
-
-  * Feature: promise.wait()
-
-  * Feature: node.stdio
-
-  * Feature: EventEmitters emit "newListener" when listeners are
-    added
-
-  * API:  Use flat object instead of array-of-arrays for HTTP
-    headers.
-
-  * API: Remove buffered file object (node.File)
-
-  * API: require(), include() are synchronous. (Uses
-    continuations.)
-
-  * API: Deprecate onLoad and onExit.
-
-  * API: Rename node.Process to node.ChildProcess
-
-  * Refactor node.Process to take advantage of evcom_reader/writer.
-
-  * Upgrade v8 to 1.3.7
-
-2009.08.22, Version 0.1.6, 9c97b1db3099d61cd292aa59ec2227a619f3a7ab
-
-  * Bugfix: Ignore SIGPIPE.
-
-2009.08.21, Version 0.1.5, b0fd3e281cb5f7cd8d3a26bd2b89e1b59998e5ed
-
-  * Bugfix: Buggy connections could crash node.js. Now check
-    connection before sending data every time (Kevin van Zonneveld)
-
-  * Bugfix: stdin fd (0) being ignored by node.File. (Abe Fettig)
-
-  * API: Remove connnection.fullClose()
-
-  * API: Return the EventEmitter from addListener for chaining.
-
-  * API: tcp.Connection "disconnect" event renamed to "close"
-
-  * Upgrade evcom
-    Upgrade v8 to 1.3.6
-
-2009.08.13, Version 0.1.4, 0f888ed6de153f68c17005211d7e0f960a5e34f3
-
-  * Major refactor to evcom.
-
-  * Enable test-tcp-many-clients.
-
-  * Add -m32 gcc flag to udns.
-
-  * Add connection.readPause() and connection.readResume()
-    Add IncomingMessage.prototype.pause() and resume().
-
-  * Fix http benchmark. Wasn't correctly dispatching.
-
-  * Bugfix: response.setBodyEncoding("ascii") not working.
-
-  * Bugfix: Negative ints in HTTP's on_body and node.fs.read()
-
-  * Upgrade v8 to 1.3.4
-    Upgrade libev to 3.8
-    Upgrade http_parser to v0.2
-
-2009.08.06, Version 0.1.3, 695f0296e35b30cf8322fd1bd934810403cca9f3
-
-  * Upgrade v8 to 1.3.2
-
-  * Bugfix: node.http.ServerRequest.setBodyEncoding('ascii') not
-    working
-
-  * Bugfix: node.encodeUtf8 was broken. (Connor Dunn)
-
-  * Add ranlib to udns Makefile.
-
-  * Upgrade evcom - fix accepting too many connections issue.
-
-  * Initial support for shebang
-
-  * Add simple command line switches
-
-  * Add node.version API
-
-
-2009.08.01, Version 0.1.2, 025a34244d1cea94d6d40ad7bf92671cb909a96c
-
-  * Add DNS API
-
-  * node.tcp.Server's backlog option is now an argument to listen()
-
-  * Upgrade V8 to 1.3.1
-
-  * Bugfix: Default to chunked for client requests without
-    Content-Length.
-
-  * Bugfix: Line numbers in stack traces.
-
-  * Bugfix: negative integers in raw encoding stream
-
-  * Bugfix: node.fs.File was not passing args to promise callbacks.
-
-
-2009.07.27, Version 0.1.1, 77d407df2826b20e9177c26c0d2bb4481e497937
-
-  * Simplify and clean up ObjectWrap.
-
-  * Upgrade liboi (which is now called evcom)
-    Upgrade libev to 3.7
-    Upgrade V8 to 1.2.14
-
-  * Array.prototype.encodeUtf8 renamed to node.encodeUtf8(array)
-
-  * Move EventEmitter.prototype.emit() completely into C++.
-
-  * Bugfix: Fix memory leak in event emitters.
-    http://groups.google.com/group/nodejs/browse_thread/thread/a8d1dfc2fd57a6d1
-
-  * Bugfix: Had problems reading scripts with non-ascii characters.
-
-  * Bugfix: Fix Detach() in node::Server
-
-  * Bugfix: Sockets not properly reattached if reconnected during
-    disconnect event.
-
-  * Bugfix: Server-side clients not attached between creation and
-    on_connect.
-
-  * Add 'close' event to node.tcp.Server
-
-  * Simplify and clean up http.js. (Takes more advantage of event
-    infrastructure.)
-
-  * Add benchmark scripts. Run with "make benchmark".
-
-
-2009.06.30, Version 0.1.0, 0fe44d52fe75f151bceb59534394658aae6ac328
-
-  * Update documentation, use asciidoc.
-
-  * EventEmitter and Promise interfaces. (Breaks previous API.)
-
-  * Remove node.Process constructor in favor of node.createProcess
-
-  * Add -m32 flags for compiling on x64 platforms.
-    (Thanks to András Bártházi)
-
-  * Upgrade v8 to 1.2.10 and libev to 3.6
-
-  * Bugfix: Timer::RepeatSetter wasn't working.
-
-  * Bugfix: Spawning many processes in a loop
-    (reported by Felix Geisendörfer)
-
-
-2009.06.24, Version 0.0.6, fbe0be19ebfb422d8fa20ea5204c1713e9214d5f
-
-  * Load modules via HTTP URLs (Urban Hafner)
-
-  * Bugfix: Add HTTPConnection->size() and HTTPServer->size()
-
-  * New node.Process API
-
-  * Clean up build tools, use v8's test runner.
-
-  * Use ev_unref() instead of starting/stopping the eio thread
-    pool watcher.
-
-
-2009.06.18, Version 0.0.5, 3a2b41de74b6c343b8464a68eff04c4bfd9aebea
-
-  * Support for IPv6
-
-  * Remove namespace node.constants
-
-  * Upgrade v8 to 1.2.8.1
-
-  * Accept ports as strings in the TCP client and server.
-
-  * Bugfix: HTTP Client race
-
-  * Bugfix: freeaddrinfo() wasn't getting called after
-    getaddrinfo() for TCP servers
-
-  * Add "opening" to TCP client readyState
-
-  * Add remoteAddress to TCP client
-
-  * Add global print() function.
-
-
-2009.06.13, Version 0.0.4, 916b9ca715b229b0703f0ed6c2fc065410fb189c
-
- * Add interrupt() method to server-side HTTP requests.
-
- * Bugfix: onBodyComplete was not getting called on server-side
-   HTTP
-
-
-2009.06.11, Version 0.0.3, 6e0dfe50006ae4f5dac987f055e0c9338662f40a
-
- * Many bug fixes including the problem with http.Client on
-   macintosh
-
- * Upgrades v8 to 1.2.7
-
- * Adds onExit hook
-
- * Guard against buffer overflow in http parser
-
- * require() and include() now need the ".js" extension
-
- * http.Client uses identity transfer encoding by default.
--- a/node/node-v0.10.22-linux-x86/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,812 +0,0 @@
-Node's license follows:
-
-====
-
-Copyright Joyent, Inc. and other Node contributors. All rights reserved.
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to
-deal in the Software without restriction, including without limitation the
-rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
-sell copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
-IN THE SOFTWARE.
-
-====
-
-This license applies to all parts of Node that are not externally
-maintained libraries. The externally maintained libraries used by Node are:
-
-- V8, located at deps/v8. V8's license follows:
-  """
-    This license applies to all parts of V8 that are not externally
-    maintained libraries.  The externally maintained libraries used by V8
-    are:
-
-      - PCRE test suite, located in
-        test/mjsunit/third_party/regexp-pcre.js.  This is based on the
-        test suite from PCRE-7.3, which is copyrighted by the University
-        of Cambridge and Google, Inc.  The copyright notice and license
-        are embedded in regexp-pcre.js.
-
-      - Layout tests, located in test/mjsunit/third_party.  These are
-        based on layout tests from webkit.org which are copyrighted by
-        Apple Computer, Inc. and released under a 3-clause BSD license.
-
-      - Strongtalk assembler, the basis of the files assembler-arm-inl.h,
-        assembler-arm.cc, assembler-arm.h, assembler-ia32-inl.h,
-        assembler-ia32.cc, assembler-ia32.h, assembler-x64-inl.h,
-        assembler-x64.cc, assembler-x64.h, assembler-mips-inl.h,
-        assembler-mips.cc, assembler-mips.h, assembler.cc and assembler.h.
-        This code is copyrighted by Sun Microsystems Inc. and released
-        under a 3-clause BSD license.
-
-      - Valgrind client API header, located at third_party/valgrind/valgrind.h
-        This is release under the BSD license.
-
-    These libraries have their own licenses; we recommend you read them,
-    as their terms may differ from the terms below.
-
-    Copyright 2006-2012, the V8 project authors. All rights reserved.
-    Redistribution and use in source and binary forms, with or without
-    modification, are permitted provided that the following conditions are
-    met:
-
-        * Redistributions of source code must retain the above copyright
-          notice, this list of conditions and the following disclaimer.
-        * Redistributions in binary form must reproduce the above
-          copyright notice, this list of conditions and the following
-          disclaimer in the documentation and/or other materials provided
-          with the distribution.
-        * Neither the name of Google Inc. nor the names of its
-          contributors may be used to endorse or promote products derived
-          from this software without specific prior written permission.
-
-    THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
-    "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
-    LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
-    A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
-    OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
-    SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
-    LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
-    DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
-    THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-    (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
-    OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-  """
-
-- C-Ares, an asynchronous DNS client, located at deps/cares. C-Ares license
-  follows:
-  """
-    /* Copyright 1998 by the Massachusetts Institute of Technology.
-     *
-     * Permission to use, copy, modify, and distribute this
-     * software and its documentation for any purpose and without
-     * fee is hereby granted, provided that the above copyright
-     * notice appear in all copies and that both that copyright
-     * notice and this permission notice appear in supporting
-     * documentation, and that the name of M.I.T. not be used in
-     * advertising or publicity pertaining to distribution of the
-     * software without specific, written prior permission.
-     * M.I.T. makes no representations about the suitability of
-     * this software for any purpose.  It is provided "as is"
-     * without express or implied warranty.
-  """
-
-- OpenSSL located at deps/openssl. OpenSSL is cryptographic software written
-  by Eric Young (eay@cryptsoft.com) to provide SSL/TLS encryption. OpenSSL's
-  license follows:
-  """
-    /* ====================================================================
-     * Copyright (c) 1998-2011 The OpenSSL Project.  All rights reserved.
-     *
-     * Redistribution and use in source and binary forms, with or without
-     * modification, are permitted provided that the following conditions
-     * are met:
-     *
-     * 1. Redistributions of source code must retain the above copyright
-     *    notice, this list of conditions and the following disclaimer.
-     *
-     * 2. Redistributions in binary form must reproduce the above copyright
-     *    notice, this list of conditions and the following disclaimer in
-     *    the documentation and/or other materials provided with the
-     *    distribution.
-     *
-     * 3. All advertising materials mentioning features or use of this
-     *    software must display the following acknowledgment:
-     *    "This product includes software developed by the OpenSSL Project
-     *    for use in the OpenSSL Toolkit. (http://www.openssl.org/)"
-     *
-     * 4. The names "OpenSSL Toolkit" and "OpenSSL Project" must not be used to
-     *    endorse or promote products derived from this software without
-     *    prior written permission. For written permission, please contact
-     *    openssl-core@openssl.org.
-     *
-     * 5. Products derived from this software may not be called "OpenSSL"
-     *    nor may "OpenSSL" appear in their names without prior written
-     *    permission of the OpenSSL Project.
-     *
-     * 6. Redistributions of any form whatsoever must retain the following
-     *    acknowledgment:
-     *    "This product includes software developed by the OpenSSL Project
-     *    for use in the OpenSSL Toolkit (http://www.openssl.org/)"
-     *
-     * THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY
-     * EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-     * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-     * PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE OpenSSL PROJECT OR
-     * ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
-     * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
-     * NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
-     * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
-     * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
-     * STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
-     * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
-     * OF THE POSSIBILITY OF SUCH DAMAGE.
-     * ====================================================================
-     *
-     * This product includes cryptographic software written by Eric Young
-     * (eay@cryptsoft.com).  This product includes software written by Tim
-     * Hudson (tjh@cryptsoft.com).
-     *
-     */
-  """
-
-- HTTP Parser, located at deps/http_parser. HTTP Parser's license follows:
-  """
-    http_parser.c is based on src/http/ngx_http_parse.c from NGINX copyright
-    Igor Sysoev.
-
-    Additional changes are licensed under the same terms as NGINX and
-    copyright Joyent, Inc. and other Node contributors. All rights reserved.
-
-    Permission is hereby granted, free of charge, to any person obtaining a copy
-    of this software and associated documentation files (the "Software"), to
-    deal in the Software without restriction, including without limitation the
-    rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
-    sell copies of the Software, and to permit persons to whom the Software is
-    furnished to do so, subject to the following conditions:
-
-    The above copyright notice and this permission notice shall be included in
-    all copies or substantial portions of the Software.
-
-    THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-    IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-    FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-    AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-    LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-    FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
-    IN THE SOFTWARE.
-  """
-
-- Closure Linter is located at tools/closure_linter. Closure's license
-  follows:
-  """
-    # Copyright (c) 2007, Google Inc.
-    # All rights reserved.
-    #
-    # Redistribution and use in source and binary forms, with or without
-    # modification, are permitted provided that the following conditions are
-    # met:
-    #
-    #     * Redistributions of source code must retain the above copyright
-    # notice, this list of conditions and the following disclaimer.
-    #     * Redistributions in binary form must reproduce the above
-    # copyright notice, this list of conditions and the following disclaimer
-    # in the documentation and/or other materials provided with the
-    # distribution.
-    #     * Neither the name of Google Inc. nor the names of its
-    # contributors may be used to endorse or promote products derived from
-    # this software without specific prior written permission.
-    #
-    # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
-    # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
-    # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
-    # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
-    # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
-    # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
-    # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
-    # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
-    # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-    # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
-    # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-  """
-
-- tools/cpplint.py is a C++ linter. Its license follows:
-  """
-    # Copyright (c) 2009 Google Inc. All rights reserved.
-    #
-    # Redistribution and use in source and binary forms, with or without
-    # modification, are permitted provided that the following conditions are
-    # met:
-    #
-    #    * Redistributions of source code must retain the above copyright
-    # notice, this list of conditions and the following disclaimer.
-    #    * Redistributions in binary form must reproduce the above
-    # copyright notice, this list of conditions and the following disclaimer
-    # in the documentation and/or other materials provided with the
-    # distribution.
-    #    * Neither the name of Google Inc. nor the names of its
-    # contributors may be used to endorse or promote products derived from
-    # this software without specific prior written permission.
-    #
-    # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
-    # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
-    # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
-    # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
-    # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
-    # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
-    # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
-    # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
-    # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-    # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
-    # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-  """
-
-- lib/punycode.js is copyright 2011 Mathias Bynens <http://mathiasbynens.be/>
-  and released under the MIT license.
-  """
-    * Punycode.js <http://mths.be/punycode>
-    * Copyright 2011 Mathias Bynens <http://mathiasbynens.be/>
-    * Available under MIT license <http://mths.be/mit>
-  """
-
-- tools/gyp. GYP is a meta-build system. GYP's license follows:
-  """
-    Copyright (c) 2009 Google Inc. All rights reserved.
-
-    Redistribution and use in source and binary forms, with or without
-    modification, are permitted provided that the following conditions are
-    met:
-
-       * Redistributions of source code must retain the above copyright
-    notice, this list of conditions and the following disclaimer.
-       * Redistributions in binary form must reproduce the above
-    copyright notice, this list of conditions and the following disclaimer
-    in the documentation and/or other materials provided with the
-    distribution.
-       * Neither the name of Google Inc. nor the names of its
-    contributors may be used to endorse or promote products derived from
-    this software without specific prior written permission.
-
-    THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
-    "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
-    LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
-    A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
-    OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
-    SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
-    LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
-    DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
-    THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-    (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
-    OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-  """
-
-- Zlib at deps/zlib. zlib's license follows:
-  """
-    /* zlib.h -- interface of the 'zlib' general purpose compression library
-      version 1.2.4, March 14th, 2010
-
-      Copyright (C) 1995-2010 Jean-loup Gailly and Mark Adler
-
-      This software is provided 'as-is', without any express or implied
-      warranty.  In no event will the authors be held liable for any damages
-      arising from the use of this software.
-
-      Permission is granted to anyone to use this software for any purpose,
-      including commercial applications, and to alter it and redistribute it
-      freely, subject to the following restrictions:
-
-      1. The origin of this software must not be misrepresented; you must not
-         claim that you wrote the original software. If you use this software
-         in a product, an acknowledgment in the product documentation would be
-         appreciated but is not required.
-      2. Altered source versions must be plainly marked as such, and must not be
-         misrepresented as being the original software.
-      3. This notice may not be removed or altered from any source distribution.
-
-      Jean-loup Gailly
-      Mark Adler
-
-    */
-  """
-
-- npm is a package manager program located at deps/npm.
-  npm's license follows:
-  """
-    Copyright (c) Isaac Z. Schlueter
-    All rights reserved.
-
-    npm is released under the Artistic 2.0 License.
-    The text of the License follows:
-
-
-    --------
-
-
-    The Artistic License 2.0
-
-    Copyright (c) 2000-2006, The Perl Foundation.
-
-    Everyone is permitted to copy and distribute verbatim copies
-    of this license document, but changing it is not allowed.
-
-    Preamble
-
-    This license establishes the terms under which a given free software
-    Package may be copied, modified, distributed, and/or redistributed.
-    The intent is that the Copyright Holder maintains some artistic
-    control over the development of that Package while still keeping the
-    Package available as open source and free software.
-
-    You are always permitted to make arrangements wholly outside of this
-    license directly with the Copyright Holder of a given Package.  If the
-    terms of this license do not permit the full use that you propose to
-    make of the Package, you should contact the Copyright Holder and seek
-    a different licensing arrangement.
-
-    Definitions
-
-        "Copyright Holder" means the individual(s) or organization(s)
-        named in the copyright notice for the entire Package.
-
-        "Contributor" means any party that has contributed code or other
-        material to the Package, in accordance with the Copyright Holder's
-        procedures.
-
-        "You" and "your" means any person who would like to copy,
-        distribute, or modify the Package.
-
-        "Package" means the collection of files distributed by the
-        Copyright Holder, and derivatives of that collection and/or of
-        those files. A given Package may consist of either the Standard
-        Version, or a Modified Version.
-
-        "Distribute" means providing a copy of the Package or making it
-        accessible to anyone else, or in the case of a company or
-        organization, to others outside of your company or organization.
-
-        "Distributor Fee" means any fee that you charge for Distributing
-        this Package or providing support for this Package to another
-        party.  It does not mean licensing fees.
-
-        "Standard Version" refers to the Package if it has not been
-        modified, or has been modified only in ways explicitly requested
-        by the Copyright Holder.
-
-        "Modified Version" means the Package, if it has been changed, and
-        such changes were not explicitly requested by the Copyright
-        Holder.
-
-        "Original License" means this Artistic License as Distributed with
-        the Standard Version of the Package, in its current version or as
-        it may be modified by The Perl Foundation in the future.
-
-        "Source" form means the source code, documentation source, and
-        configuration files for the Package.
-
-        "Compiled" form means the compiled bytecode, object code, binary,
-        or any other form resulting from mechanical transformation or
-        translation of the Source form.
-
-
-    Permission for Use and Modification Without Distribution
-
-    (1)  You are permitted to use the Standard Version and create and use
-    Modified Versions for any purpose without restriction, provided that
-    you do not Distribute the Modified Version.
-
-
-    Permissions for Redistribution of the Standard Version
-
-    (2)  You may Distribute verbatim copies of the Source form of the
-    Standard Version of this Package in any medium without restriction,
-    either gratis or for a Distributor Fee, provided that you duplicate
-    all of the original copyright notices and associated disclaimers.  At
-    your discretion, such verbatim copies may or may not include a
-    Compiled form of the Package.
-
-    (3)  You may apply any bug fixes, portability changes, and other
-    modifications made available from the Copyright Holder.  The resulting
-    Package will still be considered the Standard Version, and as such
-    will be subject to the Original License.
-
-
-    Distribution of Modified Versions of the Package as Source
-
-    (4)  You may Distribute your Modified Version as Source (either gratis
-    or for a Distributor Fee, and with or without a Compiled form of the
-    Modified Version) provided that you clearly document how it differs
-    from the Standard Version, including, but not limited to, documenting
-    any non-standard features, executables, or modules, and provided that
-    you do at least ONE of the following:
-
-        (a)  make the Modified Version available to the Copyright Holder
-        of the Standard Version, under the Original License, so that the
-        Copyright Holder may include your modifications in the Standard
-        Version.
-
-        (b)  ensure that installation of your Modified Version does not
-        prevent the user installing or running the Standard Version. In
-        addition, the Modified Version must bear a name that is different
-        from the name of the Standard Version.
-
-        (c)  allow anyone who receives a copy of the Modified Version to
-        make the Source form of the Modified Version available to others
-        under
-
-            (i)  the Original License or
-
-            (ii)  a license that permits the licensee to freely copy,
-            modify and redistribute the Modified Version using the same
-            licensing terms that apply to the copy that the licensee
-            received, and requires that the Source form of the Modified
-            Version, and of any works derived from it, be made freely
-            available in that license fees are prohibited but Distributor
-            Fees are allowed.
-
-
-    Distribution of Compiled Forms of the Standard Version
-    or Modified Versions without the Source
-
-    (5)  You may Distribute Compiled forms of the Standard Version without
-    the Source, provided that you include complete instructions on how to
-    get the Source of the Standard Version.  Such instructions must be
-    valid at the time of your distribution.  If these instructions, at any
-    time while you are carrying out such distribution, become invalid, you
-    must provide new instructions on demand or cease further distribution.
-    If you provide valid instructions or cease distribution within thirty
-    days after you become aware that the instructions are invalid, then
-    you do not forfeit any of your rights under this license.
-
-    (6)  You may Distribute a Modified Version in Compiled form without
-    the Source, provided that you comply with Section 4 with respect to
-    the Source of the Modified Version.
-
-
-    Aggregating or Linking the Package
-
-    (7)  You may aggregate the Package (either the Standard Version or
-    Modified Version) with other packages and Distribute the resulting
-    aggregation provided that you do not charge a licensing fee for the
-    Package.  Distributor Fees are permitted, and licensing fees for other
-    components in the aggregation are permitted. The terms of this license
-    apply to the use and Distribution of the Standard or Modified Versions
-    as included in the aggregation.
-
-    (8) You are permitted to link Modified and Standard Versions with
-    other works, to embed the Package in a larger work of your own, or to
-    build stand-alone binary or bytecode versions of applications that
-    include the Package, and Distribute the result without restriction,
-    provided the result does not expose a direct interface to the Package.
-
-
-    Items That are Not Considered Part of a Modified Version
-
-    (9) Works (including, but not limited to, modules and scripts) that
-    merely extend or make use of the Package, do not, by themselves, cause
-    the Package to be a Modified Version.  In addition, such works are not
-    considered parts of the Package itself, and are not subject to the
-    terms of this license.
-
-
-    General Provisions
-
-    (10)  Any use, modification, and distribution of the Standard or
-    Modified Versions is governed by this Artistic License. By using,
-    modifying or distributing the Package, you accept this license. Do not
-    use, modify, or distribute the Package, if you do not accept this
-    license.
-
-    (11)  If your Modified Version has been derived from a Modified
-    Version made by someone other than you, you are nevertheless required
-    to ensure that your Modified Version complies with the requirements of
-    this license.
-
-    (12)  This license does not grant you the right to use any trademark,
-    service mark, tradename, or logo of the Copyright Holder.
-
-    (13)  This license includes the non-exclusive, worldwide,
-    free-of-charge patent license to make, have made, use, offer to sell,
-    sell, import and otherwise transfer the Package with respect to any
-    patent claims licensable by the Copyright Holder that are necessarily
-    infringed by the Package. If you institute patent litigation
-    (including a cross-claim or counterclaim) against any party alleging
-    that the Package constitutes direct or contributory patent
-    infringement, then this Artistic License to you shall terminate on the
-    date that such litigation is filed.
-
-    (14)  Disclaimer of Warranty:
-    THE PACKAGE IS PROVIDED BY THE COPYRIGHT HOLDER AND CONTRIBUTORS "AS
-    IS' AND WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES. THE IMPLIED
-    WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR
-    NON-INFRINGEMENT ARE DISCLAIMED TO THE EXTENT PERMITTED BY YOUR LOCAL
-    LAW. UNLESS REQUIRED BY LAW, NO COPYRIGHT HOLDER OR CONTRIBUTOR WILL
-    BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL
-    DAMAGES ARISING IN ANY WAY OUT OF THE USE OF THE PACKAGE, EVEN IF
-    ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-
-
-    --------
-
-
-    "Node.js" and "node" trademark Joyent, Inc. npm is not officially
-    part of the Node.js project, and is neither owned by nor
-    officially affiliated with Joyent, Inc.
-
-    Packages published in the npm registry (other than the Software and
-    its included dependencies) are not part of npm itself, are the sole
-    property of their respective maintainers, and are not covered by
-    this license.
-
-    "npm Logo" created by Mathias Pettersson and Brian Hammond,
-    used with permission.
-
-    "Gubblebum Blocky" font
-    Copyright (c) by Tjarda Koster, http://jelloween.deviantart.com
-    included for use in the npm website and documentation,
-    used with permission.
-
-    This program uses several Node modules contained in the node_modules/
-    subdirectory, according to the terms of their respective licenses.
-  """
-
-- tools/doc/node_modules/marked. Marked is a Markdown parser. Marked's
-  license follows:
-  """
-    Copyright (c) 2011-2012, Christopher Jeffrey (https://github.com/chjj/)
-
-    Permission is hereby granted, free of charge, to any person obtaining a copy
-    of this software and associated documentation files (the "Software"), to deal
-    in the Software without restriction, including without limitation the rights
-    to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-    copies of the Software, and to permit persons to whom the Software is
-    furnished to do so, subject to the following conditions:
-
-    The above copyright notice and this permission notice shall be included in
-    all copies or substantial portions of the Software.
-
-    THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-    IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-    FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-    AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-    LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-    OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-    THE SOFTWARE.
-  """
-
-- test/gc/node_modules/weak. Node-weak is a node.js addon that provides garbage
-  collector notifications. Node-weak's license follows:
-  """
-    Copyright (c) 2011, Ben Noordhuis <info@bnoordhuis.nl>
-
-    Permission to use, copy, modify, and/or distribute this software for any
-    purpose with or without fee is hereby granted, provided that the above
-    copyright notice and this permission notice appear in all copies.
-
-    THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
-    WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
-    MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
-    ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
-    WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
-    ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
-    OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
-  """
-
-- src/ngx-queue.h. ngx-queue.h is taken from the nginx source tree. nginx's
-  license follows:
-  """
-    Copyright (C) 2002-2012 Igor Sysoev
-    Copyright (C) 2011,2012 Nginx, Inc.
-
-    Redistribution and use in source and binary forms, with or without
-    modification, are permitted provided that the following conditions
-    are met:
-    1. Redistributions of source code must retain the above copyright
-       notice, this list of conditions and the following disclaimer.
-    2. Redistributions in binary form must reproduce the above copyright
-       notice, this list of conditions and the following disclaimer in the
-       documentation and/or other materials provided with the distribution.
-
-    THIS SOFTWARE IS PROVIDED BY AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-    ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-    IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
-    ARE DISCLAIMED.  IN NO EVENT SHALL AUTHOR OR CONTRIBUTORS BE LIABLE
-    FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
-    DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
-    OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
-    HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
-    LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
-    OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
-    SUCH DAMAGE.
-  """
-
-- wrk is located at tools/wrk. wrk's license follows:
-  """
-
-                                     Apache License
-                               Version 2.0, January 2004
-                            http://www.apache.org/licenses/
-
-       TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-       1. Definitions.
-
-          "License" shall mean the terms and conditions for use, reproduction,
-          and distribution as defined by Sections 1 through 9 of this document.
-
-          "Licensor" shall mean the copyright owner or entity authorized by
-          the copyright owner that is granting the License.
-
-          "Legal Entity" shall mean the union of the acting entity and all
-          other entities that control, are controlled by, or are under common
-          control with that entity. For the purposes of this definition,
-          "control" means (i) the power, direct or indirect, to cause the
-          direction or management of such entity, whether by contract or
-          otherwise, or (ii) ownership of fifty percent (50%) or more of the
-          outstanding shares, or (iii) beneficial ownership of such entity.
-
-          "You" (or "Your") shall mean an individual or Legal Entity
-          exercising permissions granted by this License.
-
-          "Source" form shall mean the preferred form for making modifications,
-          including but not limited to software source code, documentation
-          source, and configuration files.
-
-          "Object" form shall mean any form resulting from mechanical
-          transformation or translation of a Source form, including but
-          not limited to compiled object code, generated documentation,
-          and conversions to other media types.
-
-          "Work" shall mean the work of authorship, whether in Source or
-          Object form, made available under the License, as indicated by a
-          copyright notice that is included in or attached to the work
-          (an example is provided in the Appendix below).
-
-          "Derivative Works" shall mean any work, whether in Source or Object
-          form, that is based on (or derived from) the Work and for which the
-          editorial revisions, annotations, elaborations, or other modifications
-          represent, as a whole, an original work of authorship. For the purposes
-          of this License, Derivative Works shall not include works that remain
-          separable from, or merely link (or bind by name) to the interfaces of,
-          the Work and Derivative Works thereof.
-
-          "Contribution" shall mean any work of authorship, including
-          the original version of the Work and any modifications or additions
-          to that Work or Derivative Works thereof, that is intentionally
-          submitted to Licensor for inclusion in the Work by the copyright owner
-          or by an individual or Legal Entity authorized to submit on behalf of
-          the copyright owner. For the purposes of this definition, "submitted"
-          means any form of electronic, verbal, or written communication sent
-          to the Licensor or its representatives, including but not limited to
-          communication on electronic mailing lists, source code control systems,
-          and issue tracking systems that are managed by, or on behalf of, the
-          Licensor for the purpose of discussing and improving the Work, but
-          excluding communication that is conspicuously marked or otherwise
-          designated in writing by the copyright owner as "Not a Contribution."
-
-          "Contributor" shall mean Licensor and any individual or Legal Entity
-          on behalf of whom a Contribution has been received by Licensor and
-          subsequently incorporated within the Work.
-
-       2. Grant of Copyright License. Subject to the terms and conditions of
-          this License, each Contributor hereby grants to You a perpetual,
-          worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-          copyright license to reproduce, prepare Derivative Works of,
-          publicly display, publicly perform, sublicense, and distribute the
-          Work and such Derivative Works in Source or Object form.
-
-       3. Grant of Patent License. Subject to the terms and conditions of
-          this License, each Contributor hereby grants to You a perpetual,
-          worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-          (except as stated in this section) patent license to make, have made,
-          use, offer to sell, sell, import, and otherwise transfer the Work,
-          where such license applies only to those patent claims licensable
-          by such Contributor that are necessarily infringed by their
-          Contribution(s) alone or by combination of their Contribution(s)
-          with the Work to which such Contribution(s) was submitted. If You
-          institute patent litigation against any entity (including a
-          cross-claim or counterclaim in a lawsuit) alleging that the Work
-          or a Contribution incorporated within the Work constitutes direct
-          or contributory patent infringement, then any patent licenses
-          granted to You under this License for that Work shall terminate
-          as of the date such litigation is filed.
-
-       4. Redistribution. You may reproduce and distribute copies of the
-          Work or Derivative Works thereof in any medium, with or without
-          modifications, and in Source or Object form, provided that You
-          meet the following conditions:
-
-          (a) You must give any other recipients of the Work or
-              Derivative Works a copy of this License; and
-
-          (b) You must cause any modified files to carry prominent notices
-              stating that You changed the files; and
-
-          (c) You must retain, in the Source form of any Derivative Works
-              that You distribute, all copyright, patent, trademark, and
-              attribution notices from the Source form of the Work,
-              excluding those notices that do not pertain to any part of
-              the Derivative Works; and
-
-          (d) If the Work includes a "NOTICE" text file as part of its
-              distribution, then any Derivative Works that You distribute must
-              include a readable copy of the attribution notices contained
-              within such NOTICE file, excluding those notices that do not
-              pertain to any part of the Derivative Works, in at least one
-              of the following places: within a NOTICE text file distributed
-              as part of the Derivative Works; within the Source form or
-              documentation, if provided along with the Derivative Works; or,
-              within a display generated by the Derivative Works, if and
-              wherever such third-party notices normally appear. The contents
-              of the NOTICE file are for informational purposes only and
-              do not modify the License. You may add Your own attribution
-              notices within Derivative Works that You distribute, alongside
-              or as an addendum to the NOTICE text from the Work, provided
-              that such additional attribution notices cannot be construed
-              as modifying the License.
-
-          You may add Your own copyright statement to Your modifications and
-          may provide additional or different license terms and conditions
-          for use, reproduction, or distribution of Your modifications, or
-          for any such Derivative Works as a whole, provided Your use,
-          reproduction, and distribution of the Work otherwise complies with
-          the conditions stated in this License.
-
-       5. Submission of Contributions. Unless You explicitly state otherwise,
-          any Contribution intentionally submitted for inclusion in the Work
-          by You to the Licensor shall be under the terms and conditions of
-          this License, without any additional terms or conditions.
-          Notwithstanding the above, nothing herein shall supersede or modify
-          the terms of any separate license agreement you may have executed
-          with Licensor regarding such Contributions.
-
-       6. Trademarks. This License does not grant permission to use the trade
-          names, trademarks, service marks, or product names of the Licensor,
-          except as required for reasonable and customary use in describing the
-          origin of the Work and reproducing the content of the NOTICE file.
-
-       7. Disclaimer of Warranty. Unless required by applicable law or
-          agreed to in writing, Licensor provides the Work (and each
-          Contributor provides its Contributions) on an "AS IS" BASIS,
-          WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
-          implied, including, without limitation, any warranties or conditions
-          of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
-          PARTICULAR PURPOSE. You are solely responsible for determining the
-          appropriateness of using or redistributing the Work and assume any
-          risks associated with Your exercise of permissions under this License.
-
-       8. Limitation of Liability. In no event and under no legal theory,
-          whether in tort (including negligence), contract, or otherwise,
-          unless required by applicable law (such as deliberate and grossly
-          negligent acts) or agreed to in writing, shall any Contributor be
-          liable to You for damages, including any direct, indirect, special,
-          incidental, or consequential damages of any character arising as a
-          result of this License or out of the use or inability to use the
-          Work (including but not limited to damages for loss of goodwill,
-          work stoppage, computer failure or malfunction, or any and all
-          other commercial damages or losses), even if such Contributor
-          has been advised of the possibility of such damages.
-
-       9. Accepting Warranty or Additional Liability. While redistributing
-          the Work or Derivative Works thereof, You may choose to offer,
-          and charge a fee for, acceptance of support, warranty, indemnity,
-          or other liability obligations and/or rights consistent with this
-          License. However, in accepting such obligations, You may act only
-          on Your own behalf and on Your sole responsibility, not on behalf
-          of any other Contributor, and only if You agree to indemnify,
-          defend, and hold each Contributor harmless for any liability
-          incurred by, or claims asserted against, such Contributor by reason
-          of your accepting any such warranty or additional liability.
-
-       END OF TERMS AND CONDITIONS
-  """
--- a/node/node-v0.10.22-linux-x86/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,74 +0,0 @@
-Evented I/O for V8 javascript. [![Build Status](https://secure.travis-ci.org/joyent/node.png)](http://travis-ci.org/joyent/node)
-===
-
-### To build:
-
-Prerequisites (Unix only):
-
-    * GCC 4.2 or newer
-    * Python 2.6 or 2.7
-    * GNU Make 3.81 or newer
-    * libexecinfo (FreeBSD and OpenBSD only)
-
-Unix/Macintosh:
-
-    ./configure
-    make
-    make install
-
-If your python binary is in a non-standard location or has a
-non-standard name, run the following instead:
-
-    export PYTHON=/path/to/python
-    $PYTHON ./configure
-    make
-    make install
-
-Windows:
-
-    vcbuild.bat
-
-You can download pre-built binaries for various operating systems from
-[http://nodejs.org/download/](http://nodejs.org/download/).  The Windows
-and OS X installers will prompt you for the location to install to.
-The tarballs are self-contained; you can extract them to a local directory
-with:
-
-    tar xzf /path/to/node-<version>-<platform>-<arch>.tar.gz
-
-Or system-wide with:
-
-    cd /usr/local && tar --strip-components 1 -xzf \
-                         /path/to/node-<version>-<platform>-<arch>.tar.gz
-
-### To run the tests:
-
-Unix/Macintosh:
-
-    make test
-
-Windows:
-
-    vcbuild.bat test
-
-### To build the documentation:
-
-    make doc
-
-### To read the documentation:
-
-    man doc/node.1
-
-Resources for Newcomers
----
-  - [The Wiki](https://github.com/joyent/node/wiki)
-  - [nodejs.org](http://nodejs.org/)
-  - [how to install node.js and npm (node package manager)](http://www.joyent.com/blog/installing-node-and-npm/)
-  - [list of modules](https://github.com/joyent/node/wiki/modules)
-  - [searching the npm registry](http://npmjs.org/)
-  - [list of companies and projects using node](https://github.com/joyent/node/wiki/Projects,-Applications,-and-Companies-Using-Node)
-  - [node.js mailing list](http://groups.google.com/group/nodejs)
-  - irc chatroom, [#node.js on freenode.net](http://webchat.freenode.net?channels=node.js&uio=d4)
-  - [community](https://github.com/joyent/node/wiki/Community)
-  - [contributing](https://github.com/joyent/node/wiki/Contributing)
-  - [big list of all the helpful wiki pages](https://github.com/joyent/node/wiki/_pages)
Binary file node/node-v0.10.22-linux-x86/bin/node has changed
--- a/node/node-v0.10.22-linux-x86/bin/npm	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-../lib/node_modules/npm/bin/npm-cli.js
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/dtrace/node.d	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,315 +0,0 @@
-/* Copyright Joyent, Inc. and other Node contributors.
- *
- * Permission is hereby granted, free of charge, to any person obtaining a
- * copy of this software and associated documentation files (the
- * "Software"), to deal in the Software without restriction, including
- * without limitation the rights to use, copy, modify, merge, publish,
- * distribute, sublicense, and/or sell copies of the Software, and to permit
- * persons to whom the Software is furnished to do so, subject to the
- * following conditions:
- *
- * The above copyright notice and this permission notice shall be included
- * in all copies or substantial portions of the Software.
- *
- * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
- * OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
- * MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
- * NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
- * DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
- * OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
- * USE OR OTHER DEALINGS IN THE SOFTWARE.
- */
-
-/*
- * This is the DTrace library file for the node provider, which includes
- * the necessary translators to get from the args[] to something useful.
- * Be warned:  the mechanics here are seriously ugly -- and one must always
- * keep in mind that clean abstractions often require filthy systems.
- */
-#pragma D depends_on library procfs.d
-
-typedef struct {
-	int32_t fd;
-	int32_t port;
-	uint32_t remote;
-	uint32_t buffered;
-} node_dtrace_connection_t;
-
-typedef struct {
-	int32_t fd;
-	int32_t port;
-	uint64_t remote;
-	uint32_t buffered;
-} node_dtrace_connection64_t;
-
-typedef struct {
-	int fd;
-	string remoteAddress;
-	int remotePort;
-	int bufferSize;
-} node_connection_t;
-
-translator node_connection_t <node_dtrace_connection_t *nc> {
-	fd = *(int32_t *)copyin((uintptr_t)&nc->fd, sizeof (int32_t));
-	remotePort =
-	    *(int32_t *)copyin((uintptr_t)&nc->port, sizeof (int32_t));
-	remoteAddress = curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-	    copyinstr((uintptr_t)*(uint32_t *)copyin((uintptr_t)&nc->remote,
-	    sizeof (int32_t))) :
-	    copyinstr((uintptr_t)*(uint64_t *)copyin((uintptr_t)
-	    &((node_dtrace_connection64_t *)nc)->remote, sizeof (int64_t)));
-	bufferSize = curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-	    *(uint32_t *)copyin((uintptr_t)&nc->buffered, sizeof (int32_t)) :
-	    *(uint32_t *)copyin((uintptr_t)
-	    &((node_dtrace_connection64_t *)nc)->buffered, sizeof (int32_t));
-};
-
-/*
- * 32-bit and 64-bit structures received from node for HTTP client request
- * probe.
- */
-typedef struct {
-	uint32_t url;
-	uint32_t method;
-} node_dtrace_http_client_request_t;
-
-typedef struct {
-	uint64_t url;
-	uint64_t method;
-} node_dtrace_http_client_request64_t;
-
-/*
- * The following structures are never used directly, but must exist to bind the
- * types specified in the provider to the translators defined here.
- * Ultimately, they always get cast to a more specific type inside the
- * translator.  To add to the confusion, the DTrace compiler does not allow
- * declaring two translators with the same destination type if the source types
- * are structures with the same size (because libctf says they're compatible,
- * so dtrace considers them equivalent).  Since we must define translators from
- * node_dtrace_http_client_request_t (above), node_dtrace_http_request_t, and
- * node_dtrace_http_server_request_t (both below), each of these three structs
- * must be declared with a different size.
- */
-typedef struct {
-	uint32_t version;
-	uint64_t dummy1;
-} node_dtrace_http_request_t;
-
-typedef struct {
-	uint32_t version;
-	uint64_t dummy2;
-	uint64_t dummy3;
-} node_dtrace_http_server_request_t;
-
-/*
- * Actual 32-bit and 64-bit, v0 and v1 structures received from node for the
- * HTTP server request probe.
- */
-typedef struct {
-	uint32_t url;
-	uint32_t method;
-} node_dtrace_http_server_request_v0_t;
-
-typedef struct {
-	uint32_t version;
-	uint32_t url;
-	uint32_t method;
-	uint32_t forwardedFor;
-} node_dtrace_http_server_request_v1_t;
-
-typedef struct {
-	uint64_t url;
-	uint64_t method;
-} node_dtrace_http_server_request64_v0_t;
-
-typedef struct {
-	uint32_t version;
-	uint32_t pad;
-	uint64_t url;
-	uint64_t method;
-	uint64_t forwardedFor;
-} node_dtrace_http_server_request64_v1_t;
-
-/*
- * In the end, both client and server request probes from both old and new
- * binaries translate their arguments to node_http_request_t, which is what the
- * user's D script ultimately sees.
- */
-typedef struct {
-	string url;
-	string method;
-	string forwardedFor;
-} node_http_request_t;
-
-/*
- * The following translators are particularly filthy for reasons of backwards
- * compatibility.  Stable versions of node prior to 0.6 used a single
- * http_request struct with fields for "url" and "method" for both client and
- * server probes.  0.6 added a "forwardedFor" field intended for the server
- * probe only, and the http_request struct passed by the application was split
- * first into client_http_request and server_http_request and the latter was
- * again split for v0 (the old struct) and v1.
- *
- * To distinguish the binary representations of the two versions of these
- * structs, the new version prepends a "version" member (where the old one has
- * a "url" pointer).  Each field that we're translating below first switches on
- * the value of this "version" field: if it's larger than 4096, we know we must
- * be looking at the "url" pointer of the older structure version.  Otherwise,
- * we must be looking at the new version.  Besides this, we have the usual
- * switch based on the userland process data model.  This would all be simpler
- * with macros, but those aren't available in D library files since we cannot
- * rely on cpp being present at runtime.
- *
- * In retrospect, the versioning bit might have been unnecessary since the type
- * of the object passed in should allow DTrace to select which translator to
- * use.  However, DTrace does sometimes use translators whose source types
- * don't quite match, and since we know this versioning logic works, we just
- * leave it alone.  Each of the translators below is functionally identical
- * (except that the client -> client translator doesn't bother translating
- * forwardedFor) and should actually work with any version of any of the client
- * or server structs transmitted by the application up to this point.
- */
-
-/*
- * Translate from node_dtrace_http_server_request_t (received from node 0.6 and
- * later versions) to node_http_request_t.
- */
-translator node_http_request_t <node_dtrace_http_server_request_t *nd> {
-	url = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd,
-		sizeof (uint32_t))) >= 4096 ?
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v0_t *)nd)->url,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v0_t *)nd)->url,
-		    sizeof (uint64_t)))) :
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v1_t *)nd)->url,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v1_t *)nd)->url,
-		    sizeof (uint64_t))));
-
-	method = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd,
-		sizeof (uint32_t))) >= 4096 ?
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v0_t *)nd)->method,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v0_t *)nd)->method,
-		    sizeof (uint64_t)))) :
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v1_t *)nd)->method,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v1_t *)nd)->method,
-		    sizeof (uint64_t))));
-
-	forwardedFor = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd,
-		sizeof (uint32_t))) >= 4096 ? "" :
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v1_t *)nd)->forwardedFor,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v1_t *)nd)->
-		    forwardedFor, sizeof (uint64_t))));
-};
-
-/*
- * Translate from node_dtrace_http_client_request_t (received from node 0.6 and
- * later versions) to node_http_request_t.
- */
-translator node_http_request_t <node_dtrace_http_client_request_t *nd> {
-	url = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd,
-		sizeof (uint32_t))) >= 4096 ?
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v0_t *)nd)->url,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v0_t *)nd)->url,
-		    sizeof (uint64_t)))) :
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v1_t *)nd)->url,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v1_t *)nd)->url,
-		    sizeof (uint64_t))));
-
-	method = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd,
-		sizeof (uint32_t))) >= 4096 ?
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v0_t *)nd)->method,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v0_t *)nd)->method,
-		    sizeof (uint64_t)))) :
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v1_t *)nd)->method,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v1_t *)nd)->method,
-		    sizeof (uint64_t))));
-
-	forwardedFor = "";
-};
-
-/*
- * Translate from node_dtrace_http_request_t (received from versions of node
- * prior to 0.6) to node_http_request_t.  This is used for both the server and
- * client probes since these versions of node didn't distinguish between the
- * types used in these probes.
- */
-translator node_http_request_t <node_dtrace_http_request_t *nd> {
-	url = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd,
-		sizeof (uint32_t))) >= 4096 ?
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v0_t *)nd)->url,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v0_t *)nd)->url,
-		    sizeof (uint64_t)))) :
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v1_t *)nd)->url,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v1_t *)nd)->url,
-		    sizeof (uint64_t))));
-
-	method = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd,
-		sizeof (uint32_t))) >= 4096 ?
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v0_t *)nd)->method,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v0_t *)nd)->method,
-		    sizeof (uint64_t)))) :
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v1_t *)nd)->method,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v1_t *)nd)->method,
-		    sizeof (uint64_t))));
-
-	forwardedFor = (*(uint32_t *) copyin((uintptr_t)(uint32_t *)nd,
-		sizeof (uint32_t))) >= 4096 ? "" :
-	    (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ?
-		copyinstr(*(uint32_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request_v1_t *)nd)->forwardedFor,
-		    sizeof (uint32_t))) :
-		copyinstr(*(uint64_t *)copyin((uintptr_t)
-		    &((node_dtrace_http_server_request64_v1_t *)nd)->
-		    forwardedFor, sizeof (uint64_t))));
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-*.swp
-.*.swp
-npm-debug.log
-/test/bin
-/test/output.log
-/test/packages/*/node_modules
-/test/packages/npm-test-depends-on-spark/which-spark.log
-/test/packages/test-package/random-data.txt
-/test/root
-node_modules/ronn
-node_modules/tap
-node_modules/.bin
-node_modules/npm-registry-mock
-/npmrc
-/release/
-
-# don't need these in the npm package.
-html/*.png
-
-# don't ignore .npmignore files
-# these are used in some tests.
-!.npmignore
-
-/npm-*.tgz
-
-*.pyc
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/.tern-project	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,7 +0,0 @@
-{
-  "libs": [
-  ],
-  "plugins": {
-    "node": {}
-  }
-}
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/AUTHORS	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,115 +0,0 @@
-# Authors sorted by whether or not they're me
-Isaac Z. Schlueter <i@izs.me>
-Steve Steiner <ssteinerX@gmail.com>
-Mikeal Rogers <mikeal.rogers@gmail.com>
-Aaron Blohowiak <aaron.blohowiak@gmail.com>
-Martyn Smith <martyn@dollyfish.net.nz>
-Mathias Pettersson <mape@mape.me>
-Brian Hammond <brian@fictorial.com>
-Charlie Robbins <charlie.robbins@gmail.com>
-Francisco Treacy <francisco.treacy@gmail.com>
-Cliffano Subagio <cliffano@gmail.com>
-Christian Eager <christian.eager@nokia.com>
-Dav Glass <davglass@gmail.com>
-Alex K. Wolfe <alexkwolfe@gmail.com>
-James Sanders <jimmyjazz14@gmail.com>
-Reid Burke <me@reidburke.com>
-Arlo Breault <arlolra@gmail.com>
-Timo Derstappen <teemow@gmail.com>
-Bradley Meck <bradley.meck@gmail.com>
-Bart Teeuwisse <bart.teeuwisse@thecodemill.biz>
-Ben Noordhuis <info@bnoordhuis.nl>
-Tor Valamo <tor.valamo@gmail.com>
-Whyme.Lyu <5longluna@gmail.com>
-Olivier Melcher <olivier.melcher@gmail.com>
-Tomaž Muraus <kami@k5-storitve.net>
-Evan Meagher <evan.meagher@gmail.com>
-Orlando Vazquez <ovazquez@gmail.com>
-George Miroshnykov <gmiroshnykov@lohika.com>
-Geoff Flarity <geoff.flarity@gmail.com>
-Pete Kruckenberg <pete@kruckenberg.com>
-Laurie Harper <laurie@holoweb.net>
-Chris Wong <chris@chriswongstudio.com>
-Max Goodman <c@chromacode.com>
-Scott Bronson <brons_github@rinspin.com>
-Federico Romero <federomero@gmail.com>
-Visnu Pitiyanuvath <visnupx@gmail.com>
-Irakli Gozalishvili <rfobic@gmail.com>
-Mark Cahill <mark@tiemonster.info>
-Zearin <zearin@gonk.net>
-Iain Sproat <iainsproat@gmail.com>
-Trent Mick <trentm@gmail.com>
-Felix Geisendörfer <felix@debuggable.com>
-Conny Brunnkvist <cbrunnkvist@gmail.com>
-Will Elwood <w.elwood08@gmail.com>
-Oleg Efimov <efimovov@gmail.com>
-Martin Cooper <mfncooper@gmail.com>
-Jameson Little <t.jameson.little@gmail.com>
-cspotcode <cspotcode@gmail.com>
-Maciej Małecki <maciej.malecki@notimplemented.org>
-Stephen Sugden <glurgle@gmail.com>
-Gautham Pai <buzypi@gmail.com>
-David Trejo <david.daniel.trejo@gmail.com>
-Paul Vorbach <paul@vorb.de>
-George Ornbo <george@shapeshed.com>
-Tim Oxley <secoif@gmail.com>
-Tyler Green <tyler.green2@gmail.com>
-atomizer <danila.gerasimov@gmail.com>
-Rod Vagg <rod@vagg.org>
-Christian Howe <coderarity@gmail.com>
-Andrew Lunny <alunny@gmail.com>
-Henrik Hodne <dvyjones@binaryhex.com>
-Adam Blackburn <regality@gmail.com>
-Kris Windham <kriswindham@gmail.com>
-Jens Grunert <jens.grunert@gmail.com>
-Joost-Wim Boekesteijn <joost-wim@boekesteijn.nl>
-Dalmais Maxence <github@maxired.fr>
-Marcus Ekwall <marcus.ekwall@gmail.com>
-Aaron Stacy <aaron.r.stacy@gmail.com>
-Phillip Howell <phowell@cothm.org>
-Domenic Denicola <domenic@domenicdenicola.com>
-James Halliday <mail@substack.net>
-Jeremy Cantrell <jmcantrell@gmail.com>
-Ribettes <patlogan29@gmail.com>
-Einar Otto Stangvik <einaros@gmail.com>
-Don Park <donpark@docuverse.com>
-Kei Son <heyacct@gmail.com>
-Nicolas Morel <marsup@gmail.com>
-Mark Dube <markisdee@gmail.com>
-Nathan Rajlich <nathan@tootallnate.net>
-Maxim Bogushevich <boga1@mail.ru>
-Justin Beckwith <justbe@microsoft.com>
-Meaglin <Meaglin.wasabi@gmail.com>
-Ben Evans <ben@bensbit.co.uk>
-Nathan Zadoks <nathan@nathan7.eu>
-Brian White <mscdex@gmail.com>
-Jed Schmidt <tr@nslator.jp>
-Ian Livingstone <ianl@cs.dal.ca>
-Patrick Pfeiffer <patrick@buzzle.at>
-Paul Miller <paul@paulmillr.com>
-seebees <seebees@gmail.com>
-Carl Lange <carl@flax.ie>
-Jan Lehnardt <jan@apache.org>
-Alexey Kreschuk <akrsch@gmail.com>
-Di Wu <dwu@palantir.com>
-Florian Margaine <florian@margaine.com>
-Forbes Lindesay <forbes@lindesay.co.uk>
-Ian Babrou <ibobrik@gmail.com>
-Jaakko Manninen <jaakko@rocketpack.fi>
-Johan Nordberg <its@johan-nordberg.com>
-Johan Sköld <johan@skold.cc>
-Larz Conwell <larz@larz-laptop.(none)>
-Luke Arduini <luke.arduini@gmail.com>
-Marcel Klehr <mklehr@gmx.net>
-Mathias Bynens <mathias@qiwi.be>
-Matt Lunn <matt@mattlunn.me.uk>
-Matt McClure <matt.mcclure@mapmyfitness.com>
-Nirk Niggler <nirk.niggler@gmail.com>
-Paolo Fragomeni <paolo@async.ly>
-Jake Verbaten (Raynos) <raynos2@gmail.com>
-Robert Kowalski <rok@kowalski.gd>
-Schabse Laks <Dev@SLaks.net>
-Stuart Knightley <stuart@stuartk.com>
-Stuart P. Bentley <stuart@testtrack4.com>
-Vaz Allen <vaz@tryptid.com>
-elisee <elisee@sparklin.org>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,235 +0,0 @@
-Copyright (c) Isaac Z. Schlueter
-All rights reserved.
-
-npm is released under the Artistic License 2.0.
-The text of the License follows:
-
-
---------
-
-
-The Artistic License 2.0
-
-Copyright (c) 2000-2006, The Perl Foundation.
-
-Everyone is permitted to copy and distribute verbatim copies
-of this license document, but changing it is not allowed.
-
-Preamble
-
-This license establishes the terms under which a given free software
-Package may be copied, modified, distributed, and/or redistributed.
-The intent is that the Copyright Holder maintains some artistic
-control over the development of that Package while still keeping the
-Package available as open source and free software.
-
-You are always permitted to make arrangements wholly outside of this
-license directly with the Copyright Holder of a given Package.  If the
-terms of this license do not permit the full use that you propose to
-make of the Package, you should contact the Copyright Holder and seek
-a different licensing arrangement.
-
-Definitions
-
-    "Copyright Holder" means the individual(s) or organization(s)
-    named in the copyright notice for the entire Package.
-
-    "Contributor" means any party that has contributed code or other
-    material to the Package, in accordance with the Copyright Holder's
-    procedures.
-
-    "You" and "your" means any person who would like to copy,
-    distribute, or modify the Package.
-
-    "Package" means the collection of files distributed by the
-    Copyright Holder, and derivatives of that collection and/or of
-    those files. A given Package may consist of either the Standard
-    Version, or a Modified Version.
-
-    "Distribute" means providing a copy of the Package or making it
-    accessible to anyone else, or in the case of a company or
-    organization, to others outside of your company or organization.
-
-    "Distributor Fee" means any fee that you charge for Distributing
-    this Package or providing support for this Package to another
-    party.  It does not mean licensing fees.
-
-    "Standard Version" refers to the Package if it has not been
-    modified, or has been modified only in ways explicitly requested
-    by the Copyright Holder.
-
-    "Modified Version" means the Package, if it has been changed, and
-    such changes were not explicitly requested by the Copyright
-    Holder.
-
-    "Original License" means this Artistic License as Distributed with
-    the Standard Version of the Package, in its current version or as
-    it may be modified by The Perl Foundation in the future.
-
-    "Source" form means the source code, documentation source, and
-    configuration files for the Package.
-
-    "Compiled" form means the compiled bytecode, object code, binary,
-    or any other form resulting from mechanical transformation or
-    translation of the Source form.
-
-
-Permission for Use and Modification Without Distribution
-
-(1)  You are permitted to use the Standard Version and create and use
-Modified Versions for any purpose without restriction, provided that
-you do not Distribute the Modified Version.
-
-
-Permissions for Redistribution of the Standard Version
-
-(2)  You may Distribute verbatim copies of the Source form of the
-Standard Version of this Package in any medium without restriction,
-either gratis or for a Distributor Fee, provided that you duplicate
-all of the original copyright notices and associated disclaimers.  At
-your discretion, such verbatim copies may or may not include a
-Compiled form of the Package.
-
-(3)  You may apply any bug fixes, portability changes, and other
-modifications made available from the Copyright Holder.  The resulting
-Package will still be considered the Standard Version, and as such
-will be subject to the Original License.
-
-
-Distribution of Modified Versions of the Package as Source
-
-(4)  You may Distribute your Modified Version as Source (either gratis
-or for a Distributor Fee, and with or without a Compiled form of the
-Modified Version) provided that you clearly document how it differs
-from the Standard Version, including, but not limited to, documenting
-any non-standard features, executables, or modules, and provided that
-you do at least ONE of the following:
-
-    (a)  make the Modified Version available to the Copyright Holder
-    of the Standard Version, under the Original License, so that the
-    Copyright Holder may include your modifications in the Standard
-    Version.
-
-    (b)  ensure that installation of your Modified Version does not
-    prevent the user installing or running the Standard Version. In
-    addition, the Modified Version must bear a name that is different
-    from the name of the Standard Version.
-
-    (c)  allow anyone who receives a copy of the Modified Version to
-    make the Source form of the Modified Version available to others
-    under
-
-        (i)  the Original License or
-
-        (ii)  a license that permits the licensee to freely copy,
-        modify and redistribute the Modified Version using the same
-        licensing terms that apply to the copy that the licensee
-        received, and requires that the Source form of the Modified
-        Version, and of any works derived from it, be made freely
-        available in that license fees are prohibited but Distributor
-        Fees are allowed.
-
-
-Distribution of Compiled Forms of the Standard Version
-or Modified Versions without the Source
-
-(5)  You may Distribute Compiled forms of the Standard Version without
-the Source, provided that you include complete instructions on how to
-get the Source of the Standard Version.  Such instructions must be
-valid at the time of your distribution.  If these instructions, at any
-time while you are carrying out such distribution, become invalid, you
-must provide new instructions on demand or cease further distribution.
-If you provide valid instructions or cease distribution within thirty
-days after you become aware that the instructions are invalid, then
-you do not forfeit any of your rights under this license.
-
-(6)  You may Distribute a Modified Version in Compiled form without
-the Source, provided that you comply with Section 4 with respect to
-the Source of the Modified Version.
-
-
-Aggregating or Linking the Package
-
-(7)  You may aggregate the Package (either the Standard Version or
-Modified Version) with other packages and Distribute the resulting
-aggregation provided that you do not charge a licensing fee for the
-Package.  Distributor Fees are permitted, and licensing fees for other
-components in the aggregation are permitted. The terms of this license
-apply to the use and Distribution of the Standard or Modified Versions
-as included in the aggregation.
-
-(8) You are permitted to link Modified and Standard Versions with
-other works, to embed the Package in a larger work of your own, or to
-build stand-alone binary or bytecode versions of applications that
-include the Package, and Distribute the result without restriction,
-provided the result does not expose a direct interface to the Package.
-
-
-Items That are Not Considered Part of a Modified Version
-
-(9) Works (including, but not limited to, modules and scripts) that
-merely extend or make use of the Package, do not, by themselves, cause
-the Package to be a Modified Version.  In addition, such works are not
-considered parts of the Package itself, and are not subject to the
-terms of this license.
-
-
-General Provisions
-
-(10)  Any use, modification, and distribution of the Standard or
-Modified Versions is governed by this Artistic License. By using,
-modifying or distributing the Package, you accept this license. Do not
-use, modify, or distribute the Package, if you do not accept this
-license.
-
-(11)  If your Modified Version has been derived from a Modified
-Version made by someone other than you, you are nevertheless required
-to ensure that your Modified Version complies with the requirements of
-this license.
-
-(12)  This license does not grant you the right to use any trademark,
-service mark, tradename, or logo of the Copyright Holder.
-
-(13)  This license includes the non-exclusive, worldwide,
-free-of-charge patent license to make, have made, use, offer to sell,
-sell, import and otherwise transfer the Package with respect to any
-patent claims licensable by the Copyright Holder that are necessarily
-infringed by the Package. If you institute patent litigation
-(including a cross-claim or counterclaim) against any party alleging
-that the Package constitutes direct or contributory patent
-infringement, then this Artistic License to you shall terminate on the
-date that such litigation is filed.
-
-(14)  Disclaimer of Warranty:
-THE PACKAGE IS PROVIDED BY THE COPYRIGHT HOLDER AND CONTRIBUTORS "AS
-IS' AND WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES. THE IMPLIED
-WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR
-NON-INFRINGEMENT ARE DISCLAIMED TO THE EXTENT PERMITTED BY YOUR LOCAL
-LAW. UNLESS REQUIRED BY LAW, NO COPYRIGHT HOLDER OR CONTRIBUTOR WILL
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL
-DAMAGES ARISING IN ANY WAY OUT OF THE USE OF THE PACKAGE, EVEN IF
-ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-
-
---------
-
-
-"Node.js" and "node" trademark Joyent, Inc. npm is not officially
-part of the Node.js project, and is neither owned by nor
-officially affiliated with Joyent, Inc.
-
-Packages published in the npm registry (other than the Software and
-its included dependencies) are not part of npm itself, are the sole
-property of their respective maintainers, and are not covered by
-this license.
-
-"npm Logo" created by Mathias Pettersson and Brian Hammond,
-used with permission.
-
-"Gubblebum Blocky" font
-Copyright (c) by Tjarda Koster, http://jelloween.deviantart.com
-included for use in the npm website and documentation,
-used with permission.
-
-This program uses several Node modules contained in the node_modules/
-subdirectory, according to the terms of their respective licenses.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,219 +0,0 @@
-# vim: set softtabstop=2 shiftwidth=2:
-SHELL = bash
-
-markdowns = $(shell find doc -name '*.md' | grep -v 'index') README.md
-
-html_docdeps = html/dochead.html \
-               html/docfoot.html \
-               html/docfoot-script.html \
-               scripts/doc-build.sh \
-               package.json
-
-cli_mandocs = $(shell find doc/cli -name '*.md' \
-               |sed 's|.md|.1|g' \
-               |sed 's|doc/cli/|man/man1/|g' ) \
-               man/man1/npm-README.1
-
-api_mandocs = $(shell find doc/api -name '*.md' \
-               |sed 's|.md|.3|g' \
-               |sed 's|doc/api/|man/man3/|g' )
-
-files_mandocs = $(shell find doc/files -name '*.md' \
-               |sed 's|.md|.5|g' \
-               |sed 's|doc/files/|man/man5/|g' ) \
-               man/man5/npm-json.5 \
-               man/man5/npm-global.5
-
-misc_mandocs = $(shell find doc/misc -name '*.md' \
-               |sed 's|.md|.7|g' \
-               |sed 's|doc/misc/|man/man7/|g' ) \
-               man/man7/npm-index.7
-
-cli_htmldocs = $(shell find doc/cli -name '*.md' \
-                |sed 's|.md|.html|g' \
-                |sed 's|doc/cli/|html/doc/cli/|g' ) \
-                html/doc/README.html
-
-api_htmldocs = $(shell find doc/api -name '*.md' \
-                |sed 's|.md|.html|g' \
-                |sed 's|doc/api/|html/doc/api/|g' )
-
-files_htmldocs = $(shell find doc/files -name '*.md' \
-                  |sed 's|.md|.html|g' \
-                  |sed 's|doc/files/|html/doc/files/|g' ) \
-                  html/doc/files/npm-json.html \
-                  html/doc/files/npm-global.html
-
-misc_htmldocs = $(shell find doc/misc -name '*.md' \
-                 |sed 's|.md|.html|g' \
-                 |sed 's|doc/misc/|html/doc/misc/|g' ) \
-                 html/doc/index.html
-
-mandocs = $(api_mandocs) $(cli_mandocs) $(files_mandocs) $(misc_mandocs)
-
-htmldocs = $(api_htmldocs) $(cli_htmldocs) $(files_htmldocs) $(misc_htmldocs)
-
-all: doc
-
-latest:
-	@echo "Installing latest published npm"
-	@echo "Use 'make install' or 'make link' to install the code"
-	@echo "in this folder that you're looking at right now."
-	node cli.js install -g -f npm
-
-install: docclean all
-	node cli.js install -g -f
-
-# backwards compat
-dev: install
-
-link: uninstall
-	node cli.js link -f
-
-clean: ronnclean doc-clean uninstall
-	rm -rf npmrc
-	node cli.js cache clean
-
-uninstall:
-	node cli.js rm npm -g -f
-
-doc: $(mandocs) $(htmldocs)
-
-ronnclean:
-	rm -rf node_modules/ronn node_modules/.bin/ronn .building_ronn
-
-docclean: doc-clean
-doc-clean:
-	rm -rf \
-    .building_ronn \
-    html/doc \
-    html/api \
-    man
-
-# use `npm install ronn` for this to work.
-man/man1/npm-README.1: README.md scripts/doc-build.sh package.json
-	@[ -d man/man1 ] || mkdir -p man/man1
-	scripts/doc-build.sh $< $@
-
-man/man1/%.1: doc/cli/%.md scripts/doc-build.sh package.json
-	@[ -d man/man1 ] || mkdir -p man/man1
-	scripts/doc-build.sh $< $@
-
-man/man3/%.3: doc/api/%.md scripts/doc-build.sh package.json
-	@[ -d man/man3 ] || mkdir -p man/man3
-	scripts/doc-build.sh $< $@
-
-man/man5/npm-json.5: man/man5/package.json.5
-	cp $< $@
-
-man/man5/npm-global.5: man/man5/npm-folders.5
-	cp $< $@
-
-man/man5/%.5: doc/files/%.md scripts/doc-build.sh package.json
-	@[ -d man/man5 ] || mkdir -p man/man5
-	scripts/doc-build.sh $< $@
-
-doc/misc/npm-index.md: scripts/index-build.js package.json
-	node scripts/index-build.js > $@
-
-html/doc/index.html: doc/misc/npm-index.md $(html_docdeps)
-	@[ -d html/doc ] || mkdir -p html/doc
-	scripts/doc-build.sh $< $@
-
-man/man7/%.7: doc/misc/%.md scripts/doc-build.sh package.json
-	@[ -d man/man7 ] || mkdir -p man/man7
-	scripts/doc-build.sh $< $@
-
-html/doc/README.html: README.md $(html_docdeps)
-	@[ -d html/doc ] || mkdir -p html/doc
-	scripts/doc-build.sh $< $@
-
-html/doc/cli/%.html: doc/cli/%.md $(html_docdeps)
-	@[ -d html/doc/cli ] || mkdir -p html/doc/cli
-	scripts/doc-build.sh $< $@
-
-html/doc/api/%.html: doc/api/%.md $(html_docdeps)
-	@[ -d html/doc/api ] || mkdir -p html/doc/api
-	scripts/doc-build.sh $< $@
-
-html/doc/files/npm-json.html: html/doc/files/package.json.html
-	cp $< $@
-html/doc/files/npm-global.html: html/doc/files/npm-folders.html
-	cp $< $@
-
-html/doc/files/%.html: doc/files/%.md $(html_docdeps)
-	@[ -d html/doc/files ] || mkdir -p html/doc/files
-	scripts/doc-build.sh $< $@
-
-html/doc/misc/%.html: doc/misc/%.md $(html_docdeps)
-	@[ -d html/doc/misc ] || mkdir -p html/doc/misc
-	scripts/doc-build.sh $< $@
-
-
-
-ronn: node_modules/.bin/ronn
-
-node_modules/.bin/ronn:
-	node cli.js install ronn --no-global
-
-doc: man
-
-man: $(cli_docs) $(api_docs)
-
-test: doc
-	node cli.js test
-
-publish: link doc
-	@git push origin :v$(shell npm -v) 2>&1 || true
-	@npm unpublish npm@$(shell npm -v) 2>&1 || true
-	git clean -fd &&\
-	git push origin &&\
-	git push origin --tags &&\
-	npm publish &&\
-	npm tag npm@$(shell npm -v) $(shell npm -v | awk -F. '{print $$1 "." $$2}') &&\
-	make doc-publish &&\
-	make zip-publish
-
-docpublish: doc-publish
-doc-publish: doc
-	# legacy urls
-	for f in $$(find html/doc/{cli,files,misc}/ -name '*.html'); do \
-    j=$$(basename $$f | sed 's|^npm-||g'); \
-    if ! [ -f html/doc/$$j ] && [ $$j != README.html ] && [ $$j != index.html ]; then \
-      perl -pi -e 's/ href="\.\.\// href="/g' <$$f >html/doc/$$j; \
-    fi; \
-  done
-	mkdir -p html/api
-	for f in $$(find html/doc/api/ -name '*.html'); do \
-    j=$$(basename $$f | sed 's|^npm-||g'); \
-    perl -pi -e 's/ href="\.\.\// href="/g' <$$f >html/api/$$j; \
-  done
-	rsync -vazu --stats --no-implied-dirs --delete \
-    html/doc/* \
-    node@npmjs.org:/home/node/npm-www/doc
-	rsync -vazu --stats --no-implied-dirs --delete \
-    html/static/webfonts/ \
-    node@npmjs.org:/home/node/npm-www/static/webfonts
-	rsync -vazu --stats --no-implied-dirs --delete \
-    html/static/style.css \
-    node@npmjs.org:/home/node/npm-www/static/
-	#cleanup
-	rm -rf html/api
-	for f in html/doc/*.html; do \
-    case $$f in \
-      html/doc/README.html) continue ;; \
-      html/doc/index.html) continue ;; \
-      *) rm $$f ;; \
-    esac; \
-  done
-
-zip-publish: release
-	scp release/* node@nodejs.org:dist/npm/
-
-release:
-	@bash scripts/release.sh
-
-sandwich:
-	@[ $$(whoami) = "root" ] && (echo "ok"; echo "ham" > sandwich) || echo "make it yourself" && exit 13
-
-.PHONY: all latest install dev link doc clean uninstall test man doc-publish doc-clean docclean docpublish release zip-publish
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,238 +0,0 @@
-npm(1) -- node package manager
-==============================
-
-## SYNOPSIS
-
-This is just enough info to get you up and running.
-
-Much more info available via `npm help` once it's installed.
-
-## IMPORTANT
-
-**You need node v0.8 or higher to run this program.**
-
-To install an old **and unsupported** version of npm that works on node 0.3
-and prior, clone the git repo and dig through the old tags and branches.
-
-## Super Easy Install
-
-npm comes with node now.
-
-### Windows Computers
-
-Get the MSI.  npm is in it.
-
-### Apple Macintosh Computers
-
-Get the pkg.  npm is in it.
-
-### Other Sorts of Unices
-
-Run `make install`.  npm will be installed with node.
-
-If you want a more fancy pants install (a different version, customized
-paths, etc.) then read on.
-
-## Fancy Install (Unix)
-
-There's a pretty robust install script at
-<https://npmjs.org/install.sh>.  You can download that and run it.
-
-### Slightly Fancier
-
-You can set any npm configuration params with that script:
-
-    npm_config_prefix=/some/path sh install.sh
-
-Or, you can run it in uber-debuggery mode:
-
-    npm_debug=1 sh install.sh
-
-### Even Fancier
-
-Get the code with git.  Use `make` to build the docs and do other stuff.
-If you plan on hacking on npm, `make link` is your friend.
-
-If you've got the npm source code, you can also semi-permanently set
-arbitrary config keys using the `./configure --key=val ...`, and then
-run npm commands by doing `node cli.js <cmd> <args>`.  (This is helpful
-for testing, or running stuff without actually installing npm itself.)
-
-## Fancy Windows Install
-
-You can download a zip file from <https://npmjs.org/dist/>, and unpack it
-in the same folder where node.exe lives.
-
-If that's not fancy enough for you, then you can fetch the code with
-git, and mess with it directly.
-
-## Installing on Cygwin
-
-No.
-
-## Permissions when Using npm to Install Other Stuff
-
-**tl;dr**
-
-* Use `sudo` for greater safety.  Or don't, if you prefer not to.
-* npm will downgrade permissions if it's root before running any build
-  scripts that package authors specified.
-
-### More details...
-
-As of version 0.3, it is recommended to run npm as root.
-This allows npm to change the user identifier to the `nobody` user prior
-to running any package build or test commands.
-
-If you are not the root user, or if you are on a platform that does not
-support uid switching, then npm will not attempt to change the userid.
-
-If you would like to ensure that npm **always** runs scripts as the
-"nobody" user, and have it fail if it cannot downgrade permissions, then
-set the following configuration param:
-
-    npm config set unsafe-perm false
-
-This will prevent running in unsafe mode, even as non-root users.
-
-## Uninstalling
-
-So sad to see you go.
-
-    sudo npm uninstall npm -g
-
-Or, if that fails,
-
-    sudo make uninstall
-
-## More Severe Uninstalling
-
-Usually, the above instructions are sufficient.  That will remove
-npm, but leave behind anything you've installed.
-
-If you would like to remove all the packages that you have installed,
-then you can use the `npm ls` command to find them, and then `npm rm` to
-remove them.
-
-To remove cruft left behind by npm 0.x, you can use the included
-`clean-old.sh` script file.  You can run it conveniently like this:
-
-    npm explore npm -g -- sh scripts/clean-old.sh
-
-npm uses two configuration files, one for per-user configs, and another
-for global (every-user) configs.  You can view them by doing:
-
-    npm config get userconfig   # defaults to ~/.npmrc
-    npm config get globalconfig # defaults to /usr/local/etc/npmrc
-
-Uninstalling npm does not remove configuration files by default.  You
-must remove them yourself manually if you want them gone.  Note that
-this means that future npm installs will not remember the settings that
-you have chosen.
-
-## Using npm Programmatically
-
-If you would like to use npm programmatically, you can do that.
-It's not very well documented, but it *is* rather simple.
-
-Most of the time, unless you actually want to do all the things that
-npm does, you should try using one of npm's dependencies rather than
-using npm itself, if possible.
-
-Eventually, npm will be just a thin cli wrapper around the modules
-that it depends on, but for now, there are some things that you must
-use npm itself to do.
-
-    var npm = require("npm")
-    npm.load(myConfigObject, function (er) {
-      if (er) return handlError(er)
-      npm.commands.install(["some", "args"], function (er, data) {
-        if (er) return commandFailed(er)
-        // command succeeded, and data might have some info
-      })
-      npm.on("log", function (message) { .... })
-    })
-
-The `load` function takes an object hash of the command-line configs.
-The various `npm.commands.<cmd>` functions take an **array** of
-positional argument **strings**.  The last argument to any
-`npm.commands.<cmd>` function is a callback.  Some commands take other
-optional arguments.  Read the source.
-
-You cannot set configs individually for any single npm function at this
-time.  Since `npm` is a singleton, any call to `npm.config.set` will
-change the value for *all* npm commands in that process.
-
-See `./bin/npm-cli.js` for an example of pulling config values off of the
-command line arguments using nopt.  You may also want to check out `npm
-help config` to learn about all the options you can set there.
-
-## More Docs
-
-Check out the [docs](https://npmjs.org/doc/),
-especially the [faq](https://npmjs.org/doc/faq.html).
-
-You can use the `npm help` command to read any of them.
-
-If you're a developer, and you want to use npm to publish your program,
-you should [read this](https://npmjs.org/doc/developers.html)
-
-## Legal Stuff
-
-"npm" and "the npm registry" are owned by Isaac Z. Schlueter.
-All rights reserved.  See the included LICENSE file for more details.
-
-"Node.js" and "node" are trademarks owned by Joyent, Inc.  npm is not
-officially part of the Node.js project, and is neither owned by nor
-officially affiliated with Joyent, Inc.
-
-The packages in the npm registry are not part of npm itself, and are the
-sole property of their respective maintainers.  While every effort is
-made to ensure accountability, there is absolutely no guarantee,
-warrantee, or assertion made as to the quality, fitness for a specific
-purpose, or lack of malice in any given npm package.  Modules
-published on the npm registry are not affiliated with or endorsed by
-Joyent, Inc., Isaac Z. Schlueter, Ryan Dahl, or the Node.js project.
-
-If you have a complaint about a package in the npm registry, and cannot
-resolve it with the package owner, please express your concerns to
-Isaac Z. Schlueter at <i@izs.me>.
-
-### In plain english
-
-This is mine; not my employer's, not Node's, not Joyent's, not Ryan
-Dahl's.
-
-If you publish something, it's yours, and you are solely accountable
-for it.  Not me, not Node, not Joyent, not Ryan Dahl.
-
-If other people publish something, it's theirs.  Not mine, not Node's,
-not Joyent's, not Ryan Dahl's.
-
-Yes, you can publish something evil.  It will be removed promptly if
-reported, and we'll lose respect for you.  But there is no vetting
-process for published modules.
-
-If this concerns you, inspect the source before using packages.
-
-## BUGS
-
-When you find issues, please report them:
-
-* web:
-  <https://github.com/isaacs/npm/issues>
-* email:
-  <npm-@googlegroups.com>
-
-Be sure to include *all* of the output from the npm command that didn't work
-as expected.  The `npm-debug.log` file is also helpful to provide.
-
-You can also look for isaacs in #node.js on irc://irc.freenode.net.  He
-will no doubt tell you to put the output in a gist or email.
-
-## SEE ALSO
-
-* npm(1)
-* npm-faq(7)
-* npm-help(1)
-* npm-index(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/node-gyp-bin/node-gyp	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-#!/usr/bin/env sh
-node "`dirname "$0"`/../../node_modules/node-gyp/bin/node-gyp.js" "$@"
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/node-gyp-bin/node-gyp.cmd	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-node "%~dp0\..\..\node_modules\node-gyp\bin\node-gyp.js" %*
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/npm	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-#!/bin/sh
-
-basedir=`dirname "$0"`
-
-case `uname` in
-    *CYGWIN*) basedir=`cygpath -w "$basedir"`;;
-esac
-
-if [ -x "$basedir/node.exe" ]; then
-  "$basedir/node.exe" "$basedir/node_modules/npm/bin/npm-cli.js" "$@"
-else
-  node "$basedir/node_modules/npm/bin/npm-cli.js" "$@"
-fi
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/npm-cli.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,87 +0,0 @@
-#!/bin/sh
-// 2>/dev/null; exec "`dirname "$0"`/node" "$0" "$@"
-;(function () { // wrapper in case we're in module_context mode
-
-// windows: running "npm blah" in this folder will invoke WSH, not node.
-if (typeof WScript !== "undefined") {
-  WScript.echo("npm does not work when run\n"
-              +"with the Windows Scripting Host\n\n"
-              +"'cd' to a different directory,\n"
-              +"or type 'npm.cmd <args>',\n"
-              +"or type 'node npm <args>'.")
-  WScript.quit(1)
-  return
-}
-
-
-process.title = "npm"
-
-var log = require("npmlog")
-log.pause() // will be unpaused when config is loaded.
-log.info("it worked if it ends with", "ok")
-
-var fs = require("graceful-fs")
-  , path = require("path")
-  , npm = require("../lib/npm.js")
-  , npmconf = require("npmconf")
-  , errorHandler = require("../lib/utils/error-handler.js")
-
-  , configDefs = npmconf.defs
-  , shorthands = configDefs.shorthands
-  , types = configDefs.types
-  , nopt = require("nopt")
-
-// if npm is called as "npmg" or "npm_g", then
-// run in global mode.
-if (path.basename(process.argv[1]).slice(-1)  === "g") {
-  process.argv.splice(1, 1, "npm", "-g")
-}
-
-log.verbose("cli", process.argv)
-
-var conf = nopt(types, shorthands)
-npm.argv = conf.argv.remain
-if (npm.deref(npm.argv[0])) npm.command = npm.argv.shift()
-else conf.usage = true
-
-
-if (conf.version) {
-  console.log(npm.version)
-  return
-}
-
-if (conf.versions) {
-  npm.command = "version"
-  conf.usage = false
-  npm.argv = []
-}
-
-log.info("using", "npm@%s", npm.version)
-log.info("using", "node@%s", process.version)
-
-// make sure that this version of node works with this version of npm.
-var semver = require("semver")
-  , nodeVer = process.version
-  , reqVer = npm.nodeVersionRequired
-if (reqVer && !semver.satisfies(nodeVer, reqVer)) {
-  return errorHandler(new Error(
-    "npm doesn't work with node " + nodeVer
-    + "\nRequired: node@" + reqVer), true)
-}
-
-process.on("uncaughtException", errorHandler)
-
-if (conf.usage && npm.command !== "help") {
-  npm.argv.unshift(npm.command)
-  npm.command = "help"
-}
-
-// now actually fire up npm and run the command.
-// this is how to use npm programmatically:
-conf._exit = true
-npm.load(conf, function (er) {
-  if (er) return errorHandler(er)
-  npm.commands[npm.command](npm.argv, errorHandler)
-})
-
-})()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/npm.cmd	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-:: Created by npm, please don't edit manually.
-@IF EXIST "%~dp0\node.exe" (
-  "%~dp0\node.exe" "%~dp0\.\node_modules\npm\bin\npm-cli.js" %*
-) ELSE (
-  node "%~dp0\.\node_modules\npm\bin\npm-cli.js" %*
-)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/bin/read-package-json.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-var argv = process.argv
-if (argv.length < 3) {
-  console.error("Usage: read-package.json <file> [<fields> ...]")
-  process.exit(1)
-}
-
-var fs = require("fs")
-  , file = argv[2]
-  , readJson = require("read-package-json")
-
-readJson(file, function (er, data) {
-  if (er) throw er
-  if (argv.length === 3) console.log(data)
-  else argv.slice(3).forEach(function (field) {
-    field = field.split(".")
-    var val = data
-    field.forEach(function (f) {
-      val = val[f]
-    })
-    console.log(val)
-  })
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/cli.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-#!/usr/bin/env node
-require("./bin/npm-cli.js")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/configure	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-#!/bin/bash
-
-# set configurations that will be "sticky" on this system,
-# surviving npm self-updates.
-
-CONFIGS=()
-i=0
-
-# get the location of this file.
-unset CDPATH
-CONFFILE=$(cd $(dirname "$0"); pwd -P)/npmrc
-
-while [ $# -gt 0 ]; do
-  conf="$1"
-  case $conf in
-    --help)
-      echo "./configure --param=value ..."
-      exit 0
-      ;;
-    --*)
-      CONFIGS[$i]="${conf:2}"
-      ;;
-    *)
-      CONFIGS[$i]="$conf"
-      ;;
-  esac
-  let i++
-  shift
-done
-
-for c in "${CONFIGS[@]}"; do
-  echo "$c" >> "$CONFFILE"
-done
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-bin.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-npm-bin(3) -- Display npm bin folder
-====================================
-
-## SYNOPSIS
-
-    npm.commands.bin(args, cb)
-
-## DESCRIPTION
-
-Print the folder where npm will install executables.
-
-This function should not be used programmatically.  Instead, just refer
-to the `npm.bin` member.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-bugs.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-npm-bugs(3) -- Bugs for a package in a web browser maybe
-========================================================
-
-## SYNOPSIS
-
-    npm.commands.bugs(package, callback)
-
-## DESCRIPTION
-
-This command tries to guess at the likely location of a package's
-bug tracker URL, and then tries to open it using the `--browser`
-config param.
-
-Like other commands, the first parameter is an array. This command only
-uses the first element, which is expected to be a package name with an
-optional version number.
-
-This command will launch a browser, so this command may not be the most
-friendly for programmatic use.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-commands.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-npm-commands(3) -- npm commands
-===============================
-
-## SYNOPSIS
-
-    npm.commands[<command>](args, callback)
-
-## DESCRIPTION
-
-npm comes with a full set of commands, and each of the commands takes a
-similar set of arguments.
-
-In general, all commands on the command object take an **array** of positional
-argument **strings**. The last argument to any function is a callback. Some
-commands are special and take other optional arguments.
-
-All commands have their own man page. See `man npm-<command>` for command-line
-usage, or `man 3 npm-<command>` for programmatic usage.
-
-## SEE ALSO
-
-* npm-index(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-config.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,45 +0,0 @@
-npm-config(3) -- Manage the npm configuration files
-===================================================
-
-## SYNOPSIS
-
-    npm.commands.config(args, callback)
-    var val = npm.config.get(key)
-    npm.config.set(key, val)
-
-## DESCRIPTION
-
-This function acts much the same way as the command-line version.  The first
-element in the array tells config what to do. Possible values are:
-
-* `set`
-
-    Sets a config parameter.  The second element in `args` is interpreted as the
-    key, and the third element is interpreted as the value.
-
-* `get`
-
-    Gets the value of a config parameter. The second element in `args` is the
-    key to get the value of.
-
-* `delete` (`rm` or `del`)
-
-    Deletes a parameter from the config. The second element in `args` is the
-    key to delete.
-
-* `list` (`ls`)
-
-    Show all configs that aren't secret. No parameters necessary.
-
-* `edit`:
-
-    Opens the config file in the default editor. This command isn't very useful
-    programmatically, but it is made available.
-
-To programmatically access npm configuration settings, or set them for
-the duration of a program, use the `npm.config.set` and `npm.config.get`
-functions instead.
-
-## SEE ALSO
-
-* npm(3)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-deprecate.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,34 +0,0 @@
-npm-deprecate(3) -- Deprecate a version of a package
-====================================================
-
-## SYNOPSIS
-
-    npm.commands.deprecate(args, callback)
-
-## DESCRIPTION
-
-This command will update the npm registry entry for a package, providing
-a deprecation warning to all who attempt to install it.
-
-The 'args' parameter must have exactly two elements:
-
-* `package[@version]`
-
-    The `version` portion is optional, and may be either a range, or a
-    specific version, or a tag.
-
-* `message`
-
-    The warning message that will be printed whenever a user attempts to
-    install the package.
-
-Note that you must be the package owner to deprecate something.  See the
-`owner` and `adduser` help topics.
-
-To un-deprecate a package, specify an empty string (`""`) for the `message` argument.
-
-## SEE ALSO
-
-* npm-publish(3)
-* npm-unpublish(3)
-* npm-registry(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-docs.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-npm-docs(3) -- Docs for a package in a web browser maybe
-========================================================
-
-## SYNOPSIS
-
-    npm.commands.docs(package, callback)
-
-## DESCRIPTION
-
-This command tries to guess at the likely location of a package's
-documentation URL, and then tries to open it using the `--browser`
-config param.
-
-Like other commands, the first parameter is an array. This command only
-uses the first element, which is expected to be a package name with an
-optional version number.
-
-This command will launch a browser, so this command may not be the most
-friendly for programmatic use.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-edit.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-npm-edit(3) -- Edit an installed package
-========================================
-
-## SYNOPSIS
-
-    npm.commands.edit(package, callback)
-
-## DESCRIPTION
-
-Opens the package folder in the default editor (or whatever you've
-configured as the npm `editor` config -- see `npm help config`.)
-
-After it has been edited, the package is rebuilt so as to pick up any
-changes in compiled packages.
-
-For instance, you can do `npm install connect` to install connect
-into your package, and then `npm.commands.edit(["connect"], callback)`
-to make a few changes to your locally installed copy.
-
-The first parameter is a string array with a single element, the package
-to open. The package can optionally have a version number attached.
-
-Since this command opens an editor in a new process, be careful about where
-and how this is used.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-explore.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-npm-explore(3) -- Browse an installed package
-=============================================
-
-## SYNOPSIS
-
-    npm.commands.explore(args, callback)
-
-## DESCRIPTION
-
-Spawn a subshell in the directory of the installed package specified.
-
-If a command is specified, then it is run in the subshell, which then
-immediately terminates.
-
-Note that the package is *not* automatically rebuilt afterwards, so be
-sure to use `npm rebuild <pkg>` if you make any changes.
-
-The first element in the 'args' parameter must be a package name.  After that is the optional command, which can be any number of strings. All of the strings will be combined into one, space-delimited command.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-help-search.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-npm-help-search(3) -- Search the help pages
-===========================================
-
-## SYNOPSIS
-
-    npm.commands.helpSearch(args, [silent,] callback)
-
-## DESCRIPTION
-
-This command is rarely useful, but it exists in the rare case that it is.
-
-This command takes an array of search terms and returns the help pages that
-match in order of best match.
-
-If there is only one match, then npm displays that help section. If there
-are multiple results, the results are printed to the screen formatted and the
-array of results is returned. Each result is an object with these properties:
-
-* hits:
-  A map of args to number of hits on that arg. For example, {"npm": 3}
-* found:
-  Total number of unique args that matched.
-* totalHits:
-  Total number of hits.
-* lines:
-  An array of all matching lines (and some adjacent lines).
-* file:
-  Name of the file that matched
-
-The silent parameter is not neccessary not used, but it may in the future.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-init.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,29 +0,0 @@
-npm init(3) -- Interactively create a package.json file
-=======================================================
-
-## SYNOPSIS
-
-    npm.commands.init(args, callback)
-
-## DESCRIPTION
-
-This will ask you a bunch of questions, and then write a package.json for you.
-
-It attempts to make reasonable guesses about what you want things to be set to,
-and then writes a package.json file with the options you've selected.
-
-If you already have a package.json file, it'll read that first, and default to
-the options in there.
-
-It is strictly additive, so it does not delete options from your package.json
-without a really good reason to do so.
-
-Since this function expects to be run on the command-line, it doesn't work very
-well as a programmatically. The best option is to roll your own, and since
-JavaScript makes it stupid simple to output formatted JSON, that is the
-preferred method. If you're sure you want to handle command-line prompting,
-then go ahead and use this programmatically.
-
-## SEE ALSO
-
-package.json(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-install.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-npm-install(3) -- install a package programmatically
-====================================================
-
-## SYNOPSIS
-
-    npm.commands.install([where,] packages, callback)
-
-## DESCRIPTION
-
-This acts much the same ways as installing on the command-line.
-
-The 'where' parameter is optional and only used internally, and it specifies
-where the packages should be installed to.
-
-The 'packages' parameter is an array of strings. Each element in the array is
-the name of a package to be installed.
-
-Finally, 'callback' is a function that will be called when all packages have been
-installed or when an error has been encountered.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-link.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-npm-link(3) -- Symlink a package folder
-=======================================
-
-## SYNOPSIS
-
-    npm.command.link(callback)
-    npm.command.link(packages, callback)
-
-## DESCRIPTION
-
-Package linking is a two-step process.
-
-Without parameters, link will create a globally-installed
-symbolic link from `prefix/package-name` to the current folder.
-
-With a parameters, link will create a symlink from the local `node_modules`
-folder to the global symlink.
-
-When creating tarballs for `npm publish`, the linked packages are
-"snapshotted" to their current state by resolving the symbolic links.
-
-This is
-handy for installing your own stuff, so that you can work on it and test it
-iteratively without having to continually rebuild.
-
-For example:
-
-    npm.commands.link(cb)           # creates global link from the cwd
-                                    # (say redis package)
-    npm.commands.link('redis', cb)  # link-install the package
-
-Now, any changes to the redis package will be reflected in
-the package in the current working directory
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-load.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-npm-load(3) -- Load config settings
-===================================
-
-## SYNOPSIS
-
-    npm.load(conf, cb)
-
-## DESCRIPTION
-
-npm.load() must be called before any other function call.  Both parameters are
-optional, but the second is recommended.
-
-The first parameter is an object hash of command-line config params, and the
-second parameter is a callback that will be called when npm is loaded and
-ready to serve.
-
-The first parameter should follow a similar structure as the package.json
-config object.
-
-For example, to emulate the --dev flag, pass an object that looks like this:
-
-    {
-      "dev": true
-    }
-
-For a list of all the available command-line configs, see `npm help config`
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-ls.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-npm-ls(3) -- List installed packages
-======================================
-
-## SYNOPSIS
-
-    npm.commands.ls(args, [silent,] callback)
-
-## DESCRIPTION
-
-This command will print to stdout all the versions of packages that are
-installed, as well as their dependencies, in a tree-structure. It will also
-return that data using the callback.
-
-This command does not take any arguments, but args must be defined.
-Beyond that, if any arguments are passed in, npm will politely warn that it
-does not take positional arguments, though you may set config flags
-like with any other command, such as `global` to list global packages.
-
-It will print out extraneous, missing, and invalid packages.
-
-If the silent parameter is set to true, nothing will be output to the screen,
-but the data will still be returned.
-
-Callback is provided an error if one occurred, the full data about which
-packages are installed and which dependencies they will receive, and a
-"lite" data object which just shows which versions are installed where.
-Note that the full data object is a circular structure, so care must be
-taken if it is serialized to JSON.
-
-## CONFIGURATION
-
-### long
-
-* Default: false
-* Type: Boolean
-
-Show extended information.
-
-### parseable
-
-* Default: false
-* Type: Boolean
-
-Show parseable output instead of tree view.
-
-### global
-
-* Default: false
-* Type: Boolean
-
-List packages in the global install prefix instead of in the current
-project.
-
-Note, if parseable is set or long isn't set, then duplicates will be trimmed.
-This means that if a submodule a same dependency as a parent module, then the
-dependency will only be output once.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-outdated.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-npm-outdated(3) -- Check for outdated packages
-==============================================
-
-## SYNOPSIS
-
-    npm.commands.outdated([packages,] callback)
-
-## DESCRIPTION
-
-This command will check the registry to see if the specified packages are
-currently outdated.
-
-If the 'packages' parameter is left out, npm will check all packages.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-owner.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-npm-owner(3) -- Manage package owners
-=====================================
-
-## SYNOPSIS
-
-    npm.commands.owner(args, callback)
-
-## DESCRIPTION
-
-The first element of the 'args' parameter defines what to do, and the subsequent
-elements depend on the action. Possible values for the action are (order of
-parameters are given in parenthesis):
-
-* ls (package):
-  List all the users who have access to modify a package and push new versions.
-  Handy when you need to know who to bug for help.
-* add (user, package):
-  Add a new user as a maintainer of a package.  This user is enabled to modify
-  metadata, publish new versions, and add other owners.
-* rm (user, package):
-  Remove a user from the package owner list.  This immediately revokes their
-  privileges.
-
-Note that there is only one level of access.  Either you can modify a package,
-or you can't.  Future versions may contain more fine-grained access levels, but
-that is not implemented at this time.
-
-## SEE ALSO
-
-* npm-publish(3)
-* npm-registry(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-pack.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-npm-pack(3) -- Create a tarball from a package
-==============================================
-
-## SYNOPSIS
-
-    npm.commands.pack([packages,] callback)
-
-## DESCRIPTION
-
-For anything that's installable (that is, a package folder, tarball,
-tarball url, name@tag, name@version, or name), this command will fetch
-it to the cache, and then copy the tarball to the current working
-directory as `<name>-<version>.tgz`, and then write the filenames out to
-stdout.
-
-If the same package is specified multiple times, then the file will be
-overwritten the second time.
-
-If no arguments are supplied, then npm packs the current package folder.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-prefix.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-npm-prefix(3) -- Display prefix
-===============================
-
-## SYNOPSIS
-
-    npm.commands.prefix(args, callback)
-
-## DESCRIPTION
-
-Print the prefix to standard out.
-
-'args' is never used and callback is never called with data.
-'args' must be present or things will break.
-
-This function is not useful programmatically
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-prune.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,17 +0,0 @@
-npm-prune(3) -- Remove extraneous packages
-==========================================
-
-## SYNOPSIS
-
-    npm.commands.prune([packages,] callback)
-
-## DESCRIPTION
-
-This command removes "extraneous" packages.
-
-The first parameter is optional, and it specifies packages to be removed.
-
-No packages are specified, then all packages will be checked.
-
-Extraneous packages are packages that are not listed on the parent
-package's dependencies list.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-publish.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-npm-publish(3) -- Publish a package
-===================================
-
-## SYNOPSIS
-
-    npm.commands.publish([packages,] callback)
-
-## DESCRIPTION
-
-Publishes a package to the registry so that it can be installed by name.
-Possible values in the 'packages' array are:
-
-* `<folder>`:
-  A folder containing a package.json file
-
-* `<tarball>`:
-  A url or file path to a gzipped tar archive containing a single folder
-  with a package.json file inside.
-
-If the package array is empty, npm will try to publish something in the
-current working directory.
-
-This command could fails if one of the packages specified already exists in
-the registry.  Overwrites when the "force" environment variable is set.
-
-## SEE ALSO
-
-* npm-registry(7)
-* npm-adduser(1)
-* npm-owner(3)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-rebuild.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-npm-rebuild(3) -- Rebuild a package
-===================================
-
-## SYNOPSIS
-
-    npm.commands.rebuild([packages,] callback)
-
-## DESCRIPTION
-
-This command runs the `npm build` command on each of the matched packages.  This is useful
-when you install a new version of node, and must recompile all your C++ addons with
-the new binary. If no 'packages' parameter is specify, every package will be rebuilt.
-
-## CONFIGURATION
-
-See `npm help build`
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-restart.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-npm-restart(3) -- Start a package
-=================================
-
-## SYNOPSIS
-
-    npm.commands.restart(packages, callback)
-
-## DESCRIPTION
-
-This runs a package's "restart" script, if one was provided.
-Otherwise it runs package's "stop" script, if one was provided, and then
-the "start" script.
-
-If no version is specified, then it restarts the "active" version.
-
-npm can run tests on multiple packages. Just specify multiple packages
-in the `packages` parameter.
-
-## SEE ALSO
-
-* npm-start(3)
-* npm-stop(3)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-root.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-npm-root(3) -- Display npm root
-===============================
-
-## SYNOPSIS
-
-    npm.commands.root(args, callback)
-
-## DESCRIPTION
-
-Print the effective `node_modules` folder to standard out.
-
-'args' is never used and callback is never called with data.
-'args' must be present or things will break.
-
-This function is not useful programmatically.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-run-script.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-npm-run-script(3) -- Run arbitrary package scripts
-==================================================
-
-## SYNOPSIS
-
-    npm.commands.run-script(args, callback)
-
-## DESCRIPTION
-
-This runs an arbitrary command from a package's "scripts" object.
-
-It is used by the test, start, restart, and stop commands, but can be
-called directly, as well.
-
-The 'args' parameter is an array of strings. Behavior depends on the number
-of elements.  If there is only one element, npm assumes that the element
-represents a command to be run on the local repository. If there is more than
-one element, then the first is assumed to be the package and the second is
-assumed to be the command to run. All other elements are ignored.
-
-## SEE ALSO
-
-* npm-scripts(7)
-* npm-test(3)
-* npm-start(3)
-* npm-restart(3)
-* npm-stop(3)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-search.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-npm-search(3) -- Search for packages
-====================================
-
-## SYNOPSIS
-
-    npm.commands.search(searchTerms, [silent,] [staleness,] callback)
-
-## DESCRIPTION
-
-Search the registry for packages matching the search terms. The available parameters are:
-
-* searchTerms:
-  Array of search terms. These terms are case-insensitive.
-* silent:
-  If true, npm will not log anything to the console.
-* staleness:
-  This is the threshold for stale packages. "Fresh" packages are not refreshed
-  from the registry. This value is measured in seconds.
-* callback:
-  Returns an object where each key is the name of a package, and the value
-  is information about that package along with a 'words' property, which is
-  a space-delimited string of all of the interesting words in that package.
-  The only properties included are those that are searched, which generally include:
-
-    * name
-    * description
-    * maintainers
-    * url
-    * keywords
-
-A search on the registry excludes any result that does not match all of the
-search terms. It also removes any items from the results that contain an
-excluded term (the "searchexclude" config). The search is case insensitive
-and doesn't try to read your mind (it doesn't do any verb tense matching or the
-like).
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-shrinkwrap.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,20 +0,0 @@
-npm-shrinkwrap(3) -- programmatically generate package shrinkwrap file
-====================================================
-
-## SYNOPSIS
-
-    npm.commands.shrinkwrap(args, [silent,] callback)
-
-## DESCRIPTION
-
-This acts much the same ways as shrinkwrapping on the command-line.
-
-This command does not take any arguments, but 'args' must be defined.
-Beyond that, if any arguments are passed in, npm will politely warn that it
-does not take positional arguments.
-
-If the 'silent' parameter is set to true, nothing will be output to the screen,
-but the shrinkwrap file will still be written.
-
-Finally, 'callback' is a function that will be called when the shrinkwrap has
-been saved.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-start.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-npm-start(3) -- Start a package
-===============================
-
-## SYNOPSIS
-
-    npm.commands.start(packages, callback)
-
-## DESCRIPTION
-
-This runs a package's "start" script, if one was provided.
-
-npm can run tests on multiple packages. Just specify multiple packages
-in the `packages` parameter.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-stop.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-npm-stop(3) -- Stop a package
-=============================
-
-## SYNOPSIS
-
-    npm.commands.stop(packages, callback)
-
-## DESCRIPTION
-
-This runs a package's "stop" script, if one was provided.
-
-npm can run stop on multiple packages. Just specify multiple packages
-in the `packages` parameter.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-submodule.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-npm-submodule(3) -- Add a package as a git submodule
-====================================================
-
-## SYNOPSIS
-
-    npm.commands.submodule(packages, callback)
-
-## DESCRIPTION
-
-For each package specified, npm will check if it has a git repository url
-in its package.json description then add it as a git submodule at
-`node_modules/<pkg name>`.
-
-This is a convenience only.  From then on, it's up to you to manage
-updates by using the appropriate git commands.  npm will stubbornly
-refuse to update, modify, or remove anything with a `.git` subfolder
-in it.
-
-This command also does not install missing dependencies, if the package
-does not include them in its git repository.  If `npm ls` reports that
-things are missing, you can either install, link, or submodule them yourself,
-or you can do `npm explore <pkgname> -- npm install` to install the
-dependencies into the submodule folder.
-
-## SEE ALSO
-
-* npm help json
-* git help submodule
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-tag.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-npm-tag(3) -- Tag a published version
-=====================================
-
-## SYNOPSIS
-
-    npm.commands.tag(package@version, tag, callback)
-
-## DESCRIPTION
-
-Tags the specified version of the package with the specified tag, or the
-`--tag` config if not specified.
-
-The 'package@version' is an array of strings, but only the first two elements are
-currently used.
-
-The first element must be in the form package@version, where package
-is the package name and version is the version number (much like installing a
-specific version).
-
-The second element is the name of the tag to tag this version with. If this
-parameter is missing or falsey (empty), the default froom the config will be
-used. For more information about how to set this config, check
-`man 3 npm-config` for programmatic usage or `man npm-config` for cli usage.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-test.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-npm-test(3) -- Test a package
-=============================
-
-## SYNOPSIS
-
-      npm.commands.test(packages, callback)
-
-## DESCRIPTION
-
-This runs a package's "test" script, if one was provided.
-
-To run tests as a condition of installation, set the `npat` config to
-true.
-
-npm can run tests on multiple packages. Just specify multiple packages
-in the `packages` parameter.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-uninstall.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-npm-uninstall(3) -- uninstall a package programmatically
-========================================================
-
-## SYNOPSIS
-
-    npm.commands.uninstall(packages, callback)
-
-## DESCRIPTION
-
-This acts much the same ways as uninstalling on the command-line.
-
-The 'packages' parameter is an array of strings. Each element in the array is
-the name of a package to be uninstalled.
-
-Finally, 'callback' is a function that will be called when all packages have been
-uninstalled or when an error has been encountered.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-unpublish.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,20 +0,0 @@
-npm-unpublish(3) -- Remove a package from the registry
-======================================================
-
-## SYNOPSIS
-
-    npm.commands.unpublish(package, callback)
-
-## DESCRIPTION
-
-This removes a package version from the registry, deleting its
-entry and removing the tarball.
-
-The package parameter must be defined.
-
-Only the first element in the package parameter is used.  If there is no first
-element, then npm assumes that the package at the current working directory
-is what is meant.
-
-If no version is specified, or if all versions are removed then
-the root package entry is removed from the registry entirely.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-update.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
-npm-update(3) -- Update a package
-=================================
-
-## SYNOPSIS
-    npm.commands.update(packages, callback)
-
-# DESCRIPTION
-
-Updates a package, upgrading it to the latest version. It also installs any missing packages.
-
-The 'packages' argument is an array of packages to update. The 'callback' parameter will be called when done or when an error occurs.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-version.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-npm-version(3) -- Bump a package version
-========================================
-
-## SYNOPSIS
-
-    npm.commands.version(newversion, callback)
-
-## DESCRIPTION
-
-Run this in a package directory to bump the version and write the new
-data back to the package.json file.
-
-If run in a git repo, it will also create a version commit and tag, and
-fail if the repo is not clean.
-
-Like all other commands, this function takes a string array as its first
-parameter. The difference, however, is this function will fail if it does
-not have exactly one element. The only element should be a version number.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-view.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,93 +0,0 @@
-npm-view(3) -- View registry info
-=================================
-
-## SYNOPSIS
-
-    npm.commands.view(args, [silent,] callback)
-
-## DESCRIPTION
-
-This command shows data about a package and prints it to the stream
-referenced by the `outfd` config, which defaults to stdout.
-
-The "args" parameter is an ordered list that closely resembles the command-line
-usage. The elements should be ordered such that the first element is
-the package and version (package@version). The version is optional. After that,
-the rest of the parameters are fields with optional subfields ("field.subfield")
-which can be used to get only the information desired from the registry.
-
-The callback will be passed all of the data returned by the query.
-
-For example, to get the package registry entry for the `connect` package,
-you can do this:
-
-    npm.commands.view(["connect"], callback)
-
-If no version is specified, "latest" is assumed.
-
-Field names can be specified after the package descriptor.
-For example, to show the dependencies of the `ronn` package at version
-0.3.5, you could do the following:
-
-    npm.commands.view(["ronn@0.3.5", "dependencies"], callback)
-
-You can view child field by separating them with a period.
-To view the git repository URL for the latest version of npm, you could
-do this:
-
-    npm.commands.view(["npm", "repository.url"], callback)
-
-For fields that are arrays, requesting a non-numeric field will return
-all of the values from the objects in the list.  For example, to get all
-the contributor names for the "express" project, you can do this:
-
-    npm.commands.view(["express", "contributors.email"], callback)
-
-You may also use numeric indices in square braces to specifically select
-an item in an array field.  To just get the email address of the first
-contributor in the list, you can do this:
-
-    npm.commands.view(["express", "contributors[0].email"], callback)
-
-Multiple fields may be specified, and will be printed one after another.
-For exampls, to get all the contributor names and email addresses, you
-can do this:
-
-    npm.commands.view(["express", "contributors.name", "contributors.email"], callback)
-
-"Person" fields are shown as a string if they would be shown as an
-object.  So, for example, this will show the list of npm contributors in
-the shortened string format.  (See `npm help json` for more on this.)
-
-    npm.commands.view(["npm", "contributors"], callback)
-
-If a version range is provided, then data will be printed for every
-matching version of the package.  This will show which version of jsdom
-was required by each matching version of yui3:
-
-    npm.commands.view(["yui3@'>0.5.4'", "dependencies.jsdom"], callback)
-
-## OUTPUT
-
-If only a single string field for a single version is output, then it
-will not be colorized or quoted, so as to enable piping the output to
-another command.
-
-If the version range matches multiple versions, than each printed value
-will be prefixed with the version it applies to.
-
-If multiple fields are requested, than each of them are prefixed with
-the field name.
-
-Console output can be disabled by setting the 'silent' parameter to true.
-
-## RETURN VALUE
-
-The data returned will be an object in this formation:
-
-    { <version>:
-      { <field>: <value>
-      , ... }
-    , ... }
-
-corresponding to the list of fields selected.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm-whoami.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-npm-whoami(3) -- Display npm username
-=====================================
-
-## SYNOPSIS
-
-    npm.commands.whoami(args, callback)
-
-## DESCRIPTION
-
-Print the `username` config to standard output.
-
-'args' is never used and callback is never called with data.
-'args' must be present or things will break.
-
-This function is not useful programmatically
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/npm.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,116 +0,0 @@
-npm(3) -- node package manager
-==============================
-
-## SYNOPSIS
-
-    var npm = require("npm")
-    npm.load([configObject], function (er, npm) {
-      // use the npm object, now that it's loaded.
-
-      npm.config.set(key, val)
-      val = npm.config.get(key)
-
-      console.log("prefix = %s", npm.prefix)
-
-      npm.commands.install(["package"], cb)
-    })
-
-## VERSION
-
-@VERSION@
-
-## DESCRIPTION
-
-This is the API documentation for npm.
-To find documentation of the command line
-client, see `npm(1)`.
-
-Prior to using npm's commands, `npm.load()` must be called.
-If you provide `configObject` as an object hash of top-level
-configs, they override the values stored in the various config
-locations. In the npm command line client, this set of configs
-is parsed from the command line options. Additional configuration
-params are loaded from two configuration files. See `npm-config(1)`,
-`npm-config(7)`, and `npmrc(5)` for more information.
-
-After that, each of the functions are accessible in the
-commands object: `npm.commands.<cmd>`.  See `npm-index(7)` for a list of
-all possible commands.
-
-All commands on the command object take an **array** of positional argument
-**strings**. The last argument to any function is a callback. Some
-commands take other optional arguments.
-
-Configs cannot currently be set on a per function basis, as each call to
-npm.config.set will change the value for *all* npm commands in that process.
-
-To find API documentation for a specific command, run the `npm apihelp`
-command.
-
-## METHODS AND PROPERTIES
-
-* `npm.load(configs, cb)`
-
-    Load the configuration params, and call the `cb` function once the
-    globalconfig and userconfig files have been loaded as well, or on
-    nextTick if they've already been loaded.
-
-* `npm.config`
-
-    An object for accessing npm configuration parameters.
-
-    * `npm.config.get(key)`
-    * `npm.config.set(key, val)`
-    * `npm.config.del(key)`
-
-* `npm.dir` or `npm.root`
-
-    The `node_modules` directory where npm will operate.
-
-* `npm.prefix`
-
-    The prefix where npm is operating.  (Most often the current working
-    directory.)
-
-* `npm.cache`
-
-    The place where npm keeps JSON and tarballs it fetches from the
-    registry (or uploads to the registry).
-
-* `npm.tmp`
-
-    npm's temporary working directory.
-
-* `npm.deref`
-
-    Get the "real" name for a command that has either an alias or
-    abbreviation.
-
-## MAGIC
-
-For each of the methods in the `npm.commands` hash, a method is added to
-the npm object, which takes a set of positional string arguments rather
-than an array and a callback.
-
-If the last argument is a callback, then it will use the supplied
-callback.  However, if no callback is provided, then it will print out
-the error or results.
-
-For example, this would work in a node repl:
-
-    > npm = require("npm")
-    > npm.load()  // wait a sec...
-    > npm.install("dnode", "express")
-
-Note that that *won't* work in a node program, since the `install`
-method will get called before the configuration load is completed.
-
-## ABBREVS
-
-In order to support `npm ins foo` instead of `npm install foo`, the
-`npm.commands` object has a set of abbreviations as well as the full
-method names.  Use the `npm.deref` method to find the real name.
-
-For example:
-
-    var cmd = npm.deref("unp") // cmd === "unpublish"
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/api/repo.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-npm-repo(3) -- Open package repository page in the browser
-========================================================
-
-## SYNOPSIS
-
-    npm.commands.repo(package, callback)
-
-## DESCRIPTION
-
-This command tries to guess at the likely location of a package's
-repository URL, and then tries to open it using the `--browser`
-config param.
-
-Like other commands, the first parameter is an array. This command only
-uses the first element, which is expected to be a package name with an
-optional version number.
-
-This command will launch a browser, so this command may not be the most
-friendly for programmatic use.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-adduser.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,38 +0,0 @@
-npm-adduser(1) -- Add a registry user account
-=============================================
-
-## SYNOPSIS
-
-    npm adduser
-
-## DESCRIPTION
-
-Create or verify a user named `<username>` in the npm registry, and
-save the credentials to the `.npmrc` file.
-
-The username, password, and email are read in from prompts.
-
-You may use this command to change your email address, but not username
-or password.
-
-To reset your password, go to <http://admin.npmjs.org/>
-
-You may use this command multiple times with the same user account to
-authorize on a new machine.
-
-## CONFIGURATION
-
-### registry
-
-Default: http://registry.npmjs.org/
-
-The base URL of the npm package registry.
-
-## SEE ALSO
-
-* npm-registry(7)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-owner(1)
-* npm-whoami(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-bin.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-npm-bin(1) -- Display npm bin folder
-====================================
-
-## SYNOPSIS
-
-    npm bin
-
-## DESCRIPTION
-
-Print the folder where npm will install executables.
-
-## SEE ALSO
-
-* npm-prefix(1)
-* npm-root(1)
-* npm-folders(5)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-bugs.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-npm-bugs(1) -- Bugs for a package in a web browser maybe
-========================================================
-
-## SYNOPSIS
-
-    npm bugs <pkgname>
-
-## DESCRIPTION
-
-This command tries to guess at the likely location of a package's
-bug tracker URL, and then tries to open it using the `--browser`
-config param.
-
-## CONFIGURATION
-
-### browser
-
-* Default: OS X: `"open"`, Windows: `"start"`, Others: `"xdg-open"`
-* Type: String
-
-The browser that is called by the `npm bugs` command to open websites.
-
-### registry
-
-* Default: https://registry.npmjs.org/
-* Type: url
-
-The base URL of the npm package registry.
-
-
-## SEE ALSO
-
-* npm-docs(1)
-* npm-view(1)
-* npm-publish(1)
-* npm-registry(7)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* package.json(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-build.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-npm-build(1) -- Build a package
-===============================
-
-## SYNOPSIS
-
-    npm build <package-folder>
-
-* `<package-folder>`:
-  A folder containing a `package.json` file in its root.
-
-## DESCRIPTION
-
-This is the plumbing command called by `npm link` and `npm install`.
-
-It should generally not be called directly.
-
-## SEE ALSO
-
-* npm-install(1)
-* npm-link(1)
-* npm-scripts(7)
-* package.json(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-bundle.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-npm-bundle(1) -- REMOVED
-========================
-
-## DESCRIPTION
-
-The `npm bundle` command has been removed in 1.0, for the simple reason
-that it is no longer necessary, as the default behavior is now to
-install packages into the local space.
-
-Just use `npm install` now to do what `npm bundle` used to do.
-
-## SEE ALSO
-
-* npm-install(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-cache.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,72 +0,0 @@
-npm-cache(1) -- Manipulates packages cache
-==========================================
-
-## SYNOPSIS
-
-    npm cache add <tarball file>
-    npm cache add <folder>
-    npm cache add <tarball url>
-    npm cache add <name>@<version>
-
-    npm cache ls [<path>]
-
-    npm cache clean [<path>]
-
-## DESCRIPTION
-
-Used to add, list, or clear the npm cache folder.
-
-* add:
-  Add the specified package to the local cache.  This command is primarily
-  intended to be used internally by npm, but it can provide a way to
-  add data to the local installation cache explicitly.
-
-* ls:
-  Show the data in the cache.  Argument is a path to show in the cache
-  folder.  Works a bit like the `find` program, but limited by the
-  `depth` config.
-
-* clean:
-  Delete data out of the cache folder.  If an argument is provided, then
-  it specifies a subpath to delete.  If no argument is provided, then
-  the entire cache is cleared.
-
-## DETAILS
-
-npm stores cache data in the directory specified in `npm config get cache`.
-For each package that is added to the cache, three pieces of information are
-stored in `{cache}/{name}/{version}`:
-
-* .../package/:
-  A folder containing the package contents as they appear in the tarball.
-* .../package.json:
-  The package.json file, as npm sees it, with overlays applied and a _id attribute.
-* .../package.tgz:
-  The tarball for that version.
-
-Additionally, whenever a registry request is made, a `.cache.json` file
-is placed at the corresponding URI, to store the ETag and the requested
-data.
-
-Commands that make non-essential registry requests (such as `search` and
-`view`, or the completion scripts) generally specify a minimum timeout.
-If the `.cache.json` file is younger than the specified timeout, then
-they do not make an HTTP request to the registry.
-
-## CONFIGURATION
-
-### cache
-
-Default: `~/.npm` on Posix, or `%AppData%/npm-cache` on Windows.
-
-The root cache folder.
-
-## SEE ALSO
-
-* npm-folders(5)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-install(1)
-* npm-publish(1)
-* npm-pack(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-completion.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,29 +0,0 @@
-npm-completion(1) -- Tab Completion for npm
-===========================================
-
-## SYNOPSIS
-
-    . <(npm completion)
-
-## DESCRIPTION
-
-Enables tab-completion in all npm commands.
-
-The synopsis above
-loads the completions into your current shell.  Adding it to
-your ~/.bashrc or ~/.zshrc will make the completions available
-everywhere.
-
-You may of course also pipe the output of npm completion to a file
-such as `/usr/local/etc/bash_completion.d/npm` if you have a system
-that will read that file for you.
-
-When `COMP_CWORD`, `COMP_LINE`, and `COMP_POINT` are defined in the
-environment, `npm completion` acts in "plumbing mode", and outputs
-completions based on the arguments.
-
-## SEE ALSO
-
-* npm-developers(7)
-* npm-faq(7)
-* npm(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-config.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,71 +0,0 @@
-npm-config(1) -- Manage the npm configuration files
-===================================================
-
-## SYNOPSIS
-
-    npm config set <key> <value> [--global]
-    npm config get <key>
-    npm config delete <key>
-    npm config list
-    npm config edit
-    npm c [set|get|delete|list]
-    npm get <key>
-    npm set <key> <value> [--global]
-
-## DESCRIPTION
-
-npm gets its config settings from the command line, environment
-variables, `npmrc` files, and in some cases, the `package.json` file.
-
-See npmrc(5) for more information about the npmrc files.
-
-See `npm-config(7)` for a more thorough discussion of the mechanisms
-involved.
-
-The `npm config` command can be used to update and edit the contents
-of the user and global npmrc files.
-
-## Sub-commands
-
-Config supports the following sub-commands:
-
-### set
-
-    npm config set key value
-
-Sets the config key to the value.
-
-If value is omitted, then it sets it to "true".
-
-### get
-
-    npm config get key
-
-Echo the config value to stdout.
-
-### list
-
-    npm config list
-
-Show all the config settings.
-
-### delete
-
-    npm config delete key
-
-Deletes the key from all configuration files.
-
-### edit
-
-    npm config edit
-
-Opens the config file in an editor.  Use the `--global` flag to edit the
-global config.
-
-## SEE ALSO
-
-* npm-folders(5)
-* npm-config(7)
-* package.json(5)
-* npmrc(5)
-* npm(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-dedupe.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,58 +0,0 @@
-npm-dedupe(1) -- Reduce duplication
-===================================
-
-## SYNOPSIS
-
-    npm dedupe [package names...]
-    npm ddp [package names...]
-
-## DESCRIPTION
-
-Searches the local package tree and attempts to simplify the overall
-structure by moving dependencies further up the tree, where they can
-be more effectively shared by multiple dependent packages.
-
-For example, consider this dependency graph:
-
-    a
-    +-- b <-- depends on c@1.0.x
-    |   `-- c@1.0.3
-    `-- d <-- depends on c@~1.0.9
-        `-- c@1.0.10
-
-In this case, `npm-dedupe(1)` will transform the tree to:
-
-    a
-    +-- b
-    +-- d
-    `-- c@1.0.10
-
-Because of the hierarchical nature of node's module lookup, b and d
-will both get their dependency met by the single c package at the root
-level of the tree.
-
-If a suitable version exists at the target location in the tree
-already, then it will be left untouched, but the other duplicates will
-be deleted.
-
-If no suitable version can be found, then a warning is printed, and
-nothing is done.
-
-If any arguments are supplied, then they are filters, and only the
-named packages will be touched.
-
-Note that this operation transforms the dependency tree, and may
-result in packages getting updated versions, perhaps from the npm
-registry.
-
-This feature is experimental, and may change in future versions.
-
-The `--tag` argument will apply to all of the affected dependencies. If a
-tag with the given name exists, the tagged version is preferred over newer
-versions.
-
-## SEE ALSO
-
-* npm-ls(1)
-* npm-update(1)
-* npm-install(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-deprecate.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-npm-deprecate(1) -- Deprecate a version of a package
-====================================================
-
-## SYNOPSIS
-
-    npm deprecate <name>[@<version>] <message>
-
-## DESCRIPTION
-
-This command will update the npm registry entry for a package, providing
-a deprecation warning to all who attempt to install it.
-
-It works on version ranges as well as specific versions, so you can do
-something like this:
-
-    npm deprecate my-thing@"< 0.2.3" "critical bug fixed in v0.2.3"
-
-Note that you must be the package owner to deprecate something.  See the
-`owner` and `adduser` help topics.
-
-To un-deprecate a package, specify an empty string (`""`) for the `message` argument.
-
-## SEE ALSO
-
-* npm-publish(1)
-* npm-registry(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-docs.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-npm-docs(1) -- Docs for a package in a web browser maybe
-========================================================
-
-## SYNOPSIS
-
-    npm docs <pkgname>
-    npm home <pkgname>
-
-## DESCRIPTION
-
-This command tries to guess at the likely location of a package's
-documentation URL, and then tries to open it using the `--browser`
-config param.
-
-## CONFIGURATION
-
-### browser
-
-* Default: OS X: `"open"`, Windows: `"start"`, Others: `"xdg-open"`
-* Type: String
-
-The browser that is called by the `npm docs` command to open websites.
-
-### registry
-
-* Default: https://registry.npmjs.org/
-* Type: url
-
-The base URL of the npm package registry.
-
-
-## SEE ALSO
-
-* npm-view(1)
-* npm-publish(1)
-* npm-registry(7)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* package.json(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-edit.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-npm-edit(1) -- Edit an installed package
-========================================
-
-## SYNOPSIS
-
-    npm edit <name>[@<version>]
-
-## DESCRIPTION
-
-Opens the package folder in the default editor (or whatever you've
-configured as the npm `editor` config -- see `npm-config(7)`.)
-
-After it has been edited, the package is rebuilt so as to pick up any
-changes in compiled packages.
-
-For instance, you can do `npm install connect` to install connect
-into your package, and then `npm edit connect` to make a few
-changes to your locally installed copy.
-
-## CONFIGURATION
-
-### editor
-
-* Default: `EDITOR` environment variable if set, or `"vi"` on Posix,
-  or `"notepad"` on Windows.
-* Type: path
-
-The command to run for `npm edit` or `npm config edit`.
-
-## SEE ALSO
-
-* npm-folders(5)
-* npm-explore(1)
-* npm-install(1)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-explore.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-npm-explore(1) -- Browse an installed package
-=============================================
-
-## SYNOPSIS
-
-    npm explore <name>[@<version>] [ -- <cmd>]
-
-## DESCRIPTION
-
-Spawn a subshell in the directory of the installed package specified.
-
-If a command is specified, then it is run in the subshell, which then
-immediately terminates.
-
-This is particularly handy in the case of git submodules in the
-`node_modules` folder:
-
-    npm explore some-dependency -- git pull origin master
-
-Note that the package is *not* automatically rebuilt afterwards, so be
-sure to use `npm rebuild <pkg>` if you make any changes.
-
-## CONFIGURATION
-
-### shell
-
-* Default: SHELL environment variable, or "bash" on Posix, or "cmd" on
-  Windows
-* Type: path
-
-The shell to run for the `npm explore` command.
-
-## SEE ALSO
-
-* npm-submodule(1)
-* npm-folders(5)
-* npm-edit(1)
-* npm-rebuild(1)
-* npm-build(1)
-* npm-install(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-help-search.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-npm-help-search(1) -- Search npm help documentation
-===================================================
-
-## SYNOPSIS
-
-    npm help-search some search terms
-
-## DESCRIPTION
-
-This command will search the npm markdown documentation files for the
-terms provided, and then list the results, sorted by relevance.
-
-If only one result is found, then it will show that help topic.
-
-If the argument to `npm help` is not a known help topic, then it will
-call `help-search`.  It is rarely if ever necessary to call this
-command directly.
-
-## CONFIGURATION
-
-### long
-
-* Type: Boolean
-* Default false
-
-If true, the "long" flag will cause help-search to output context around
-where the terms were found in the documentation.
-
-If false, then help-search will just list out the help topics found.
-
-## SEE ALSO
-
-* npm(1)
-* npm-faq(7)
-* npm-help(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-help.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-npm-help(1) -- Get help on npm
-==============================
-
-## SYNOPSIS
-
-    npm help <topic>
-    npm help some search terms
-
-## DESCRIPTION
-
-If supplied a topic, then show the appropriate documentation page.
-
-If the topic does not exist, or if multiple terms are provided, then run
-the `help-search` command to find a match.  Note that, if `help-search`
-finds a single subject, then it will run `help` on that topic, so unique
-matches are equivalent to specifying a topic name.
-
-## CONFIGURATION
-
-### viewer
-
-* Default: "man" on Posix, "browser" on Windows
-* Type: path
-
-The program to use to view help content.
-
-Set to `"browser"` to view html help content in the default web browser.
-
-## SEE ALSO
-
-* npm(1)
-* README
-* npm-faq(7)
-* npm-folders(5)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* package.json(5)
-* npm-help-search(1)
-* npm-index(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-init.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-npm-init(1) -- Interactively create a package.json file
-=======================================================
-
-## SYNOPSIS
-
-    npm init
-
-## DESCRIPTION
-
-This will ask you a bunch of questions, and then write a package.json for you.
-
-It attempts to make reasonable guesses about what you want things to be set to,
-and then writes a package.json file with the options you've selected.
-
-If you already have a package.json file, it'll read that first, and default to
-the options in there.
-
-It is strictly additive, so it does not delete options from your package.json
-without a really good reason to do so.
-
-## SEE ALSO
-
-* <https://github.com/isaacs/init-package-json>
-* package.json(5)
-* npm-version(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-install.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,249 +0,0 @@
-npm-install(1) -- Install a package
-===================================
-
-## SYNOPSIS
-
-    npm install (with no args in a package dir)
-    npm install <tarball file>
-    npm install <tarball url>
-    npm install <folder>
-    npm install <name> [--save|--save-dev|--save-optional]
-    npm install <name>@<tag>
-    npm install <name>@<version>
-    npm install <name>@<version range>
-    npm i (with any of the previous argument usage)
-
-## DESCRIPTION
-
-This command installs a package, and any packages that it depends on. If the
-package has a shrinkwrap file, the installation of dependencies will be driven
-by that. See npm-shrinkwrap(1).
-
-A `package` is:
-
-* a) a folder containing a program described by a package.json file
-* b) a gzipped tarball containing (a)
-* c) a url that resolves to (b)
-* d) a `<name>@<version>` that is published on the registry with (c)
-* e) a `<name>@<tag>` that points to (d)
-* f) a `<name>` that has a "latest" tag satisfying (e)
-* g) a `<git remote url>` that resolves to (b)
-
-Even if you never publish your package, you can still get a lot of
-benefits of using npm if you just want to write a node program (a), and
-perhaps if you also want to be able to easily install it elsewhere
-after packing it up into a tarball (b).
-
-
-* `npm install` (in package directory, no arguments):
-
-    Install the dependencies in the local node_modules folder.
-
-    In global mode (ie, with `-g` or `--global` appended to the command),
-    it installs the current package context (ie, the current working
-    directory) as a global package.
-
-    By default, `npm install` will install all modules listed as
-    dependencies. With the `--production` flag,
-    npm will not install modules listed in `devDependencies`.
-
-* `npm install <folder>`:
-
-    Install a package that is sitting in a folder on the filesystem.
-
-* `npm install <tarball file>`:
-
-    Install a package that is sitting on the filesystem.  Note: if you just want
-    to link a dev directory into your npm root, you can do this more easily by
-    using `npm link`.
-
-    Example:
-
-          npm install ./package.tgz
-
-* `npm install <tarball url>`:
-
-    Fetch the tarball url, and then install it.  In order to distinguish between
-    this and other options, the argument must start with "http://" or "https://"
-
-    Example:
-
-          npm install https://github.com/indexzero/forever/tarball/v0.5.6
-
-* `npm install <name> [--save|--save-dev|--save-optional]`:
-
-    Do a `<name>@<tag>` install, where `<tag>` is the "tag" config. (See
-    `npm-config(7)`.)
-
-    In most cases, this will install the latest version
-    of the module published on npm.
-
-    Example:
-
-          npm install sax
-
-    `npm install` takes 3 exclusive, optional flags which save or update
-    the package version in your main package.json:
-
-    * `--save`: Package will appear in your `dependencies`.
-
-    * `--save-dev`: Package will appear in your `devDependencies`.
-
-    * `--save-optional`: Package will appear in your `optionalDependencies`.
-
-    Examples:
-
-          npm install sax --save
-          npm install node-tap --save-dev
-          npm install dtrace-provider --save-optional
-
-
-    **Note**: If there is a file or folder named `<name>` in the current
-    working directory, then it will try to install that, and only try to
-    fetch the package by name if it is not valid.
-
-* `npm install <name>@<tag>`:
-
-    Install the version of the package that is referenced by the specified tag.
-    If the tag does not exist in the registry data for that package, then this
-    will fail.
-
-    Example:
-
-          npm install sax@latest
-
-* `npm install <name>@<version>`:
-
-    Install the specified version of the package.  This will fail if the version
-    has not been published to the registry.
-
-    Example:
-
-          npm install sax@0.1.1
-
-* `npm install <name>@<version range>`:
-
-    Install a version of the package matching the specified version range.  This
-    will follow the same rules for resolving dependencies described in `package.json(5)`.
-
-    Note that most version ranges must be put in quotes so that your shell will
-    treat it as a single argument.
-
-    Example:
-
-          npm install sax@">=0.1.0 <0.2.0"
-
-* `npm install <git remote url>`:
-
-    Install a package by cloning a git remote url.  The format of the git
-    url is:
-
-          <protocol>://[<user>@]<hostname><separator><path>[#<commit-ish>]
-
-    `<protocol>` is one of `git`, `git+ssh`, `git+http`, or
-    `git+https`.  If no `<commit-ish>` is specified, then `master` is
-    used.
-
-    Examples:
-
-          git+ssh://git@github.com:isaacs/npm.git#v1.0.27
-          git+https://isaacs@github.com/isaacs/npm.git
-          git://github.com/isaacs/npm.git#v1.0.27
-
-You may combine multiple arguments, and even multiple types of arguments.
-For example:
-
-    npm install sax@">=0.1.0 <0.2.0" bench supervisor
-
-The `--tag` argument will apply to all of the specified install targets. If a
-tag with the given name exists, the tagged version is preferred over newer
-versions.
-
-The `--force` argument will force npm to fetch remote resources even if a
-local copy exists on disk.
-
-    npm install sax --force
-
-The `--global` argument will cause npm to install the package globally
-rather than locally.  See `npm-folders(5)`.
-
-The `--link` argument will cause npm to link global installs into the
-local space in some cases.
-
-The `--no-bin-links` argument will prevent npm from creating symlinks for
-any binaries the package might contain.
-
-The `--no-shrinkwrap` argument, which will ignore an available
-shrinkwrap file and use the package.json instead.
-
-The `--nodedir=/path/to/node/source` argument will allow npm to find the
-node source code so that npm can compile native modules.
-
-See `npm-config(7)`.  Many of the configuration params have some
-effect on installation, since that's most of what npm does.
-
-## ALGORITHM
-
-To install a package, npm uses the following algorithm:
-
-    install(where, what, family, ancestors)
-    fetch what, unpack to <where>/node_modules/<what>
-    for each dep in what.dependencies
-      resolve dep to precise version
-    for each dep@version in what.dependencies
-        not in <where>/node_modules/<what>/node_modules/*
-        and not in <family>
-      add precise version deps to <family>
-      install(<where>/node_modules/<what>, dep, family)
-
-For this `package{dep}` structure: `A{B,C}, B{C}, C{D}`,
-this algorithm produces:
-
-    A
-    +-- B
-    `-- C
-        `-- D
-
-That is, the dependency from B to C is satisfied by the fact that A
-already caused C to be installed at a higher level.
-
-See npm-folders(5) for a more detailed description of the specific
-folder structures that npm creates.
-
-### Limitations of npm's Install Algorithm
-
-There are some very rare and pathological edge-cases where a cycle can
-cause npm to try to install a never-ending tree of packages.  Here is
-the simplest case:
-
-    A -> B -> A' -> B' -> A -> B -> A' -> B' -> A -> ...
-
-where `A` is some version of a package, and `A'` is a different version
-of the same package.  Because `B` depends on a different version of `A`
-than the one that is already in the tree, it must install a separate
-copy.  The same is true of `A'`, which must install `B'`.  Because `B'`
-depends on the original version of `A`, which has been overridden, the
-cycle falls into infinite regress.
-
-To avoid this situation, npm flat-out refuses to install any
-`name@version` that is already present anywhere in the tree of package
-folder ancestors.  A more correct, but more complex, solution would be
-to symlink the existing version into the new location.  If this ever
-affects a real use-case, it will be investigated.
-
-## SEE ALSO
-
-* npm-folders(5)
-* npm-update(1)
-* npm-link(1)
-* npm-rebuild(1)
-* npm-scripts(7)
-* npm-build(1)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-registry(7)
-* npm-folders(5)
-* npm-tag(1)
-* npm-rm(1)
-* npm-shrinkwrap(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-link.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-npm-link(1) -- Symlink a package folder
-=======================================
-
-## SYNOPSIS
-
-    npm link (in package folder)
-    npm link <pkgname>
-    npm ln (with any of the previous argument usage)
-
-## DESCRIPTION
-
-Package linking is a two-step process.
-
-First, `npm link` in a package folder will create a globally-installed
-symbolic link from `prefix/package-name` to the current folder.
-
-Next, in some other location, `npm link package-name` will create a
-symlink from the local `node_modules` folder to the global symlink.
-
-Note that `package-name` is taken from `package.json`,
-not from directory name.
-
-When creating tarballs for `npm publish`, the linked packages are
-"snapshotted" to their current state by resolving the symbolic links.
-
-This is
-handy for installing your own stuff, so that you can work on it and test it
-iteratively without having to continually rebuild.
-
-For example:
-
-    cd ~/projects/node-redis    # go into the package directory
-    npm link                    # creates global link
-    cd ~/projects/node-bloggy   # go into some other package directory.
-    npm link redis              # link-install the package
-
-Now, any changes to ~/projects/node-redis will be reflected in
-~/projects/node-bloggy/node_modules/redis/
-
-You may also shortcut the two steps in one.  For example, to do the
-above use-case in a shorter way:
-
-    cd ~/projects/node-bloggy  # go into the dir of your main project
-    npm link ../node-redis     # link the dir of your dependency
-
-The second line is the equivalent of doing:
-
-    (cd ../node-redis; npm link)
-    npm link redis
-
-That is, it first creates a global link, and then links the global
-installation target into your project's `node_modules` folder.
-
-## SEE ALSO
-
-* npm-developers(7)
-* npm-faq(7)
-* package.json(5)
-* npm-install(1)
-* npm-folders(5)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-ls.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,74 +0,0 @@
-npm-ls(1) -- List installed packages
-======================================
-
-## SYNOPSIS
-
-    npm list [<pkg> ...]
-    npm ls [<pkg> ...]
-    npm la [<pkg> ...]
-    npm ll [<pkg> ...]
-
-## DESCRIPTION
-
-This command will print to stdout all the versions of packages that are
-installed, as well as their dependencies, in a tree-structure.
-
-Positional arguments are `name@version-range` identifiers, which will
-limit the results to only the paths to the packages named.  Note that
-nested packages will *also* show the paths to the specified packages.
-For example, running `npm ls promzard` in npm's source tree will show:
-
-    npm@@VERSION@ /path/to/npm
-    └─┬ init-package-json@0.0.4
-      └── promzard@0.1.5
-
-It will print out extraneous, missing, and invalid packages.
-
-If a project specifies git urls for dependencies these are shown
-in parentheses after the name@version to make it easier for users to
-recognize potential forks of a project.
-
-When run as `ll` or `la`, it shows extended information by default.
-
-## CONFIGURATION
-
-### json
-
-* Default: false
-* Type: Boolean
-
-Show information in JSON format.
-
-### long
-
-* Default: false
-* Type: Boolean
-
-Show extended information.
-
-### parseable
-
-* Default: false
-* Type: Boolean
-
-Show parseable output instead of tree view.
-
-### global
-
-* Default: false
-* Type: Boolean
-
-List packages in the global install prefix instead of in the current
-project.
-
-## SEE ALSO
-
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-folders(5)
-* npm-install(1)
-* npm-link(1)
-* npm-prune(1)
-* npm-outdated(1)
-* npm-update(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-outdated.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-npm-outdated(1) -- Check for outdated packages
-==============================================
-
-## SYNOPSIS
-
-    npm outdated [<name> [<name> ...]]
-
-## DESCRIPTION
-
-This command will check the registry to see if any (or, specific) installed
-packages are currently outdated.
-
-The resulting field 'wanted' shows the latest version according to the
-version specified in the package.json, the field 'latest' the very latest
-version of the package.
-
-## SEE ALSO
-
-* npm-update(1)
-* npm-registry(7)
-* npm-folders(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-owner.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-npm-owner(1) -- Manage package owners
-=====================================
-
-## SYNOPSIS
-
-    npm owner ls <package name>
-    npm owner add <user> <package name>
-    npm owner rm <user> <package name>
-
-## DESCRIPTION
-
-Manage ownership of published packages.
-
-* ls:
-  List all the users who have access to modify a package and push new versions.
-  Handy when you need to know who to bug for help.
-* add:
-  Add a new user as a maintainer of a package.  This user is enabled to modify
-  metadata, publish new versions, and add other owners.
-* rm:
-  Remove a user from the package owner list.  This immediately revokes their
-  privileges.
-
-Note that there is only one level of access.  Either you can modify a package,
-or you can't.  Future versions may contain more fine-grained access levels, but
-that is not implemented at this time.
-
-## SEE ALSO
-
-* npm-publish(1)
-* npm-registry(7)
-* npm-adduser(1)
-* npm-disputes(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-pack.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-npm-pack(1) -- Create a tarball from a package
-==============================================
-
-## SYNOPSIS
-
-    npm pack [<pkg> [<pkg> ...]]
-
-## DESCRIPTION
-
-For anything that's installable (that is, a package folder, tarball,
-tarball url, name@tag, name@version, or name), this command will fetch
-it to the cache, and then copy the tarball to the current working
-directory as `<name>-<version>.tgz`, and then write the filenames out to
-stdout.
-
-If the same package is specified multiple times, then the file will be
-overwritten the second time.
-
-If no arguments are supplied, then npm packs the current package folder.
-
-## SEE ALSO
-
-* npm-cache(1)
-* npm-publish(1)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-prefix.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-npm-prefix(1) -- Display prefix
-===============================
-
-## SYNOPSIS
-
-    npm prefix
-
-## DESCRIPTION
-
-Print the prefix to standard out.
-
-## SEE ALSO
-
-* npm-root(1)
-* npm-bin(1)
-* npm-folders(5)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-prune.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-npm-prune(1) -- Remove extraneous packages
-==========================================
-
-## SYNOPSIS
-
-    npm prune [<name> [<name ...]]
-    npm prune [<name> [<name ...]] [--production]
-
-## DESCRIPTION
-
-This command removes "extraneous" packages.  If a package name is
-provided, then only packages matching one of the supplied names are
-removed.
-
-Extraneous packages are packages that are not listed on the parent
-package's dependencies list.
-
-If the `--production` flag is specified, this command will remove the
-packages specified in your `devDependencies`.
-
-## SEE ALSO
-
-* npm-rm(1)
-* npm-folders(5)
-* npm-ls(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-publish.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-npm-publish(1) -- Publish a package
-===================================
-
-
-## SYNOPSIS
-
-    npm publish <tarball>
-    npm publish <folder>
-
-## DESCRIPTION
-
-Publishes a package to the registry so that it can be installed by name.
-
-* `<folder>`:
-  A folder containing a package.json file
-
-* `<tarball>`:
-  A url or file path to a gzipped tar archive containing a single folder
-  with a package.json file inside.
-
-Fails if the package name and version combination already exists in
-the registry.  Overwrites when the "--force" flag is set.
-
-## SEE ALSO
-
-* npm-registry(7)
-* npm-adduser(1)
-* npm-owner(1)
-* npm-deprecate(1)
-* npm-tag(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-rebuild.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-npm-rebuild(1) -- Rebuild a package
-===================================
-
-## SYNOPSIS
-
-    npm rebuild [<name> [<name> ...]]
-    npm rb [<name> [<name> ...]]
-
-* `<name>`:
-  The package to rebuild
-
-## DESCRIPTION
-
-This command runs the `npm build` command on the matched folders.  This is useful
-when you install a new version of node, and must recompile all your C++ addons with
-the new binary.
-
-## SEE ALSO
-
-* npm-build(1)
-* npm-install(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-restart.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-npm-restart(1) -- Start a package
-=================================
-
-## SYNOPSIS
-
-    npm restart <name>
-
-## DESCRIPTION
-
-This runs a package's "restart" script, if one was provided.
-Otherwise it runs package's "stop" script, if one was provided, and then
-the "start" script.
-
-If no version is specified, then it restarts the "active" version.
-
-## SEE ALSO
-
-* npm-run-script(1)
-* npm-scripts(7)
-* npm-test(1)
-* npm-start(1)
-* npm-stop(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-rm.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-npm-rm(1) -- Remove a package
-=============================
-
-## SYNOPSIS
-
-    npm rm <name>
-    npm r <name>
-    npm uninstall <name>
-    npm un <name>
-
-## DESCRIPTION
-
-This uninstalls a package, completely removing everything npm installed
-on its behalf.
-
-## SEE ALSO
-
-* npm-prune(1)
-* npm-install(1)
-* npm-folders(5)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-root.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-npm-root(1) -- Display npm root
-===============================
-
-## SYNOPSIS
-
-    npm root
-
-## DESCRIPTION
-
-Print the effective `node_modules` folder to standard out.
-
-## SEE ALSO
-
-* npm-prefix(1)
-* npm-bin(1)
-* npm-folders(5)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-run-script.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-npm-run-script(1) -- Run arbitrary package scripts
-==================================================
-
-## SYNOPSIS
-
-    npm run-script <script> <name>
-
-## DESCRIPTION
-
-This runs an arbitrary command from a package's "scripts" object.
-
-It is used by the test, start, restart, and stop commands, but can be
-called directly, as well.
-
-## SEE ALSO
-
-* npm-scripts(7)
-* npm-test(1)
-* npm-start(1)
-* npm-restart(1)
-* npm-stop(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-search.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-npm-search(1) -- Search for packages
-====================================
-
-## SYNOPSIS
-
-    npm search [search terms ...]
-    npm s [search terms ...]
-    npm se [search terms ...]
-
-## DESCRIPTION
-
-Search the registry for packages matching the search terms.
-
-If a term starts with `/`, then it's interpreted as a regular expression.
-A trailing `/` will be ignored in this case.  (Note that many regular
-expression characters must be escaped or quoted in most shells.)
-
-## SEE ALSO
-
-* npm-registry(7)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-view(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-shrinkwrap.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,185 +0,0 @@
-npm-shrinkwrap(1) -- Lock down dependency versions
-=====================================================
-
-## SYNOPSIS
-
-    npm shrinkwrap
-
-## DESCRIPTION
-
-This command locks down the versions of a package's dependencies so
-that you can control exactly which versions of each dependency will be
-used when your package is installed. The "package.json" file is still
-required if you want to use "npm install".
-
-By default, "npm install" recursively installs the target's
-dependencies (as specified in package.json), choosing the latest
-available version that satisfies the dependency's semver pattern. In
-some situations, particularly when shipping software where each change
-is tightly managed, it's desirable to fully specify each version of
-each dependency recursively so that subsequent builds and deploys do
-not inadvertently pick up newer versions of a dependency that satisfy
-the semver pattern. Specifying specific semver patterns in each
-dependency's package.json would facilitate this, but that's not always
-possible or desirable, as when another author owns the npm package.
-It's also possible to check dependencies directly into source control,
-but that may be undesirable for other reasons.
-
-As an example, consider package A:
-
-    {
-      "name": "A",
-      "version": "0.1.0",
-      "dependencies": {
-        "B": "<0.1.0"
-      }
-    }
-
-package B:
-
-    {
-      "name": "B",
-      "version": "0.0.1",
-      "dependencies": {
-        "C": "<0.1.0"
-      }
-    }
-
-and package C:
-
-    {
-      "name": "C,
-      "version": "0.0.1"
-    }
-
-If these are the only versions of A, B, and C available in the
-registry, then a normal "npm install A" will install:
-
-    A@0.1.0
-    `-- B@0.0.1
-        `-- C@0.0.1
-
-However, if B@0.0.2 is published, then a fresh "npm install A" will
-install:
-
-    A@0.1.0
-    `-- B@0.0.2
-        `-- C@0.0.1
-
-assuming the new version did not modify B's dependencies. Of course,
-the new version of B could include a new version of C and any number
-of new dependencies. If such changes are undesirable, the author of A
-could specify a dependency on B@0.0.1. However, if A's author and B's
-author are not the same person, there's no way for A's author to say
-that he or she does not want to pull in newly published versions of C
-when B hasn't changed at all.
-
-In this case, A's author can run
-
-    npm shrinkwrap
-
-This generates npm-shrinkwrap.json, which will look something like this:
-
-    {
-      "name": "A",
-      "version": "0.1.0",
-      "dependencies": {
-        "B": {
-          "version": "0.0.1",
-          "dependencies": {
-            "C": {
-              "version": "0.1.0"
-            }
-          }
-        }
-      }
-    }
-
-The shrinkwrap command has locked down the dependencies based on
-what's currently installed in node_modules.  When "npm install"
-installs a package with a npm-shrinkwrap.json file in the package
-root, the shrinkwrap file (rather than package.json files) completely
-drives the installation of that package and all of its dependencies
-(recursively).  So now the author publishes A@0.1.0, and subsequent
-installs of this package will use B@0.0.1 and C@0.1.0, regardless the
-dependencies and versions listed in A's, B's, and C's package.json
-files.
-
-
-### Using shrinkwrapped packages
-
-Using a shrinkwrapped package is no different than using any other
-package: you can "npm install" it by hand, or add a dependency to your
-package.json file and "npm install" it.
-
-### Building shrinkwrapped packages
-
-To shrinkwrap an existing package:
-
-1. Run "npm install" in the package root to install the current
-   versions of all dependencies.
-2. Validate that the package works as expected with these versions.
-3. Run "npm shrinkwrap", add npm-shrinkwrap.json to git, and publish
-   your package.
-
-To add or update a dependency in a shrinkwrapped package:
-
-1. Run "npm install" in the package root to install the current
-   versions of all dependencies.
-2. Add or update dependencies. "npm install" each new or updated
-   package individually and then update package.json.  Note that they
-   must be explicitly named in order to be installed: running `npm
-   install` with no arguments will merely reproduce the existing
-   shrinkwrap.
-3. Validate that the package works as expected with the new
-   dependencies.
-4. Run "npm shrinkwrap", commit the new npm-shrinkwrap.json, and
-   publish your package.
-
-You can use npm-outdated(1) to view dependencies with newer versions
-available.
-
-### Other Notes
-
-A shrinkwrap file must be consistent with the package's package.json
-file. "npm shrinkwrap" will fail if required dependencies are not
-already installed, since that would result in a shrinkwrap that
-wouldn't actually work. Similarly, the command will fail if there are
-extraneous packages (not referenced by package.json), since that would
-indicate that package.json is not correct.
-
-Since "npm shrinkwrap" is intended to lock down your dependencies for
-production use, `devDependencies` will not be included unless you
-explicitly set the `--dev` flag when you run `npm shrinkwrap`.  If
-installed `devDependencies` are excluded, then npm will print a
-warning.  If you want them to be installed with your module by
-default, please consider adding them to `dependencies` instead.
-
-If shrinkwrapped package A depends on shrinkwrapped package B, B's
-shrinkwrap will not be used as part of the installation of A. However,
-because A's shrinkwrap is constructed from a valid installation of B
-and recursively specifies all dependencies, the contents of B's
-shrinkwrap will implicitly be included in A's shrinkwrap.
-
-### Caveats
-
-Shrinkwrap files only lock down package versions, not actual package
-contents.  While discouraged, a package author can republish an
-existing version of a package, causing shrinkwrapped packages using
-that version to pick up different code than they were before. If you
-want to avoid any risk that a byzantine author replaces a package
-you're using with code that breaks your application, you could modify
-the shrinkwrap file to use git URL references rather than version
-numbers so that npm always fetches all packages from git.
-
-If you wish to lock down the specific bytes included in a package, for
-example to have 100% confidence in being able to reproduce a
-deployment or build, then you ought to check your dependencies into
-source control, or pursue some other mechanism that can verify
-contents rather than versions.
-
-## SEE ALSO
-
-* npm-install(1)
-* package.json(5)
-* npm-ls(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-star.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-npm-star(1) -- Mark your favorite packages
-==========================================
-
-## SYNOPSIS
-
-    npm star <pkgname> [<pkg>, ...]
-    npm unstar <pkgname> [<pkg>, ...]
-
-## DESCRIPTION
-
-"Starring" a package means that you have some interest in it.  It's
-a vaguely positive way to show that you care.
-
-"Unstarring" is the same thing, but in reverse.
-
-It's a boolean thing.  Starring repeatedly has no additional effect.
-
-## SEE ALSO
-
-* npm-view(1)
-* npm-whoami(1)
-* npm-adduser(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-stars.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-npm-stars(1) -- View packages marked as favorites
-=================================================
-
-## SYNOPSIS
-
-    npm stars
-    npm stars [username]
-
-## DESCRIPTION
-
-If you have starred a lot of neat things and want to find them again
-quickly this command lets you do just that.
-
-You may also want to see your friend's favorite packages, in this case
-you will most certainly enjoy this command.
-
-## SEE ALSO
-
-* npm-star(1)
-* npm-view(1)
-* npm-whoami(1)
-* npm-adduser(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-start.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-npm-start(1) -- Start a package
-===============================
-
-## SYNOPSIS
-
-    npm start <name>
-
-## DESCRIPTION
-
-This runs a package's "start" script, if one was provided.
-
-## SEE ALSO
-
-* npm-run-script(1)
-* npm-scripts(7)
-* npm-test(1)
-* npm-restart(1)
-* npm-stop(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-stop.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-npm-stop(1) -- Stop a package
-=============================
-
-## SYNOPSIS
-
-    npm stop <name>
-
-## DESCRIPTION
-
-This runs a package's "stop" script, if one was provided.
-
-## SEE ALSO
-
-* npm-run-script(1)
-* npm-scripts(7)
-* npm-test(1)
-* npm-start(1)
-* npm-restart(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-submodule.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-npm-submodule(1) -- Add a package as a git submodule
-====================================================
-
-## SYNOPSIS
-
-    npm submodule <pkg>
-
-## DESCRIPTION
-
-If the specified package has a git repository url in its package.json
-description, then this command will add it as a git submodule at
-`node_modules/<pkg name>`.
-
-This is a convenience only.  From then on, it's up to you to manage
-updates by using the appropriate git commands.  npm will stubbornly
-refuse to update, modify, or remove anything with a `.git` subfolder
-in it.
-
-This command also does not install missing dependencies, if the package
-does not include them in its git repository.  If `npm ls` reports that
-things are missing, you can either install, link, or submodule them yourself,
-or you can do `npm explore <pkgname> -- npm install` to install the
-dependencies into the submodule folder.
-
-## SEE ALSO
-
-* package.json(5)
-* git help submodule
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-tag.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,34 +0,0 @@
-npm-tag(1) -- Tag a published version
-=====================================
-
-## SYNOPSIS
-
-    npm tag <name>@<version> [<tag>]
-
-## DESCRIPTION
-
-Tags the specified version of the package with the specified tag, or the
-`--tag` config if not specified.
-
-A tag can be used when installing packages as a reference to a version instead
-of using a specific version number:
-
-    npm install <name>@<tag>
-
-When installing dependencies, a preferred tagged version may be specified:
-
-    npm install --tag <tag>
-
-This also applies to `npm dedupe`.
-
-Publishing a package always sets the "latest" tag to the published version.
-
-## SEE ALSO
-
-* npm-publish(1)
-* npm-install(1)
-* npm-dedupe(1)
-* npm-registry(7)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-test.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-npm-test(1) -- Test a package
-=============================
-
-## SYNOPSIS
-
-      npm test <name>
-      npm tst <name>
-
-## DESCRIPTION
-
-This runs a package's "test" script, if one was provided.
-
-To run tests as a condition of installation, set the `npat` config to
-true.
-
-## SEE ALSO
-
-* npm-run-script(1)
-* npm-scripts(7)
-* npm-start(1)
-* npm-restart(1)
-* npm-stop(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-uninstall.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-npm-rm(1) -- Remove a package
-=============================
-
-## SYNOPSIS
-
-    npm rm <name>
-    npm uninstall <name>
-
-## DESCRIPTION
-
-This uninstalls a package, completely removing everything npm installed
-on its behalf.
-
-## SEE ALSO
-
-* npm-prune(1)
-* npm-install(1)
-* npm-folders(5)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-unpublish.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-npm-unpublish(1) -- Remove a package from the registry
-======================================================
-
-## SYNOPSIS
-
-    npm unpublish <name>[@<version>]
-
-## WARNING
-
-**It is generally considered bad behavior to remove versions of a library
-that others are depending on!**
-
-Consider using the `deprecate` command
-instead, if your intent is to encourage users to upgrade.
-
-There is plenty of room on the registry.
-
-## DESCRIPTION
-
-This removes a package version from the registry, deleting its
-entry and removing the tarball.
-
-If no version is specified, or if all versions are removed then
-the root package entry is removed from the registry entirely.
-
-## SEE ALSO
-
-* npm-deprecate(1)
-* npm-publish(1)
-* npm-registry(7)
-* npm-adduser(1)
-* npm-owner(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-update.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-npm-update(1) -- Update a package
-=================================
-
-## SYNOPSIS
-
-    npm update [-g] [<name> [<name> ...]]
-
-## DESCRIPTION
-
-This command will update all the packages listed to the latest version
-(specified by the `tag` config).
-
-It will also install missing packages.
-
-If the `-g` flag is specified, this command will update globally installed packages.
-If no package name is specified, all packages in the specified location (global or local) will be updated.
-
-## SEE ALSO
-
-* npm-install(1)
-* npm-outdated(1)
-* npm-registry(7)
-* npm-folders(5)
-* npm-ls(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-version.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,45 +0,0 @@
-npm-version(1) -- Bump a package version
-========================================
-
-## SYNOPSIS
-
-    npm version [<newversion> | major | minor | patch | build]
-
-## DESCRIPTION
-
-Run this in a package directory to bump the version and write the new
-data back to the package.json file.
-
-The `newversion` argument should be a valid semver string, *or* a valid
-second argument to semver.inc (one of "build", "patch", "minor", or
-"major"). In the second case, the existing version will be incremented
-by 1 in the specified field.
-
-If run in a git repo, it will also create a version commit and tag, and
-fail if the repo is not clean.
-
-If supplied with `--message` (shorthand: `-m`) config option, npm will
-use it as a commit message when creating a version commit.  If the
-`message` config contains `%s` then that will be replaced with the
-resulting version number.  For example:
-
-    npm version patch -m "Upgrade to %s for reasons"
-
-If the `sign-git-tag` config is set, then the tag will be signed using
-the `-s` flag to git.  Note that you must have a default GPG key set up
-in your git config for this to work properly.  For example:
-
-    $ npm config set sign-git-tag true
-    $ npm version patch
-
-    You need a passphrase to unlock the secret key for
-    user: "isaacs (http://blog.izs.me/) <i@izs.me>"
-    2048-bit RSA key, ID 6C481CF6, created 2010-08-31
-
-    Enter passphrase:
-
-## SEE ALSO
-
-* npm-init(1)
-* package.json(5)
-* semver(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-view.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,90 +0,0 @@
-npm-view(1) -- View registry info
-=================================
-
-## SYNOPSIS
-
-    npm view <name>[@<version>] [<field>[.<subfield>]...]
-    npm v <name>[@<version>] [<field>[.<subfield>]...]
-
-## DESCRIPTION
-
-This command shows data about a package and prints it to the stream
-referenced by the `outfd` config, which defaults to stdout.
-
-To show the package registry entry for the `connect` package, you can do
-this:
-
-    npm view connect
-
-The default version is "latest" if unspecified.
-
-Field names can be specified after the package descriptor.
-For example, to show the dependencies of the `ronn` package at version
-0.3.5, you could do the following:
-
-    npm view ronn@0.3.5 dependencies
-
-You can view child field by separating them with a period.
-To view the git repository URL for the latest version of npm, you could
-do this:
-
-    npm view npm repository.url
-
-This makes it easy to view information about a dependency with a bit of
-shell scripting.  For example, to view all the data about the version of
-opts that ronn depends on, you can do this:
-
-    npm view opts@$(npm view ronn dependencies.opts)
-
-For fields that are arrays, requesting a non-numeric field will return
-all of the values from the objects in the list.  For example, to get all
-the contributor names for the "express" project, you can do this:
-
-    npm view express contributors.email
-
-You may also use numeric indices in square braces to specifically select
-an item in an array field.  To just get the email address of the first
-contributor in the list, you can do this:
-
-    npm view express contributors[0].email
-
-Multiple fields may be specified, and will be printed one after another.
-For exampls, to get all the contributor names and email addresses, you
-can do this:
-
-    npm view express contributors.name contributors.email
-
-"Person" fields are shown as a string if they would be shown as an
-object.  So, for example, this will show the list of npm contributors in
-the shortened string format.  (See `package.json(5)` for more on this.)
-
-    npm view npm contributors
-
-If a version range is provided, then data will be printed for every
-matching version of the package.  This will show which version of jsdom
-was required by each matching version of yui3:
-
-    npm view yui3@'>0.5.4' dependencies.jsdom
-
-## OUTPUT
-
-If only a single string field for a single version is output, then it
-will not be colorized or quoted, so as to enable piping the output to
-another command. If the field is an object, it will be output as a JavaScript object literal.
-
-If the --json flag is given, the outputted fields will be JSON.
-
-If the version range matches multiple versions, than each printed value
-will be prefixed with the version it applies to.
-
-If multiple fields are requested, than each of them are prefixed with
-the field name.
-
-## SEE ALSO
-
-* npm-search(1)
-* npm-registry(7)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-docs(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm-whoami.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,17 +0,0 @@
-npm-whoami(1) -- Display npm username
-=====================================
-
-## SYNOPSIS
-
-    npm whoami
-
-## DESCRIPTION
-
-Print the `username` config to standard output.
-
-## SEE ALSO
-
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-adduser(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/npm.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,157 +0,0 @@
-npm(1) -- node package manager
-==============================
-
-## SYNOPSIS
-
-    npm <command> [args]
-
-## VERSION
-
-@VERSION@
-
-## DESCRIPTION
-
-npm is the package manager for the Node JavaScript platform.  It puts
-modules in place so that node can find them, and manages dependency
-conflicts intelligently.
-
-It is extremely configurable to support a wide variety of use cases.
-Most commonly, it is used to publish, discover, install, and develop node
-programs.
-
-Run `npm help` to get a list of available commands.
-
-## INTRODUCTION
-
-You probably got npm because you want to install stuff.
-
-Use `npm install blerg` to install the latest version of "blerg".  Check out
-`npm-install(1)` for more info.  It can do a lot of stuff.
-
-Use the `npm search` command to show everything that's available.
-Use `npm ls` to show everything you've installed.
-
-## DIRECTORIES
-
-See `npm-folders(5)` to learn about where npm puts stuff.
-
-In particular, npm has two modes of operation:
-
-* global mode:  
-  npm installs packages into the install prefix at
-  `prefix/lib/node_modules` and bins are installed in `prefix/bin`.
-* local mode:  
-  npm installs packages into the current project directory, which
-  defaults to the current working directory.  Packages are installed to
-  `./node_modules`, and bins are installed to `./node_modules/.bin`.
-
-Local mode is the default.  Use `--global` or `-g` on any command to
-operate in global mode instead.
-
-## DEVELOPER USAGE
-
-If you're using npm to develop and publish your code, check out the
-following help topics:
-
-* json:
-  Make a package.json file.  See `package.json(5)`.
-* link:
-  For linking your current working code into Node's path, so that you
-  don't have to reinstall every time you make a change.  Use
-  `npm link` to do this.
-* install:
-  It's a good idea to install things if you don't need the symbolic link.
-  Especially, installing other peoples code from the registry is done via
-  `npm install`
-* adduser:
-  Create an account or log in.  Credentials are stored in the
-  user config file.
-* publish:
-  Use the `npm publish` command to upload your code to the registry.
-
-## CONFIGURATION
-
-npm is extremely configurable.  It reads its configuration options from
-5 places.
-
-* Command line switches:  
-  Set a config with `--key val`.  All keys take a value, even if they
-  are booleans (the config parser doesn't know what the options are at
-  the time of parsing.)  If no value is provided, then the option is set
-  to boolean `true`.
-* Environment Variables:  
-  Set any config by prefixing the name in an environment variable with
-  `npm_config_`.  For example, `export npm_config_key=val`.
-* User Configs:  
-  The file at $HOME/.npmrc is an ini-formatted list of configs.  If
-  present, it is parsed.  If the `userconfig` option is set in the cli
-  or env, then that will be used instead.
-* Global Configs:  
-  The file found at ../etc/npmrc (from the node executable, by default
-  this resolves to /usr/local/etc/npmrc) will be parsed if it is found.
-  If the `globalconfig` option is set in the cli, env, or user config,
-  then that file is parsed instead.
-* Defaults:  
-  npm's default configuration options are defined in
-  lib/utils/config-defs.js.  These must not be changed.
-
-See `npm-config(7)` for much much more information.
-
-## CONTRIBUTIONS
-
-Patches welcome!
-
-* code:
-  Read through `npm-coding-style(7)` if you plan to submit code.
-  You don't have to agree with it, but you do have to follow it.
-* docs:
-  If you find an error in the documentation, edit the appropriate markdown
-  file in the "doc" folder.  (Don't worry about generating the man page.)
-
-Contributors are listed in npm's `package.json` file.  You can view them
-easily by doing `npm view npm contributors`.
-
-If you would like to contribute, but don't know what to work on, check
-the issues list or ask on the mailing list.
-
-* <http://github.com/isaacs/npm/issues>
-* <npm-@googlegroups.com>
-
-## BUGS
-
-When you find issues, please report them:
-
-* web:
-  <http://github.com/isaacs/npm/issues>
-* email:
-  <npm-@googlegroups.com>
-
-Be sure to include *all* of the output from the npm command that didn't work
-as expected.  The `npm-debug.log` file is also helpful to provide.
-
-You can also look for isaacs in #node.js on irc://irc.freenode.net.  He
-will no doubt tell you to put the output in a gist or email.
-
-## HISTORY
-
-See npm-changelog(1)
-
-## AUTHOR
-
-[Isaac Z. Schlueter](http://blog.izs.me/) ::
-[isaacs](https://github.com/isaacs/) ::
-[@izs](http://twitter.com/izs) ::
-<i@izs.me>
-
-## SEE ALSO
-
-* npm-help(1)
-* npm-faq(7)
-* README
-* package.json(5)
-* npm-install(1)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-index(7)
-* npm(3)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/cli/repo.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-npm-repo(1) -- Open package repository page in the browser
-========================================================
-
-## SYNOPSIS
-
-    npm repo <pkgname>
-
-## DESCRIPTION
-
-This command tries to guess at the likely location of a package's
-repository URL, and then tries to open it using the `--browser`
-config param.
-
-## CONFIGURATION
-
-### browser
-
-* Default: OS X: `"open"`, Windows: `"start"`, Others: `"xdg-open"`
-* Type: String
-
-The browser that is called by the `npm repo` command to open websites.
-
-## SEE ALSO
-
-* npm-docs(1)
-* npm-config(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/files/npm-folders.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,211 +0,0 @@
-npm-folders(5) -- Folder Structures Used by npm
-===============================================
-
-## DESCRIPTION
-
-npm puts various things on your computer.  That's its job.
-
-This document will tell you what it puts where.
-
-### tl;dr
-
-* Local install (default): puts stuff in `./node_modules` of the current
-  package root.
-* Global install (with `-g`): puts stuff in /usr/local or wherever node
-  is installed.
-* Install it **locally** if you're going to `require()` it.
-* Install it **globally** if you're going to run it on the command line.
-* If you need both, then install it in both places, or use `npm link`.
-
-### prefix Configuration
-
-The `prefix` config defaults to the location where node is installed.
-On most systems, this is `/usr/local`, and most of the time is the same
-as node's `process.installPrefix`.
-
-On windows, this is the exact location of the node.exe binary.  On Unix
-systems, it's one level up, since node is typically installed at
-`{prefix}/bin/node` rather than `{prefix}/node.exe`.
-
-When the `global` flag is set, npm installs things into this prefix.
-When it is not set, it uses the root of the current package, or the
-current working directory if not in a package already.
-
-### Node Modules
-
-Packages are dropped into the `node_modules` folder under the `prefix`.
-When installing locally, this means that you can
-`require("packagename")` to load its main module, or
-`require("packagename/lib/path/to/sub/module")` to load other modules.
-
-Global installs on Unix systems go to `{prefix}/lib/node_modules`.
-Global installs on Windows go to `{prefix}/node_modules` (that is, no
-`lib` folder.)
-
-If you wish to `require()` a package, then install it locally.
-
-### Executables
-
-When in global mode, executables are linked into `{prefix}/bin` on Unix,
-or directly into `{prefix}` on Windows.
-
-When in local mode, executables are linked into
-`./node_modules/.bin` so that they can be made available to scripts run
-through npm.  (For example, so that a test runner will be in the path
-when you run `npm test`.)
-
-### Man Pages
-
-When in global mode, man pages are linked into `{prefix}/share/man`.
-
-When in local mode, man pages are not installed.
-
-Man pages are not installed on Windows systems.
-
-### Cache
-
-See `npm-cache(1)`.  Cache files are stored in `~/.npm` on Posix, or
-`~/npm-cache` on Windows.
-
-This is controlled by the `cache` configuration param.
-
-### Temp Files
-
-Temporary files are stored by default in the folder specified by the
-`tmp` config, which defaults to the TMPDIR, TMP, or TEMP environment
-variables, or `/tmp` on Unix and `c:\windows\temp` on Windows.
-
-Temp files are given a unique folder under this root for each run of the
-program, and are deleted upon successful exit.
-
-## More Information
-
-When installing locally, npm first tries to find an appropriate
-`prefix` folder.  This is so that `npm install foo@1.2.3` will install
-to the sensible root of your package, even if you happen to have `cd`ed
-into some other folder.
-
-Starting at the $PWD, npm will walk up the folder tree checking for a
-folder that contains either a `package.json` file, or a `node_modules`
-folder.  If such a thing is found, then that is treated as the effective
-"current directory" for the purpose of running npm commands.  (This
-behavior is inspired by and similar to git's .git-folder seeking
-logic when running git commands in a working dir.)
-
-If no package root is found, then the current folder is used.
-
-When you run `npm install foo@1.2.3`, then the package is loaded into
-the cache, and then unpacked into `./node_modules/foo`.  Then, any of
-foo's dependencies are similarly unpacked into
-`./node_modules/foo/node_modules/...`.
-
-Any bin files are symlinked to `./node_modules/.bin/`, so that they may
-be found by npm scripts when necessary.
-
-### Global Installation
-
-If the `global` configuration is set to true, then npm will
-install packages "globally".
-
-For global installation, packages are installed roughly the same way,
-but using the folders described above.
-
-### Cycles, Conflicts, and Folder Parsimony
-
-Cycles are handled using the property of node's module system that it
-walks up the directories looking for `node_modules` folders.  So, at every
-stage, if a package is already installed in an ancestor `node_modules`
-folder, then it is not installed at the current location.
-
-Consider the case above, where `foo -> bar -> baz`.  Imagine if, in
-addition to that, baz depended on bar, so you'd have:
-`foo -> bar -> baz -> bar -> baz ...`.  However, since the folder
-structure is: `foo/node_modules/bar/node_modules/baz`, there's no need to
-put another copy of bar into `.../baz/node_modules`, since when it calls
-require("bar"), it will get the copy that is installed in
-`foo/node_modules/bar`.
-
-This shortcut is only used if the exact same
-version would be installed in multiple nested `node_modules` folders.  It
-is still possible to have `a/node_modules/b/node_modules/a` if the two
-"a" packages are different versions.  However, without repeating the
-exact same package multiple times, an infinite regress will always be
-prevented.
-
-Another optimization can be made by installing dependencies at the
-highest level possible, below the localized "target" folder.
-
-#### Example
-
-Consider this dependency graph:
-
-    foo
-    +-- blerg@1.2.5
-    +-- bar@1.2.3
-    |   +-- blerg@1.x (latest=1.3.7)
-    |   +-- baz@2.x
-    |   |   `-- quux@3.x
-    |   |       `-- bar@1.2.3 (cycle)
-    |   `-- asdf@*
-    `-- baz@1.2.3
-        `-- quux@3.x
-            `-- bar
-
-In this case, we might expect a folder structure like this:
-
-    foo
-    +-- node_modules
-        +-- blerg (1.2.5) <---[A]
-        +-- bar (1.2.3) <---[B]
-        |   `-- node_modules
-        |       +-- baz (2.0.2) <---[C]
-        |       |   `-- node_modules
-        |       |       `-- quux (3.2.0)
-        |       `-- asdf (2.3.4)
-        `-- baz (1.2.3) <---[D]
-            `-- node_modules
-                `-- quux (3.2.0) <---[E]
-
-Since foo depends directly on `bar@1.2.3` and `baz@1.2.3`, those are
-installed in foo's `node_modules` folder.
-
-Even though the latest copy of blerg is 1.3.7, foo has a specific
-dependency on version 1.2.5.  So, that gets installed at [A].  Since the
-parent installation of blerg satisfies bar's dependency on `blerg@1.x`,
-it does not install another copy under [B].
-
-Bar [B] also has dependencies on baz and asdf, so those are installed in
-bar's `node_modules` folder.  Because it depends on `baz@2.x`, it cannot
-re-use the `baz@1.2.3` installed in the parent `node_modules` folder [D],
-and must install its own copy [C].
-
-Underneath bar, the `baz -> quux -> bar` dependency creates a cycle.
-However, because bar is already in quux's ancestry [B], it does not
-unpack another copy of bar into that folder.
-
-Underneath `foo -> baz` [D], quux's [E] folder tree is empty, because its
-dependency on bar is satisfied by the parent folder copy installed at [B].
-
-For a graphical breakdown of what is installed where, use `npm ls`.
-
-### Publishing
-
-Upon publishing, npm will look in the `node_modules` folder.  If any of
-the items there are not in the `bundledDependencies` array, then they will
-not be included in the package tarball.
-
-This allows a package maintainer to install all of their dependencies
-(and dev dependencies) locally, but only re-publish those items that
-cannot be found elsewhere.  See `package.json(5)` for more information.
-
-## SEE ALSO
-
-* npm-faq(7)
-* package.json(5)
-* npm-install(1)
-* npm-pack(1)
-* npm-cache(1)
-* npm-config(1)
-* npmrc(5)
-* npm-config(7)
-* npm-publish(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/files/npmrc.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-npmrc(5) -- The npm config files
-================================
-
-## DESCRIPTION
-
-npm gets its config settings from the command line, environment
-variables, and `npmrc` files.
-
-The `npm config` command can be used to update and edit the contents
-of the user and global npmrc files.
-
-For a list of available configuration options, see npm-config(7).
-
-## FILES
-
-The three relevant files are:
-
-* per-user config file (~/.npmrc)
-* global config file ($PREFIX/npmrc)
-* npm builtin config file (/path/to/npm/npmrc)
-
-All npm config files are an ini-formatted list of `key = value`
-parameters.  Environment variables can be replaced using
-`${VARIABLE_NAME}`. For example:
-
-    prefix = ${HOME}/.npm-packages
-
-Each of these files is loaded, and config options are resolved in
-priority order.  For example, a setting in the userconfig file would
-override the setting in the globalconfig file.
-
-### Per-user config file
-
-`$HOME/.npmrc` (or the `userconfig` param, if set in the environment
-or on the command line)
-
-### Global config file
-
-`$PREFIX/etc/npmrc` (or the `globalconfig` param, if set above):
-This file is an ini-file formatted list of `key = value` parameters.
-Environment variables can be replaced as above.
-
-### Built-in config file
-
-`path/to/npm/itself/npmrc`
-
-This is an unchangeable "builtin" configuration file that npm keeps
-consistent across updates.  Set fields in here using the `./configure`
-script that comes with npm.  This is primarily for distribution
-maintainers to override default configs in a standard and consistent
-manner.
-
-## SEE ALSO
-
-* npm-folders(5)
-* npm-config(1)
-* npm-config(7)
-* package.json(5)
-* npm(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/files/package.json.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,584 +0,0 @@
-package.json(5) -- Specifics of npm's package.json handling
-===========================================================
-
-## DESCRIPTION
-
-This document is all you need to know about what's required in your package.json
-file.  It must be actual JSON, not just a JavaScript object literal.
-
-A lot of the behavior described in this document is affected by the config
-settings described in `npm-config(7)`.
-
-## DEFAULT VALUES
-
-npm will default some values based on package contents.
-
-* `"scripts": {"start": "node server.js"}`
-
-  If there is a `server.js` file in the root of your package, then npm
-  will default the `start` command to `node server.js`.
-
-* `"scripts":{"preinstall": "node-waf clean || true; node-waf configure build"}`
-
-  If there is a `wscript` file in the root of your package, npm will
-  default the `preinstall` command to compile using node-waf.
-
-* `"scripts":{"preinstall": "node-gyp rebuild"}`
-
-  If there is a `binding.gyp` file in the root of your package, npm will
-  default the `preinstall` command to compile using node-gyp.
-
-* `"contributors": [...]`
-
-  If there is an `AUTHORS` file in the root of your package, npm will
-  treat each line as a `Name <email> (url)` format, where email and url
-  are optional.  Lines which start with a `#` or are blank, will be
-  ignored.
-
-## name
-
-The *most* important things in your package.json are the name and version fields.
-Those are actually required, and your package won't install without
-them.  The name and version together form an identifier that is assumed
-to be completely unique.  Changes to the package should come along with
-changes to the version.
-
-The name is what your thing is called.  Some tips:
-
-* Don't put "js" or "node" in the name.  It's assumed that it's js, since you're
-  writing a package.json file, and you can specify the engine using the "engines"
-  field.  (See below.)
-* The name ends up being part of a URL, an argument on the command line, and a
-  folder name. Any name with non-url-safe characters will be rejected.
-  Also, it can't start with a dot or an underscore.
-* The name will probably be passed as an argument to require(), so it should
-  be something short, but also reasonably descriptive.
-* You may want to check the npm registry to see if there's something by that name
-  already, before you get too attached to it.  http://registry.npmjs.org/
-
-## version
-
-The *most* important things in your package.json are the name and version fields.
-Those are actually required, and your package won't install without
-them.  The name and version together form an identifier that is assumed
-to be completely unique.  Changes to the package should come along with
-changes to the version.
-
-Version must be parseable by
-[node-semver](https://github.com/isaacs/node-semver), which is bundled
-with npm as a dependency.  (`npm install semver` to use it yourself.)
-
-More on version numbers and ranges at semver(7).
-
-## description
-
-Put a description in it.  It's a string.  This helps people discover your
-package, as it's listed in `npm search`.
-
-## keywords
-
-Put keywords in it.  It's an array of strings.  This helps people
-discover your package as it's listed in `npm search`.
-
-## homepage
-
-The url to the project homepage.
-
-**NOTE**: This is *not* the same as "url".  If you put a "url" field,
-then the registry will think it's a redirection to your package that has
-been published somewhere else, and spit at you.
-
-Literally.  Spit.  I'm so not kidding.
-
-## bugs
-
-The url to your project's issue tracker and / or the email address to which
-issues should be reported. These are helpful for people who encounter issues
-with your package.
-
-It should look like this:
-
-    { "url" : "http://github.com/owner/project/issues"
-    , "email" : "project@hostname.com"
-    }
-
-You can specify either one or both values. If you want to provide only a url,
-you can specify the value for "bugs" as a simple string instead of an object.
-
-If a url is provided, it will be used by the `npm bugs` command.
-
-## license
-
-You should specify a license for your package so that people know how they are
-permitted to use it, and any restrictions you're placing on it.
-
-The simplest way, assuming you're using a common license such as BSD or MIT, is
-to just specify the name of the license you're using, like this:
-
-    { "license" : "BSD" }
-
-If you have more complex licensing terms, or you want to provide more detail
-in your package.json file, you can use the more verbose plural form, like this:
-
-    "licenses" : [
-      { "type" : "MyLicense"
-      , "url" : "http://github.com/owner/project/path/to/license"
-      }
-    ]
-
-It's also a good idea to include a license file at the top level in your package.
-
-## people fields: author, contributors
-
-The "author" is one person.  "contributors" is an array of people.  A "person"
-is an object with a "name" field and optionally "url" and "email", like this:
-
-    { "name" : "Barney Rubble"
-    , "email" : "b@rubble.com"
-    , "url" : "http://barnyrubble.tumblr.com/"
-    }
-
-Or you can shorten that all into a single string, and npm will parse it for you:
-
-    "Barney Rubble <b@rubble.com> (http://barnyrubble.tumblr.com/)
-
-Both email and url are optional either way.
-
-npm also sets a top-level "maintainers" field with your npm user info.
-
-## files
-
-The "files" field is an array of files to include in your project.  If
-you name a folder in the array, then it will also include the files
-inside that folder. (Unless they would be ignored by another rule.)
-
-You can also provide a ".npmignore" file in the root of your package,
-which will keep files from being included, even if they would be picked
-up by the files array.  The ".npmignore" file works just like a
-".gitignore".
-
-## main
-
-The main field is a module ID that is the primary entry point to your program.
-That is, if your package is named `foo`, and a user installs it, and then does
-`require("foo")`, then your main module's exports object will be returned.
-
-This should be a module ID relative to the root of your package folder.
-
-For most modules, it makes the most sense to have a main script and often not
-much else.
-
-## bin
-
-A lot of packages have one or more executable files that they'd like to
-install into the PATH. npm makes this pretty easy (in fact, it uses this
-feature to install the "npm" executable.)
-
-To use this, supply a `bin` field in your package.json which is a map of
-command name to local file name. On install, npm will symlink that file into
-`prefix/bin` for global installs, or `./node_modules/.bin/` for local
-installs.
-
-
-For example, npm has this:
-
-    { "bin" : { "npm" : "./cli.js" } }
-
-So, when you install npm, it'll create a symlink from the `cli.js` script to
-`/usr/local/bin/npm`.
-
-If you have a single executable, and its name should be the name
-of the package, then you can just supply it as a string.  For example:
-
-    { "name": "my-program"
-    , "version": "1.2.5"
-    , "bin": "./path/to/program" }
-
-would be the same as this:
-
-    { "name": "my-program"
-    , "version": "1.2.5"
-    , "bin" : { "my-program" : "./path/to/program" } }
-
-## man
-
-Specify either a single file or an array of filenames to put in place for the
-`man` program to find.
-
-If only a single file is provided, then it's installed such that it is the
-result from `man <pkgname>`, regardless of its actual filename.  For example:
-
-    { "name" : "foo"
-    , "version" : "1.2.3"
-    , "description" : "A packaged foo fooer for fooing foos"
-    , "main" : "foo.js"
-    , "man" : "./man/doc.1"
-    }
-
-would link the `./man/doc.1` file in such that it is the target for `man foo`
-
-If the filename doesn't start with the package name, then it's prefixed.
-So, this:
-
-    { "name" : "foo"
-    , "version" : "1.2.3"
-    , "description" : "A packaged foo fooer for fooing foos"
-    , "main" : "foo.js"
-    , "man" : [ "./man/foo.1", "./man/bar.1" ]
-    }
-
-will create files to do `man foo` and `man foo-bar`.
-
-Man files must end with a number, and optionally a `.gz` suffix if they are
-compressed.  The number dictates which man section the file is installed into.
-
-    { "name" : "foo"
-    , "version" : "1.2.3"
-    , "description" : "A packaged foo fooer for fooing foos"
-    , "main" : "foo.js"
-    , "man" : [ "./man/foo.1", "./man/foo.2" ]
-    }
-
-will create entries for `man foo` and `man 2 foo`
-
-## directories
-
-The CommonJS [Packages](http://wiki.commonjs.org/wiki/Packages/1.0) spec details a
-few ways that you can indicate the structure of your package using a `directories`
-hash. If you look at [npm's package.json](http://registry.npmjs.org/npm/latest),
-you'll see that it has directories for doc, lib, and man.
-
-In the future, this information may be used in other creative ways.
-
-### directories.lib
-
-Tell people where the bulk of your library is.  Nothing special is done
-with the lib folder in any way, but it's useful meta info.
-
-### directories.bin
-
-If you specify a "bin" directory, then all the files in that folder will
-be used as the "bin" hash.
-
-If you have a "bin" hash already, then this has no effect.
-
-### directories.man
-
-A folder that is full of man pages.  Sugar to generate a "man" array by
-walking the folder.
-
-### directories.doc
-
-Put markdown files in here.  Eventually, these will be displayed nicely,
-maybe, someday.
-
-### directories.example
-
-Put example scripts in here.  Someday, it might be exposed in some clever way.
-
-## repository
-
-Specify the place where your code lives. This is helpful for people who
-want to contribute.  If the git repo is on github, then the `npm docs`
-command will be able to find you.
-
-Do it like this:
-
-    "repository" :
-      { "type" : "git"
-      , "url" : "http://github.com/isaacs/npm.git"
-      }
-
-    "repository" :
-      { "type" : "svn"
-      , "url" : "http://v8.googlecode.com/svn/trunk/"
-      }
-
-The URL should be a publicly available (perhaps read-only) url that can be handed
-directly to a VCS program without any modification.  It should not be a url to an
-html project page that you put in your browser.  It's for computers.
-
-## scripts
-
-The "scripts" member is an object hash of script commands that are run
-at various times in the lifecycle of your package.  The key is the lifecycle
-event, and the value is the command to run at that point.
-
-See `npm-scripts(7)` to find out more about writing package scripts.
-
-## config
-
-A "config" hash can be used to set configuration
-parameters used in package scripts that persist across upgrades.  For
-instance, if a package had the following:
-
-    { "name" : "foo"
-    , "config" : { "port" : "8080" } }
-
-and then had a "start" command that then referenced the
-`npm_package_config_port` environment variable, then the user could
-override that by doing `npm config set foo:port 8001`.
-
-See `npm-config(7)` and `npm-scripts(7)` for more on package
-configs.
-
-## dependencies
-
-Dependencies are specified with a simple hash of package name to
-version range. The version range is a string which has one or more
-space-separated descriptors.  Dependencies can also be identified with
-a tarball or git URL.
-
-**Please do not put test harnesses or transpilers in your
-`dependencies` hash.**  See `devDependencies`, below.
-
-See semver(7) for more details about specifying version ranges.
-
-* `version` Must match `version` exactly
-* `>version` Must be greater than `version`
-* `>=version` etc
-* `<version`
-* `<=version`
-* `~version` "Approximately equivalent to version"  See semver(7)
-* `1.2.x` 1.2.0, 1.2.1, etc., but not 1.3.0
-* `http://...` See 'URLs as Dependencies' below
-* `*` Matches any version
-* `""` (just an empty string) Same as `*`
-* `version1 - version2` Same as `>=version1 <=version2`.
-* `range1 || range2` Passes if either range1 or range2 are satisfied.
-* `git...` See 'Git URLs as Dependencies' below
-* `user/repo` See 'GitHub URLs' below
-
-For example, these are all valid:
-
-    { "dependencies" :
-      { "foo" : "1.0.0 - 2.9999.9999"
-      , "bar" : ">=1.0.2 <2.1.2"
-      , "baz" : ">1.0.2 <=2.3.4"
-      , "boo" : "2.0.1"
-      , "qux" : "<1.0.0 || >=2.3.1 <2.4.5 || >=2.5.2 <3.0.0"
-      , "asd" : "http://asdf.com/asdf.tar.gz"
-      , "til" : "~1.2"
-      , "elf" : "~1.2.3"
-      , "two" : "2.x"
-      , "thr" : "3.3.x"
-      }
-    }
-
-### URLs as Dependencies
-
-You may specify a tarball URL in place of a version range.
-
-This tarball will be downloaded and installed locally to your package at
-install time.
-
-### Git URLs as Dependencies
-
-Git urls can be of the form:
-
-    git://github.com/user/project.git#commit-ish
-    git+ssh://user@hostname:project.git#commit-ish
-    git+ssh://user@hostname/project.git#commit-ish
-    git+http://user@hostname/project/blah.git#commit-ish
-    git+https://user@hostname/project/blah.git#commit-ish
-
-The `commit-ish` can be any tag, sha, or branch which can be supplied as
-an argument to `git checkout`.  The default is `master`.
-
-## GitHub URLs
-
-As of version 1.1.65, you can refer to GitHub urls as just "foo": "user/foo-project". For example:
-
-```json
-{
-  "name": "foo",
-  "version": "0.0.0",
-  "dependencies": {
-    "express": "visionmedia/express"
-  }
-}
-```
-
-## devDependencies
-
-If someone is planning on downloading and using your module in their
-program, then they probably don't want or need to download and build
-the external test or documentation framework that you use.
-
-In this case, it's best to list these additional items in a
-`devDependencies` hash.
-
-These things will be installed when doing `npm link` or `npm install`
-from the root of a package, and can be managed like any other npm
-configuration param.  See `npm-config(7)` for more on the topic.
-
-For build steps that are not platform-specific, such as compiling
-CoffeeScript or other languages to JavaScript, use the `prepublish`
-script to do this, and make the required package a devDependency.
-
-For example:
-
-```json
-{ "name": "ethopia-waza",
-  "description": "a delightfully fruity coffee varietal",
-  "version": "1.2.3",
-  "devDependencies": {
-    "coffee-script": "~1.6.3"
-  },
-  "scripts": {
-    "prepublish": "coffee -o lib/ -c src/waza.coffee"
-  },
-  "main": "lib/waza.js"
-}
-```
-
-The `prepublish` script will be run before publishing, so that users
-can consume the functionality without requiring them to compile it
-themselves.  In dev mode (ie, locally running `npm install`), it'll
-run this script as well, so that you can test it easily.
-
-## bundledDependencies
-
-Array of package names that will be bundled when publishing the package.
-
-If this is spelled `"bundleDependencies"`, then that is also honorable.
-
-## optionalDependencies
-
-If a dependency can be used, but you would like npm to proceed if it
-cannot be found or fails to install, then you may put it in the
-`optionalDependencies` hash.  This is a map of package name to version
-or url, just like the `dependencies` hash.  The difference is that
-failure is tolerated.
-
-It is still your program's responsibility to handle the lack of the
-dependency.  For example, something like this:
-
-    try {
-      var foo = require('foo')
-      var fooVersion = require('foo/package.json').version
-    } catch (er) {
-      foo = null
-    }
-    if ( notGoodFooVersion(fooVersion) ) {
-      foo = null
-    }
-
-    // .. then later in your program ..
-
-    if (foo) {
-      foo.doFooThings()
-    }
-
-Entries in `optionalDependencies` will override entries of the same name in
-`dependencies`, so it's usually best to only put in one place.
-
-## engines
-
-You can specify the version of node that your stuff works on:
-
-    { "engines" : { "node" : ">=0.10.3 <0.12" } }
-
-And, like with dependencies, if you don't specify the version (or if you
-specify "\*" as the version), then any version of node will do.
-
-If you specify an "engines" field, then npm will require that "node" be
-somewhere on that list. If "engines" is omitted, then npm will just assume
-that it works on node.
-
-You can also use the "engines" field to specify which versions of npm
-are capable of properly installing your program.  For example:
-
-    { "engines" : { "npm" : "~1.0.20" } }
-
-Note that, unless the user has set the `engine-strict` config flag, this
-field is advisory only.
-
-## engineStrict
-
-If you are sure that your module will *definitely not* run properly on
-versions of Node/npm other than those specified in the `engines` hash,
-then you can set `"engineStrict": true` in your package.json file.
-This will override the user's `engine-strict` config setting.
-
-Please do not do this unless you are really very very sure.  If your
-engines hash is something overly restrictive, you can quite easily and
-inadvertently lock yourself into obscurity and prevent your users from
-updating to new versions of Node.  Consider this choice carefully.  If
-people abuse it, it will be removed in a future version of npm.
-
-## os
-
-You can specify which operating systems your
-module will run on:
-
-    "os" : [ "darwin", "linux" ]
-
-You can also blacklist instead of whitelist operating systems,
-just prepend the blacklisted os with a '!':
-
-    "os" : [ "!win32" ]
-
-The host operating system is determined by `process.platform`
-
-It is allowed to both blacklist, and whitelist, although there isn't any
-good reason to do this.
-
-## cpu
-
-If your code only runs on certain cpu architectures,
-you can specify which ones.
-
-    "cpu" : [ "x64", "ia32" ]
-
-Like the `os` option, you can also blacklist architectures:
-
-    "cpu" : [ "!arm", "!mips" ]
-
-The host architecture is determined by `process.arch`
-
-## preferGlobal
-
-If your package is primarily a command-line application that should be
-installed globally, then set this value to `true` to provide a warning
-if it is installed locally.
-
-It doesn't actually prevent users from installing it locally, but it
-does help prevent some confusion if it doesn't work as expected.
-
-## private
-
-If you set `"private": true` in your package.json, then npm will refuse
-to publish it.
-
-This is a way to prevent accidental publication of private repositories.
-If you would like to ensure that a given package is only ever published
-to a specific registry (for example, an internal registry),
-then use the `publishConfig` hash described below
-to override the `registry` config param at publish-time.
-
-## publishConfig
-
-This is a set of config values that will be used at publish-time.  It's
-especially handy if you want to set the tag or registry, so that you can
-ensure that a given package is not tagged with "latest" or published to
-the global public registry by default.
-
-Any config values can be overridden, but of course only "tag" and
-"registry" probably matter for the purposes of publishing.
-
-See `npm-config(7)` to see the list of config options that can be
-overridden.
-
-## SEE ALSO
-
-* semver(7)
-* npm-init(1)
-* npm-version(1)
-* npm-config(1)
-* npm-config(7)
-* npm-help(1)
-* npm-faq(7)
-* npm-install(1)
-* npm-publish(1)
-* npm-rm(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-coding-style.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,181 +0,0 @@
-npm-coding-style(7) -- npm's "funny" coding style
-=================================================
-
-## DESCRIPTION
-
-npm's coding style is a bit unconventional.  It is not different for
-difference's sake, but rather a carefully crafted style that is
-designed to reduce visual clutter and make bugs more apparent.
-
-If you want to contribute to npm (which is very encouraged), you should
-make your code conform to npm's style.
-
-Note: this concerns npm's code not the specific packages at npmjs.org
-
-## Line Length
-
-Keep lines shorter than 80 characters.  It's better for lines to be
-too short than to be too long.  Break up long lists, objects, and other
-statements onto multiple lines.
-
-## Indentation
-
-Two-spaces.  Tabs are better, but they look like hell in web browsers
-(and on github), and node uses 2 spaces, so that's that.
-
-Configure your editor appropriately.
-
-## Curly braces
-
-Curly braces belong on the same line as the thing that necessitates them.
-
-Bad:
-
-    function ()
-    {
-
-Good:
-
-    function () {
-
-If a block needs to wrap to the next line, use a curly brace.  Don't
-use it if it doesn't.
-
-Bad:
-
-    if (foo) { bar() }
-    while (foo)
-      bar()
-
-Good:
-
-    if (foo) bar()
-    while (foo) {
-      bar()
-    }
-
-## Semicolons
-
-Don't use them except in four situations:
-
-* `for (;;)` loops.  They're actually required.
-* null loops like: `while (something) ;` (But you'd better have a good
-  reason for doing that.)
-* `case "foo": doSomething(); break`
-* In front of a leading `(` or `[` at the start of the line.
-  This prevents the expression from being interpreted
-  as a function call or property access, respectively.
-
-Some examples of good semicolon usage:
-
-    ;(x || y).doSomething()
-    ;[a, b, c].forEach(doSomething)
-    for (var i = 0; i < 10; i ++) {
-      switch (state) {
-        case "begin": start(); continue
-        case "end": finish(); break
-        default: throw new Error("unknown state")
-      }
-      end()
-    }
-
-Note that starting lines with `-` and `+` also should be prefixed
-with a semicolon, but this is much less common.
-
-## Comma First
-
-If there is a list of things separated by commas, and it wraps
-across multiple lines, put the comma at the start of the next
-line, directly below the token that starts the list.  Put the
-final token in the list on a line by itself.  For example:
-
-    var magicWords = [ "abracadabra"
-                     , "gesundheit"
-                     , "ventrilo"
-                     ]
-      , spells = { "fireball" : function () { setOnFire() }
-                 , "water" : function () { putOut() }
-                 }
-      , a = 1
-      , b = "abc"
-      , etc
-      , somethingElse
-
-## Whitespace
-
-Put a single space in front of ( for anything other than a function call.
-Also use a single space wherever it makes things more readable.
-
-Don't leave trailing whitespace at the end of lines.  Don't indent empty
-lines.  Don't use more spaces than are helpful.
-
-## Functions
-
-Use named functions.  They make stack traces a lot easier to read.
-
-## Callbacks, Sync/async Style
-
-Use the asynchronous/non-blocking versions of things as much as possible.
-It might make more sense for npm to use the synchronous fs APIs, but this
-way, the fs and http and child process stuff all uses the same callback-passing
-methodology.
-
-The callback should always be the last argument in the list.  Its first
-argument is the Error or null.
-
-Be very careful never to ever ever throw anything.  It's worse than useless.
-Just send the error message back as the first argument to the callback.
-
-## Errors
-
-Always create a new Error object with your message.  Don't just return a
-string message to the callback.  Stack traces are handy.
-
-## Logging
-
-Logging is done using the [npmlog](https://github.com/isaacs/npmlog)
-utility.
-
-Please clean up logs when they are no longer helpful.  In particular,
-logging the same object over and over again is not helpful.  Logs should
-report what's happening so that it's easier to track down where a fault
-occurs.
-
-Use appropriate log levels.  See `npm-config(7)` and search for
-"loglevel".
-
-## Case, naming, etc.
-
-Use `lowerCamelCase` for multiword identifiers when they refer to objects,
-functions, methods, members, or anything not specified in this section.
-
-Use `UpperCamelCase` for class names (things that you'd pass to "new").
-
-Use `all-lower-hyphen-css-case` for multiword filenames and config keys.
-
-Use named functions.  They make stack traces easier to follow.
-
-Use `CAPS_SNAKE_CASE` for constants, things that should never change
-and are rarely used.
-
-Use a single uppercase letter for function names where the function
-would normally be anonymous, but needs to call itself recursively.  It
-makes it clear that it's a "throwaway" function.
-
-## null, undefined, false, 0
-
-Boolean variables and functions should always be either `true` or
-`false`.  Don't set it to 0 unless it's supposed to be a number.
-
-When something is intentionally missing or removed, set it to `null`.
-
-Don't set things to `undefined`.  Reserve that value to mean "not yet
-set to anything."
-
-Boolean objects are verboten.
-
-## SEE ALSO
-
-* npm-developers(7)
-* npm-faq(7)
-* npm(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-config.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,827 +0,0 @@
-npm-config(7) -- More than you probably want to know about npm configuration
-============================================================================
-
-## DESCRIPTION
-
-npm gets its configuration values from 6 sources, in this priority:
-
-### Command Line Flags
-
-Putting `--foo bar` on the command line sets the `foo` configuration
-parameter to `"bar"`.  A `--` argument tells the cli parser to stop
-reading flags.  A `--flag` parameter that is at the *end* of the
-command will be given the value of `true`.
-
-### Environment Variables
-
-Any environment variables that start with `npm_config_` will be
-interpreted as a configuration parameter.  For example, putting
-`npm_config_foo=bar` in your environment will set the `foo`
-configuration parameter to `bar`.  Any environment configurations that
-are not given a value will be given the value of `true`.  Config
-values are case-insensitive, so `NPM_CONFIG_FOO=bar` will work the
-same.
-
-### npmrc Files
-
-The three relevant files are:
-
-* per-user config file (~/.npmrc)
-* global config file ($PREFIX/npmrc)
-* npm builtin config file (/path/to/npm/npmrc)
-
-See npmrc(5) for more details.
-
-### Default Configs
-
-A set of configuration parameters that are internal to npm, and are
-defaults if nothing else is specified.
-
-## Shorthands and Other CLI Niceties
-
-The following shorthands are parsed on the command-line:
-
-* `-v`: `--version`
-* `-h`, `-?`, `--help`, `-H`: `--usage`
-* `-s`, `--silent`: `--loglevel silent`
-* `-q`, `--quiet`: `--loglevel warn`
-* `-d`: `--loglevel info`
-* `-dd`, `--verbose`: `--loglevel verbose`
-* `-ddd`: `--loglevel silly`
-* `-g`: `--global`
-* `-l`: `--long`
-* `-m`: `--message`
-* `-p`, `--porcelain`: `--parseable`
-* `-reg`: `--registry`
-* `-v`: `--version`
-* `-f`: `--force`
-* `-desc`: `--description`
-* `-S`: `--save`
-* `-D`: `--save-dev`
-* `-O`: `--save-optional`
-* `-B`: `--save-bundle`
-* `-y`: `--yes`
-* `-n`: `--yes false`
-* `ll` and `la` commands: `ls --long`
-
-If the specified configuration param resolves unambiguously to a known
-configuration parameter, then it is expanded to that configuration
-parameter.  For example:
-
-    npm ls --par
-    # same as:
-    npm ls --parseable
-
-If multiple single-character shorthands are strung together, and the
-resulting combination is unambiguously not some other configuration
-param, then it is expanded to its various component pieces.  For
-example:
-
-    npm ls -gpld
-    # same as:
-    npm ls --global --parseable --long --loglevel info
-
-## Per-Package Config Settings
-
-When running scripts (see `npm-scripts(7)`) the package.json "config"
-keys are overwritten in the environment if there is a config param of
-`<name>[@<version>]:<key>`.  For example, if the package.json has
-this:
-
-    { "name" : "foo"
-    , "config" : { "port" : "8080" }
-    , "scripts" : { "start" : "node server.js" } }
-
-and the server.js is this:
-
-    http.createServer(...).listen(process.env.npm_package_config_port)
-
-then the user could change the behavior by doing:
-
-    npm config set foo:port 80
-
-See package.json(5) for more information.
-
-## Config Settings
-
-### always-auth
-
-* Default: false
-* Type: Boolean
-
-Force npm to always require authentication when accessing the registry,
-even for `GET` requests.
-
-### bin-links
-
-* Default: `true`
-* Type: Boolean
-
-Tells npm to create symlinks (or `.cmd` shims on Windows) for package
-executables.
-
-Set to false to have it not do this.  This can be used to work around
-the fact that some file systems don't support symlinks, even on
-ostensibly Unix systems.
-
-### browser
-
-* Default: OS X: `"open"`, Windows: `"start"`, Others: `"xdg-open"`
-* Type: String
-
-The browser that is called by the `npm docs` command to open websites.
-
-### ca
-
-* Default: The npm CA certificate
-* Type: String or null
-
-The Certificate Authority signing certificate that is trusted for SSL
-connections to the registry.
-
-Set to `null` to only allow "known" registrars, or to a specific CA cert
-to trust only that specific signing authority.
-
-See also the `strict-ssl` config.
-
-### cache
-
-* Default: Windows: `%AppData%\npm-cache`, Posix: `~/.npm`
-* Type: path
-
-The location of npm's cache directory.  See `npm-cache(1)`
-
-### cache-lock-stale
-
-* Default: 60000 (1 minute)
-* Type: Number
-
-The number of ms before cache folder lockfiles are considered stale.
-
-### cache-lock-retries
-
-* Default: 10
-* Type: Number
-
-Number of times to retry to acquire a lock on cache folder lockfiles.
-
-### cache-lock-wait
-
-* Default: 10000 (10 seconds)
-* Type: Number
-
-Number of ms to wait for cache lock files to expire.
-
-### cache-max
-
-* Default: Infinity
-* Type: Number
-
-The maximum time (in seconds) to keep items in the registry cache before
-re-checking against the registry.
-
-Note that no purging is done unless the `npm cache clean` command is
-explicitly used, and that only GET requests use the cache.
-
-### cache-min
-
-* Default: 10
-* Type: Number
-
-The minimum time (in seconds) to keep items in the registry cache before
-re-checking against the registry.
-
-Note that no purging is done unless the `npm cache clean` command is
-explicitly used, and that only GET requests use the cache.
-
-### color
-
-* Default: true on Posix, false on Windows
-* Type: Boolean or `"always"`
-
-If false, never shows colors.  If `"always"` then always shows colors.
-If true, then only prints color codes for tty file descriptors.
-
-### coverage
-
-* Default: false
-* Type: Boolean
-
-A flag to tell test-harness to run with their coverage options enabled,
-if they respond to the `npm_config_coverage` environment variable.
-
-### depth
-
-* Default: Infinity
-* Type: Number
-
-The depth to go when recursing directories for `npm ls` and
-`npm cache ls`.
-
-### description
-
-* Default: true
-* Type: Boolean
-
-Show the description in `npm search`
-
-### dev
-
-* Default: false
-* Type: Boolean
-
-Install `dev-dependencies` along with packages.
-
-Note that `dev-dependencies` are also installed if the `npat` flag is
-set.
-
-### editor
-
-* Default: `EDITOR` environment variable if set, or `"vi"` on Posix,
-  or `"notepad"` on Windows.
-* Type: path
-
-The command to run for `npm edit` or `npm config edit`.
-
-### engine-strict
-
-* Default: false
-* Type: Boolean
-
-If set to true, then npm will stubbornly refuse to install (or even
-consider installing) any package that claims to not be compatible with
-the current Node.js version.
-
-### force
-
-* Default: false
-* Type: Boolean
-
-Makes various commands more forceful.
-
-* lifecycle script failure does not block progress.
-* publishing clobbers previously published versions.
-* skips cache when requesting from the registry.
-* prevents checks against clobbering non-npm files.
-
-### fetch-retries
-
-* Default: 2
-* Type: Number
-
-The "retries" config for the `retry` module to use when fetching
-packages from the registry.
-
-### fetch-retry-factor
-
-* Default: 10
-* Type: Number
-
-The "factor" config for the `retry` module to use when fetching
-packages.
-
-### fetch-retry-mintimeout
-
-* Default: 10000 (10 seconds)
-* Type: Number
-
-The "minTimeout" config for the `retry` module to use when fetching
-packages.
-
-### fetch-retry-maxtimeout
-
-* Default: 60000 (1 minute)
-* Type: Number
-
-The "maxTimeout" config for the `retry` module to use when fetching
-packages.
-
-### git
-
-* Default: `"git"`
-* Type: String
-
-The command to use for git commands.  If git is installed on the
-computer, but is not in the `PATH`, then set this to the full path to
-the git binary.
-
-### global
-
-* Default: false
-* Type: Boolean
-
-Operates in "global" mode, so that packages are installed into the
-`prefix` folder instead of the current working directory.  See
-`npm-folders(5)` for more on the differences in behavior.
-
-* packages are installed into the `{prefix}/lib/node_modules` folder, instead of the
-  current working directory.
-* bin files are linked to `{prefix}/bin`
-* man pages are linked to `{prefix}/share/man`
-
-### globalconfig
-
-* Default: {prefix}/etc/npmrc
-* Type: path
-
-The config file to read for global config options.
-
-### globalignorefile
-
-* Default: {prefix}/etc/npmignore
-* Type: path
-
-The config file to read for global ignore patterns to apply to all users
-and all projects.
-
-If not found, but there is a "gitignore" file in the
-same directory, then that will be used instead.
-
-### group
-
-* Default: GID of the current process
-* Type: String or Number
-
-The group to use when running package scripts in global mode as the root
-user.
-
-### https-proxy
-
-* Default: the `HTTPS_PROXY` or `https_proxy` or `HTTP_PROXY` or
-  `http_proxy` environment variables.
-* Type: url
-
-A proxy to use for outgoing https requests.
-
-### user-agent
-
-* Default: node/{process.version} {process.platform} {process.arch}
-* Type: String
-
-Sets a User-Agent to the request header
-
-### ignore
-
-* Default: ""
-* Type: string
-
-A white-space separated list of glob patterns of files to always exclude
-from packages when building tarballs.
-
-### init-module
-
-* Default: ~/.npm-init.js
-* Type: path
-
-A module that will be loaded by the `npm init` command.  See the
-documentation for the
-[init-package-json](https://github.com/isaacs/init-package-json) module
-for more information, or npm-init(1).
-
-### init.version
-
-* Default: "0.0.0"
-* Type: semver
-
-The value `npm init` should use by default for the package version.
-
-### init.author.name
-
-* Default: ""
-* Type: String
-
-The value `npm init` should use by default for the package author's name.
-
-### init.author.email
-
-* Default: ""
-* Type: String
-
-The value `npm init` should use by default for the package author's email.
-
-### init.author.url
-
-* Default: ""
-* Type: String
-
-The value `npm init` should use by default for the package author's homepage.
-
-### json
-
-* Default: false
-* Type: Boolean
-
-Whether or not to output JSON data, rather than the normal output.
-
-This feature is currently experimental, and the output data structures
-for many commands is either not implemented in JSON yet, or subject to
-change.  Only the output from `npm ls --json` is currently valid.
-
-### link
-
-* Default: false
-* Type: Boolean
-
-If true, then local installs will link if there is a suitable globally
-installed package.
-
-Note that this means that local installs can cause things to be
-installed into the global space at the same time.  The link is only done
-if one of the two conditions are met:
-
-* The package is not already installed globally, or
-* the globally installed version is identical to the version that is
-  being installed locally.
-
-### loglevel
-
-* Default: "http"
-* Type: String
-* Values: "silent", "win", "error", "warn", "http", "info", "verbose", "silly"
-
-What level of logs to report.  On failure, *all* logs are written to
-`npm-debug.log` in the current working directory.
-
-Any logs of a higher level than the setting are shown.
-The default is "http", which shows http, warn, and error output.
-
-### logstream
-
-* Default: process.stderr
-* Type: Stream
-
-This is the stream that is passed to the
-[npmlog](https://github.com/isaacs/npmlog) module at run time.
-
-It cannot be set from the command line, but if you are using npm
-programmatically, you may wish to send logs to somewhere other than
-stderr.
-
-If the `color` config is set to true, then this stream will receive
-colored output if it is a TTY.
-
-### long
-
-* Default: false
-* Type: Boolean
-
-Show extended information in `npm ls`
-
-### message
-
-* Default: "%s"
-* Type: String
-
-Commit message which is used by `npm version` when creating version commit.
-
-Any "%s" in the message will be replaced with the version number.
-
-### node-version
-
-* Default: process.version
-* Type: semver or false
-
-The node version to use when checking package's "engines" hash.
-
-### npat
-
-* Default: false
-* Type: Boolean
-
-Run tests on installation and report results to the
-`npaturl`.
-
-### npaturl
-
-* Default: Not yet implemented
-* Type: url
-
-The url to report npat test results.
-
-### onload-script
-
-* Default: false
-* Type: path
-
-A node module to `require()` when npm loads.  Useful for programmatic
-usage.
-
-### optional
-
-* Default: true
-* Type: Boolean
-
-Attempt to install packages in the `optionalDependencies` hash.  Note
-that if these packages fail to install, the overall installation
-process is not aborted.
-
-### parseable
-
-* Default: false
-* Type: Boolean
-
-Output parseable results from commands that write to
-standard output.
-
-### prefix
-
-* Default: see npm-folders(5)
-* Type: path
-
-The location to install global items.  If set on the command line, then
-it forces non-global commands to run in the specified folder.
-
-### production
-
-* Default: false
-* Type: Boolean
-
-Set to true to run in "production" mode.
-
-1. devDependencies are not installed at the topmost level when running
-   local `npm install` without any arguments.
-2. Set the NODE_ENV="production" for lifecycle scripts.
-
-### proprietary-attribs
-
-* Default: true
-* Type: Boolean
-
-Whether or not to include proprietary extended attributes in the
-tarballs created by npm.
-
-Unless you are expecting to unpack package tarballs with something other
-than npm -- particularly a very outdated tar implementation -- leave
-this as true.
-
-### proxy
-
-* Default: `HTTP_PROXY` or `http_proxy` environment variable, or null
-* Type: url
-
-A proxy to use for outgoing http requests.
-
-### rebuild-bundle
-
-* Default: true
-* Type: Boolean
-
-Rebuild bundled dependencies after installation.
-
-### registry
-
-* Default: https://registry.npmjs.org/
-* Type: url
-
-The base URL of the npm package registry.
-
-### rollback
-
-* Default: true
-* Type: Boolean
-
-Remove failed installs.
-
-### save
-
-* Default: false
-* Type: Boolean
-
-Save installed packages to a package.json file as dependencies.
-
-When used with the `npm rm` command, it removes it from the dependencies
-hash.
-
-Only works if there is already a package.json file present.
-
-### save-bundle
-
-* Default: false
-* Type: Boolean
-
-If a package would be saved at install time by the use of `--save`,
-`--save-dev`, or `--save-optional`, then also put it in the
-`bundleDependencies` list.
-
-When used with the `npm rm` command, it removes it from the
-bundledDependencies list.
-
-### save-dev
-
-* Default: false
-* Type: Boolean
-
-Save installed packages to a package.json file as devDependencies.
-
-When used with the `npm rm` command, it removes it from the devDependencies
-hash.
-
-Only works if there is already a package.json file present.
-
-### save-optional
-
-* Default: false
-* Type: Boolean
-
-Save installed packages to a package.json file as optionalDependencies.
-
-When used with the `npm rm` command, it removes it from the devDependencies
-hash.
-
-Only works if there is already a package.json file present.
-
-### searchopts
-
-* Default: ""
-* Type: String
-
-Space-separated options that are always passed to search.
-
-### searchexclude
-
-* Default: ""
-* Type: String
-
-Space-separated options that limit the results from search.
-
-### searchsort
-
-* Default: "name"
-* Type: String
-* Values: "name", "-name", "date", "-date", "description",
-  "-description", "keywords", "-keywords"
-
-Indication of which field to sort search results by.  Prefix with a `-`
-character to indicate reverse sort.
-
-### shell
-
-* Default: SHELL environment variable, or "bash" on Posix, or "cmd" on
-  Windows
-* Type: path
-
-The shell to run for the `npm explore` command.
-
-### shrinkwrap
-
-* Default: true
-* Type: Boolean
-
-If set to false, then ignore `npm-shrinkwrap.json` files when
-installing.
-
-### sign-git-tag
-
-* Default: false
-* Type: Boolean
-
-If set to true, then the `npm version` command will tag the version
-using `-s` to add a signature.
-
-Note that git requires you to have set up GPG keys in your git configs
-for this to work properly.
-
-### strict-ssl
-
-* Default: true
-* Type: Boolean
-
-Whether or not to do SSL key validation when making requests to the
-registry via https.
-
-See also the `ca` config.
-
-### tag
-
-* Default: latest
-* Type: String
-
-If you ask npm to install a package and don't tell it a specific version, then
-it will install the specified tag.
-
-Also the tag that is added to the package@version specified by the `npm
-tag` command, if no explicit tag is given.
-
-### tmp
-
-* Default: TMPDIR environment variable, or "/tmp"
-* Type: path
-
-Where to store temporary files and folders.  All temp files are deleted
-on success, but left behind on failure for forensic purposes.
-
-### unicode
-
-* Default: true
-* Type: Boolean
-
-When set to true, npm uses unicode characters in the tree output.  When
-false, it uses ascii characters to draw trees.
-
-### unsafe-perm
-
-* Default: false if running as root, true otherwise
-* Type: Boolean
-
-Set to true to suppress the UID/GID switching when running package
-scripts.  If set explicitly to false, then installing as a non-root user
-will fail.
-
-### usage
-
-* Default: false
-* Type: Boolean
-
-Set to show short usage output (like the -H output)
-instead of complete help when doing `npm-help(1)`.
-
-### user
-
-* Default: "nobody"
-* Type: String or Number
-
-The UID to set to when running package scripts as root.
-
-### username
-
-* Default: null
-* Type: String
-
-The username on the npm registry.  Set with `npm adduser`
-
-### userconfig
-
-* Default: ~/.npmrc
-* Type: path
-
-The location of user-level configuration settings.
-
-### userignorefile
-
-* Default: ~/.npmignore
-* Type: path
-
-The location of a user-level ignore file to apply to all packages.
-
-If not found, but there is a .gitignore file in the same directory, then
-that will be used instead.
-
-### umask
-
-* Default: 022
-* Type: Octal numeric string
-
-The "umask" value to use when setting the file creation mode on files
-and folders.
-
-Folders and executables are given a mode which is `0777` masked against
-this value.  Other files are given a mode which is `0666` masked against
-this value.  Thus, the defaults are `0755` and `0644` respectively.
-
-### version
-
-* Default: false
-* Type: boolean
-
-If true, output the npm version and exit successfully.
-
-Only relevant when specified explicitly on the command line.
-
-### versions
-
-* Default: false
-* Type: boolean
-
-If true, output the npm version as well as node's `process.versions`
-hash, and exit successfully.
-
-Only relevant when specified explicitly on the command line.
-
-### viewer
-
-* Default: "man" on Posix, "browser" on Windows
-* Type: path
-
-The program to use to view help content.
-
-Set to `"browser"` to view html help content in the default web browser.
-
-### yes
-
-* Default: null
-* Type: Boolean or null
-
-If set to `null`, then prompt the user for responses in some
-circumstances.
-
-If set to `true`, then answer "yes" to any prompt.  If set to `false`
-then answer "no" to any prompt.
-
-## SEE ALSO
-
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-scripts(7)
-* npm-folders(5)
-* npm(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-developers.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,207 +0,0 @@
-npm-developers(7) -- Developer Guide
-====================================
-
-## DESCRIPTION
-
-So, you've decided to use npm to develop (and maybe publish/deploy)
-your project.
-
-Fantastic!
-
-There are a few things that you need to do above the simple steps
-that your users will do to install your program.
-
-## About These Documents
-
-These are man pages.  If you install npm, you should be able to
-then do `man npm-thing` to get the documentation on a particular
-topic, or `npm help thing` to see the same information.
-
-## What is a `package`
-
-A package is:
-
-* a) a folder containing a program described by a package.json file
-* b) a gzipped tarball containing (a)
-* c) a url that resolves to (b)
-* d) a `<name>@<version>` that is published on the registry with (c)
-* e) a `<name>@<tag>` that points to (d)
-* f) a `<name>` that has a "latest" tag satisfying (e)
-* g) a `git` url that, when cloned, results in (a).
-
-Even if you never publish your package, you can still get a lot of
-benefits of using npm if you just want to write a node program (a), and
-perhaps if you also want to be able to easily install it elsewhere
-after packing it up into a tarball (b).
-
-Git urls can be of the form:
-
-    git://github.com/user/project.git#commit-ish
-    git+ssh://user@hostname:project.git#commit-ish
-    git+http://user@hostname/project/blah.git#commit-ish
-    git+https://user@hostname/project/blah.git#commit-ish
-
-The `commit-ish` can be any tag, sha, or branch which can be supplied as
-an argument to `git checkout`.  The default is `master`.
-
-## The package.json File
-
-You need to have a `package.json` file in the root of your project to do
-much of anything with npm.  That is basically the whole interface.
-
-See `package.json(5)` for details about what goes in that file.  At the very
-least, you need:
-
-* name:
-  This should be a string that identifies your project.  Please do not
-  use the name to specify that it runs on node, or is in JavaScript.
-  You can use the "engines" field to explicitly state the versions of
-  node (or whatever else) that your program requires, and it's pretty
-  well assumed that it's javascript.
-
-  It does not necessarily need to match your github repository name.
-
-  So, `node-foo` and `bar-js` are bad names.  `foo` or `bar` are better.
-
-* version:
-  A semver-compatible version.
-
-* engines:
-  Specify the versions of node (or whatever else) that your program
-  runs on.  The node API changes a lot, and there may be bugs or new
-  functionality that you depend on.  Be explicit.
-
-* author:
-  Take some credit.
-
-* scripts:
-  If you have a special compilation or installation script, then you
-  should put it in the `scripts` hash.  You should definitely have at
-  least a basic smoke-test command as the "scripts.test" field.
-  See npm-scripts(7).
-
-* main:
-  If you have a single module that serves as the entry point to your
-  program (like what the "foo" package gives you at require("foo")),
-  then you need to specify that in the "main" field.
-
-* directories:
-  This is a hash of folders.  The best ones to include are "lib" and
-  "doc", but if you specify a folder full of man pages in "man", then
-  they'll get installed just like these ones.
-
-You can use `npm init` in the root of your package in order to get you
-started with a pretty basic package.json file.  See `npm-init(1)` for
-more info.
-
-## Keeping files *out* of your package
-
-Use a `.npmignore` file to keep stuff out of your package.  If there's
-no `.npmignore` file, but there *is* a `.gitignore` file, then npm will
-ignore the stuff matched by the `.gitignore` file.  If you *want* to
-include something that is excluded by your `.gitignore` file, you can
-create an empty `.npmignore` file to override it.
-
-By default, the following paths and files are ignored, so there's no
-need to add them to `.npmignore` explicitly:
-
-* `.*.swp`
-* `._*`
-* `.DS_Store`
-* `.git`
-* `.hg`
-* `.lock-wscript`
-* `.svn`
-* `.wafpickle-*`
-* `CVS`
-* `npm-debug.log`
-
-Additionally, everything in `node_modules` is ignored, except for
-bundled dependencies. npm automatically handles this for you, so don't
-bother adding `node_modules` to `.npmignore`.
-
-The following paths and files are never ignored, so adding them to
-`.npmignore` is pointless:
-
-* `package.json`
-* `README.*`
-
-## Link Packages
-
-`npm link` is designed to install a development package and see the
-changes in real time without having to keep re-installing it.  (You do
-need to either re-link or `npm rebuild -g` to update compiled packages,
-of course.)
-
-More info at `npm-link(1)`.
-
-## Before Publishing: Make Sure Your Package Installs and Works
-
-**This is important.**
-
-If you can not install it locally, you'll have
-problems trying to publish it.  Or, worse yet, you'll be able to
-publish it, but you'll be publishing a broken or pointless package.
-So don't do that.
-
-In the root of your package, do this:
-
-    npm install . -g
-
-That'll show you that it's working.  If you'd rather just create a symlink
-package that points to your working directory, then do this:
-
-    npm link
-
-Use `npm ls -g` to see if it's there.
-
-To test a local install, go into some other folder, and then do:
-
-    cd ../some-other-folder
-    npm install ../my-package
-
-to install it locally into the node_modules folder in that other place.
-
-Then go into the node-repl, and try using require("my-thing") to
-bring in your module's main module.
-
-## Create a User Account
-
-Create a user with the adduser command.  It works like this:
-
-    npm adduser
-
-and then follow the prompts.
-
-This is documented better in npm-adduser(1).
-
-## Publish your package
-
-This part's easy.  IN the root of your folder, do this:
-
-    npm publish
-
-You can give publish a url to a tarball, or a filename of a tarball,
-or a path to a folder.
-
-Note that pretty much **everything in that folder will be exposed**
-by default.  So, if you have secret stuff in there, use a
-`.npmignore` file to list out the globs to ignore, or publish
-from a fresh checkout.
-
-## Brag about it
-
-Send emails, write blogs, blab in IRC.
-
-Tell the world how easy it is to install your program!
-
-## SEE ALSO
-
-* npm-faq(7)
-* npm(1)
-* npm-init(1)
-* package.json(5)
-* npm-scripts(7)
-* npm-publish(1)
-* npm-adduser(1)
-* npm-registry(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-disputes.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,98 +0,0 @@
-npm-disputes(7) -- Handling Module Name Disputes
-================================================
-
-## SYNOPSIS
-
-1. Get the author email with `npm owner ls <pkgname>`
-2. Email the author, CC <i@izs.me>.
-3. After a few weeks, if there's no resolution, we'll sort it out.
-
-Don't squat on package names.  Publish code or move out of the way.
-
-## DESCRIPTION
-
-There sometimes arise cases where a user publishes a module, and then
-later, some other user wants to use that name.  Here are some common
-ways that happens (each of these is based on actual events.)
-
-1. Joe writes a JavaScript module `foo`, which is not node-specific.
-   Joe doesn't use node at all.  Bob   wants to use `foo` in node, so he
-   wraps it in an npm module.  Some time later, Joe starts using node,
-   and wants to take over management of his program.
-2. Bob writes an npm module `foo`, and publishes it.  Perhaps much
-   later, Joe finds a bug in `foo`, and fixes it.  He sends a pull
-   request to Bob, but Bob doesn't have the time to deal with it,
-   because he has a new job and a new baby and is focused on his new
-   erlang project, and kind of not involved with node any more.  Joe
-   would like to publish a new `foo`, but can't, because the name is
-   taken.
-3. Bob writes a 10-line flow-control library, and calls it `foo`, and
-   publishes it to the npm registry.  Being a simple little thing, it
-   never really has to be updated.  Joe works for Foo Inc, the makers
-   of the critically acclaimed and widely-marketed `foo` JavaScript
-   toolkit framework.  They publish it to npm as `foojs`, but people are
-   routinely confused when `npm install foo` is some different thing.
-4. Bob writes a parser for the widely-known `foo` file format, because
-   he needs it for work.  Then, he gets a new job, and never updates the
-   prototype.  Later on, Joe writes a much more complete `foo` parser,
-   but can't publish, because Bob's `foo` is in the way.
-
-The validity of Joe's claim in each situation can be debated.  However,
-Joe's appropriate course of action in each case is the same.
-
-1. `npm owner ls foo`.  This will tell Joe the email address of the
-   owner (Bob).
-2. Joe emails Bob, explaining the situation **as respectfully as possible**,
-   and what he would like to do with the module name.  He adds
-   isaacs <i@izs.me> to the CC list of the email.  Mention in the email
-   that Bob can run `npm owner add joe foo` to add Joe as an owner of
-   the `foo` package.
-3. After a reasonable amount of time, if Bob has not responded, or if
-   Bob and Joe can't come to any sort of resolution, email isaacs
-   <i@izs.me> and we'll sort it out.  ("Reasonable" is usually about 4
-   weeks, but extra time is allowed around common holidays.)
-
-## REASONING
-
-In almost every case so far, the parties involved have been able to reach
-an amicable resolution without any major intervention.  Most people
-really do want to be reasonable, and are probably not even aware that
-they're in your way.
-
-Module ecosystems are most vibrant and powerful when they are as
-self-directed as possible.  If an admin one day deletes something you
-had worked on, then that is going to make most people quite upset,
-regardless of the justification.  When humans solve their problems by
-talking to other humans with respect, everyone has the chance to end up
-feeling good about the interaction.
-
-## EXCEPTIONS
-
-Some things are not allowed, and will be removed without discussion if
-they are brought to the attention of the npm registry admins, including
-but not limited to:
-
-1. Malware (that is, a package designed to exploit or harm the machine on
-   which it is installed).
-2. Violations of copyright or licenses (for example, cloning an
-   MIT-licensed program, and then removing or changing the copyright and
-   license statement).
-3. Illegal content.
-4. "Squatting" on a package name that you *plan* to use, but aren't
-   actually using.  Sorry, I don't care how great the name is, or how
-   perfect a fit it is for the thing that someday might happen.  If
-   someone wants to use it today, and you're just taking up space with
-   an empty tarball, you're going to be evicted.
-5. Putting empty packages in the registry.  Packages must have SOME
-   functionality.  It can be silly, but it can't be *nothing*.  (See
-   also: squatting.)
-6. Doing weird things with the registry, like using it as your own
-   personal application database or otherwise putting non-packagey
-   things into it.
-
-If you see bad behavior like this, please report it right away.
-
-## SEE ALSO
-
-* npm-registry(7)
-* npm-owner(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-faq.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,355 +0,0 @@
-npm-faq(7) -- Frequently Asked Questions
-========================================
-
-## Where can I find these docs in HTML?
-
-<https://npmjs.org/doc/>, or run:
-
-    npm config set viewer browser
-
-to open these documents in your default web browser rather than `man`.
-
-## It didn't work.
-
-That's not really a question.
-
-## Why didn't it work?
-
-I don't know yet.
-
-Read the error output, and if you can't figure out what it means,
-do what it says and post a bug with all the information it asks for.
-
-## Where does npm put stuff?
-
-See `npm-folders(5)`
-
-tl;dr:
-
-* Use the `npm root` command to see where modules go, and the `npm bin`
-  command to see where executables go
-* Global installs are different from local installs.  If you install
-  something with the `-g` flag, then its executables go in `npm bin -g`
-  and its modules go in `npm root -g`.
-
-## How do I install something on my computer in a central location?
-
-Install it globally by tacking `-g` or `--global` to the command.  (This
-is especially important for command line utilities that need to add
-their bins to the global system `PATH`.)
-
-## I installed something globally, but I can't `require()` it
-
-Install it locally.
-
-The global install location is a place for command-line utilities
-to put their bins in the system `PATH`.  It's not for use with `require()`.
-
-If you `require()` a module in your code, then that means it's a
-dependency, and a part of your program.  You need to install it locally
-in your program.
-
-## Why can't npm just put everything in one place, like other package managers?
-
-Not every change is an improvement, but every improvement is a change.
-This would be like asking git to do network IO for every commit.  It's
-not going to happen, because it's a terrible idea that causes more
-problems than it solves.
-
-It is much harder to avoid dependency conflicts without nesting
-dependencies.  This is fundamental to the way that npm works, and has
-proven to be an extremely successful approach.  See `npm-folders(5)` for
-more details.
-
-If you want a package to be installed in one place, and have all your
-programs reference the same copy of it, then use the `npm link` command.
-That's what it's for.  Install it globally, then link it into each
-program that uses it.
-
-## Whatever, I really want the old style 'everything global' style.
-
-Write your own package manager, then.  It's not that hard.
-
-npm will not help you do something that is known to be a bad idea.
-
-## Should I check my `node_modules` folder into git?
-
-Mikeal Rogers answered this question very well:
-
-<http://www.mikealrogers.com/posts/nodemodules-in-git.html>
-
-tl;dr
-
-* Check `node_modules` into git for things you **deploy**, such as
-  websites and apps.
-* Do not check `node_modules` into git for libraries and modules
-  intended to be reused.
-* Use npm to manage dependencies in your dev environment, but not in
-  your deployment scripts.
-
-## Is it 'npm' or 'NPM' or 'Npm'?
-
-npm should never be capitalized unless it is being displayed in a
-location that is customarily all-caps (such as the title of man pages.)
-
-## If 'npm' is an acronym, why is it never capitalized?
-
-Contrary to the belief of many, "npm" is not in fact an abbreviation for
-"Node Package Manager".  It is a recursive bacronymic abbreviation for
-"npm is not an acronym".  (If it was "ninaa", then it would be an
-acronym, and thus incorrectly named.)
-
-"NPM", however, *is* an acronym (more precisely, a capitonym) for the
-National Association of Pastoral Musicians.  You can learn more
-about them at <http://npm.org/>.
-
-In software, "NPM" is a Non-Parametric Mapping utility written by
-Chris Rorden.  You can analyze pictures of brains with it.  Learn more
-about the (capitalized) NPM program at <http://www.cabiatl.com/mricro/npm/>.
-
-The first seed that eventually grew into this flower was a bash utility
-named "pm", which was a shortened descendent of "pkgmakeinst", a
-bash function that was used to install various different things on different
-platforms, most often using Yahoo's `yinst`.  If `npm` was ever an
-acronym for anything, it was `node pm` or maybe `new pm`.
-
-So, in all seriousness, the "npm" project is named after its command-line
-utility, which was organically selected to be easily typed by a right-handed
-programmer using a US QWERTY keyboard layout, ending with the
-right-ring-finger in a postition to type the `-` key for flags and
-other command-line arguments.  That command-line utility is always
-lower-case, though it starts most sentences it is a part of.
-
-## How do I list installed packages?
-
-`npm ls`
-
-## How do I search for packages?
-
-`npm search`
-
-Arguments are greps.  `npm search jsdom` shows jsdom packages.
-
-## How do I update npm?
-
-    npm update npm -g
-
-You can also update all outdated local packages by doing `npm update` without
-any arguments, or global packages by doing `npm update -g`.
-
-Occasionally, the version of npm will progress such that the current
-version cannot be properly installed with the version that you have
-installed already.  (Consider, if there is ever a bug in the `update`
-command.)
-
-In those cases, you can do this:
-
-    curl https://npmjs.org/install.sh | sh
-
-## What is a `package`?
-
-A package is:
-
-* a) a folder containing a program described by a package.json file
-* b) a gzipped tarball containing (a)
-* c) a url that resolves to (b)
-* d) a `<name>@<version>` that is published on the registry with (c)
-* e) a `<name>@<tag>` that points to (d)
-* f) a `<name>` that has a "latest" tag satisfying (e)
-* g) a `git` url that, when cloned, results in (a).
-
-Even if you never publish your package, you can still get a lot of
-benefits of using npm if you just want to write a node program (a), and
-perhaps if you also want to be able to easily install it elsewhere
-after packing it up into a tarball (b).
-
-Git urls can be of the form:
-
-    git://github.com/user/project.git#commit-ish
-    git+ssh://user@hostname:project.git#commit-ish
-    git+http://user@hostname/project/blah.git#commit-ish
-    git+https://user@hostname/project/blah.git#commit-ish
-
-The `commit-ish` can be any tag, sha, or branch which can be supplied as
-an argument to `git checkout`.  The default is `master`.
-
-## What is a `module`?
-
-A module is anything that can be loaded with `require()` in a Node.js
-program.  The following things are all examples of things that can be
-loaded as modules:
-
-* A folder with a `package.json` file containing a `main` field.
-* A folder with an `index.js` file in it.
-* A JavaScript file.
-
-Most npm packages are modules, because they are libraries that you
-load with `require`.  However, there's no requirement that an npm
-package be a module!  Some only contain an executable command-line
-interface, and don't provide a `main` field for use in Node programs.
-
-Almost all npm packages (at least, those that are Node programs)
-*contain* many modules within them (because every file they load with
-`require()` is a module).
-
-In the context of a Node program, the `module` is also the thing that
-was loaded *from* a file.  For example, in the following program:
-
-    var req = require('request')
-
-we might say that "The variable `req` refers to the `request` module".
-
-## So, why is it the "`node_modules`" folder, but "`package.json`" file?  Why not `node_packages` or `module.json`?
-
-The `package.json` file defines the package.  (See "What is a
-package?" above.)
-
-The `node_modules` folder is the place Node.js looks for modules.
-(See "What is a module?" above.)
-
-For example, if you create a file at `node_modules/foo.js` and then
-had a program that did `var f = require('foo.js')` then it would load
-the module.  However, `foo.js` is not a "package" in this case,
-because it does not have a package.json.
-
-Alternatively, if you create a package which does not have an
-`index.js` or a `"main"` field in the `package.json` file, then it is
-not a module.  Even if it's installed in `node_modules`, it can't be
-an argument to `require()`.
-
-## `"node_modules"` is the name of my deity's arch-rival, and a Forbidden Word in my religion.  Can I configure npm to use a different folder?
-
-No.  This will never happen.  This question comes up sometimes,
-because it seems silly from the outside that npm couldn't just be
-configured to put stuff somewhere else, and then npm could load them
-from there.  It's an arbitrary spelling choice, right?  What's the big
-deal?
-
-At the time of this writing, the string `'node_modules'` appears 151
-times in 53 separate files in npm and node core (excluding tests and
-documentation).
-
-Some of these references are in node's built-in module loader.  Since
-npm is not involved **at all** at run-time, node itself would have to
-be configured to know where you've decided to stick stuff.  Complexity
-hurdle #1.  Since the Node module system is locked, this cannot be
-changed, and is enough to kill this request.  But I'll continue, in
-deference to your deity's delicate feelings regarding spelling.
-
-Many of the others are in dependencies that npm uses, which are not
-necessarily tightly coupled to npm (in the sense that they do not read
-npm's configuration files, etc.)  Each of these would have to be
-configured to take the name of the `node_modules` folder as a
-parameter.  Complexity hurdle #2.
-
-Furthermore, npm has the ability to "bundle" dependencies by adding
-the dep names to the `"bundledDependencies"` list in package.json,
-which causes the folder to be included in the package tarball.  What
-if the author of a module bundles its dependencies, and they use a
-different spelling for `node_modules`?  npm would have to rename the
-folder at publish time, and then be smart enough to unpack it using
-your locally configured name.  Complexity hurdle #3.
-
-Furthermore, what happens when you *change* this name?  Fine, it's
-easy enough the first time, just rename the `node_modules` folders to
-`./blergyblerp/` or whatever name you choose.  But what about when you
-change it again?  npm doesn't currently track any state about past
-configuration settings, so this would be rather difficult to do
-properly.  It would have to track every previous value for this
-config, and always accept any of them, or else yesterday's install may
-be broken tomorrow.  Complexity hurdle #5.
-
-Never going to happen.  The folder is named `node_modules`.  It is
-written indelibly in the Node Way, handed down from the ancient times
-of Node 0.3.
-
-## How do I install node with npm?
-
-You don't.  Try one of these node version managers:
-
-Unix:
-
-* <http://github.com/isaacs/nave>
-* <http://github.com/visionmedia/n>
-* <http://github.com/creationix/nvm>
-
-Windows:
-
-* <http://github.com/marcelklehr/nodist>
-* <https://github.com/hakobera/nvmw>
-
-## How can I use npm for development?
-
-See `npm-developers(7)` and `package.json(5)`.
-
-You'll most likely want to `npm link` your development folder.  That's
-awesomely handy.
-
-To set up your own private registry, check out `npm-registry(7)`.
-
-## Can I list a url as a dependency?
-
-Yes.  It should be a url to a gzipped tarball containing a single folder
-that has a package.json in its root, or a git url.
-(See "what is a package?" above.)
-
-## How do I symlink to a dev folder so I don't have to keep re-installing?
-
-See `npm-link(1)`
-
-## The package registry website.  What is that exactly?
-
-See `npm-registry(7)`.
-
-## I forgot my password, and can't publish.  How do I reset it?
-
-Go to <https://npmjs.org/forgot>.
-
-## I get ECONNREFUSED a lot.  What's up?
-
-Either the registry is down, or node's DNS isn't able to reach out.
-
-To check if the registry is down, open up <http://registry.npmjs.org/>
-in a web browser.  This will also tell you if you are just unable to
-access the internet for some reason.
-
-If the registry IS down, let me know by emailing <i@izs.me> or posting
-an issue at <https://github.com/isaacs/npm/issues>.  We'll have
-someone kick it or something.
-
-## Why no namespaces?
-
-Please see this discussion: <https://github.com/isaacs/npm/issues/798>
-
-tl;dr - It doesn't actually make things better, and can make them worse.
-
-If you want to namespace your own packages, you may: simply use the
-`-` character to separate the names.  npm is a mostly anarchic system.
-There is not sufficient need to impose namespace rules on everyone.
-
-## Who does npm?
-
-`npm view npm author`
-
-`npm view npm contributors`
-
-## I have a question or request not addressed here. Where should I put it?
-
-Post an issue on the github project:
-
-* <https://github.com/isaacs/npm/issues>
-
-## Why does npm hate me?
-
-npm is not capable of hatred.  It loves everyone, especially you.
-
-## SEE ALSO
-
-* npm(1)
-* npm-developers(7)
-* package.json(5)
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-config(7)
-* npm-folders(5)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-index.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,411 +0,0 @@
-npm-index(7) -- Index of all npm documentation
-==============================================
-
-## README(1)
-
-node package manager
-
-# Command Line Documentation
-
-## npm(1)
-
-node package manager
-
-## npm-adduser(1)
-
-Add a registry user account
-
-## npm-bin(1)
-
-Display npm bin folder
-
-## npm-bugs(1)
-
-Bugs for a package in a web browser maybe
-
-## npm-build(1)
-
-Build a package
-
-## npm-bundle(1)
-
-REMOVED
-
-## npm-cache(1)
-
-Manipulates packages cache
-
-## npm-completion(1)
-
-Tab Completion for npm
-
-## npm-config(1)
-
-Manage the npm configuration files
-
-## npm-dedupe(1)
-
-Reduce duplication
-
-## npm-deprecate(1)
-
-Deprecate a version of a package
-
-## npm-docs(1)
-
-Docs for a package in a web browser maybe
-
-## npm-edit(1)
-
-Edit an installed package
-
-## npm-explore(1)
-
-Browse an installed package
-
-## npm-help-search(1)
-
-Search npm help documentation
-
-## npm-help(1)
-
-Get help on npm
-
-## npm-init(1)
-
-Interactively create a package.json file
-
-## npm-install(1)
-
-Install a package
-
-## npm-link(1)
-
-Symlink a package folder
-
-## npm-ls(1)
-
-List installed packages
-
-## npm-outdated(1)
-
-Check for outdated packages
-
-## npm-owner(1)
-
-Manage package owners
-
-## npm-pack(1)
-
-Create a tarball from a package
-
-## npm-prefix(1)
-
-Display prefix
-
-## npm-prune(1)
-
-Remove extraneous packages
-
-## npm-publish(1)
-
-Publish a package
-
-## npm-rebuild(1)
-
-Rebuild a package
-
-## npm-restart(1)
-
-Start a package
-
-## npm-rm(1)
-
-Remove a package
-
-## npm-root(1)
-
-Display npm root
-
-## npm-run-script(1)
-
-Run arbitrary package scripts
-
-## npm-search(1)
-
-Search for packages
-
-## npm-shrinkwrap(1)
-
-Lock down dependency versions
-
-## npm-star(1)
-
-Mark your favorite packages
-
-## npm-stars(1)
-
-View packages marked as favorites
-
-## npm-start(1)
-
-Start a package
-
-## npm-stop(1)
-
-Stop a package
-
-## npm-submodule(1)
-
-Add a package as a git submodule
-
-## npm-tag(1)
-
-Tag a published version
-
-## npm-test(1)
-
-Test a package
-
-## npm-uninstall(1)
-
-Remove a package
-
-## npm-unpublish(1)
-
-Remove a package from the registry
-
-## npm-update(1)
-
-Update a package
-
-## npm-version(1)
-
-Bump a package version
-
-## npm-view(1)
-
-View registry info
-
-## npm-whoami(1)
-
-Display npm username
-
-## repo(1)
-
-Open package repository page in the browser
-
-# API Documentation
-
-## npm(3)
-
-node package manager
-
-## npm-bin(3)
-
-Display npm bin folder
-
-## npm-bugs(3)
-
-Bugs for a package in a web browser maybe
-
-## npm-commands(3)
-
-npm commands
-
-## npm-config(3)
-
-Manage the npm configuration files
-
-## npm-deprecate(3)
-
-Deprecate a version of a package
-
-## npm-docs(3)
-
-Docs for a package in a web browser maybe
-
-## npm-edit(3)
-
-Edit an installed package
-
-## npm-explore(3)
-
-Browse an installed package
-
-## npm-help-search(3)
-
-Search the help pages
-
-## npm-init(3)
-
-Interactively create a package.json file
-
-## npm-install(3)
-
-install a package programmatically
-
-## npm-link(3)
-
-Symlink a package folder
-
-## npm-load(3)
-
-Load config settings
-
-## npm-ls(3)
-
-List installed packages
-
-## npm-outdated(3)
-
-Check for outdated packages
-
-## npm-owner(3)
-
-Manage package owners
-
-## npm-pack(3)
-
-Create a tarball from a package
-
-## npm-prefix(3)
-
-Display prefix
-
-## npm-prune(3)
-
-Remove extraneous packages
-
-## npm-publish(3)
-
-Publish a package
-
-## npm-rebuild(3)
-
-Rebuild a package
-
-## npm-restart(3)
-
-Start a package
-
-## npm-root(3)
-
-Display npm root
-
-## npm-run-script(3)
-
-Run arbitrary package scripts
-
-## npm-search(3)
-
-Search for packages
-
-## npm-shrinkwrap(3)
-
-programmatically generate package shrinkwrap file
-
-## npm-start(3)
-
-Start a package
-
-## npm-stop(3)
-
-Stop a package
-
-## npm-submodule(3)
-
-Add a package as a git submodule
-
-## npm-tag(3)
-
-Tag a published version
-
-## npm-test(3)
-
-Test a package
-
-## npm-uninstall(3)
-
-uninstall a package programmatically
-
-## npm-unpublish(3)
-
-Remove a package from the registry
-
-## npm-update(3)
-
-Update a package
-
-## npm-version(3)
-
-Bump a package version
-
-## npm-view(3)
-
-View registry info
-
-## npm-whoami(3)
-
-Display npm username
-
-## repo(3)
-
-Open package repository page in the browser
-
-# Files
-
-## npm-folders(5)
-
-Folder Structures Used by npm
-
-## npmrc(5)
-
-The npm config files
-
-## package.json(5)
-
-Specifics of npm's package.json handling
-
-# Misc
-
-## npm-coding-style(7)
-
-npm's "funny" coding style
-
-## npm-config(7)
-
-More than you probably want to know about npm configuration
-
-## npm-developers(7)
-
-Developer Guide
-
-## npm-disputes(7)
-
-Handling Module Name Disputes
-
-## npm-faq(7)
-
-Frequently Asked Questions
-
-## npm-index(7)
-
-Index of all npm documentation
-
-## npm-registry(7)
-
-The JavaScript Package Registry
-
-## npm-scripts(7)
-
-How npm handles the "scripts" field
-
-## removing-npm(7)
-
-Cleaning the Slate
-
-## semver(7)
-
-The semantic versioner for npm
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-registry.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,69 +0,0 @@
-npm-registry(7) -- The JavaScript Package Registry
-==================================================
-
-## DESCRIPTION
-
-To resolve packages by name and version, npm talks to a registry website
-that implements the CommonJS Package Registry specification for reading
-package info.
-
-Additionally, npm's package registry implementation supports several
-write APIs as well, to allow for publishing packages and managing user
-account information.
-
-The official public npm registry is at <http://registry.npmjs.org/>.  It
-is powered by a CouchDB database at
-<http://isaacs.iriscouch.com/registry>.  The code for the couchapp is
-available at <http://github.com/isaacs/npmjs.org>.  npm user accounts
-are CouchDB users, stored in the <http://isaacs.iriscouch.com/_users>
-database.
-
-The registry URL is supplied by the `registry` config parameter.  See
-`npm-config(1)`, `npmrc(5)`, and `npm-config(7)` for more on managing
-npm's configuration.
-
-## Can I run my own private registry?
-
-Yes!
-
-The easiest way is to replicate the couch database, and use the same (or
-similar) design doc to implement the APIs.
-
-If you set up continuous replication from the official CouchDB, and then
-set your internal CouchDB as the registry config, then you'll be able
-to read any published packages, in addition to your private ones, and by
-default will only publish internally.  If you then want to publish a
-package for the whole world to see, you can simply override the
-`--registry` config for that command.
-
-## I don't want my package published in the official registry. It's private.
-
-Set `"private": true` in your package.json to prevent it from being
-published at all, or
-`"publishConfig":{"registry":"http://my-internal-registry.local"}`
-to force it to be published only to your internal registry.
-
-See `package.json(5)` for more info on what goes in the package.json file.
-
-## Will you replicate from my registry into the public one?
-
-No.  If you want things to be public, then publish them into the public
-registry using npm.  What little security there is would be for nought
-otherwise.
-
-## Do I have to use couchdb to build a registry that npm can talk to?
-
-No, but it's way easier.  Basically, yes, you do, or you have to
-effectively implement the entire CouchDB API anyway.
-
-## Is there a website or something to see package docs and such?
-
-Yes, head over to <https://npmjs.org/>
-
-## SEE ALSO
-
-* npm-config(1)
-* npm-config(7)
-* npmrc(5)
-* npm-developers(7)
-* npm-disputes(7)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/npm-scripts.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,245 +0,0 @@
-npm-scripts(7) -- How npm handles the "scripts" field
-=====================================================
-
-## DESCRIPTION
-
-npm supports the "scripts" member of the package.json script, for the
-following scripts:
-
-* prepublish:
-  Run BEFORE the package is published.  (Also run on local `npm
-  install` without any arguments.)
-* publish, postpublish:
-  Run AFTER the package is published.
-* preinstall:
-  Run BEFORE the package is installed
-* install, postinstall:
-  Run AFTER the package is installed.
-* preuninstall, uninstall:
-  Run BEFORE the package is uninstalled.
-* postuninstall:
-  Run AFTER the package is uninstalled.
-* preupdate:
-  Run BEFORE the package is updated with the update command.
-* update, postupdate:
-  Run AFTER the package is updated with the update command.
-* pretest, test, posttest:
-  Run by the `npm test` command.
-* prestop, stop, poststop:
-  Run by the `npm stop` command.
-* prestart, start, poststart:
-  Run by the `npm start` command.
-* prerestart, restart, postrestart:
-  Run by the `npm restart` command. Note: `npm restart` will run the
-  stop and start scripts if no `restart` script is provided.
-
-Additionally, arbitrary scripts can be run by doing
-`npm run-script <stage> <pkg>`.
-
-## NOTE: INSTALL SCRIPTS ARE AN ANTIPATTERN
-
-**tl;dr** Don't use `install`.  Use a `.gyp` file for compilation, and
-`prepublish` for anything else.
-
-You should almost never have to explicitly set a `preinstall` or
-`install` script.  If you are doing this, please consider if there is
-another option.
-
-The only valid use of `install` or `preinstall` scripts is for
-compilation which must be done on the target architecture.  In early
-versions of node, this was often done using the `node-waf` scripts, or
-a standalone `Makefile`, and early versions of npm required that it be
-explicitly set in package.json.  This was not portable, and harder to
-do properly.
-
-In the current version of node, the standard way to do this is using a
-`.gyp` file.  If you have a file with a `.gyp` extension in the root
-of your package, then npm will run the appropriate `node-gyp` commands
-automatically at install time.  This is the only officially supported
-method for compiling binary addons, and does not require that you add
-anything to your package.json file.
-
-If you have to do other things before your package is used, in a way
-that is not dependent on the operating system or architecture of the
-target system, then use a `prepublish` script instead.  This includes
-tasks such as:
-
-* Compile CoffeeScript source code into JavaScript.
-* Create minified versions of JavaScript source code.
-* Fetching remote resources that your package will use.
-
-The advantage of doing these things at `prepublish` time instead of
-`preinstall` or `install` time is that they can be done once, in a
-single place, and thus greatly reduce complexity and variability.
-Additionally, this means that:
-
-* You can depend on `coffee-script` as a `devDependency`, and thus
-  your users don't need to have it installed.
-* You don't need to include the minifiers in your package, reducing
-  the size for your users.
-* You don't need to rely on your users having `curl` or `wget` or
-  other system tools on the target machines.
-
-## DEFAULT VALUES
-
-npm will default some script values based on package contents.
-
-* `"start": "node server.js"`:
-
-  If there is a `server.js` file in the root of your package, then npm
-  will default the `start` command to `node server.js`.
-
-* `"preinstall": "node-waf clean || true; node-waf configure build"`:
-
-  If there is a `wscript` file in the root of your package, npm will
-  default the `preinstall` command to compile using node-waf.
-
-## USER
-
-If npm was invoked with root privileges, then it will change the uid
-to the user account or uid specified by the `user` config, which
-defaults to `nobody`.  Set the `unsafe-perm` flag to run scripts with
-root privileges.
-
-## ENVIRONMENT
-
-Package scripts run in an environment where many pieces of information
-are made available regarding the setup of npm and the current state of
-the process.
-
-
-### path
-
-If you depend on modules that define executable scripts, like test
-suites, then those executables will be added to the `PATH` for
-executing the scripts.  So, if your package.json has this:
-
-    { "name" : "foo"
-    , "dependencies" : { "bar" : "0.1.x" }
-    , "scripts": { "start" : "bar ./test" } }
-
-then you could run `npm start` to execute the `bar` script, which is
-exported into the `node_modules/.bin` directory on `npm install`.
-
-### package.json vars
-
-The package.json fields are tacked onto the `npm_package_` prefix. So,
-for instance, if you had `{"name":"foo", "version":"1.2.5"}` in your
-package.json file, then your package scripts would have the
-`npm_package_name` environment variable set to "foo", and the
-`npm_package_version` set to "1.2.5"
-
-### configuration
-
-Configuration parameters are put in the environment with the
-`npm_config_` prefix. For instance, you can view the effective `root`
-config by checking the `npm_config_root` environment variable.
-
-### Special: package.json "config" hash
-
-The package.json "config" keys are overwritten in the environment if
-there is a config param of `<name>[@<version>]:<key>`.  For example,
-if the package.json has this:
-
-    { "name" : "foo"
-    , "config" : { "port" : "8080" }
-    , "scripts" : { "start" : "node server.js" } }
-
-and the server.js is this:
-
-    http.createServer(...).listen(process.env.npm_package_config_port)
-
-then the user could change the behavior by doing:
-
-    npm config set foo:port 80
-
-### current lifecycle event
-
-Lastly, the `npm_lifecycle_event` environment variable is set to
-whichever stage of the cycle is being executed. So, you could have a
-single script used for different parts of the process which switches
-based on what's currently happening.
-
-Objects are flattened following this format, so if you had
-`{"scripts":{"install":"foo.js"}}` in your package.json, then you'd
-see this in the script:
-
-    process.env.npm_package_scripts_install === "foo.js"
-
-## EXAMPLES
-
-For example, if your package.json contains this:
-
-    { "scripts" :
-      { "install" : "scripts/install.js"
-      , "postinstall" : "scripts/install.js"
-      , "uninstall" : "scripts/uninstall.js"
-      }
-    }
-
-then the `scripts/install.js` will be called for the install,
-post-install, stages of the lifecycle, and the `scripts/uninstall.js`
-would be called when the package is uninstalled.  Since
-`scripts/install.js` is running for three different phases, it would
-be wise in this case to look at the `npm_lifecycle_event` environment
-variable.
-
-If you want to run a make command, you can do so.  This works just
-fine:
-
-    { "scripts" :
-      { "preinstall" : "./configure"
-      , "install" : "make && make install"
-      , "test" : "make test"
-      }
-    }
-
-## EXITING
-
-Scripts are run by passing the line as a script argument to `sh`.
-
-If the script exits with a code other than 0, then this will abort the
-process.
-
-Note that these script files don't have to be nodejs or even
-javascript programs. They just have to be some kind of executable
-file.
-
-## HOOK SCRIPTS
-
-If you want to run a specific script at a specific lifecycle event for
-ALL packages, then you can use a hook script.
-
-Place an executable file at `node_modules/.hooks/{eventname}`, and
-it'll get run for all packages when they are going through that point
-in the package lifecycle for any packages installed in that root.
-
-Hook scripts are run exactly the same way as package.json scripts.
-That is, they are in a separate child process, with the env described
-above.
-
-## BEST PRACTICES
-
-* Don't exit with a non-zero error code unless you *really* mean it.
-  Except for uninstall scripts, this will cause the npm action to
-  fail, and potentially be rolled back.  If the failure is minor or
-  only will prevent some optional features, then it's better to just
-  print a warning and exit successfully.
-* Try not to use scripts to do what npm can do for you.  Read through
-  `package.json(5)` to see all the things that you can specify and enable
-  by simply describing your package appropriately.  In general, this
-  will lead to a more robust and consistent state.
-* Inspect the env to determine where to put things.  For instance, if
-  the `npm_config_binroot` environ is set to `/home/user/bin`, then
-  don't try to install executables into `/usr/local/bin`.  The user
-  probably set it up that way for a reason.
-* Don't prefix your script commands with "sudo".  If root permissions
-  are required for some reason, then it'll fail with that error, and
-  the user will sudo the npm command in question.
-
-## SEE ALSO
-
-* npm-run-script(1)
-* package.json(5)
-* npm-developers(7)
-* npm-install(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/removing-npm.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-npm-removal(1) -- Cleaning the Slate
-====================================
-
-## SYNOPSIS
-
-So sad to see you go.
-
-    sudo npm uninstall npm -g
-
-Or, if that fails, get the npm source code, and do:
-
-    sudo make uninstall
-
-## More Severe Uninstalling
-
-Usually, the above instructions are sufficient.  That will remove
-npm, but leave behind anything you've installed.
-
-If that doesn't work, or if you require more drastic measures,
-continue reading.
-
-Note that this is only necessary for globally-installed packages.  Local
-installs are completely contained within a project's `node_modules`
-folder.  Delete that folder, and everything is gone (unless a package's
-install script is particularly ill-behaved).
-
-This assumes that you installed node and npm in the default place.  If
-you configured node with a different `--prefix`, or installed npm with a
-different prefix setting, then adjust the paths accordingly, replacing
-`/usr/local` with your install prefix.
-
-To remove everything npm-related manually:
-
-    rm -rf /usr/local/{lib/node{,/.npm,_modules},bin,share/man}/npm*
-
-If you installed things *with* npm, then your best bet is to uninstall
-them with npm first, and then install them again once you have a
-proper install.  This can help find any symlinks that are lying
-around:
-
-    ls -laF /usr/local/{lib/node{,/.npm},bin,share/man} | grep npm
-
-Prior to version 0.3, npm used shim files for executables and node
-modules.  To track those down, you can do the following:
-
-    find /usr/local/{lib/node,bin} -exec grep -l npm \{\} \; ;
-
-(This is also in the README file.)
-
-## SEE ALSO
-
-* README
-* npm-rm(1)
-* npm-prune(1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/doc/misc/semver.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,111 +0,0 @@
-semver(7) -- The semantic versioner for npm
-===========================================
-
-## Usage
-
-    $ npm install semver
-
-    semver.valid('1.2.3') // '1.2.3'
-    semver.valid('a.b.c') // null
-    semver.clean('  =v1.2.3   ') // '1.2.3'
-    semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true
-    semver.gt('1.2.3', '9.8.7') // false
-    semver.lt('1.2.3', '9.8.7') // true
-
-As a command-line utility:
-
-    $ semver -h
-
-    Usage: semver <version> [<version> [...]] [-r <range> | -i <inc> | -d <dec>]
-    Test if version(s) satisfy the supplied range(s), and sort them.
-
-    Multiple versions or ranges may be supplied, unless increment
-    or decrement options are specified.  In that case, only a single
-    version may be used, and it is incremented by the specified level
-
-    Program exits successfully if any valid version satisfies
-    all supplied ranges, and prints all satisfying versions.
-
-    If no versions are valid, or ranges are not satisfied,
-    then exits failure.
-
-    Versions are printed in ascending order, so supplying
-    multiple versions to the utility will just sort them.
-
-## Versions
-
-A "version" is described by the v2.0.0 specification found at
-<http://semver.org/>.
-
-A leading `"="` or `"v"` character is stripped off and ignored.
-
-## Ranges
-
-The following range styles are supported:
-
-* `1.2.3` A specific version.  When nothing else will do.  Note that
-  build metadata is still ignored, so `1.2.3+build2012` will satisfy
-  this range.
-* `>1.2.3` Greater than a specific version.
-* `<1.2.3` Less than a specific version.  If there is no prerelease
-  tag on the version range, then no prerelease version will be allowed
-  either, even though these are technically "less than".
-* `>=1.2.3` Greater than or equal to.  Note that prerelease versions
-  are NOT equal to their "normal" equivalents, so `1.2.3-beta` will
-  not satisfy this range, but `2.3.0-beta` will.
-* `<=1.2.3` Less than or equal to.  In this case, prerelease versions
-  ARE allowed, so `1.2.3-beta` would satisfy.
-* `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4`
-* `~1.2.3` := `>=1.2.3-0 <1.3.0-0`  "Reasonably close to 1.2.3".  When
-  using tilde operators, prerelease versions are supported as well,
-  but a prerelease of the next significant digit will NOT be
-  satisfactory, so `1.3.0-beta` will not satisfy `~1.2.3`.
-* `~1.2` := `>=1.2.0-0 <1.3.0-0` "Any version starting with 1.2"
-* `1.2.x` := `>=1.2.0-0 <1.3.0-0` "Any version starting with 1.2"
-* `~1` := `>=1.0.0-0 <2.0.0-0` "Any version starting with 1"
-* `1.x` := `>=1.0.0-0 <2.0.0-0` "Any version starting with 1"
-
-
-Ranges can be joined with either a space (which implies "and") or a
-`||` (which implies "or").
-
-## Functions
-
-All methods and classes take a final `loose` boolean argument that, if
-true, will be more forgiving about not-quite-valid semver strings.
-The resulting output will always be 100% strict, of course.
-
-Strict-mode Comparators and Ranges will be strict about the SemVer
-strings that they parse.
-
-* valid(v): Return the parsed version, or null if it's not valid.
-* inc(v, release): Return the version incremented by the release type
-  (major, minor, patch, or prerelease), or null if it's not valid.
-
-### Comparison
-
-* gt(v1, v2): `v1 > v2`
-* gte(v1, v2): `v1 >= v2`
-* lt(v1, v2): `v1 < v2`
-* lte(v1, v2): `v1 <= v2`
-* eq(v1, v2): `v1 == v2` This is true if they're logically equivalent,
-  even if they're not the exact same string.  You already know how to
-  compare strings.
-* neq(v1, v2): `v1 != v2` The opposite of eq.
-* cmp(v1, comparator, v2): Pass in a comparison string, and it'll call
-  the corresponding function above.  `"==="` and `"!=="` do simple
-  string comparison, but are included for completeness.  Throws if an
-  invalid comparison string is provided.
-* compare(v1, v2): Return 0 if v1 == v2, or 1 if v1 is greater, or -1 if
-  v2 is greater.  Sorts in ascending order if passed to Array.sort().
-* rcompare(v1, v2): The reverse of compare.  Sorts an array of versions
-  in descending order when passed to Array.sort().
-
-
-### Ranges
-
-* validRange(range): Return the valid range or null if it's not valid
-* satisfies(version, range): Return true if the version satisfies the
-  range.
-* maxSatisfying(versions, range): Return the highest version in the list
-  that satisfies the range, or null if none of them do.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/README.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,273 +0,0 @@
-<!doctype html>
-<html>
-  <title>README</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="cli/npm.html">npm</a></h1> <p>node package manager</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<p>This is just enough info to get you up and running.</p>
-
-<p>Much more info available via <code>npm help</code> once it&#39;s installed.</p>
-
-<h2 id="IMPORTANT">IMPORTANT</h2>
-
-<p><strong>You need node v0.8 or higher to run this program.</strong></p>
-
-<p>To install an old <strong>and unsupported</strong> version of npm that works on node 0.3
-and prior, clone the git repo and dig through the old tags and branches.</p>
-
-<h2 id="Super-Easy-Install">Super Easy Install</h2>
-
-<p>npm comes with node now.</p>
-
-<h3 id="Windows-Computers">Windows Computers</h3>
-
-<p>Get the MSI.  npm is in it.</p>
-
-<h3 id="Apple-Macintosh-Computers">Apple Macintosh Computers</h3>
-
-<p>Get the pkg.  npm is in it.</p>
-
-<h3 id="Other-Sorts-of-Unices">Other Sorts of Unices</h3>
-
-<p>Run <code>make install</code>.  npm will be installed with node.</p>
-
-<p>If you want a more fancy pants install (a different version, customized
-paths, etc.) then read on.</p>
-
-<h2 id="Fancy-Install-Unix">Fancy Install (Unix)</h2>
-
-<p>There&#39;s a pretty robust install script at
-<a href="https://npmjs.org/install.sh">https://npmjs.org/install.sh</a>.  You can download that and run it.</p>
-
-<h3 id="Slightly-Fancier">Slightly Fancier</h3>
-
-<p>You can set any npm configuration params with that script:</p>
-
-<pre><code>npm_config_prefix=/some/path sh install.sh</code></pre>
-
-<p>Or, you can run it in uber-debuggery mode:</p>
-
-<pre><code>npm_debug=1 sh install.sh</code></pre>
-
-<h3 id="Even-Fancier">Even Fancier</h3>
-
-<p>Get the code with git.  Use <code>make</code> to build the docs and do other stuff.
-If you plan on hacking on npm, <code>make link</code> is your friend.</p>
-
-<p>If you&#39;ve got the npm source code, you can also semi-permanently set
-arbitrary config keys using the <code>./configure --key=val ...</code>, and then
-run npm commands by doing <code>node cli.js &lt;cmd&gt; &lt;args&gt;</code>.  (This is helpful
-for testing, or running stuff without actually installing npm itself.)</p>
-
-<h2 id="Fancy-Windows-Install">Fancy Windows Install</h2>
-
-<p>You can download a zip file from <a href="https://npmjs.org/dist/">https://npmjs.org/dist/</a>, and unpack it
-in the same folder where node.exe lives.</p>
-
-<p>If that&#39;s not fancy enough for you, then you can fetch the code with
-git, and mess with it directly.</p>
-
-<h2 id="Installing-on-Cygwin">Installing on Cygwin</h2>
-
-<p>No.</p>
-
-<h2 id="Permissions-when-Using-npm-to-Install-Other-Stuff">Permissions when Using npm to Install Other Stuff</h2>
-
-<p><strong>tl;dr</strong></p>
-
-<ul><li>Use <code>sudo</code> for greater safety.  Or don&#39;t, if you prefer not to.</li><li>npm will downgrade permissions if it&#39;s root before running any build
-scripts that package authors specified.</li></ul>
-
-<h3 id="More-details">More details...</h3>
-
-<p>As of version 0.3, it is recommended to run npm as root.
-This allows npm to change the user identifier to the <code>nobody</code> user prior
-to running any package build or test commands.</p>
-
-<p>If you are not the root user, or if you are on a platform that does not
-support uid switching, then npm will not attempt to change the userid.</p>
-
-<p>If you would like to ensure that npm <strong>always</strong> runs scripts as the
-&quot;nobody&quot; user, and have it fail if it cannot downgrade permissions, then
-set the following configuration param:</p>
-
-<pre><code>npm config set unsafe-perm false</code></pre>
-
-<p>This will prevent running in unsafe mode, even as non-root users.</p>
-
-<h2 id="Uninstalling">Uninstalling</h2>
-
-<p>So sad to see you go.</p>
-
-<pre><code>sudo npm uninstall npm -g</code></pre>
-
-<p>Or, if that fails,</p>
-
-<pre><code>sudo make uninstall</code></pre>
-
-<h2 id="More-Severe-Uninstalling">More Severe Uninstalling</h2>
-
-<p>Usually, the above instructions are sufficient.  That will remove
-npm, but leave behind anything you&#39;ve installed.</p>
-
-<p>If you would like to remove all the packages that you have installed,
-then you can use the <code>npm ls</code> command to find them, and then <code>npm rm</code> to
-remove them.</p>
-
-<p>To remove cruft left behind by npm 0.x, you can use the included
-<code>clean-old.sh</code> script file.  You can run it conveniently like this:</p>
-
-<pre><code>npm explore npm -g -- sh scripts/clean-old.sh</code></pre>
-
-<p>npm uses two configuration files, one for per-user configs, and another
-for global (every-user) configs.  You can view them by doing:</p>
-
-<pre><code>npm config get userconfig   # defaults to ~/.npmrc
-npm config get globalconfig # defaults to /usr/local/etc/npmrc</code></pre>
-
-<p>Uninstalling npm does not remove configuration files by default.  You
-must remove them yourself manually if you want them gone.  Note that
-this means that future npm installs will not remember the settings that
-you have chosen.</p>
-
-<h2 id="Using-npm-Programmatically">Using npm Programmatically</h2>
-
-<p>If you would like to use npm programmatically, you can do that.
-It&#39;s not very well documented, but it <em>is</em> rather simple.</p>
-
-<p>Most of the time, unless you actually want to do all the things that
-npm does, you should try using one of npm&#39;s dependencies rather than
-using npm itself, if possible.</p>
-
-<p>Eventually, npm will be just a thin cli wrapper around the modules
-that it depends on, but for now, there are some things that you must
-use npm itself to do.</p>
-
-<pre><code>var npm = require(&quot;npm&quot;)
-npm.load(myConfigObject, function (er) {
-  if (er) return handlError(er)
-  npm.commands.install([&quot;some&quot;, &quot;args&quot;], function (er, data) {
-    if (er) return commandFailed(er)
-    // command succeeded, and data might have some info
-  })
-  npm.on(&quot;log&quot;, function (message) { .... })
-})</code></pre>
-
-<p>The <code>load</code> function takes an object hash of the command-line configs.
-The various <code>npm.commands.&lt;cmd&gt;</code> functions take an <strong>array</strong> of
-positional argument <strong>strings</strong>.  The last argument to any
-<code>npm.commands.&lt;cmd&gt;</code> function is a callback.  Some commands take other
-optional arguments.  Read the source.</p>
-
-<p>You cannot set configs individually for any single npm function at this
-time.  Since <code>npm</code> is a singleton, any call to <code>npm.config.set</code> will
-change the value for <em>all</em> npm commands in that process.</p>
-
-<p>See <code>./bin/npm-cli.js</code> for an example of pulling config values off of the
-command line arguments using nopt.  You may also want to check out <code>npm
-help config</code> to learn about all the options you can set there.</p>
-
-<h2 id="More-Docs">More Docs</h2>
-
-<p>Check out the <a href="https://npmjs.org/doc/">docs</a>,
-especially the <a href="https://npmjs.org/doc/faq.html">faq</a>.</p>
-
-<p>You can use the <code>npm help</code> command to read any of them.</p>
-
-<p>If you&#39;re a developer, and you want to use npm to publish your program,
-you should <a href="https://npmjs.org/doc/developers.html">read this</a></p>
-
-<h2 id="Legal-Stuff">Legal Stuff</h2>
-
-<p>&quot;npm&quot; and &quot;the npm registry&quot; are owned by Isaac Z. Schlueter.
-All rights reserved.  See the included LICENSE file for more details.</p>
-
-<p>&quot;Node.js&quot; and &quot;node&quot; are trademarks owned by Joyent, Inc.  npm is not
-officially part of the Node.js project, and is neither owned by nor
-officially affiliated with Joyent, Inc.</p>
-
-<p>The packages in the npm registry are not part of npm itself, and are the
-sole property of their respective maintainers.  While every effort is
-made to ensure accountability, there is absolutely no guarantee,
-warrantee, or assertion made as to the quality, fitness for a specific
-purpose, or lack of malice in any given npm package.  Modules
-published on the npm registry are not affiliated with or endorsed by
-Joyent, Inc., Isaac Z. Schlueter, Ryan Dahl, or the Node.js project.</p>
-
-<p>If you have a complaint about a package in the npm registry, and cannot
-resolve it with the package owner, please express your concerns to
-Isaac Z. Schlueter at <a href="mailto:i@izs.me">i@izs.me</a>.</p>
-
-<h3 id="In-plain-english">In plain english</h3>
-
-<p>This is mine; not my employer&#39;s, not Node&#39;s, not Joyent&#39;s, not Ryan
-Dahl&#39;s.</p>
-
-<p>If you publish something, it&#39;s yours, and you are solely accountable
-for it.  Not me, not Node, not Joyent, not Ryan Dahl.</p>
-
-<p>If other people publish something, it&#39;s theirs.  Not mine, not Node&#39;s,
-not Joyent&#39;s, not Ryan Dahl&#39;s.</p>
-
-<p>Yes, you can publish something evil.  It will be removed promptly if
-reported, and we&#39;ll lose respect for you.  But there is no vetting
-process for published modules.</p>
-
-<p>If this concerns you, inspect the source before using packages.</p>
-
-<h2 id="BUGS">BUGS</h2>
-
-<p>When you find issues, please report them:</p>
-
-<ul><li>web:
-<a href="https://github.com/isaacs/npm/issues">https://github.com/isaacs/npm/issues</a></li><li>email:
-<a href="mailto:npm-@googlegroups.com">npm-@googlegroups.com</a></li></ul>
-
-<p>Be sure to include <em>all</em> of the output from the npm command that didn&#39;t work
-as expected.  The <code>npm-debug.log</code> file is also helpful to provide.</p>
-
-<p>You can also look for isaacs in #node.js on irc://irc.freenode.net.  He
-will no doubt tell you to put the output in a gist or email.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="cli/npm.html">npm(1)</a></li><li><a href="misc/npm-faq.html">npm-faq(7)</a></li><li><a href="cli/npm-help.html">npm-help(1)</a></li><li><a href="misc/npm-index.html">npm-index(7)</a></li></ul>
-</div>
-<p id="footer"><a href="../doc/README.html">README</a> &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-bin.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-bin</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-bin.html">npm-bin</a></h1> <p>Display npm bin folder</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.bin(args, cb)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Print the folder where npm will install executables.</p>
-
-<p>This function should not be used programmatically.  Instead, just refer
-to the <code>npm.bin</code> member.</p>
-</div>
-<p id="footer">npm-bin &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-bugs.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-bugs</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-bugs.html">npm-bugs</a></h1> <p>Bugs for a package in a web browser maybe</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.bugs(package, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command tries to guess at the likely location of a package&#39;s
-bug tracker URL, and then tries to open it using the <code>--browser</code>
-config param.</p>
-
-<p>Like other commands, the first parameter is an array. This command only
-uses the first element, which is expected to be a package name with an
-optional version number.</p>
-
-<p>This command will launch a browser, so this command may not be the most
-friendly for programmatic use.</p>
-</div>
-<p id="footer">npm-bugs &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-commands.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,62 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-commands</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-commands.html">npm-commands</a></h1> <p>npm commands</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands[&lt;command&gt;](args, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm comes with a full set of commands, and each of the commands takes a
-similar set of arguments.</p>
-
-<p>In general, all commands on the command object take an <strong>array</strong> of positional
-argument <strong>strings</strong>. The last argument to any function is a callback. Some
-commands are special and take other optional arguments.</p>
-
-<p>All commands have their own man page. See <code>man npm-&lt;command&gt;</code> for command-line
-usage, or <code>man 3 npm-&lt;command&gt;</code> for programmatic usage.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-index.html">npm-index(7)</a></li></ul>
-</div>
-<p id="footer">npm-commands &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-config.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,67 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-config</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-config.html">npm-config</a></h1> <p>Manage the npm configuration files</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.config(args, callback)
-var val = npm.config.get(key)
-npm.config.set(key, val)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This function acts much the same way as the command-line version.  The first
-element in the array tells config what to do. Possible values are:</p>
-
-<ul><li><p><code>set</code></p><p>Sets a config parameter.  The second element in <code>args</code> is interpreted as the
-key, and the third element is interpreted as the value.</p></li><li><p><code>get</code></p><p>Gets the value of a config parameter. The second element in <code>args</code> is the
-key to get the value of.</p></li><li><p><code>delete</code> (<code>rm</code> or <code>del</code>)</p><p>Deletes a parameter from the config. The second element in <code>args</code> is the
-key to delete.</p></li><li><p><code>list</code> (<code>ls</code>)</p><p>Show all configs that aren&#39;t secret. No parameters necessary.</p></li><li><p><code>edit</code>:</p><p>Opens the config file in the default editor. This command isn&#39;t very useful
-programmatically, but it is made available.</p></li></ul>
-
-<p>To programmatically access npm configuration settings, or set them for
-the duration of a program, use the <code>npm.config.set</code> and <code>npm.config.get</code>
-functions instead.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../api/npm.html">npm(3)</a></li></ul>
-</div>
-<p id="footer">npm-config &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-deprecate.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,66 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-deprecate</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-deprecate.html">npm-deprecate</a></h1> <p>Deprecate a version of a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.deprecate(args, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command will update the npm registry entry for a package, providing
-a deprecation warning to all who attempt to install it.</p>
-
-<p>The &#39;args&#39; parameter must have exactly two elements:</p>
-
-<ul><li><p><code>package[@version]</code></p><p>The <code>version</code> portion is optional, and may be either a range, or a
-specific version, or a tag.</p></li><li><p><code>message</code></p><p>The warning message that will be printed whenever a user attempts to
-install the package.</p></li></ul>
-
-<p>Note that you must be the package owner to deprecate something.  See the
-<code>owner</code> and <code>adduser</code> help topics.</p>
-
-<p>To un-deprecate a package, specify an empty string (<code>&quot;&quot;</code>) for the <code>message</code> argument.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../api/npm-publish.html">npm-publish(3)</a></li><li><a href="../api/npm-unpublish.html">npm-unpublish(3)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li></ul>
-</div>
-<p id="footer">npm-deprecate &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-docs.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-docs</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-docs.html">npm-docs</a></h1> <p>Docs for a package in a web browser maybe</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.docs(package, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command tries to guess at the likely location of a package&#39;s
-documentation URL, and then tries to open it using the <code>--browser</code>
-config param.</p>
-
-<p>Like other commands, the first parameter is an array. This command only
-uses the first element, which is expected to be a package name with an
-optional version number.</p>
-
-<p>This command will launch a browser, so this command may not be the most
-friendly for programmatic use.</p>
-</div>
-<p id="footer">npm-docs &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-edit.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,64 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-edit</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-edit.html">npm-edit</a></h1> <p>Edit an installed package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.edit(package, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Opens the package folder in the default editor (or whatever you&#39;ve
-configured as the npm <code>editor</code> config -- see <code>npm help config</code>.)</p>
-
-<p>After it has been edited, the package is rebuilt so as to pick up any
-changes in compiled packages.</p>
-
-<p>For instance, you can do <code>npm install connect</code> to install connect
-into your package, and then <code>npm.commands.edit([&quot;connect&quot;], callback)</code>
-to make a few changes to your locally installed copy.</p>
-
-<p>The first parameter is a string array with a single element, the package
-to open. The package can optionally have a version number attached.</p>
-
-<p>Since this command opens an editor in a new process, be careful about where
-and how this is used.</p>
-</div>
-<p id="footer">npm-edit &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-explore.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,58 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-explore</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-explore.html">npm-explore</a></h1> <p>Browse an installed package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.explore(args, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Spawn a subshell in the directory of the installed package specified.</p>
-
-<p>If a command is specified, then it is run in the subshell, which then
-immediately terminates.</p>
-
-<p>Note that the package is <em>not</em> automatically rebuilt afterwards, so be
-sure to use <code>npm rebuild &lt;pkg&gt;</code> if you make any changes.</p>
-
-<p>The first element in the &#39;args&#39; parameter must be a package name.  After that is the optional command, which can be any number of strings. All of the strings will be combined into one, space-delimited command.</p>
-</div>
-<p id="footer">npm-explore &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-help-search.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,66 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-help-search</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-help-search.html">npm-help-search</a></h1> <p>Search the help pages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.helpSearch(args, [silent,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command is rarely useful, but it exists in the rare case that it is.</p>
-
-<p>This command takes an array of search terms and returns the help pages that
-match in order of best match.</p>
-
-<p>If there is only one match, then npm displays that help section. If there
-are multiple results, the results are printed to the screen formatted and the
-array of results is returned. Each result is an object with these properties:</p>
-
-<ul><li>hits:
-A map of args to number of hits on that arg. For example, {&quot;npm&quot;: 3}</li><li>found:
-Total number of unique args that matched.</li><li>totalHits:
-Total number of hits.</li><li>lines:
-An array of all matching lines (and some adjacent lines).</li><li>file:
-Name of the file that matched</li></ul>
-
-<p>The silent parameter is not neccessary not used, but it may in the future.</p>
-</div>
-<p id="footer">npm-help-search &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-init.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,69 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-init</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1>npm <a href="../api/init.html">init</a></h1> <p>Interactively create a package.json file</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.init(args, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This will ask you a bunch of questions, and then write a package.json for you.</p>
-
-<p>It attempts to make reasonable guesses about what you want things to be set to,
-and then writes a package.json file with the options you&#39;ve selected.</p>
-
-<p>If you already have a package.json file, it&#39;ll read that first, and default to
-the options in there.</p>
-
-<p>It is strictly additive, so it does not delete options from your package.json
-without a really good reason to do so.</p>
-
-<p>Since this function expects to be run on the command-line, it doesn&#39;t work very
-well as a programmatically. The best option is to roll your own, and since
-JavaScript makes it stupid simple to output formatted JSON, that is the
-preferred method. If you&#39;re sure you want to handle command-line prompting,
-then go ahead and use this programmatically.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<p><a href="../files/package.json.html">package.json(5)</a></p>
-</div>
-<p id="footer">npm-init &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-install.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-install</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-install.html">npm-install</a></h1> <p>install a package programmatically</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.install([where,] packages, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This acts much the same ways as installing on the command-line.</p>
-
-<p>The &#39;where&#39; parameter is optional and only used internally, and it specifies
-where the packages should be installed to.</p>
-
-<p>The &#39;packages&#39; parameter is an array of strings. Each element in the array is
-the name of a package to be installed.</p>
-
-<p>Finally, &#39;callback&#39; is a function that will be called when all packages have been
-installed or when an error has been encountered.</p>
-</div>
-<p id="footer">npm-install &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-link.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,73 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-link</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-link.html">npm-link</a></h1> <p>Symlink a package folder</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.command.link(callback)
-npm.command.link(packages, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Package linking is a two-step process.</p>
-
-<p>Without parameters, link will create a globally-installed
-symbolic link from <code>prefix/package-name</code> to the current folder.</p>
-
-<p>With a parameters, link will create a symlink from the local <code>node_modules</code>
-folder to the global symlink.</p>
-
-<p>When creating tarballs for <code>npm publish</code>, the linked packages are
-&quot;snapshotted&quot; to their current state by resolving the symbolic links.</p>
-
-<p>This is
-handy for installing your own stuff, so that you can work on it and test it
-iteratively without having to continually rebuild.</p>
-
-<p>For example:</p>
-
-<pre><code>npm.commands.link(cb)           # creates global link from the cwd
-                                # (say redis package)
-npm.commands.link(&#39;redis&#39;, cb)  # link-install the package</code></pre>
-
-<p>Now, any changes to the redis package will be reflected in
-the package in the current working directory</p>
-</div>
-<p id="footer">npm-link &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-load.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,66 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-load</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-load.html">npm-load</a></h1> <p>Load config settings</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.load(conf, cb)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm.load() must be called before any other function call.  Both parameters are
-optional, but the second is recommended.</p>
-
-<p>The first parameter is an object hash of command-line config params, and the
-second parameter is a callback that will be called when npm is loaded and
-ready to serve.</p>
-
-<p>The first parameter should follow a similar structure as the package.json
-config object.</p>
-
-<p>For example, to emulate the --dev flag, pass an object that looks like this:</p>
-
-<pre><code>{
-  &quot;dev&quot;: true
-}</code></pre>
-
-<p>For a list of all the available command-line configs, see <code>npm help config</code></p>
-</div>
-<p id="footer">npm-load &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-ls.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,93 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-ls</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-ls.html">npm-ls</a></h1> <p>List installed packages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.ls(args, [silent,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command will print to stdout all the versions of packages that are
-installed, as well as their dependencies, in a tree-structure. It will also
-return that data using the callback.</p>
-
-<p>This command does not take any arguments, but args must be defined.
-Beyond that, if any arguments are passed in, npm will politely warn that it
-does not take positional arguments, though you may set config flags
-like with any other command, such as <code>global</code> to list global packages.</p>
-
-<p>It will print out extraneous, missing, and invalid packages.</p>
-
-<p>If the silent parameter is set to true, nothing will be output to the screen,
-but the data will still be returned.</p>
-
-<p>Callback is provided an error if one occurred, the full data about which
-packages are installed and which dependencies they will receive, and a
-&quot;lite&quot; data object which just shows which versions are installed where.
-Note that the full data object is a circular structure, so care must be
-taken if it is serialized to JSON.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="long">long</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Show extended information.</p>
-
-<h3 id="parseable">parseable</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Show parseable output instead of tree view.</p>
-
-<h3 id="global">global</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>List packages in the global install prefix instead of in the current
-project.</p>
-
-<p>Note, if parseable is set or long isn&#39;t set, then duplicates will be trimmed.
-This means that if a submodule a same dependency as a parent module, then the
-dependency will only be output once.</p>
-</div>
-<p id="footer">npm-ls &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-outdated.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-outdated</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-outdated.html">npm-outdated</a></h1> <p>Check for outdated packages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.outdated([packages,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command will check the registry to see if the specified packages are
-currently outdated.</p>
-
-<p>If the &#39;packages&#39; parameter is left out, npm will check all packages.</p>
-</div>
-<p id="footer">npm-outdated &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-owner.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-owner</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-owner.html">npm-owner</a></h1> <p>Manage package owners</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.owner(args, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>The first element of the &#39;args&#39; parameter defines what to do, and the subsequent
-elements depend on the action. Possible values for the action are (order of
-parameters are given in parenthesis):</p>
-
-<ul><li>ls (package):
-List all the users who have access to modify a package and push new versions.
-Handy when you need to know who to bug for help.</li><li>add (user, package):
-Add a new user as a maintainer of a package.  This user is enabled to modify
-metadata, publish new versions, and add other owners.</li><li>rm (user, package):
-Remove a user from the package owner list.  This immediately revokes their
-privileges.</li></ul>
-
-<p>Note that there is only one level of access.  Either you can modify a package,
-or you can&#39;t.  Future versions may contain more fine-grained access levels, but
-that is not implemented at this time.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../api/npm-publish.html">npm-publish(3)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li></ul>
-</div>
-<p id="footer">npm-owner &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-pack.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-pack</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-pack.html">npm-pack</a></h1> <p>Create a tarball from a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.pack([packages,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>For anything that&#39;s installable (that is, a package folder, tarball,
-tarball url, name@tag, name@version, or name), this command will fetch
-it to the cache, and then copy the tarball to the current working
-directory as <code>&lt;name&gt;-&lt;version&gt;.tgz</code>, and then write the filenames out to
-stdout.</p>
-
-<p>If the same package is specified multiple times, then the file will be
-overwritten the second time.</p>
-
-<p>If no arguments are supplied, then npm packs the current package folder.</p>
-</div>
-<p id="footer">npm-pack &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-prefix.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-prefix</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-prefix.html">npm-prefix</a></h1> <p>Display prefix</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.prefix(args, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Print the prefix to standard out.</p>
-
-<p>&#39;args&#39; is never used and callback is never called with data.
-&#39;args&#39; must be present or things will break.</p>
-
-<p>This function is not useful programmatically</p>
-</div>
-<p id="footer">npm-prefix &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-prune.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,57 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-prune</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-prune.html">npm-prune</a></h1> <p>Remove extraneous packages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.prune([packages,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command removes &quot;extraneous&quot; packages.</p>
-
-<p>The first parameter is optional, and it specifies packages to be removed.</p>
-
-<p>No packages are specified, then all packages will be checked.</p>
-
-<p>Extraneous packages are packages that are not listed on the parent
-package&#39;s dependencies list.</p>
-</div>
-<p id="footer">npm-prune &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-publish.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,66 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-publish</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-publish.html">npm-publish</a></h1> <p>Publish a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.publish([packages,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Publishes a package to the registry so that it can be installed by name.
-Possible values in the &#39;packages&#39; array are:</p>
-
-<ul><li><p><code>&lt;folder&gt;</code>:
-A folder containing a package.json file</p></li><li><p><code>&lt;tarball&gt;</code>:
-A url or file path to a gzipped tar archive containing a single folder
-with a package.json file inside.</p></li></ul>
-
-<p>If the package array is empty, npm will try to publish something in the
-current working directory.</p>
-
-<p>This command could fails if one of the packages specified already exists in
-the registry.  Overwrites when the &quot;force&quot; environment variable is set.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-adduser.html">npm-adduser(1)</a></li><li><a href="../api/npm-owner.html">npm-owner(3)</a></li></ul>
-</div>
-<p id="footer">npm-publish &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-rebuild.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-rebuild</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-rebuild.html">npm-rebuild</a></h1> <p>Rebuild a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.rebuild([packages,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command runs the <code>npm build</code> command on each of the matched packages.  This is useful
-when you install a new version of node, and must recompile all your C++ addons with
-the new binary. If no &#39;packages&#39; parameter is specify, every package will be rebuilt.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<p>See <code>npm help build</code></p>
-</div>
-<p id="footer">npm-rebuild &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-restart.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,61 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-restart</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-restart.html">npm-restart</a></h1> <p>Start a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.restart(packages, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs a package&#39;s &quot;restart&quot; script, if one was provided.
-Otherwise it runs package&#39;s &quot;stop&quot; script, if one was provided, and then
-the &quot;start&quot; script.</p>
-
-<p>If no version is specified, then it restarts the &quot;active&quot; version.</p>
-
-<p>npm can run tests on multiple packages. Just specify multiple packages
-in the <code>packages</code> parameter.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../api/npm-start.html">npm-start(3)</a></li><li><a href="../api/npm-stop.html">npm-stop(3)</a></li></ul>
-</div>
-<p id="footer">npm-restart &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-root.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-root</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-root.html">npm-root</a></h1> <p>Display npm root</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.root(args, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Print the effective <code>node_modules</code> folder to standard out.</p>
-
-<p>&#39;args&#39; is never used and callback is never called with data.
-&#39;args&#39; must be present or things will break.</p>
-
-<p>This function is not useful programmatically.</p>
-</div>
-<p id="footer">npm-root &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-run-script.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-run-script</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-run-script.html">npm-run-script</a></h1> <p>Run arbitrary package scripts</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.run-script(args, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs an arbitrary command from a package&#39;s &quot;scripts&quot; object.</p>
-
-<p>It is used by the test, start, restart, and stop commands, but can be
-called directly, as well.</p>
-
-<p>The &#39;args&#39; parameter is an array of strings. Behavior depends on the number
-of elements.  If there is only one element, npm assumes that the element
-represents a command to be run on the local repository. If there is more than
-one element, then the first is assumed to be the package and the second is
-assumed to be the command to run. All other elements are ignored.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../api/npm-test.html">npm-test(3)</a></li><li><a href="../api/npm-start.html">npm-start(3)</a></li><li><a href="../api/npm-restart.html">npm-restart(3)</a></li><li><a href="../api/npm-stop.html">npm-stop(3)</a></li></ul>
-</div>
-<p id="footer">npm-run-script &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-search.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,66 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-search</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-search.html">npm-search</a></h1> <p>Search for packages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.search(searchTerms, [silent,] [staleness,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Search the registry for packages matching the search terms. The available parameters are:</p>
-
-<ul><li>searchTerms:
-Array of search terms. These terms are case-insensitive.</li><li>silent:
-If true, npm will not log anything to the console.</li><li>staleness:
-This is the threshold for stale packages. &quot;Fresh&quot; packages are not refreshed
-from the registry. This value is measured in seconds.</li><li><p>callback:
-Returns an object where each key is the name of a package, and the value
-is information about that package along with a &#39;words&#39; property, which is
-a space-delimited string of all of the interesting words in that package.
-The only properties included are those that are searched, which generally include:</p><ul><li>name</li><li>description</li><li>maintainers</li><li>url</li><li>keywords</li></ul></li></ul>
-
-<p>A search on the registry excludes any result that does not match all of the
-search terms. It also removes any items from the results that contain an
-excluded term (the &quot;searchexclude&quot; config). The search is case insensitive
-and doesn&#39;t try to read your mind (it doesn&#39;t do any verb tense matching or the
-like).</p>
-</div>
-<p id="footer">npm-search &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-shrinkwrap.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,60 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-shrinkwrap</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-shrinkwrap.html">npm-shrinkwrap</a></h1> <p>programmatically generate package shrinkwrap file</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.shrinkwrap(args, [silent,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This acts much the same ways as shrinkwrapping on the command-line.</p>
-
-<p>This command does not take any arguments, but &#39;args&#39; must be defined.
-Beyond that, if any arguments are passed in, npm will politely warn that it
-does not take positional arguments.</p>
-
-<p>If the &#39;silent&#39; parameter is set to true, nothing will be output to the screen,
-but the shrinkwrap file will still be written.</p>
-
-<p>Finally, &#39;callback&#39; is a function that will be called when the shrinkwrap has
-been saved.</p>
-</div>
-<p id="footer">npm-shrinkwrap &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-start.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-start</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-start.html">npm-start</a></h1> <p>Start a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.start(packages, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs a package&#39;s &quot;start&quot; script, if one was provided.</p>
-
-<p>npm can run tests on multiple packages. Just specify multiple packages
-in the <code>packages</code> parameter.</p>
-</div>
-<p id="footer">npm-start &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-stop.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-stop</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-stop.html">npm-stop</a></h1> <p>Stop a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.stop(packages, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs a package&#39;s &quot;stop&quot; script, if one was provided.</p>
-
-<p>npm can run stop on multiple packages. Just specify multiple packages
-in the <code>packages</code> parameter.</p>
-</div>
-<p id="footer">npm-stop &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-submodule.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,67 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-submodule</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-submodule.html">npm-submodule</a></h1> <p>Add a package as a git submodule</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.submodule(packages, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>For each package specified, npm will check if it has a git repository url
-in its package.json description then add it as a git submodule at
-<code>node_modules/&lt;pkg name&gt;</code>.</p>
-
-<p>This is a convenience only.  From then on, it&#39;s up to you to manage
-updates by using the appropriate git commands.  npm will stubbornly
-refuse to update, modify, or remove anything with a <code>.git</code> subfolder
-in it.</p>
-
-<p>This command also does not install missing dependencies, if the package
-does not include them in its git repository.  If <code>npm ls</code> reports that
-things are missing, you can either install, link, or submodule them yourself,
-or you can do <code>npm explore &lt;pkgname&gt; -- npm install</code> to install the
-dependencies into the submodule folder.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li>npm help json</li><li>git help submodule</li></ul>
-</div>
-<p id="footer">npm-submodule &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-tag.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-tag</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-tag.html">npm-tag</a></h1> <p>Tag a published version</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.tag(package@version, tag, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Tags the specified version of the package with the specified tag, or the
-<code>--tag</code> config if not specified.</p>
-
-<p>The &#39;package@version&#39; is an array of strings, but only the first two elements are
-currently used.</p>
-
-<p>The first element must be in the form package@version, where package
-is the package name and version is the version number (much like installing a
-specific version).</p>
-
-<p>The second element is the name of the tag to tag this version with. If this
-parameter is missing or falsey (empty), the default froom the config will be
-used. For more information about how to set this config, check
-<code>man 3 npm-config</code> for programmatic usage or <code>man npm-config</code> for cli usage.</p>
-</div>
-<p id="footer">npm-tag &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-test.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-test</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-test.html">npm-test</a></h1> <p>Test a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>  npm.commands.test(packages, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs a package&#39;s &quot;test&quot; script, if one was provided.</p>
-
-<p>To run tests as a condition of installation, set the <code>npat</code> config to
-true.</p>
-
-<p>npm can run tests on multiple packages. Just specify multiple packages
-in the <code>packages</code> parameter.</p>
-</div>
-<p id="footer">npm-test &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-uninstall.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-uninstall</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-uninstall.html">npm-uninstall</a></h1> <p>uninstall a package programmatically</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.uninstall(packages, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This acts much the same ways as uninstalling on the command-line.</p>
-
-<p>The &#39;packages&#39; parameter is an array of strings. Each element in the array is
-the name of a package to be uninstalled.</p>
-
-<p>Finally, &#39;callback&#39; is a function that will be called when all packages have been
-uninstalled or when an error has been encountered.</p>
-</div>
-<p id="footer">npm-uninstall &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-unpublish.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,60 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-unpublish</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-unpublish.html">npm-unpublish</a></h1> <p>Remove a package from the registry</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.unpublish(package, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This removes a package version from the registry, deleting its
-entry and removing the tarball.</p>
-
-<p>The package parameter must be defined.</p>
-
-<p>Only the first element in the package parameter is used.  If there is no first
-element, then npm assumes that the package at the current working directory
-is what is meant.</p>
-
-<p>If no version is specified, or if all versions are removed then
-the root package entry is removed from the registry entirely.</p>
-</div>
-<p id="footer">npm-unpublish &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-update.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,52 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-update</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-update.html">npm-update</a></h1> <p>Update a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.update(packages, callback)</code></pre>
-
-<h1>DESCRIPTION</h1>
-
-<p>Updates a package, upgrading it to the latest version. It also installs any missing packages.</p>
-
-<p>The &#39;packages&#39; argument is an array of packages to update. The &#39;callback&#39; parameter will be called when done or when an error occurs.</p>
-</div>
-<p id="footer">npm-update &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-version.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,58 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-version</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-version.html">npm-version</a></h1> <p>Bump a package version</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.version(newversion, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Run this in a package directory to bump the version and write the new
-data back to the package.json file.</p>
-
-<p>If run in a git repo, it will also create a version commit and tag, and
-fail if the repo is not clean.</p>
-
-<p>Like all other commands, this function takes a string array as its first
-parameter. The difference, however, is this function will fail if it does
-not have exactly one element. The only element should be a version number.</p>
-</div>
-<p id="footer">npm-version &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-view.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,133 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-view</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-view.html">npm-view</a></h1> <p>View registry info</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.view(args, [silent,] callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command shows data about a package and prints it to the stream
-referenced by the <code>outfd</code> config, which defaults to stdout.</p>
-
-<p>The &quot;args&quot; parameter is an ordered list that closely resembles the command-line
-usage. The elements should be ordered such that the first element is
-the package and version (package@version). The version is optional. After that,
-the rest of the parameters are fields with optional subfields (&quot;field.subfield&quot;)
-which can be used to get only the information desired from the registry.</p>
-
-<p>The callback will be passed all of the data returned by the query.</p>
-
-<p>For example, to get the package registry entry for the <code>connect</code> package,
-you can do this:</p>
-
-<pre><code>npm.commands.view([&quot;connect&quot;], callback)</code></pre>
-
-<p>If no version is specified, &quot;latest&quot; is assumed.</p>
-
-<p>Field names can be specified after the package descriptor.
-For example, to show the dependencies of the <code>ronn</code> package at version
-0.3.5, you could do the following:</p>
-
-<pre><code>npm.commands.view([&quot;ronn@0.3.5&quot;, &quot;dependencies&quot;], callback)</code></pre>
-
-<p>You can view child field by separating them with a period.
-To view the git repository URL for the latest version of npm, you could
-do this:</p>
-
-<pre><code>npm.commands.view([&quot;npm&quot;, &quot;repository.url&quot;], callback)</code></pre>
-
-<p>For fields that are arrays, requesting a non-numeric field will return
-all of the values from the objects in the list.  For example, to get all
-the contributor names for the &quot;express&quot; project, you can do this:</p>
-
-<pre><code>npm.commands.view([&quot;express&quot;, &quot;contributors.email&quot;], callback)</code></pre>
-
-<p>You may also use numeric indices in square braces to specifically select
-an item in an array field.  To just get the email address of the first
-contributor in the list, you can do this:</p>
-
-<pre><code>npm.commands.view([&quot;express&quot;, &quot;contributors[0].email&quot;], callback)</code></pre>
-
-<p>Multiple fields may be specified, and will be printed one after another.
-For exampls, to get all the contributor names and email addresses, you
-can do this:</p>
-
-<pre><code>npm.commands.view([&quot;express&quot;, &quot;contributors.name&quot;, &quot;contributors.email&quot;], callback)</code></pre>
-
-<p>&quot;Person&quot; fields are shown as a string if they would be shown as an
-object.  So, for example, this will show the list of npm contributors in
-the shortened string format.  (See <code>npm help json</code> for more on this.)</p>
-
-<pre><code>npm.commands.view([&quot;npm&quot;, &quot;contributors&quot;], callback)</code></pre>
-
-<p>If a version range is provided, then data will be printed for every
-matching version of the package.  This will show which version of jsdom
-was required by each matching version of yui3:</p>
-
-<pre><code>npm.commands.view([&quot;yui3@&#39;&gt;0.5.4&#39;&quot;, &quot;dependencies.jsdom&quot;], callback)</code></pre>
-
-<h2 id="OUTPUT">OUTPUT</h2>
-
-<p>If only a single string field for a single version is output, then it
-will not be colorized or quoted, so as to enable piping the output to
-another command.</p>
-
-<p>If the version range matches multiple versions, than each printed value
-will be prefixed with the version it applies to.</p>
-
-<p>If multiple fields are requested, than each of them are prefixed with
-the field name.</p>
-
-<p>Console output can be disabled by setting the &#39;silent&#39; parameter to true.</p>
-
-<h2 id="RETURN-VALUE">RETURN VALUE</h2>
-
-<p>The data returned will be an object in this formation:</p>
-
-<pre><code>{ &lt;version&gt;:
-  { &lt;field&gt;: &lt;value&gt;
-  , ... }
-, ... }</code></pre>
-
-<p>corresponding to the list of fields selected.</p>
-</div>
-<p id="footer">npm-view &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm-whoami.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-whoami</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-whoami.html">npm-whoami</a></h1> <p>Display npm username</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.whoami(args, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Print the <code>username</code> config to standard output.</p>
-
-<p>&#39;args&#39; is never used and callback is never called with data.
-&#39;args&#39; must be present or things will break.</p>
-
-<p>This function is not useful programmatically</p>
-</div>
-<p id="footer">npm-whoami &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/npm.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,126 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm.html">npm</a></h1> <p>node package manager</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>var npm = require(&quot;npm&quot;)
-npm.load([configObject], function (er, npm) {
-  // use the npm object, now that it&#39;s loaded.
-
-  npm.config.set(key, val)
-  val = npm.config.get(key)
-
-  console.log(&quot;prefix = %s&quot;, npm.prefix)
-
-  npm.commands.install([&quot;package&quot;], cb)
-})</code></pre>
-
-<h2 id="VERSION">VERSION</h2>
-
-<p>1.3.14</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This is the API documentation for npm.
-To find documentation of the command line
-client, see <code><a href="../cli/npm.html">npm(1)</a></code>.</p>
-
-<p>Prior to using npm&#39;s commands, <code>npm.load()</code> must be called.
-If you provide <code>configObject</code> as an object hash of top-level
-configs, they override the values stored in the various config
-locations. In the npm command line client, this set of configs
-is parsed from the command line options. Additional configuration
-params are loaded from two configuration files. See <code><a href="../cli/npm-config.html">npm-config(1)</a></code>,
-<code><a href="../misc/npm-config.html">npm-config(7)</a></code>, and <code><a href="../files/npmrc.html">npmrc(5)</a></code> for more information.</p>
-
-<p>After that, each of the functions are accessible in the
-commands object: <code>npm.commands.&lt;cmd&gt;</code>.  See <code><a href="../misc/npm-index.html">npm-index(7)</a></code> for a list of
-all possible commands.</p>
-
-<p>All commands on the command object take an <strong>array</strong> of positional argument
-<strong>strings</strong>. The last argument to any function is a callback. Some
-commands take other optional arguments.</p>
-
-<p>Configs cannot currently be set on a per function basis, as each call to
-npm.config.set will change the value for <em>all</em> npm commands in that process.</p>
-
-<p>To find API documentation for a specific command, run the <code>npm apihelp</code>
-command.</p>
-
-<h2 id="METHODS-AND-PROPERTIES">METHODS AND PROPERTIES</h2>
-
-<ul><li><p><code>npm.load(configs, cb)</code></p><p>Load the configuration params, and call the <code>cb</code> function once the
-globalconfig and userconfig files have been loaded as well, or on
-nextTick if they&#39;ve already been loaded.</p></li><li><p><code>npm.config</code></p><p>An object for accessing npm configuration parameters.</p><ul><li><p><code>npm.config.get(key)</code></p></li><li><code>npm.config.set(key, val)</code></li><li><p><code>npm.config.del(key)</code></p></li></ul></li><li><p><code>npm.dir</code> or <code>npm.root</code></p><p>The <code>node_modules</code> directory where npm will operate.</p></li><li><p><code>npm.prefix</code></p><p>The prefix where npm is operating.  (Most often the current working
-directory.)</p></li><li><p><code>npm.cache</code></p><p>The place where npm keeps JSON and tarballs it fetches from the
-registry (or uploads to the registry).</p></li><li><p><code>npm.tmp</code></p><p>npm&#39;s temporary working directory.</p></li><li><p><code>npm.deref</code></p><p>Get the &quot;real&quot; name for a command that has either an alias or
-abbreviation.</p></li></ul>
-
-<h2 id="MAGIC">MAGIC</h2>
-
-<p>For each of the methods in the <code>npm.commands</code> hash, a method is added to
-the npm object, which takes a set of positional string arguments rather
-than an array and a callback.</p>
-
-<p>If the last argument is a callback, then it will use the supplied
-callback.  However, if no callback is provided, then it will print out
-the error or results.</p>
-
-<p>For example, this would work in a node repl:</p>
-
-<pre><code>&gt; npm = require(&quot;npm&quot;)
-&gt; npm.load()  // wait a sec...
-&gt; npm.install(&quot;dnode&quot;, &quot;express&quot;)</code></pre>
-
-<p>Note that that <em>won&#39;t</em> work in a node program, since the <code>install</code>
-method will get called before the configuration load is completed.</p>
-
-<h2 id="ABBREVS">ABBREVS</h2>
-
-<p>In order to support <code>npm ins foo</code> instead of <code>npm install foo</code>, the
-<code>npm.commands</code> object has a set of abbreviations as well as the full
-method names.  Use the <code>npm.deref</code> method to find the real name.</p>
-
-<p>For example:</p>
-
-<pre><code>var cmd = npm.deref(&quot;unp&quot;) // cmd === &quot;unpublish&quot;</code></pre>
-</div>
-<p id="footer">npm &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/api/repo.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-<!doctype html>
-<html>
-  <title>repo</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../api/npm-repo.html">npm-repo</a></h1> <p>Open package repository page in the browser</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm.commands.repo(package, callback)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command tries to guess at the likely location of a package&#39;s
-repository URL, and then tries to open it using the <code>--browser</code>
-config param.</p>
-
-<p>Like other commands, the first parameter is an array. This command only
-uses the first element, which is expected to be a package name with an
-optional version number.</p>
-
-<p>This command will launch a browser, so this command may not be the most
-friendly for programmatic use.</p>
-</div>
-<p id="footer">repo &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-adduser.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,73 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-adduser</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-adduser.html">npm-adduser</a></h1> <p>Add a registry user account</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm adduser</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Create or verify a user named <code>&lt;username&gt;</code> in the npm registry, and
-save the credentials to the <code>.npmrc</code> file.</p>
-
-<p>The username, password, and email are read in from prompts.</p>
-
-<p>You may use this command to change your email address, but not username
-or password.</p>
-
-<p>To reset your password, go to <a href="http://admin.npmjs.org/">http://admin.npmjs.org/</a></p>
-
-<p>You may use this command multiple times with the same user account to
-authorize on a new machine.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="registry">registry</h3>
-
-<p>Default: http://registry.npmjs.org/</p>
-
-<p>The base URL of the npm package registry.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../cli/npm-owner.html">npm-owner(1)</a></li><li><a href="../cli/npm-whoami.html">npm-whoami(1)</a></li></ul>
-</div>
-<p id="footer">npm-adduser &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-bin.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-bin</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-bin.html">npm-bin</a></h1> <p>Display npm bin folder</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm bin</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Print the folder where npm will install executables.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-prefix.html">npm-prefix(1)</a></li><li><a href="../cli/npm-root.html">npm-root(1)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li></ul>
-</div>
-<p id="footer">npm-bin &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-bugs.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,70 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-bugs</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-bugs.html">npm-bugs</a></h1> <p>Bugs for a package in a web browser maybe</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm bugs &lt;pkgname&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command tries to guess at the likely location of a package&#39;s
-bug tracker URL, and then tries to open it using the <code>--browser</code>
-config param.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="browser">browser</h3>
-
-<ul><li>Default: OS X: <code>&quot;open&quot;</code>, Windows: <code>&quot;start&quot;</code>, Others: <code>&quot;xdg-open&quot;</code></li><li>Type: String</li></ul>
-
-<p>The browser that is called by the <code>npm bugs</code> command to open websites.</p>
-
-<h3 id="registry">registry</h3>
-
-<ul><li>Default: https://registry.npmjs.org/</li><li>Type: url</li></ul>
-
-<p>The base URL of the npm package registry.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-docs.html">npm-docs(1)</a></li><li><a href="../cli/npm-view.html">npm-view(1)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li></ul>
-</div>
-<p id="footer">npm-bugs &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-build.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-build</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-build.html">npm-build</a></h1> <p>Build a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm build &lt;package-folder&gt;</code></pre>
-
-<ul><li><code>&lt;package-folder&gt;</code>:
-A folder containing a <code>package.json</code> file in its root.</li></ul>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This is the plumbing command called by <code>npm link</code> and <code>npm install</code>.</p>
-
-<p>It should generally not be called directly.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-link.html">npm-link(1)</a></li><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li></ul>
-</div>
-<p id="footer">npm-build &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-bundle.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-bundle</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-bundle.html">npm-bundle</a></h1> <p>REMOVED</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>The <code>npm bundle</code> command has been removed in 1.0, for the simple reason
-that it is no longer necessary, as the default behavior is now to
-install packages into the local space.</p>
-
-<p>Just use <code>npm install</code> now to do what <code>npm bundle</code> used to do.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-install.html">npm-install(1)</a></li></ul>
-</div>
-<p id="footer">npm-bundle &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-cache.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,100 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-cache</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-cache.html">npm-cache</a></h1> <p>Manipulates packages cache</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm cache add &lt;tarball file&gt;
-npm cache add &lt;folder&gt;
-npm cache add &lt;tarball url&gt;
-npm cache add &lt;name&gt;@&lt;version&gt;
-
-npm cache ls [&lt;path&gt;]
-
-npm cache clean [&lt;path&gt;]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Used to add, list, or clear the npm cache folder.</p>
-
-<ul><li><p>add:
-Add the specified package to the local cache.  This command is primarily
-intended to be used internally by npm, but it can provide a way to
-add data to the local installation cache explicitly.</p></li><li><p>ls:
-Show the data in the cache.  Argument is a path to show in the cache
-folder.  Works a bit like the <code>find</code> program, but limited by the
-<code>depth</code> config.</p></li><li><p>clean:
-Delete data out of the cache folder.  If an argument is provided, then
-it specifies a subpath to delete.  If no argument is provided, then
-the entire cache is cleared.</p></li></ul>
-
-<h2 id="DETAILS">DETAILS</h2>
-
-<p>npm stores cache data in the directory specified in <code>npm config get cache</code>.
-For each package that is added to the cache, three pieces of information are
-stored in <code>{cache}/{name}/{version}</code>:</p>
-
-<ul><li>.../package/:
-A folder containing the package contents as they appear in the tarball.</li><li>.../package.json:
-The package.json file, as npm sees it, with overlays applied and a _id attribute.</li><li>.../package.tgz:
-The tarball for that version.</li></ul>
-
-<p>Additionally, whenever a registry request is made, a <code>.cache.json</code> file
-is placed at the corresponding URI, to store the ETag and the requested
-data.</p>
-
-<p>Commands that make non-essential registry requests (such as <code>search</code> and
-<code>view</code>, or the completion scripts) generally specify a minimum timeout.
-If the <code>.cache.json</code> file is younger than the specified timeout, then
-they do not make an HTTP request to the registry.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="cache">cache</h3>
-
-<p>Default: <code>~/.npm</code> on Posix, or <code>%AppData%/npm-cache</code> on Windows.</p>
-
-<p>The root cache folder.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../cli/npm-pack.html">npm-pack(1)</a></li></ul>
-</div>
-<p id="footer">npm-cache &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-completion.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,67 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-completion</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-completion.html">npm-completion</a></h1> <p>Tab Completion for npm</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>. &lt;(npm completion)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Enables tab-completion in all npm commands.</p>
-
-<p>The synopsis above
-loads the completions into your current shell.  Adding it to
-your ~/.bashrc or ~/.zshrc will make the completions available
-everywhere.</p>
-
-<p>You may of course also pipe the output of npm completion to a file
-such as <code>/usr/local/etc/bash_completion.d/npm</code> if you have a system
-that will read that file for you.</p>
-
-<p>When <code>COMP_CWORD</code>, <code>COMP_LINE</code>, and <code>COMP_POINT</code> are defined in the
-environment, <code>npm completion</code> acts in &quot;plumbing mode&quot;, and outputs
-completions based on the arguments.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-developers.html">npm-developers(7)</a></li><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../cli/npm.html">npm(1)</a></li></ul>
-</div>
-<p id="footer">npm-completion &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-config.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,107 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-config</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-config.html">npm-config</a></h1> <p>Manage the npm configuration files</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm config set &lt;key&gt; &lt;value&gt; [--global]
-npm config get &lt;key&gt;
-npm config delete &lt;key&gt;
-npm config list
-npm config edit
-npm c [set|get|delete|list]
-npm get &lt;key&gt;
-npm set &lt;key&gt; &lt;value&gt; [--global]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm gets its config settings from the command line, environment
-variables, <code>npmrc</code> files, and in some cases, the <code>package.json</code> file.</p>
-
-<p>See <a href="../files/npmrc.html">npmrc(5)</a> for more information about the npmrc files.</p>
-
-<p>See <code><a href="../misc/npm-config.html">npm-config(7)</a></code> for a more thorough discussion of the mechanisms
-involved.</p>
-
-<p>The <code>npm config</code> command can be used to update and edit the contents
-of the user and global npmrc files.</p>
-
-<h2 id="Sub-commands">Sub-commands</h2>
-
-<p>Config supports the following sub-commands:</p>
-
-<h3 id="set">set</h3>
-
-<pre><code>npm config set key value</code></pre>
-
-<p>Sets the config key to the value.</p>
-
-<p>If value is omitted, then it sets it to &quot;true&quot;.</p>
-
-<h3 id="get">get</h3>
-
-<pre><code>npm config get key</code></pre>
-
-<p>Echo the config value to stdout.</p>
-
-<h3 id="list">list</h3>
-
-<pre><code>npm config list</code></pre>
-
-<p>Show all the config settings.</p>
-
-<h3 id="delete">delete</h3>
-
-<pre><code>npm config delete key</code></pre>
-
-<p>Deletes the key from all configuration files.</p>
-
-<h3 id="edit">edit</h3>
-
-<pre><code>npm config edit</code></pre>
-
-<p>Opens the config file in an editor.  Use the <code>--global</code> flag to edit the
-global config.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../cli/npm.html">npm(1)</a></li></ul>
-</div>
-<p id="footer">npm-config &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-dedupe.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,96 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-dedupe</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-dedupe.html">npm-dedupe</a></h1> <p>Reduce duplication</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm dedupe [package names...]
-npm ddp [package names...]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Searches the local package tree and attempts to simplify the overall
-structure by moving dependencies further up the tree, where they can
-be more effectively shared by multiple dependent packages.</p>
-
-<p>For example, consider this dependency graph:</p>
-
-<pre><code>a
-+-- b &lt;-- depends on c@1.0.x
-|   `-- c@1.0.3
-`-- d &lt;-- depends on c@~1.0.9
-    `-- c@1.0.10</code></pre>
-
-<p>In this case, <code><a href="../cli/npm-dedupe.html">npm-dedupe(1)</a></code> will transform the tree to:</p>
-
-<pre><code>a
-+-- b
-+-- d
-`-- c@1.0.10</code></pre>
-
-<p>Because of the hierarchical nature of node&#39;s module lookup, b and d
-will both get their dependency met by the single c package at the root
-level of the tree.</p>
-
-<p>If a suitable version exists at the target location in the tree
-already, then it will be left untouched, but the other duplicates will
-be deleted.</p>
-
-<p>If no suitable version can be found, then a warning is printed, and
-nothing is done.</p>
-
-<p>If any arguments are supplied, then they are filters, and only the
-named packages will be touched.</p>
-
-<p>Note that this operation transforms the dependency tree, and may
-result in packages getting updated versions, perhaps from the npm
-registry.</p>
-
-<p>This feature is experimental, and may change in future versions.</p>
-
-<p>The <code>--tag</code> argument will apply to all of the affected dependencies. If a
-tag with the given name exists, the tagged version is preferred over newer
-versions.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-ls.html">npm-ls(1)</a></li><li><a href="../cli/npm-update.html">npm-update(1)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li></ul>
-</div>
-<p id="footer">npm-dedupe &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-deprecate.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,65 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-deprecate</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-deprecate.html">npm-deprecate</a></h1> <p>Deprecate a version of a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm deprecate &lt;name&gt;[@&lt;version&gt;] &lt;message&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command will update the npm registry entry for a package, providing
-a deprecation warning to all who attempt to install it.</p>
-
-<p>It works on version ranges as well as specific versions, so you can do
-something like this:</p>
-
-<pre><code>npm deprecate my-thing@&quot;&lt; 0.2.3&quot; &quot;critical bug fixed in v0.2.3&quot;</code></pre>
-
-<p>Note that you must be the package owner to deprecate something.  See the
-<code>owner</code> and <code>adduser</code> help topics.</p>
-
-<p>To un-deprecate a package, specify an empty string (<code>&quot;&quot;</code>) for the <code>message</code> argument.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li></ul>
-</div>
-<p id="footer">npm-deprecate &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-docs.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,71 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-docs</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-docs.html">npm-docs</a></h1> <p>Docs for a package in a web browser maybe</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm docs &lt;pkgname&gt;
-npm home &lt;pkgname&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command tries to guess at the likely location of a package&#39;s
-documentation URL, and then tries to open it using the <code>--browser</code>
-config param.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="browser">browser</h3>
-
-<ul><li>Default: OS X: <code>&quot;open&quot;</code>, Windows: <code>&quot;start&quot;</code>, Others: <code>&quot;xdg-open&quot;</code></li><li>Type: String</li></ul>
-
-<p>The browser that is called by the <code>npm docs</code> command to open websites.</p>
-
-<h3 id="registry">registry</h3>
-
-<ul><li>Default: https://registry.npmjs.org/</li><li>Type: url</li></ul>
-
-<p>The base URL of the npm package registry.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-view.html">npm-view(1)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li></ul>
-</div>
-<p id="footer">npm-docs &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-edit.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,71 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-edit</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-edit.html">npm-edit</a></h1> <p>Edit an installed package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm edit &lt;name&gt;[@&lt;version&gt;]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Opens the package folder in the default editor (or whatever you&#39;ve
-configured as the npm <code>editor</code> config -- see <code><a href="../misc/npm-config.html">npm-config(7)</a></code>.)</p>
-
-<p>After it has been edited, the package is rebuilt so as to pick up any
-changes in compiled packages.</p>
-
-<p>For instance, you can do <code>npm install connect</code> to install connect
-into your package, and then <code>npm edit connect</code> to make a few
-changes to your locally installed copy.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="editor">editor</h3>
-
-<ul><li>Default: <code>EDITOR</code> environment variable if set, or <code>&quot;vi&quot;</code> on Posix,
-or <code>&quot;notepad&quot;</code> on Windows.</li><li>Type: path</li></ul>
-
-<p>The command to run for <code>npm edit</code> or <code>npm config edit</code>.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-explore.html">npm-explore(1)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li></ul>
-</div>
-<p id="footer">npm-edit &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-explore.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,74 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-explore</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-explore.html">npm-explore</a></h1> <p>Browse an installed package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm explore &lt;name&gt;[@&lt;version&gt;] [ -- &lt;cmd&gt;]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Spawn a subshell in the directory of the installed package specified.</p>
-
-<p>If a command is specified, then it is run in the subshell, which then
-immediately terminates.</p>
-
-<p>This is particularly handy in the case of git submodules in the
-<code>node_modules</code> folder:</p>
-
-<pre><code>npm explore some-dependency -- git pull origin master</code></pre>
-
-<p>Note that the package is <em>not</em> automatically rebuilt afterwards, so be
-sure to use <code>npm rebuild &lt;pkg&gt;</code> if you make any changes.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="shell">shell</h3>
-
-<ul><li>Default: SHELL environment variable, or &quot;bash&quot; on Posix, or &quot;cmd&quot; on
-Windows</li><li>Type: path</li></ul>
-
-<p>The shell to run for the <code>npm explore</code> command.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-submodule.html">npm-submodule(1)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-edit.html">npm-edit(1)</a></li><li><a href="../cli/npm-rebuild.html">npm-rebuild(1)</a></li><li><a href="../cli/npm-build.html">npm-build(1)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li></ul>
-</div>
-<p id="footer">npm-explore &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-help-search.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,72 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-help-search</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-help-search.html">npm-help-search</a></h1> <p>Search npm help documentation</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm help-search some search terms</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command will search the npm markdown documentation files for the
-terms provided, and then list the results, sorted by relevance.</p>
-
-<p>If only one result is found, then it will show that help topic.</p>
-
-<p>If the argument to <code>npm help</code> is not a known help topic, then it will
-call <code>help-search</code>.  It is rarely if ever necessary to call this
-command directly.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="long">long</h3>
-
-<ul><li>Type: Boolean</li><li>Default false</li></ul>
-
-<p>If true, the &quot;long&quot; flag will cause help-search to output context around
-where the terms were found in the documentation.</p>
-
-<p>If false, then help-search will just list out the help topics found.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm.html">npm(1)</a></li><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../cli/npm-help.html">npm-help(1)</a></li></ul>
-</div>
-<p id="footer">npm-help-search &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-help.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,70 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-help</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-help.html">npm-help</a></h1> <p>Get help on npm</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm help &lt;topic&gt;
-npm help some search terms</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>If supplied a topic, then show the appropriate documentation page.</p>
-
-<p>If the topic does not exist, or if multiple terms are provided, then run
-the <code>help-search</code> command to find a match.  Note that, if <code>help-search</code>
-finds a single subject, then it will run <code>help</code> on that topic, so unique
-matches are equivalent to specifying a topic name.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="viewer">viewer</h3>
-
-<ul><li>Default: &quot;man&quot; on Posix, &quot;browser&quot; on Windows</li><li>Type: path</li></ul>
-
-<p>The program to use to view help content.</p>
-
-<p>Set to <code>&quot;browser&quot;</code> to view html help content in the default web browser.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm.html">npm(1)</a></li><li><a href="../../doc/README.html">README</a></li><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../cli/npm-help-search.html">npm-help-search(1)</a></li><li><a href="../misc/npm-index.html">npm-index(7)</a></li></ul>
-</div>
-<p id="footer">npm-help &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-init.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-init</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-init.html">npm-init</a></h1> <p>Interactively create a package.json file</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm init</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This will ask you a bunch of questions, and then write a package.json for you.</p>
-
-<p>It attempts to make reasonable guesses about what you want things to be set to,
-and then writes a package.json file with the options you&#39;ve selected.</p>
-
-<p>If you already have a package.json file, it&#39;ll read that first, and default to
-the options in there.</p>
-
-<p>It is strictly additive, so it does not delete options from your package.json
-without a really good reason to do so.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="https://github.com/isaacs/init-package-json">https://github.com/isaacs/init-package-json</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../cli/npm-version.html">npm-version(1)</a></li></ul>
-</div>
-<p id="footer">npm-init &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-install.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,180 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-install</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-install.html">npm-install</a></h1> <p>Install a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm install (with no args in a package dir)
-npm install &lt;tarball file&gt;
-npm install &lt;tarball url&gt;
-npm install &lt;folder&gt;
-npm install &lt;name&gt; [--save|--save-dev|--save-optional]
-npm install &lt;name&gt;@&lt;tag&gt;
-npm install &lt;name&gt;@&lt;version&gt;
-npm install &lt;name&gt;@&lt;version range&gt;
-npm i (with any of the previous argument usage)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command installs a package, and any packages that it depends on. If the
-package has a shrinkwrap file, the installation of dependencies will be driven
-by that. See <a href="../cli/npm-shrinkwrap.html">npm-shrinkwrap(1)</a>.</p>
-
-<p>A <code>package</code> is:</p>
-
-<ul><li>a) a folder containing a program described by a package.json file</li><li>b) a gzipped tarball containing (a)</li><li>c) a url that resolves to (b)</li><li>d) a <code>&lt;name&gt;@&lt;version&gt;</code> that is published on the registry with (c)</li><li>e) a <code>&lt;name&gt;@&lt;tag&gt;</code> that points to (d)</li><li>f) a <code>&lt;name&gt;</code> that has a &quot;latest&quot; tag satisfying (e)</li><li>g) a <code>&lt;git remote url&gt;</code> that resolves to (b)</li></ul>
-
-<p>Even if you never publish your package, you can still get a lot of
-benefits of using npm if you just want to write a node program (a), and
-perhaps if you also want to be able to easily install it elsewhere
-after packing it up into a tarball (b).</p>
-
-<ul><li><p><code>npm install</code> (in package directory, no arguments):</p><p>Install the dependencies in the local node_modules folder.</p><p>In global mode (ie, with <code>-g</code> or <code>--global</code> appended to the command),
-it installs the current package context (ie, the current working
-directory) as a global package.</p><p>By default, <code>npm install</code> will install all modules listed as
-dependencies. With the <code>--production</code> flag,
-npm will not install modules listed in <code>devDependencies</code>.</p></li><li><p><code>npm install &lt;folder&gt;</code>:</p><p>Install a package that is sitting in a folder on the filesystem.</p></li><li><p><code>npm install &lt;tarball file&gt;</code>:</p><p>Install a package that is sitting on the filesystem.  Note: if you just want
-to link a dev directory into your npm root, you can do this more easily by
-using <code>npm link</code>.</p><p>Example:</p><pre><code>  npm install ./package.tgz</code></pre></li><li><p><code>npm install &lt;tarball url&gt;</code>:</p><p>Fetch the tarball url, and then install it.  In order to distinguish between
-this and other options, the argument must start with &quot;http://&quot; or &quot;https://&quot;</p><p>Example:</p><pre><code>  npm install https://github.com/indexzero/forever/tarball/v0.5.6</code></pre></li><li><p><code>npm install &lt;name&gt; [--save|--save-dev|--save-optional]</code>:</p><p>Do a <code>&lt;name&gt;@&lt;tag&gt;</code> install, where <code>&lt;tag&gt;</code> is the &quot;tag&quot; config. (See
-<code><a href="../misc/npm-config.html">npm-config(7)</a></code>.)</p><p>In most cases, this will install the latest version
-of the module published on npm.</p><p>Example:</p><p>      npm install sax</p><p><code>npm install</code> takes 3 exclusive, optional flags which save or update
-the package version in your main package.json:</p><ul><li><p><code>--save</code>: Package will appear in your <code>dependencies</code>.</p></li><li><p><code>--save-dev</code>: Package will appear in your <code>devDependencies</code>.</p></li><li><p><code>--save-optional</code>: Package will appear in your <code>optionalDependencies</code>.</p><p>Examples:</p><p>  npm install sax --save
-  npm install node-tap --save-dev
-  npm install dtrace-provider --save-optional</p><p><strong>Note</strong>: If there is a file or folder named <code>&lt;name&gt;</code> in the current
-working directory, then it will try to install that, and only try to
-fetch the package by name if it is not valid.</p></li></ul></li><li><p><code>npm install &lt;name&gt;@&lt;tag&gt;</code>:</p><p>Install the version of the package that is referenced by the specified tag.
-If the tag does not exist in the registry data for that package, then this
-will fail.</p><p>Example:</p><pre><code>  npm install sax@latest</code></pre></li><li><p><code>npm install &lt;name&gt;@&lt;version&gt;</code>:</p><p>Install the specified version of the package.  This will fail if the version
-has not been published to the registry.</p><p>Example:</p><pre><code>  npm install sax@0.1.1</code></pre></li><li><p><code>npm install &lt;name&gt;@&lt;version range&gt;</code>:</p><p>Install a version of the package matching the specified version range.  This
-will follow the same rules for resolving dependencies described in <code><a href="../files/package.json.html">package.json(5)</a></code>.</p><p>Note that most version ranges must be put in quotes so that your shell will
-treat it as a single argument.</p><p>Example:</p><p>      npm install sax@&quot;&gt;=0.1.0 &lt;0.2.0&quot;</p></li><li><p><code>npm install &lt;git remote url&gt;</code>:</p><p>Install a package by cloning a git remote url.  The format of the git
-url is:</p><p>      &lt;protocol&gt;://[&lt;user&gt;@]&lt;hostname&gt;&lt;separator&gt;&lt;path&gt;[#&lt;commit-ish&gt;]</p><p><code>&lt;protocol&gt;</code> is one of <code>git</code>, <code>git+ssh</code>, <code>git+http</code>, or
-<code>git+https</code>.  If no <code>&lt;commit-ish&gt;</code> is specified, then <code>master</code> is
-used.</p><p>Examples:</p><pre><code>  git+ssh://git@github.com:isaacs/npm.git#v1.0.27
-  git+https://isaacs@github.com/isaacs/npm.git
-  git://github.com/isaacs/npm.git#v1.0.27</code></pre></li></ul>
-
-<p>You may combine multiple arguments, and even multiple types of arguments.
-For example:</p>
-
-<pre><code>npm install sax@&quot;&gt;=0.1.0 &lt;0.2.0&quot; bench supervisor</code></pre>
-
-<p>The <code>--tag</code> argument will apply to all of the specified install targets. If a
-tag with the given name exists, the tagged version is preferred over newer
-versions.</p>
-
-<p>The <code>--force</code> argument will force npm to fetch remote resources even if a
-local copy exists on disk.</p>
-
-<pre><code>npm install sax --force</code></pre>
-
-<p>The <code>--global</code> argument will cause npm to install the package globally
-rather than locally.  See <code><a href="../files/npm-folders.html">npm-folders(5)</a></code>.</p>
-
-<p>The <code>--link</code> argument will cause npm to link global installs into the
-local space in some cases.</p>
-
-<p>The <code>--no-bin-links</code> argument will prevent npm from creating symlinks for
-any binaries the package might contain.</p>
-
-<p>The <code>--no-shrinkwrap</code> argument, which will ignore an available
-shrinkwrap file and use the package.json instead.</p>
-
-<p>The <code>--nodedir=/path/to/node/source</code> argument will allow npm to find the
-node source code so that npm can compile native modules.</p>
-
-<p>See <code><a href="../misc/npm-config.html">npm-config(7)</a></code>.  Many of the configuration params have some
-effect on installation, since that&#39;s most of what npm does.</p>
-
-<h2 id="ALGORITHM">ALGORITHM</h2>
-
-<p>To install a package, npm uses the following algorithm:</p>
-
-<pre><code>install(where, what, family, ancestors)
-fetch what, unpack to &lt;where&gt;/node_modules/&lt;what&gt;
-for each dep in what.dependencies
-  resolve dep to precise version
-for each dep@version in what.dependencies
-    not in &lt;where&gt;/node_modules/&lt;what&gt;/node_modules/*
-    and not in &lt;family&gt;
-  add precise version deps to &lt;family&gt;
-  install(&lt;where&gt;/node_modules/&lt;what&gt;, dep, family)</code></pre>
-
-<p>For this <code>package{dep}</code> structure: <code>A{B,C}, B{C}, C{D}</code>,
-this algorithm produces:</p>
-
-<pre><code>A
-+-- B
-`-- C
-    `-- D</code></pre>
-
-<p>That is, the dependency from B to C is satisfied by the fact that A
-already caused C to be installed at a higher level.</p>
-
-<p>See <a href="../files/npm-folders.html">npm-folders(5)</a> for a more detailed description of the specific
-folder structures that npm creates.</p>
-
-<h3 id="Limitations-of-npm-s-Install-Algorithm">Limitations of npm&#39;s Install Algorithm</h3>
-
-<p>There are some very rare and pathological edge-cases where a cycle can
-cause npm to try to install a never-ending tree of packages.  Here is
-the simplest case:</p>
-
-<pre><code>A -&gt; B -&gt; A&#39; -&gt; B&#39; -&gt; A -&gt; B -&gt; A&#39; -&gt; B&#39; -&gt; A -&gt; ...</code></pre>
-
-<p>where <code>A</code> is some version of a package, and <code>A&#39;</code> is a different version
-of the same package.  Because <code>B</code> depends on a different version of <code>A</code>
-than the one that is already in the tree, it must install a separate
-copy.  The same is true of <code>A&#39;</code>, which must install <code>B&#39;</code>.  Because <code>B&#39;</code>
-depends on the original version of <code>A</code>, which has been overridden, the
-cycle falls into infinite regress.</p>
-
-<p>To avoid this situation, npm flat-out refuses to install any
-<code>name@version</code> that is already present anywhere in the tree of package
-folder ancestors.  A more correct, but more complex, solution would be
-to symlink the existing version into the new location.  If this ever
-affects a real use-case, it will be investigated.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-update.html">npm-update(1)</a></li><li><a href="../cli/npm-link.html">npm-link(1)</a></li><li><a href="../cli/npm-rebuild.html">npm-rebuild(1)</a></li><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../cli/npm-build.html">npm-build(1)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-tag.html">npm-tag(1)</a></li><li><a href="../cli/npm-rm.html">npm-rm(1)</a></li><li><a href="../cli/npm-shrinkwrap.html">npm-shrinkwrap(1)</a></li></ul>
-</div>
-<p id="footer">npm-install &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-link.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,96 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-link</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-link.html">npm-link</a></h1> <p>Symlink a package folder</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm link (in package folder)
-npm link &lt;pkgname&gt;
-npm ln (with any of the previous argument usage)</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Package linking is a two-step process.</p>
-
-<p>First, <code>npm link</code> in a package folder will create a globally-installed
-symbolic link from <code>prefix/package-name</code> to the current folder.</p>
-
-<p>Next, in some other location, <code>npm link package-name</code> will create a
-symlink from the local <code>node_modules</code> folder to the global symlink.</p>
-
-<p>Note that <code>package-name</code> is taken from <code>package.json</code>,
-not from directory name.</p>
-
-<p>When creating tarballs for <code>npm publish</code>, the linked packages are
-&quot;snapshotted&quot; to their current state by resolving the symbolic links.</p>
-
-<p>This is
-handy for installing your own stuff, so that you can work on it and test it
-iteratively without having to continually rebuild.</p>
-
-<p>For example:</p>
-
-<pre><code>cd ~/projects/node-redis    # go into the package directory
-npm link                    # creates global link
-cd ~/projects/node-bloggy   # go into some other package directory.
-npm link redis              # link-install the package</code></pre>
-
-<p>Now, any changes to ~/projects/node-redis will be reflected in
-~/projects/node-bloggy/node_modules/redis/</p>
-
-<p>You may also shortcut the two steps in one.  For example, to do the
-above use-case in a shorter way:</p>
-
-<pre><code>cd ~/projects/node-bloggy  # go into the dir of your main project
-npm link ../node-redis     # link the dir of your dependency</code></pre>
-
-<p>The second line is the equivalent of doing:</p>
-
-<pre><code>(cd ../node-redis; npm link)
-npm link redis</code></pre>
-
-<p>That is, it first creates a global link, and then links the global
-installation target into your project&#39;s <code>node_modules</code> folder.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-developers.html">npm-developers(7)</a></li><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li></ul>
-</div>
-<p id="footer">npm-link &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-ls.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,102 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-ls</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-ls.html">npm-ls</a></h1> <p>List installed packages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm list [&lt;pkg&gt; ...]
-npm ls [&lt;pkg&gt; ...]
-npm la [&lt;pkg&gt; ...]
-npm ll [&lt;pkg&gt; ...]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command will print to stdout all the versions of packages that are
-installed, as well as their dependencies, in a tree-structure.</p>
-
-<p>Positional arguments are <code>name@version-range</code> identifiers, which will
-limit the results to only the paths to the packages named.  Note that
-nested packages will <em>also</em> show the paths to the specified packages.
-For example, running <code>npm ls promzard</code> in npm&#39;s source tree will show:</p>
-
-<pre><code>npm@1.3.14 /path/to/npm
-└─┬ init-package-json@0.0.4
-  └── promzard@0.1.5</code></pre>
-
-<p>It will print out extraneous, missing, and invalid packages.</p>
-
-<p>If a project specifies git urls for dependencies these are shown
-in parentheses after the name@version to make it easier for users to
-recognize potential forks of a project.</p>
-
-<p>When run as <code>ll</code> or <code>la</code>, it shows extended information by default.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="json">json</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Show information in JSON format.</p>
-
-<h3 id="long">long</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Show extended information.</p>
-
-<h3 id="parseable">parseable</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Show parseable output instead of tree view.</p>
-
-<h3 id="global">global</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>List packages in the global install prefix instead of in the current
-project.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-link.html">npm-link(1)</a></li><li><a href="../cli/npm-prune.html">npm-prune(1)</a></li><li><a href="../cli/npm-outdated.html">npm-outdated(1)</a></li><li><a href="../cli/npm-update.html">npm-update(1)</a></li></ul>
-</div>
-<p id="footer">npm-ls &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-outdated.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-outdated</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-outdated.html">npm-outdated</a></h1> <p>Check for outdated packages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm outdated [&lt;name&gt; [&lt;name&gt; ...]]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command will check the registry to see if any (or, specific) installed
-packages are currently outdated.</p>
-
-<p>The resulting field &#39;wanted&#39; shows the latest version according to the
-version specified in the package.json, the field &#39;latest&#39; the very latest
-version of the package.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-update.html">npm-update(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li></ul>
-</div>
-<p id="footer">npm-outdated &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-owner.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-owner</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-owner.html">npm-owner</a></h1> <p>Manage package owners</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm owner ls &lt;package name&gt;
-npm owner add &lt;user&gt; &lt;package name&gt;
-npm owner rm &lt;user&gt; &lt;package name&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Manage ownership of published packages.</p>
-
-<ul><li>ls:
-List all the users who have access to modify a package and push new versions.
-Handy when you need to know who to bug for help.</li><li>add:
-Add a new user as a maintainer of a package.  This user is enabled to modify
-metadata, publish new versions, and add other owners.</li><li>rm:
-Remove a user from the package owner list.  This immediately revokes their
-privileges.</li></ul>
-
-<p>Note that there is only one level of access.  Either you can modify a package,
-or you can&#39;t.  Future versions may contain more fine-grained access levels, but
-that is not implemented at this time.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-adduser.html">npm-adduser(1)</a></li><li><a href="../misc/npm-disputes.html">npm-disputes(7)</a></li></ul>
-</div>
-<p id="footer">npm-owner &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-pack.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-pack</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-pack.html">npm-pack</a></h1> <p>Create a tarball from a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm pack [&lt;pkg&gt; [&lt;pkg&gt; ...]]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>For anything that&#39;s installable (that is, a package folder, tarball,
-tarball url, name@tag, name@version, or name), this command will fetch
-it to the cache, and then copy the tarball to the current working
-directory as <code>&lt;name&gt;-&lt;version&gt;.tgz</code>, and then write the filenames out to
-stdout.</p>
-
-<p>If the same package is specified multiple times, then the file will be
-overwritten the second time.</p>
-
-<p>If no arguments are supplied, then npm packs the current package folder.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-cache.html">npm-cache(1)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li></ul>
-</div>
-<p id="footer">npm-pack &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-prefix.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-prefix</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-prefix.html">npm-prefix</a></h1> <p>Display prefix</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm prefix</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Print the prefix to standard out.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-root.html">npm-root(1)</a></li><li><a href="../cli/npm-bin.html">npm-bin(1)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li></ul>
-</div>
-<p id="footer">npm-prefix &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-prune.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-prune</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-prune.html">npm-prune</a></h1> <p>Remove extraneous packages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm prune [&lt;name&gt; [&lt;name ...]]
-npm prune [&lt;name&gt; [&lt;name ...]] [--production]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command removes &quot;extraneous&quot; packages.  If a package name is
-provided, then only packages matching one of the supplied names are
-removed.</p>
-
-<p>Extraneous packages are packages that are not listed on the parent
-package&#39;s dependencies list.</p>
-
-<p>If the <code>--production</code> flag is specified, this command will remove the
-packages specified in your <code>devDependencies</code>.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-rm.html">npm-rm(1)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-ls.html">npm-ls(1)</a></li></ul>
-</div>
-<p id="footer">npm-prune &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-publish.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-publish</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-publish.html">npm-publish</a></h1> <p>Publish a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm publish &lt;tarball&gt;
-npm publish &lt;folder&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Publishes a package to the registry so that it can be installed by name.</p>
-
-<ul><li><p><code>&lt;folder&gt;</code>:
-A folder containing a package.json file</p></li><li><p><code>&lt;tarball&gt;</code>:
-A url or file path to a gzipped tar archive containing a single folder
-with a package.json file inside.</p></li></ul>
-
-<p>Fails if the package name and version combination already exists in
-the registry.  Overwrites when the &quot;--force&quot; flag is set.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-adduser.html">npm-adduser(1)</a></li><li><a href="../cli/npm-owner.html">npm-owner(1)</a></li><li><a href="../cli/npm-deprecate.html">npm-deprecate(1)</a></li><li><a href="../cli/npm-tag.html">npm-tag(1)</a></li></ul>
-</div>
-<p id="footer">npm-publish &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-rebuild.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,60 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-rebuild</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-rebuild.html">npm-rebuild</a></h1> <p>Rebuild a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm rebuild [&lt;name&gt; [&lt;name&gt; ...]]
-npm rb [&lt;name&gt; [&lt;name&gt; ...]]</code></pre>
-
-<ul><li><code>&lt;name&gt;</code>:
-The package to rebuild</li></ul>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command runs the <code>npm build</code> command on the matched folders.  This is useful
-when you install a new version of node, and must recompile all your C++ addons with
-the new binary.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-build.html">npm-build(1)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li></ul>
-</div>
-<p id="footer">npm-rebuild &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-restart.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,58 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-restart</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-restart.html">npm-restart</a></h1> <p>Start a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm restart &lt;name&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs a package&#39;s &quot;restart&quot; script, if one was provided.
-Otherwise it runs package&#39;s &quot;stop&quot; script, if one was provided, and then
-the &quot;start&quot; script.</p>
-
-<p>If no version is specified, then it restarts the &quot;active&quot; version.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-run-script.html">npm-run-script(1)</a></li><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../cli/npm-test.html">npm-test(1)</a></li><li><a href="../cli/npm-start.html">npm-start(1)</a></li><li><a href="../cli/npm-stop.html">npm-stop(1)</a></li></ul>
-</div>
-<p id="footer">npm-restart &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-rm.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,58 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-rm</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-rm.html">npm-rm</a></h1> <p>Remove a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm rm &lt;name&gt;
-npm r &lt;name&gt;
-npm uninstall &lt;name&gt;
-npm un &lt;name&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This uninstalls a package, completely removing everything npm installed
-on its behalf.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-prune.html">npm-prune(1)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li></ul>
-</div>
-<p id="footer">npm-rm &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-root.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-root</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-root.html">npm-root</a></h1> <p>Display npm root</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm root</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Print the effective <code>node_modules</code> folder to standard out.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-prefix.html">npm-prefix(1)</a></li><li><a href="../cli/npm-bin.html">npm-bin(1)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li></ul>
-</div>
-<p id="footer">npm-root &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-run-script.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,57 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-run-script</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-run-script.html">npm-run-script</a></h1> <p>Run arbitrary package scripts</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm run-script &lt;script&gt; &lt;name&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs an arbitrary command from a package&#39;s &quot;scripts&quot; object.</p>
-
-<p>It is used by the test, start, restart, and stop commands, but can be
-called directly, as well.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../cli/npm-test.html">npm-test(1)</a></li><li><a href="../cli/npm-start.html">npm-start(1)</a></li><li><a href="../cli/npm-restart.html">npm-restart(1)</a></li><li><a href="../cli/npm-stop.html">npm-stop(1)</a></li></ul>
-</div>
-<p id="footer">npm-run-script &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-search.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,60 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-search</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-search.html">npm-search</a></h1> <p>Search for packages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm search [search terms ...]
-npm s [search terms ...]
-npm se [search terms ...]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Search the registry for packages matching the search terms.</p>
-
-<p>If a term starts with <code>/</code>, then it&#39;s interpreted as a regular expression.
-A trailing <code>/</code> will be ignored in this case.  (Note that many regular
-expression characters must be escaped or quoted in most shells.)</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../cli/npm-view.html">npm-view(1)</a></li></ul>
-</div>
-<p id="footer">npm-search &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-shrinkwrap.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,217 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-shrinkwrap</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-shrinkwrap.html">npm-shrinkwrap</a></h1> <p>Lock down dependency versions</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm shrinkwrap</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command locks down the versions of a package&#39;s dependencies so
-that you can control exactly which versions of each dependency will be
-used when your package is installed. The &quot;package.json&quot; file is still
-required if you want to use &quot;npm install&quot;.</p>
-
-<p>By default, &quot;npm install&quot; recursively installs the target&#39;s
-dependencies (as specified in package.json), choosing the latest
-available version that satisfies the dependency&#39;s semver pattern. In
-some situations, particularly when shipping software where each change
-is tightly managed, it&#39;s desirable to fully specify each version of
-each dependency recursively so that subsequent builds and deploys do
-not inadvertently pick up newer versions of a dependency that satisfy
-the semver pattern. Specifying specific semver patterns in each
-dependency&#39;s package.json would facilitate this, but that&#39;s not always
-possible or desirable, as when another author owns the npm package.
-It&#39;s also possible to check dependencies directly into source control,
-but that may be undesirable for other reasons.</p>
-
-<p>As an example, consider package A:</p>
-
-<pre><code>{
-  &quot;name&quot;: &quot;A&quot;,
-  &quot;version&quot;: &quot;0.1.0&quot;,
-  &quot;dependencies&quot;: {
-    &quot;B&quot;: &quot;&lt;0.1.0&quot;
-  }
-}</code></pre>
-
-<p>package B:</p>
-
-<pre><code>{
-  &quot;name&quot;: &quot;B&quot;,
-  &quot;version&quot;: &quot;0.0.1&quot;,
-  &quot;dependencies&quot;: {
-    &quot;C&quot;: &quot;&lt;0.1.0&quot;
-  }
-}</code></pre>
-
-<p>and package C:</p>
-
-<pre><code>{
-  &quot;name&quot;: &quot;C,
-  &quot;version&quot;: &quot;0.0.1&quot;
-}</code></pre>
-
-<p>If these are the only versions of A, B, and C available in the
-registry, then a normal &quot;npm install A&quot; will install:</p>
-
-<pre><code>A@0.1.0
-`-- B@0.0.1
-    `-- C@0.0.1</code></pre>
-
-<p>However, if B@0.0.2 is published, then a fresh &quot;npm install A&quot; will
-install:</p>
-
-<pre><code>A@0.1.0
-`-- B@0.0.2
-    `-- C@0.0.1</code></pre>
-
-<p>assuming the new version did not modify B&#39;s dependencies. Of course,
-the new version of B could include a new version of C and any number
-of new dependencies. If such changes are undesirable, the author of A
-could specify a dependency on B@0.0.1. However, if A&#39;s author and B&#39;s
-author are not the same person, there&#39;s no way for A&#39;s author to say
-that he or she does not want to pull in newly published versions of C
-when B hasn&#39;t changed at all.</p>
-
-<p>In this case, A&#39;s author can run</p>
-
-<pre><code>npm shrinkwrap</code></pre>
-
-<p>This generates npm-shrinkwrap.json, which will look something like this:</p>
-
-<pre><code>{
-  &quot;name&quot;: &quot;A&quot;,
-  &quot;version&quot;: &quot;0.1.0&quot;,
-  &quot;dependencies&quot;: {
-    &quot;B&quot;: {
-      &quot;version&quot;: &quot;0.0.1&quot;,
-      &quot;dependencies&quot;: {
-        &quot;C&quot;: {
-          &quot;version&quot;: &quot;0.1.0&quot;
-        }
-      }
-    }
-  }
-}</code></pre>
-
-<p>The shrinkwrap command has locked down the dependencies based on
-what&#39;s currently installed in node_modules.  When &quot;npm install&quot;
-installs a package with a npm-shrinkwrap.json file in the package
-root, the shrinkwrap file (rather than package.json files) completely
-drives the installation of that package and all of its dependencies
-(recursively).  So now the author publishes A@0.1.0, and subsequent
-installs of this package will use B@0.0.1 and C@0.1.0, regardless the
-dependencies and versions listed in A&#39;s, B&#39;s, and C&#39;s package.json
-files.</p>
-
-<h3 id="Using-shrinkwrapped-packages">Using shrinkwrapped packages</h3>
-
-<p>Using a shrinkwrapped package is no different than using any other
-package: you can &quot;npm install&quot; it by hand, or add a dependency to your
-package.json file and &quot;npm install&quot; it.</p>
-
-<h3 id="Building-shrinkwrapped-packages">Building shrinkwrapped packages</h3>
-
-<p>To shrinkwrap an existing package:</p>
-
-<ol><li>Run &quot;npm install&quot; in the package root to install the current
-versions of all dependencies.</li><li>Validate that the package works as expected with these versions.</li><li>Run &quot;npm shrinkwrap&quot;, add npm-shrinkwrap.json to git, and publish
-your package.</li></ol>
-
-<p>To add or update a dependency in a shrinkwrapped package:</p>
-
-<ol><li>Run &quot;npm install&quot; in the package root to install the current
-versions of all dependencies.</li><li>Add or update dependencies. &quot;npm install&quot; each new or updated
-package individually and then update package.json.  Note that they
-must be explicitly named in order to be installed: running <code>npm
-install</code> with no arguments will merely reproduce the existing
-shrinkwrap.</li><li>Validate that the package works as expected with the new
-dependencies.</li><li>Run &quot;npm shrinkwrap&quot;, commit the new npm-shrinkwrap.json, and
-publish your package.</li></ol>
-
-<p>You can use <a href="../cli/npm-outdated.html">npm-outdated(1)</a> to view dependencies with newer versions
-available.</p>
-
-<h3 id="Other-Notes">Other Notes</h3>
-
-<p>A shrinkwrap file must be consistent with the package&#39;s package.json
-file. &quot;npm shrinkwrap&quot; will fail if required dependencies are not
-already installed, since that would result in a shrinkwrap that
-wouldn&#39;t actually work. Similarly, the command will fail if there are
-extraneous packages (not referenced by package.json), since that would
-indicate that package.json is not correct.</p>
-
-<p>Since &quot;npm shrinkwrap&quot; is intended to lock down your dependencies for
-production use, <code>devDependencies</code> will not be included unless you
-explicitly set the <code>--dev</code> flag when you run <code>npm shrinkwrap</code>.  If
-installed <code>devDependencies</code> are excluded, then npm will print a
-warning.  If you want them to be installed with your module by
-default, please consider adding them to <code>dependencies</code> instead.</p>
-
-<p>If shrinkwrapped package A depends on shrinkwrapped package B, B&#39;s
-shrinkwrap will not be used as part of the installation of A. However,
-because A&#39;s shrinkwrap is constructed from a valid installation of B
-and recursively specifies all dependencies, the contents of B&#39;s
-shrinkwrap will implicitly be included in A&#39;s shrinkwrap.</p>
-
-<h3 id="Caveats">Caveats</h3>
-
-<p>Shrinkwrap files only lock down package versions, not actual package
-contents.  While discouraged, a package author can republish an
-existing version of a package, causing shrinkwrapped packages using
-that version to pick up different code than they were before. If you
-want to avoid any risk that a byzantine author replaces a package
-you&#39;re using with code that breaks your application, you could modify
-the shrinkwrap file to use git URL references rather than version
-numbers so that npm always fetches all packages from git.</p>
-
-<p>If you wish to lock down the specific bytes included in a package, for
-example to have 100% confidence in being able to reproduce a
-deployment or build, then you ought to check your dependencies into
-source control, or pursue some other mechanism that can verify
-contents rather than versions.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../cli/npm-ls.html">npm-ls(1)</a></li></ul>
-</div>
-<p id="footer">npm-shrinkwrap &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-star.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,60 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-star</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-star.html">npm-star</a></h1> <p>Mark your favorite packages</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm star &lt;pkgname&gt; [&lt;pkg&gt;, ...]
-npm unstar &lt;pkgname&gt; [&lt;pkg&gt;, ...]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>&quot;Starring&quot; a package means that you have some interest in it.  It&#39;s
-a vaguely positive way to show that you care.</p>
-
-<p>&quot;Unstarring&quot; is the same thing, but in reverse.</p>
-
-<p>It&#39;s a boolean thing.  Starring repeatedly has no additional effect.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-view.html">npm-view(1)</a></li><li><a href="../cli/npm-whoami.html">npm-whoami(1)</a></li><li><a href="../cli/npm-adduser.html">npm-adduser(1)</a></li></ul>
-</div>
-<p id="footer">npm-star &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-stars.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-stars</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-stars.html">npm-stars</a></h1> <p>View packages marked as favorites</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm stars
-npm stars [username]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>If you have starred a lot of neat things and want to find them again
-quickly this command lets you do just that.</p>
-
-<p>You may also want to see your friend&#39;s favorite packages, in this case
-you will most certainly enjoy this command.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-star.html">npm-star(1)</a></li><li><a href="../cli/npm-view.html">npm-view(1)</a></li><li><a href="../cli/npm-whoami.html">npm-whoami(1)</a></li><li><a href="../cli/npm-adduser.html">npm-adduser(1)</a></li></ul>
-</div>
-<p id="footer">npm-stars &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-start.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-start</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-start.html">npm-start</a></h1> <p>Start a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm start &lt;name&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs a package&#39;s &quot;start&quot; script, if one was provided.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-run-script.html">npm-run-script(1)</a></li><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../cli/npm-test.html">npm-test(1)</a></li><li><a href="../cli/npm-restart.html">npm-restart(1)</a></li><li><a href="../cli/npm-stop.html">npm-stop(1)</a></li></ul>
-</div>
-<p id="footer">npm-start &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-stop.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-stop</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-stop.html">npm-stop</a></h1> <p>Stop a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm stop &lt;name&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs a package&#39;s &quot;stop&quot; script, if one was provided.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-run-script.html">npm-run-script(1)</a></li><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../cli/npm-test.html">npm-test(1)</a></li><li><a href="../cli/npm-start.html">npm-start(1)</a></li><li><a href="../cli/npm-restart.html">npm-restart(1)</a></li></ul>
-</div>
-<p id="footer">npm-stop &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-submodule.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,67 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-submodule</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-submodule.html">npm-submodule</a></h1> <p>Add a package as a git submodule</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm submodule &lt;pkg&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>If the specified package has a git repository url in its package.json
-description, then this command will add it as a git submodule at
-<code>node_modules/&lt;pkg name&gt;</code>.</p>
-
-<p>This is a convenience only.  From then on, it&#39;s up to you to manage
-updates by using the appropriate git commands.  npm will stubbornly
-refuse to update, modify, or remove anything with a <code>.git</code> subfolder
-in it.</p>
-
-<p>This command also does not install missing dependencies, if the package
-does not include them in its git repository.  If <code>npm ls</code> reports that
-things are missing, you can either install, link, or submodule them yourself,
-or you can do <code>npm explore &lt;pkgname&gt; -- npm install</code> to install the
-dependencies into the submodule folder.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../files/package.json.html">package.json(5)</a></li><li>git help submodule</li></ul>
-</div>
-<p id="footer">npm-submodule &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-tag.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-tag</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-tag.html">npm-tag</a></h1> <p>Tag a published version</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm tag &lt;name&gt;@&lt;version&gt; [&lt;tag&gt;]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Tags the specified version of the package with the specified tag, or the
-<code>--tag</code> config if not specified.</p>
-
-<p>A tag can be used when installing packages as a reference to a version instead
-of using a specific version number:</p>
-
-<pre><code>npm install &lt;name&gt;@&lt;tag&gt;</code></pre>
-
-<p>When installing dependencies, a preferred tagged version may be specified:</p>
-
-<pre><code>npm install --tag &lt;tag&gt;</code></pre>
-
-<p>This also applies to <code>npm dedupe</code>.</p>
-
-<p>Publishing a package always sets the &quot;latest&quot; tag to the published version.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-dedupe.html">npm-dedupe(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li></ul>
-</div>
-<p id="footer">npm-tag &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-test.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,58 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-test</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-test.html">npm-test</a></h1> <p>Test a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>  npm test &lt;name&gt;
-  npm tst &lt;name&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This runs a package&#39;s &quot;test&quot; script, if one was provided.</p>
-
-<p>To run tests as a condition of installation, set the <code>npat</code> config to
-true.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-run-script.html">npm-run-script(1)</a></li><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../cli/npm-start.html">npm-start(1)</a></li><li><a href="../cli/npm-restart.html">npm-restart(1)</a></li><li><a href="../cli/npm-stop.html">npm-stop(1)</a></li></ul>
-</div>
-<p id="footer">npm-test &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-uninstall.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-uninstall</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-rm.html">npm-rm</a></h1> <p>Remove a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm rm &lt;name&gt;
-npm uninstall &lt;name&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This uninstalls a package, completely removing everything npm installed
-on its behalf.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-prune.html">npm-prune(1)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li></ul>
-</div>
-<p id="footer">npm-uninstall &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-unpublish.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-unpublish</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-unpublish.html">npm-unpublish</a></h1> <p>Remove a package from the registry</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm unpublish &lt;name&gt;[@&lt;version&gt;]</code></pre>
-
-<h2 id="WARNING">WARNING</h2>
-
-<p><strong>It is generally considered bad behavior to remove versions of a library
-that others are depending on!</strong></p>
-
-<p>Consider using the <code>deprecate</code> command
-instead, if your intent is to encourage users to upgrade.</p>
-
-<p>There is plenty of room on the registry.</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This removes a package version from the registry, deleting its
-entry and removing the tarball.</p>
-
-<p>If no version is specified, or if all versions are removed then
-the root package entry is removed from the registry entirely.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-deprecate.html">npm-deprecate(1)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-adduser.html">npm-adduser(1)</a></li><li><a href="../cli/npm-owner.html">npm-owner(1)</a></li></ul>
-</div>
-<p id="footer">npm-unpublish &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-update.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,60 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-update</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-update.html">npm-update</a></h1> <p>Update a package</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm update [-g] [&lt;name&gt; [&lt;name&gt; ...]]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command will update all the packages listed to the latest version
-(specified by the <code>tag</code> config).</p>
-
-<p>It will also install missing packages.</p>
-
-<p>If the <code>-g</code> flag is specified, this command will update globally installed packages.
-If no package name is specified, all packages in the specified location (global or local) will be updated.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-outdated.html">npm-outdated(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-ls.html">npm-ls(1)</a></li></ul>
-</div>
-<p id="footer">npm-update &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-version.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,83 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-version</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-version.html">npm-version</a></h1> <p>Bump a package version</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm version [&lt;newversion&gt; | major | minor | patch | build]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Run this in a package directory to bump the version and write the new
-data back to the package.json file.</p>
-
-<p>The <code>newversion</code> argument should be a valid semver string, <em>or</em> a valid
-second argument to semver.inc (one of &quot;build&quot;, &quot;patch&quot;, &quot;minor&quot;, or
-&quot;major&quot;). In the second case, the existing version will be incremented
-by 1 in the specified field.</p>
-
-<p>If run in a git repo, it will also create a version commit and tag, and
-fail if the repo is not clean.</p>
-
-<p>If supplied with <code>--message</code> (shorthand: <code>-m</code>) config option, npm will
-use it as a commit message when creating a version commit.  If the
-<code>message</code> config contains <code>%s</code> then that will be replaced with the
-resulting version number.  For example:</p>
-
-<pre><code>npm version patch -m &quot;Upgrade to %s for reasons&quot;</code></pre>
-
-<p>If the <code>sign-git-tag</code> config is set, then the tag will be signed using
-the <code>-s</code> flag to git.  Note that you must have a default GPG key set up
-in your git config for this to work properly.  For example:</p>
-
-<pre><code>$ npm config set sign-git-tag true
-$ npm version patch
-
-You need a passphrase to unlock the secret key for
-user: &quot;isaacs (http://blog.izs.me/) &lt;i@izs.me&gt;&quot;
-2048-bit RSA key, ID 6C481CF6, created 2010-08-31
-
-Enter passphrase:</code></pre>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-init.html">npm-init(1)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../misc/semver.html">semver(7)</a></li></ul>
-</div>
-<p id="footer">npm-version &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-view.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,125 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-view</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-view.html">npm-view</a></h1> <p>View registry info</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm view &lt;name&gt;[@&lt;version&gt;] [&lt;field&gt;[.&lt;subfield&gt;]...]
-npm v &lt;name&gt;[@&lt;version&gt;] [&lt;field&gt;[.&lt;subfield&gt;]...]</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command shows data about a package and prints it to the stream
-referenced by the <code>outfd</code> config, which defaults to stdout.</p>
-
-<p>To show the package registry entry for the <code>connect</code> package, you can do
-this:</p>
-
-<pre><code>npm view connect</code></pre>
-
-<p>The default version is &quot;latest&quot; if unspecified.</p>
-
-<p>Field names can be specified after the package descriptor.
-For example, to show the dependencies of the <code>ronn</code> package at version
-0.3.5, you could do the following:</p>
-
-<pre><code>npm view ronn@0.3.5 dependencies</code></pre>
-
-<p>You can view child field by separating them with a period.
-To view the git repository URL for the latest version of npm, you could
-do this:</p>
-
-<pre><code>npm view npm repository.url</code></pre>
-
-<p>This makes it easy to view information about a dependency with a bit of
-shell scripting.  For example, to view all the data about the version of
-opts that ronn depends on, you can do this:</p>
-
-<pre><code>npm view opts@$(npm view ronn dependencies.opts)</code></pre>
-
-<p>For fields that are arrays, requesting a non-numeric field will return
-all of the values from the objects in the list.  For example, to get all
-the contributor names for the &quot;express&quot; project, you can do this:</p>
-
-<pre><code>npm view express contributors.email</code></pre>
-
-<p>You may also use numeric indices in square braces to specifically select
-an item in an array field.  To just get the email address of the first
-contributor in the list, you can do this:</p>
-
-<pre><code>npm view express contributors[0].email</code></pre>
-
-<p>Multiple fields may be specified, and will be printed one after another.
-For exampls, to get all the contributor names and email addresses, you
-can do this:</p>
-
-<pre><code>npm view express contributors.name contributors.email</code></pre>
-
-<p>&quot;Person&quot; fields are shown as a string if they would be shown as an
-object.  So, for example, this will show the list of npm contributors in
-the shortened string format.  (See <code><a href="../files/package.json.html">package.json(5)</a></code> for more on this.)</p>
-
-<pre><code>npm view npm contributors</code></pre>
-
-<p>If a version range is provided, then data will be printed for every
-matching version of the package.  This will show which version of jsdom
-was required by each matching version of yui3:</p>
-
-<pre><code>npm view yui3@&#39;&gt;0.5.4&#39; dependencies.jsdom</code></pre>
-
-<h2 id="OUTPUT">OUTPUT</h2>
-
-<p>If only a single string field for a single version is output, then it
-will not be colorized or quoted, so as to enable piping the output to
-another command. If the field is an object, it will be output as a JavaScript object literal.</p>
-
-<p>If the --json flag is given, the outputted fields will be JSON.</p>
-
-<p>If the version range matches multiple versions, than each printed value
-will be prefixed with the version it applies to.</p>
-
-<p>If multiple fields are requested, than each of them are prefixed with
-the field name.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-search.html">npm-search(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../cli/npm-docs.html">npm-docs(1)</a></li></ul>
-</div>
-<p id="footer">npm-view &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm-whoami.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-whoami</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-whoami.html">npm-whoami</a></h1> <p>Display npm username</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm whoami</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>Print the <code>username</code> config to standard output.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../cli/npm-adduser.html">npm-adduser(1)</a></li></ul>
-</div>
-<p id="footer">npm-whoami &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/npm.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,169 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm.html">npm</a></h1> <p>node package manager</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm &lt;command&gt; [args]</code></pre>
-
-<h2 id="VERSION">VERSION</h2>
-
-<p>1.3.14</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm is the package manager for the Node JavaScript platform.  It puts
-modules in place so that node can find them, and manages dependency
-conflicts intelligently.</p>
-
-<p>It is extremely configurable to support a wide variety of use cases.
-Most commonly, it is used to publish, discover, install, and develop node
-programs.</p>
-
-<p>Run <code>npm help</code> to get a list of available commands.</p>
-
-<h2 id="INTRODUCTION">INTRODUCTION</h2>
-
-<p>You probably got npm because you want to install stuff.</p>
-
-<p>Use <code>npm install blerg</code> to install the latest version of &quot;blerg&quot;.  Check out
-<code><a href="../cli/npm-install.html">npm-install(1)</a></code> for more info.  It can do a lot of stuff.</p>
-
-<p>Use the <code>npm search</code> command to show everything that&#39;s available.
-Use <code>npm ls</code> to show everything you&#39;ve installed.</p>
-
-<h2 id="DIRECTORIES">DIRECTORIES</h2>
-
-<p>See <code><a href="../files/npm-folders.html">npm-folders(5)</a></code> to learn about where npm puts stuff.</p>
-
-<p>In particular, npm has two modes of operation:</p>
-
-<ul><li>global mode:<br />npm installs packages into the install prefix at
-<code>prefix/lib/node_modules</code> and bins are installed in <code>prefix/bin</code>.</li><li>local mode:<br />npm installs packages into the current project directory, which
-defaults to the current working directory.  Packages are installed to
-<code>./node_modules</code>, and bins are installed to <code>./node_modules/.bin</code>.</li></ul>
-
-<p>Local mode is the default.  Use <code>--global</code> or <code>-g</code> on any command to
-operate in global mode instead.</p>
-
-<h2 id="DEVELOPER-USAGE">DEVELOPER USAGE</h2>
-
-<p>If you&#39;re using npm to develop and publish your code, check out the
-following help topics:</p>
-
-<ul><li>json:
-Make a package.json file.  See <code><a href="../files/package.json.html">package.json(5)</a></code>.</li><li>link:
-For linking your current working code into Node&#39;s path, so that you
-don&#39;t have to reinstall every time you make a change.  Use
-<code>npm link</code> to do this.</li><li>install:
-It&#39;s a good idea to install things if you don&#39;t need the symbolic link.
-Especially, installing other peoples code from the registry is done via
-<code>npm install</code></li><li>adduser:
-Create an account or log in.  Credentials are stored in the
-user config file.</li><li>publish:
-Use the <code>npm publish</code> command to upload your code to the registry.</li></ul>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<p>npm is extremely configurable.  It reads its configuration options from
-5 places.</p>
-
-<ul><li>Command line switches:<br />Set a config with <code>--key val</code>.  All keys take a value, even if they
-are booleans (the config parser doesn&#39;t know what the options are at
-the time of parsing.)  If no value is provided, then the option is set
-to boolean <code>true</code>.</li><li>Environment Variables:<br />Set any config by prefixing the name in an environment variable with
-<code>npm_config_</code>.  For example, <code>export npm_config_key=val</code>.</li><li>User Configs:<br />The file at $HOME/.npmrc is an ini-formatted list of configs.  If
-present, it is parsed.  If the <code>userconfig</code> option is set in the cli
-or env, then that will be used instead.</li><li>Global Configs:<br />The file found at ../etc/npmrc (from the node executable, by default
-this resolves to /usr/local/etc/npmrc) will be parsed if it is found.
-If the <code>globalconfig</code> option is set in the cli, env, or user config,
-then that file is parsed instead.</li><li>Defaults:<br />npm&#39;s default configuration options are defined in
-lib/utils/config-defs.js.  These must not be changed.</li></ul>
-
-<p>See <code><a href="../misc/npm-config.html">npm-config(7)</a></code> for much much more information.</p>
-
-<h2 id="CONTRIBUTIONS">CONTRIBUTIONS</h2>
-
-<p>Patches welcome!</p>
-
-<ul><li>code:
-Read through <code><a href="../misc/npm-coding-style.html">npm-coding-style(7)</a></code> if you plan to submit code.
-You don&#39;t have to agree with it, but you do have to follow it.</li><li>docs:
-If you find an error in the documentation, edit the appropriate markdown
-file in the &quot;doc&quot; folder.  (Don&#39;t worry about generating the man page.)</li></ul>
-
-<p>Contributors are listed in npm&#39;s <code>package.json</code> file.  You can view them
-easily by doing <code>npm view npm contributors</code>.</p>
-
-<p>If you would like to contribute, but don&#39;t know what to work on, check
-the issues list or ask on the mailing list.</p>
-
-<ul><li><a href="http://github.com/isaacs/npm/issues">http://github.com/isaacs/npm/issues</a></li><li><a href="mailto:npm-@googlegroups.com">npm-@googlegroups.com</a></li></ul>
-
-<h2 id="BUGS">BUGS</h2>
-
-<p>When you find issues, please report them:</p>
-
-<ul><li>web:
-<a href="http://github.com/isaacs/npm/issues">http://github.com/isaacs/npm/issues</a></li><li>email:
-<a href="mailto:npm-@googlegroups.com">npm-@googlegroups.com</a></li></ul>
-
-<p>Be sure to include <em>all</em> of the output from the npm command that didn&#39;t work
-as expected.  The <code>npm-debug.log</code> file is also helpful to provide.</p>
-
-<p>You can also look for isaacs in #node.js on irc://irc.freenode.net.  He
-will no doubt tell you to put the output in a gist or email.</p>
-
-<h2 id="HISTORY">HISTORY</h2>
-
-<p>See <a href="../cli/npm-changelog.html">npm-changelog(1)</a></p>
-
-<h2 id="AUTHOR">AUTHOR</h2>
-
-<p><a href="http://blog.izs.me/">Isaac Z. Schlueter</a> ::
-<a href="https://github.com/isaacs/">isaacs</a> ::
-<a href="http://twitter.com/izs">@izs</a> ::
-<a href="mailto:i@izs.me">i@izs.me</a></p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-help.html">npm-help(1)</a></li><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../../doc/README.html">README</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../misc/npm-index.html">npm-index(7)</a></li><li><a href="../api/npm.html">npm(3)</a></li></ul>
-</div>
-<p id="footer">npm &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/cli/repo.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,64 +0,0 @@
-<!doctype html>
-<html>
-  <title>repo</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-repo.html">npm-repo</a></h1> <p>Open package repository page in the browser</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<pre><code>npm repo &lt;pkgname&gt;</code></pre>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This command tries to guess at the likely location of a package&#39;s
-repository URL, and then tries to open it using the <code>--browser</code>
-config param.</p>
-
-<h2 id="CONFIGURATION">CONFIGURATION</h2>
-
-<h3 id="browser">browser</h3>
-
-<ul><li>Default: OS X: <code>&quot;open&quot;</code>, Windows: <code>&quot;start&quot;</code>, Others: <code>&quot;xdg-open&quot;</code></li><li>Type: String</li></ul>
-
-<p>The browser that is called by the <code>npm repo</code> command to open websites.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-docs.html">npm-docs(1)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li></ul>
-</div>
-<p id="footer">repo &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/npm-folders.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,239 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-folders</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../files/npm-folders.html">npm-folders</a></h1> <p>Folder Structures Used by npm</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm puts various things on your computer.  That&#39;s its job.</p>
-
-<p>This document will tell you what it puts where.</p>
-
-<h3 id="tl-dr">tl;dr</h3>
-
-<ul><li>Local install (default): puts stuff in <code>./node_modules</code> of the current
-package root.</li><li>Global install (with <code>-g</code>): puts stuff in /usr/local or wherever node
-is installed.</li><li>Install it <strong>locally</strong> if you&#39;re going to <code>require()</code> it.</li><li>Install it <strong>globally</strong> if you&#39;re going to run it on the command line.</li><li>If you need both, then install it in both places, or use <code>npm link</code>.</li></ul>
-
-<h3 id="prefix-Configuration">prefix Configuration</h3>
-
-<p>The <code>prefix</code> config defaults to the location where node is installed.
-On most systems, this is <code>/usr/local</code>, and most of the time is the same
-as node&#39;s <code>process.installPrefix</code>.</p>
-
-<p>On windows, this is the exact location of the node.exe binary.  On Unix
-systems, it&#39;s one level up, since node is typically installed at
-<code>{prefix}/bin/node</code> rather than <code>{prefix}/node.exe</code>.</p>
-
-<p>When the <code>global</code> flag is set, npm installs things into this prefix.
-When it is not set, it uses the root of the current package, or the
-current working directory if not in a package already.</p>
-
-<h3 id="Node-Modules">Node Modules</h3>
-
-<p>Packages are dropped into the <code>node_modules</code> folder under the <code>prefix</code>.
-When installing locally, this means that you can
-<code>require(&quot;packagename&quot;)</code> to load its main module, or
-<code>require(&quot;packagename/lib/path/to/sub/module&quot;)</code> to load other modules.</p>
-
-<p>Global installs on Unix systems go to <code>{prefix}/lib/node_modules</code>.
-Global installs on Windows go to <code>{prefix}/node_modules</code> (that is, no
-<code>lib</code> folder.)</p>
-
-<p>If you wish to <code>require()</code> a package, then install it locally.</p>
-
-<h3 id="Executables">Executables</h3>
-
-<p>When in global mode, executables are linked into <code>{prefix}/bin</code> on Unix,
-or directly into <code>{prefix}</code> on Windows.</p>
-
-<p>When in local mode, executables are linked into
-<code>./node_modules/.bin</code> so that they can be made available to scripts run
-through npm.  (For example, so that a test runner will be in the path
-when you run <code>npm test</code>.)</p>
-
-<h3 id="Man-Pages">Man Pages</h3>
-
-<p>When in global mode, man pages are linked into <code>{prefix}/share/man</code>.</p>
-
-<p>When in local mode, man pages are not installed.</p>
-
-<p>Man pages are not installed on Windows systems.</p>
-
-<h3 id="Cache">Cache</h3>
-
-<p>See <code><a href="../cli/npm-cache.html">npm-cache(1)</a></code>.  Cache files are stored in <code>~/.npm</code> on Posix, or
-<code>~/npm-cache</code> on Windows.</p>
-
-<p>This is controlled by the <code>cache</code> configuration param.</p>
-
-<h3 id="Temp-Files">Temp Files</h3>
-
-<p>Temporary files are stored by default in the folder specified by the
-<code>tmp</code> config, which defaults to the TMPDIR, TMP, or TEMP environment
-variables, or <code>/tmp</code> on Unix and <code>c:\windows\temp</code> on Windows.</p>
-
-<p>Temp files are given a unique folder under this root for each run of the
-program, and are deleted upon successful exit.</p>
-
-<h2 id="More-Information">More Information</h2>
-
-<p>When installing locally, npm first tries to find an appropriate
-<code>prefix</code> folder.  This is so that <code>npm install foo@1.2.3</code> will install
-to the sensible root of your package, even if you happen to have <code>cd</code>ed
-into some other folder.</p>
-
-<p>Starting at the $PWD, npm will walk up the folder tree checking for a
-folder that contains either a <code>package.json</code> file, or a <code>node_modules</code>
-folder.  If such a thing is found, then that is treated as the effective
-&quot;current directory&quot; for the purpose of running npm commands.  (This
-behavior is inspired by and similar to git&#39;s .git-folder seeking
-logic when running git commands in a working dir.)</p>
-
-<p>If no package root is found, then the current folder is used.</p>
-
-<p>When you run <code>npm install foo@1.2.3</code>, then the package is loaded into
-the cache, and then unpacked into <code>./node_modules/foo</code>.  Then, any of
-foo&#39;s dependencies are similarly unpacked into
-<code>./node_modules/foo/node_modules/...</code>.</p>
-
-<p>Any bin files are symlinked to <code>./node_modules/.bin/</code>, so that they may
-be found by npm scripts when necessary.</p>
-
-<h3 id="Global-Installation">Global Installation</h3>
-
-<p>If the <code>global</code> configuration is set to true, then npm will
-install packages &quot;globally&quot;.</p>
-
-<p>For global installation, packages are installed roughly the same way,
-but using the folders described above.</p>
-
-<h3 id="Cycles-Conflicts-and-Folder-Parsimony">Cycles, Conflicts, and Folder Parsimony</h3>
-
-<p>Cycles are handled using the property of node&#39;s module system that it
-walks up the directories looking for <code>node_modules</code> folders.  So, at every
-stage, if a package is already installed in an ancestor <code>node_modules</code>
-folder, then it is not installed at the current location.</p>
-
-<p>Consider the case above, where <code>foo -&gt; bar -&gt; baz</code>.  Imagine if, in
-addition to that, baz depended on bar, so you&#39;d have:
-<code>foo -&gt; bar -&gt; baz -&gt; bar -&gt; baz ...</code>.  However, since the folder
-structure is: <code>foo/node_modules/bar/node_modules/baz</code>, there&#39;s no need to
-put another copy of bar into <code>.../baz/node_modules</code>, since when it calls
-require(&quot;bar&quot;), it will get the copy that is installed in
-<code>foo/node_modules/bar</code>.</p>
-
-<p>This shortcut is only used if the exact same
-version would be installed in multiple nested <code>node_modules</code> folders.  It
-is still possible to have <code>a/node_modules/b/node_modules/a</code> if the two
-&quot;a&quot; packages are different versions.  However, without repeating the
-exact same package multiple times, an infinite regress will always be
-prevented.</p>
-
-<p>Another optimization can be made by installing dependencies at the
-highest level possible, below the localized &quot;target&quot; folder.</p>
-
-<h4 id="Example">Example</h4>
-
-<p>Consider this dependency graph:</p>
-
-<pre><code>foo
-+-- blerg@1.2.5
-+-- bar@1.2.3
-|   +-- blerg@1.x (latest=1.3.7)
-|   +-- baz@2.x
-|   |   `-- quux@3.x
-|   |       `-- bar@1.2.3 (cycle)
-|   `-- asdf@*
-`-- baz@1.2.3
-    `-- quux@3.x
-        `-- bar</code></pre>
-
-<p>In this case, we might expect a folder structure like this:</p>
-
-<pre><code>foo
-+-- node_modules
-    +-- blerg (1.2.5) &lt;---[A]
-    +-- bar (1.2.3) &lt;---[B]
-    |   `-- node_modules
-    |       +-- baz (2.0.2) &lt;---[C]
-    |       |   `-- node_modules
-    |       |       `-- quux (3.2.0)
-    |       `-- asdf (2.3.4)
-    `-- baz (1.2.3) &lt;---[D]
-        `-- node_modules
-            `-- quux (3.2.0) &lt;---[E]</code></pre>
-
-<p>Since foo depends directly on <code>bar@1.2.3</code> and <code>baz@1.2.3</code>, those are
-installed in foo&#39;s <code>node_modules</code> folder.</p>
-
-<p>Even though the latest copy of blerg is 1.3.7, foo has a specific
-dependency on version 1.2.5.  So, that gets installed at [A].  Since the
-parent installation of blerg satisfies bar&#39;s dependency on <code>blerg@1.x</code>,
-it does not install another copy under [B].</p>
-
-<p>Bar [B] also has dependencies on baz and asdf, so those are installed in
-bar&#39;s <code>node_modules</code> folder.  Because it depends on <code>baz@2.x</code>, it cannot
-re-use the <code>baz@1.2.3</code> installed in the parent <code>node_modules</code> folder [D],
-and must install its own copy [C].</p>
-
-<p>Underneath bar, the <code>baz -&gt; quux -&gt; bar</code> dependency creates a cycle.
-However, because bar is already in quux&#39;s ancestry [B], it does not
-unpack another copy of bar into that folder.</p>
-
-<p>Underneath <code>foo -&gt; baz</code> [D], quux&#39;s [E] folder tree is empty, because its
-dependency on bar is satisfied by the parent folder copy installed at [B].</p>
-
-<p>For a graphical breakdown of what is installed where, use <code>npm ls</code>.</p>
-
-<h3 id="Publishing">Publishing</h3>
-
-<p>Upon publishing, npm will look in the <code>node_modules</code> folder.  If any of
-the items there are not in the <code>bundledDependencies</code> array, then they will
-not be included in the package tarball.</p>
-
-<p>This allows a package maintainer to install all of their dependencies
-(and dev dependencies) locally, but only re-publish those items that
-cannot be found elsewhere.  See <code><a href="../files/package.json.html">package.json(5)</a></code> for more information.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-pack.html">npm-pack(1)</a></li><li><a href="../cli/npm-cache.html">npm-cache(1)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li></ul>
-</div>
-<p id="footer">npm-folders &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/npm-global.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,239 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-folders</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../files/npm-folders.html">npm-folders</a></h1> <p>Folder Structures Used by npm</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm puts various things on your computer.  That&#39;s its job.</p>
-
-<p>This document will tell you what it puts where.</p>
-
-<h3 id="tl-dr">tl;dr</h3>
-
-<ul><li>Local install (default): puts stuff in <code>./node_modules</code> of the current
-package root.</li><li>Global install (with <code>-g</code>): puts stuff in /usr/local or wherever node
-is installed.</li><li>Install it <strong>locally</strong> if you&#39;re going to <code>require()</code> it.</li><li>Install it <strong>globally</strong> if you&#39;re going to run it on the command line.</li><li>If you need both, then install it in both places, or use <code>npm link</code>.</li></ul>
-
-<h3 id="prefix-Configuration">prefix Configuration</h3>
-
-<p>The <code>prefix</code> config defaults to the location where node is installed.
-On most systems, this is <code>/usr/local</code>, and most of the time is the same
-as node&#39;s <code>process.installPrefix</code>.</p>
-
-<p>On windows, this is the exact location of the node.exe binary.  On Unix
-systems, it&#39;s one level up, since node is typically installed at
-<code>{prefix}/bin/node</code> rather than <code>{prefix}/node.exe</code>.</p>
-
-<p>When the <code>global</code> flag is set, npm installs things into this prefix.
-When it is not set, it uses the root of the current package, or the
-current working directory if not in a package already.</p>
-
-<h3 id="Node-Modules">Node Modules</h3>
-
-<p>Packages are dropped into the <code>node_modules</code> folder under the <code>prefix</code>.
-When installing locally, this means that you can
-<code>require(&quot;packagename&quot;)</code> to load its main module, or
-<code>require(&quot;packagename/lib/path/to/sub/module&quot;)</code> to load other modules.</p>
-
-<p>Global installs on Unix systems go to <code>{prefix}/lib/node_modules</code>.
-Global installs on Windows go to <code>{prefix}/node_modules</code> (that is, no
-<code>lib</code> folder.)</p>
-
-<p>If you wish to <code>require()</code> a package, then install it locally.</p>
-
-<h3 id="Executables">Executables</h3>
-
-<p>When in global mode, executables are linked into <code>{prefix}/bin</code> on Unix,
-or directly into <code>{prefix}</code> on Windows.</p>
-
-<p>When in local mode, executables are linked into
-<code>./node_modules/.bin</code> so that they can be made available to scripts run
-through npm.  (For example, so that a test runner will be in the path
-when you run <code>npm test</code>.)</p>
-
-<h3 id="Man-Pages">Man Pages</h3>
-
-<p>When in global mode, man pages are linked into <code>{prefix}/share/man</code>.</p>
-
-<p>When in local mode, man pages are not installed.</p>
-
-<p>Man pages are not installed on Windows systems.</p>
-
-<h3 id="Cache">Cache</h3>
-
-<p>See <code><a href="../cli/npm-cache.html">npm-cache(1)</a></code>.  Cache files are stored in <code>~/.npm</code> on Posix, or
-<code>~/npm-cache</code> on Windows.</p>
-
-<p>This is controlled by the <code>cache</code> configuration param.</p>
-
-<h3 id="Temp-Files">Temp Files</h3>
-
-<p>Temporary files are stored by default in the folder specified by the
-<code>tmp</code> config, which defaults to the TMPDIR, TMP, or TEMP environment
-variables, or <code>/tmp</code> on Unix and <code>c:\windows\temp</code> on Windows.</p>
-
-<p>Temp files are given a unique folder under this root for each run of the
-program, and are deleted upon successful exit.</p>
-
-<h2 id="More-Information">More Information</h2>
-
-<p>When installing locally, npm first tries to find an appropriate
-<code>prefix</code> folder.  This is so that <code>npm install foo@1.2.3</code> will install
-to the sensible root of your package, even if you happen to have <code>cd</code>ed
-into some other folder.</p>
-
-<p>Starting at the $PWD, npm will walk up the folder tree checking for a
-folder that contains either a <code>package.json</code> file, or a <code>node_modules</code>
-folder.  If such a thing is found, then that is treated as the effective
-&quot;current directory&quot; for the purpose of running npm commands.  (This
-behavior is inspired by and similar to git&#39;s .git-folder seeking
-logic when running git commands in a working dir.)</p>
-
-<p>If no package root is found, then the current folder is used.</p>
-
-<p>When you run <code>npm install foo@1.2.3</code>, then the package is loaded into
-the cache, and then unpacked into <code>./node_modules/foo</code>.  Then, any of
-foo&#39;s dependencies are similarly unpacked into
-<code>./node_modules/foo/node_modules/...</code>.</p>
-
-<p>Any bin files are symlinked to <code>./node_modules/.bin/</code>, so that they may
-be found by npm scripts when necessary.</p>
-
-<h3 id="Global-Installation">Global Installation</h3>
-
-<p>If the <code>global</code> configuration is set to true, then npm will
-install packages &quot;globally&quot;.</p>
-
-<p>For global installation, packages are installed roughly the same way,
-but using the folders described above.</p>
-
-<h3 id="Cycles-Conflicts-and-Folder-Parsimony">Cycles, Conflicts, and Folder Parsimony</h3>
-
-<p>Cycles are handled using the property of node&#39;s module system that it
-walks up the directories looking for <code>node_modules</code> folders.  So, at every
-stage, if a package is already installed in an ancestor <code>node_modules</code>
-folder, then it is not installed at the current location.</p>
-
-<p>Consider the case above, where <code>foo -&gt; bar -&gt; baz</code>.  Imagine if, in
-addition to that, baz depended on bar, so you&#39;d have:
-<code>foo -&gt; bar -&gt; baz -&gt; bar -&gt; baz ...</code>.  However, since the folder
-structure is: <code>foo/node_modules/bar/node_modules/baz</code>, there&#39;s no need to
-put another copy of bar into <code>.../baz/node_modules</code>, since when it calls
-require(&quot;bar&quot;), it will get the copy that is installed in
-<code>foo/node_modules/bar</code>.</p>
-
-<p>This shortcut is only used if the exact same
-version would be installed in multiple nested <code>node_modules</code> folders.  It
-is still possible to have <code>a/node_modules/b/node_modules/a</code> if the two
-&quot;a&quot; packages are different versions.  However, without repeating the
-exact same package multiple times, an infinite regress will always be
-prevented.</p>
-
-<p>Another optimization can be made by installing dependencies at the
-highest level possible, below the localized &quot;target&quot; folder.</p>
-
-<h4 id="Example">Example</h4>
-
-<p>Consider this dependency graph:</p>
-
-<pre><code>foo
-+-- blerg@1.2.5
-+-- bar@1.2.3
-|   +-- blerg@1.x (latest=1.3.7)
-|   +-- baz@2.x
-|   |   `-- quux@3.x
-|   |       `-- bar@1.2.3 (cycle)
-|   `-- asdf@*
-`-- baz@1.2.3
-    `-- quux@3.x
-        `-- bar</code></pre>
-
-<p>In this case, we might expect a folder structure like this:</p>
-
-<pre><code>foo
-+-- node_modules
-    +-- blerg (1.2.5) &lt;---[A]
-    +-- bar (1.2.3) &lt;---[B]
-    |   `-- node_modules
-    |       +-- baz (2.0.2) &lt;---[C]
-    |       |   `-- node_modules
-    |       |       `-- quux (3.2.0)
-    |       `-- asdf (2.3.4)
-    `-- baz (1.2.3) &lt;---[D]
-        `-- node_modules
-            `-- quux (3.2.0) &lt;---[E]</code></pre>
-
-<p>Since foo depends directly on <code>bar@1.2.3</code> and <code>baz@1.2.3</code>, those are
-installed in foo&#39;s <code>node_modules</code> folder.</p>
-
-<p>Even though the latest copy of blerg is 1.3.7, foo has a specific
-dependency on version 1.2.5.  So, that gets installed at [A].  Since the
-parent installation of blerg satisfies bar&#39;s dependency on <code>blerg@1.x</code>,
-it does not install another copy under [B].</p>
-
-<p>Bar [B] also has dependencies on baz and asdf, so those are installed in
-bar&#39;s <code>node_modules</code> folder.  Because it depends on <code>baz@2.x</code>, it cannot
-re-use the <code>baz@1.2.3</code> installed in the parent <code>node_modules</code> folder [D],
-and must install its own copy [C].</p>
-
-<p>Underneath bar, the <code>baz -&gt; quux -&gt; bar</code> dependency creates a cycle.
-However, because bar is already in quux&#39;s ancestry [B], it does not
-unpack another copy of bar into that folder.</p>
-
-<p>Underneath <code>foo -&gt; baz</code> [D], quux&#39;s [E] folder tree is empty, because its
-dependency on bar is satisfied by the parent folder copy installed at [B].</p>
-
-<p>For a graphical breakdown of what is installed where, use <code>npm ls</code>.</p>
-
-<h3 id="Publishing">Publishing</h3>
-
-<p>Upon publishing, npm will look in the <code>node_modules</code> folder.  If any of
-the items there are not in the <code>bundledDependencies</code> array, then they will
-not be included in the package tarball.</p>
-
-<p>This allows a package maintainer to install all of their dependencies
-(and dev dependencies) locally, but only re-publish those items that
-cannot be found elsewhere.  See <code><a href="../files/package.json.html">package.json(5)</a></code> for more information.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-pack.html">npm-pack(1)</a></li><li><a href="../cli/npm-cache.html">npm-cache(1)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li></ul>
-</div>
-<p id="footer">npm-folders &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/npm-json.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,584 +0,0 @@
-<!doctype html>
-<html>
-  <title>package.json</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../files/package.json.html">package.json</a></h1> <p>Specifics of npm&#39;s package.json handling</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This document is all you need to know about what&#39;s required in your package.json
-file.  It must be actual JSON, not just a JavaScript object literal.</p>
-
-<p>A lot of the behavior described in this document is affected by the config
-settings described in <code><a href="../misc/npm-config.html">npm-config(7)</a></code>.</p>
-
-<h2 id="DEFAULT-VALUES">DEFAULT VALUES</h2>
-
-<p>npm will default some values based on package contents.</p>
-
-<ul><li><p><code>&quot;scripts&quot;: {&quot;start&quot;: &quot;node server.js&quot;}</code></p><p>If there is a <code>server.js</code> file in the root of your package, then npm
-will default the <code>start</code> command to <code>node server.js</code>.</p></li><li><p><code>&quot;scripts&quot;:{&quot;preinstall&quot;: &quot;node-waf clean || true; node-waf configure build&quot;}</code></p><p>If there is a <code>wscript</code> file in the root of your package, npm will
-default the <code>preinstall</code> command to compile using node-waf.</p></li><li><p><code>&quot;scripts&quot;:{&quot;preinstall&quot;: &quot;node-gyp rebuild&quot;}</code></p><p>If there is a <code>binding.gyp</code> file in the root of your package, npm will
-default the <code>preinstall</code> command to compile using node-gyp.</p></li><li><p><code>&quot;contributors&quot;: [...]</code></p><p>If there is an <code>AUTHORS</code> file in the root of your package, npm will
-treat each line as a <code>Name &lt;email&gt; (url)</code> format, where email and url
-are optional.  Lines which start with a <code>#</code> or are blank, will be
-ignored.</p></li></ul>
-
-<h2 id="name">name</h2>
-
-<p>The <em>most</em> important things in your package.json are the name and version fields.
-Those are actually required, and your package won&#39;t install without
-them.  The name and version together form an identifier that is assumed
-to be completely unique.  Changes to the package should come along with
-changes to the version.</p>
-
-<p>The name is what your thing is called.  Some tips:</p>
-
-<ul><li>Don&#39;t put &quot;js&quot; or &quot;node&quot; in the name.  It&#39;s assumed that it&#39;s js, since you&#39;re
-writing a package.json file, and you can specify the engine using the &quot;engines&quot;
-field.  (See below.)</li><li>The name ends up being part of a URL, an argument on the command line, and a
-folder name. Any name with non-url-safe characters will be rejected.
-Also, it can&#39;t start with a dot or an underscore.</li><li>The name will probably be passed as an argument to require(), so it should
-be something short, but also reasonably descriptive.</li><li>You may want to check the npm registry to see if there&#39;s something by that name
-already, before you get too attached to it.  http://registry.npmjs.org/</li></ul>
-
-<h2 id="version">version</h2>
-
-<p>The <em>most</em> important things in your package.json are the name and version fields.
-Those are actually required, and your package won&#39;t install without
-them.  The name and version together form an identifier that is assumed
-to be completely unique.  Changes to the package should come along with
-changes to the version.</p>
-
-<p>Version must be parseable by
-<a href="https://github.com/isaacs/node-semver">node-semver</a>, which is bundled
-with npm as a dependency.  (<code>npm install semver</code> to use it yourself.)</p>
-
-<p>More on version numbers and ranges at <a href="../misc/semver.html">semver(7)</a>.</p>
-
-<h2 id="description">description</h2>
-
-<p>Put a description in it.  It&#39;s a string.  This helps people discover your
-package, as it&#39;s listed in <code>npm search</code>.</p>
-
-<h2 id="keywords">keywords</h2>
-
-<p>Put keywords in it.  It&#39;s an array of strings.  This helps people
-discover your package as it&#39;s listed in <code>npm search</code>.</p>
-
-<h2 id="homepage">homepage</h2>
-
-<p>The url to the project homepage.</p>
-
-<p><strong>NOTE</strong>: This is <em>not</em> the same as &quot;url&quot;.  If you put a &quot;url&quot; field,
-then the registry will think it&#39;s a redirection to your package that has
-been published somewhere else, and spit at you.</p>
-
-<p>Literally.  Spit.  I&#39;m so not kidding.</p>
-
-<h2 id="bugs">bugs</h2>
-
-<p>The url to your project&#39;s issue tracker and / or the email address to which
-issues should be reported. These are helpful for people who encounter issues
-with your package.</p>
-
-<p>It should look like this:</p>
-
-<pre><code>{ &quot;url&quot; : &quot;http://github.com/owner/project/issues&quot;
-, &quot;email&quot; : &quot;project@hostname.com&quot;
-}</code></pre>
-
-<p>You can specify either one or both values. If you want to provide only a url,
-you can specify the value for &quot;bugs&quot; as a simple string instead of an object.</p>
-
-<p>If a url is provided, it will be used by the <code>npm bugs</code> command.</p>
-
-<h2 id="license">license</h2>
-
-<p>You should specify a license for your package so that people know how they are
-permitted to use it, and any restrictions you&#39;re placing on it.</p>
-
-<p>The simplest way, assuming you&#39;re using a common license such as BSD or MIT, is
-to just specify the name of the license you&#39;re using, like this:</p>
-
-<pre><code>{ &quot;license&quot; : &quot;BSD&quot; }</code></pre>
-
-<p>If you have more complex licensing terms, or you want to provide more detail
-in your package.json file, you can use the more verbose plural form, like this:</p>
-
-<pre><code>&quot;licenses&quot; : [
-  { &quot;type&quot; : &quot;MyLicense&quot;
-  , &quot;url&quot; : &quot;http://github.com/owner/project/path/to/license&quot;
-  }
-]</code></pre>
-
-<p>It&#39;s also a good idea to include a license file at the top level in your package.</p>
-
-<h2 id="people-fields-author-contributors">people fields: author, contributors</h2>
-
-<p>The &quot;author&quot; is one person.  &quot;contributors&quot; is an array of people.  A &quot;person&quot;
-is an object with a &quot;name&quot; field and optionally &quot;url&quot; and &quot;email&quot;, like this:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;Barney Rubble&quot;
-, &quot;email&quot; : &quot;b@rubble.com&quot;
-, &quot;url&quot; : &quot;http://barnyrubble.tumblr.com/&quot;
-}</code></pre>
-
-<p>Or you can shorten that all into a single string, and npm will parse it for you:</p>
-
-<pre><code>&quot;Barney Rubble &lt;b@rubble.com&gt; (http://barnyrubble.tumblr.com/)</code></pre>
-
-<p>Both email and url are optional either way.</p>
-
-<p>npm also sets a top-level &quot;maintainers&quot; field with your npm user info.</p>
-
-<h2 id="files">files</h2>
-
-<p>The &quot;files&quot; field is an array of files to include in your project.  If
-you name a folder in the array, then it will also include the files
-inside that folder. (Unless they would be ignored by another rule.)</p>
-
-<p>You can also provide a &quot;.npmignore&quot; file in the root of your package,
-which will keep files from being included, even if they would be picked
-up by the files array.  The &quot;.npmignore&quot; file works just like a
-&quot;.gitignore&quot;.</p>
-
-<h2 id="main">main</h2>
-
-<p>The main field is a module ID that is the primary entry point to your program.
-That is, if your package is named <code>foo</code>, and a user installs it, and then does
-<code>require(&quot;foo&quot;)</code>, then your main module&#39;s exports object will be returned.</p>
-
-<p>This should be a module ID relative to the root of your package folder.</p>
-
-<p>For most modules, it makes the most sense to have a main script and often not
-much else.</p>
-
-<h2 id="bin">bin</h2>
-
-<p>A lot of packages have one or more executable files that they&#39;d like to
-install into the PATH. npm makes this pretty easy (in fact, it uses this
-feature to install the &quot;npm&quot; executable.)</p>
-
-<p>To use this, supply a <code>bin</code> field in your package.json which is a map of
-command name to local file name. On install, npm will symlink that file into
-<code>prefix/bin</code> for global installs, or <code>./node_modules/.bin/</code> for local
-installs.</p>
-
-<p>For example, npm has this:</p>
-
-<pre><code>{ &quot;bin&quot; : { &quot;npm&quot; : &quot;./cli.js&quot; } }</code></pre>
-
-<p>So, when you install npm, it&#39;ll create a symlink from the <code>cli.js</code> script to
-<code>/usr/local/bin/npm</code>.</p>
-
-<p>If you have a single executable, and its name should be the name
-of the package, then you can just supply it as a string.  For example:</p>
-
-<pre><code>{ &quot;name&quot;: &quot;my-program&quot;
-, &quot;version&quot;: &quot;1.2.5&quot;
-, &quot;bin&quot;: &quot;./path/to/program&quot; }</code></pre>
-
-<p>would be the same as this:</p>
-
-<pre><code>{ &quot;name&quot;: &quot;my-program&quot;
-, &quot;version&quot;: &quot;1.2.5&quot;
-, &quot;bin&quot; : { &quot;my-program&quot; : &quot;./path/to/program&quot; } }</code></pre>
-
-<h2 id="man">man</h2>
-
-<p>Specify either a single file or an array of filenames to put in place for the
-<code>man</code> program to find.</p>
-
-<p>If only a single file is provided, then it&#39;s installed such that it is the
-result from <code>man &lt;pkgname&gt;</code>, regardless of its actual filename.  For example:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;version&quot; : &quot;1.2.3&quot;
-, &quot;description&quot; : &quot;A packaged foo fooer for fooing foos&quot;
-, &quot;main&quot; : &quot;foo.js&quot;
-, &quot;man&quot; : &quot;./man/doc.1&quot;
-}</code></pre>
-
-<p>would link the <code>./man/doc.1</code> file in such that it is the target for <code>man foo</code></p>
-
-<p>If the filename doesn&#39;t start with the package name, then it&#39;s prefixed.
-So, this:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;version&quot; : &quot;1.2.3&quot;
-, &quot;description&quot; : &quot;A packaged foo fooer for fooing foos&quot;
-, &quot;main&quot; : &quot;foo.js&quot;
-, &quot;man&quot; : [ &quot;./man/foo.1&quot;, &quot;./man/bar.1&quot; ]
-}</code></pre>
-
-<p>will create files to do <code>man foo</code> and <code>man foo-bar</code>.</p>
-
-<p>Man files must end with a number, and optionally a <code>.gz</code> suffix if they are
-compressed.  The number dictates which man section the file is installed into.</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;version&quot; : &quot;1.2.3&quot;
-, &quot;description&quot; : &quot;A packaged foo fooer for fooing foos&quot;
-, &quot;main&quot; : &quot;foo.js&quot;
-, &quot;man&quot; : [ &quot;./man/foo.1&quot;, &quot;./man/foo.2&quot; ]
-}</code></pre>
-
-<p>will create entries for <code>man foo</code> and <code>man 2 foo</code></p>
-
-<h2 id="directories">directories</h2>
-
-<p>The CommonJS <a href="http://wiki.commonjs.org/wiki/Packages/1.0">Packages</a> spec details a
-few ways that you can indicate the structure of your package using a <code>directories</code>
-hash. If you look at <a href="http://registry.npmjs.org/npm/latest">npm&#39;s package.json</a>,
-you&#39;ll see that it has directories for doc, lib, and man.</p>
-
-<p>In the future, this information may be used in other creative ways.</p>
-
-<h3 id="directories-lib">directories.lib</h3>
-
-<p>Tell people where the bulk of your library is.  Nothing special is done
-with the lib folder in any way, but it&#39;s useful meta info.</p>
-
-<h3 id="directories-bin">directories.bin</h3>
-
-<p>If you specify a &quot;bin&quot; directory, then all the files in that folder will
-be used as the &quot;bin&quot; hash.</p>
-
-<p>If you have a &quot;bin&quot; hash already, then this has no effect.</p>
-
-<h3 id="directories-man">directories.man</h3>
-
-<p>A folder that is full of man pages.  Sugar to generate a &quot;man&quot; array by
-walking the folder.</p>
-
-<h3 id="directories-doc">directories.doc</h3>
-
-<p>Put markdown files in here.  Eventually, these will be displayed nicely,
-maybe, someday.</p>
-
-<h3 id="directories-example">directories.example</h3>
-
-<p>Put example scripts in here.  Someday, it might be exposed in some clever way.</p>
-
-<h2 id="repository">repository</h2>
-
-<p>Specify the place where your code lives. This is helpful for people who
-want to contribute.  If the git repo is on github, then the <code>npm docs</code>
-command will be able to find you.</p>
-
-<p>Do it like this:</p>
-
-<pre><code>&quot;repository&quot; :
-  { &quot;type&quot; : &quot;git&quot;
-  , &quot;url&quot; : &quot;http://github.com/isaacs/npm.git&quot;
-  }
-
-&quot;repository&quot; :
-  { &quot;type&quot; : &quot;svn&quot;
-  , &quot;url&quot; : &quot;http://v8.googlecode.com/svn/trunk/&quot;
-  }</code></pre>
-
-<p>The URL should be a publicly available (perhaps read-only) url that can be handed
-directly to a VCS program without any modification.  It should not be a url to an
-html project page that you put in your browser.  It&#39;s for computers.</p>
-
-<h2 id="scripts">scripts</h2>
-
-<p>The &quot;scripts&quot; member is an object hash of script commands that are run
-at various times in the lifecycle of your package.  The key is the lifecycle
-event, and the value is the command to run at that point.</p>
-
-<p>See <code><a href="../misc/npm-scripts.html">npm-scripts(7)</a></code> to find out more about writing package scripts.</p>
-
-<h2 id="config">config</h2>
-
-<p>A &quot;config&quot; hash can be used to set configuration
-parameters used in package scripts that persist across upgrades.  For
-instance, if a package had the following:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;config&quot; : { &quot;port&quot; : &quot;8080&quot; } }</code></pre>
-
-<p>and then had a &quot;start&quot; command that then referenced the
-<code>npm_package_config_port</code> environment variable, then the user could
-override that by doing <code>npm config set foo:port 8001</code>.</p>
-
-<p>See <code><a href="../misc/npm-config.html">npm-config(7)</a></code> and <code><a href="../misc/npm-scripts.html">npm-scripts(7)</a></code> for more on package
-configs.</p>
-
-<h2 id="dependencies">dependencies</h2>
-
-<p>Dependencies are specified with a simple hash of package name to
-version range. The version range is a string which has one or more
-space-separated descriptors.  Dependencies can also be identified with
-a tarball or git URL.</p>
-
-<p><strong>Please do not put test harnesses or transpilers in your
-<code>dependencies</code> hash.</strong>  See <code>devDependencies</code>, below.</p>
-
-<p>See <a href="../misc/semver.html">semver(7)</a> for more details about specifying version ranges.</p>
-
-<ul><li><code>version</code> Must match <code>version</code> exactly</li><li><code>&gt;version</code> Must be greater than <code>version</code></li><li><code>&gt;=version</code> etc</li><li><code>&lt;version</code></li><li><code>&lt;=version</code></li><li><code>~version</code> &quot;Approximately equivalent to version&quot;  See <a href="../misc/semver.html">semver(7)</a></li><li><code>1.2.x</code> 1.2.0, 1.2.1, etc., but not 1.3.0</li><li><code>http://...</code> See &#39;URLs as Dependencies&#39; below</li><li><code>*</code> Matches any version</li><li><code>&quot;&quot;</code> (just an empty string) Same as <code>*</code></li><li><code>version1 - version2</code> Same as <code>&gt;=version1 &lt;=version2</code>.</li><li><code>range1 || range2</code> Passes if either range1 or range2 are satisfied.</li><li><code>git...</code> See &#39;Git URLs as Dependencies&#39; below</li><li><code>user/repo</code> See &#39;GitHub URLs&#39; below</li></ul>
-
-<p>For example, these are all valid:</p>
-
-<pre><code>{ &quot;dependencies&quot; :
-  { &quot;foo&quot; : &quot;1.0.0 - 2.9999.9999&quot;
-  , &quot;bar&quot; : &quot;&gt;=1.0.2 &lt;2.1.2&quot;
-  , &quot;baz&quot; : &quot;&gt;1.0.2 &lt;=2.3.4&quot;
-  , &quot;boo&quot; : &quot;2.0.1&quot;
-  , &quot;qux&quot; : &quot;&lt;1.0.0 || &gt;=2.3.1 &lt;2.4.5 || &gt;=2.5.2 &lt;3.0.0&quot;
-  , &quot;asd&quot; : &quot;http://asdf.com/asdf.tar.gz&quot;
-  , &quot;til&quot; : &quot;~1.2&quot;
-  , &quot;elf&quot; : &quot;~1.2.3&quot;
-  , &quot;two&quot; : &quot;2.x&quot;
-  , &quot;thr&quot; : &quot;3.3.x&quot;
-  }
-}</code></pre>
-
-<h3 id="URLs-as-Dependencies">URLs as Dependencies</h3>
-
-<p>You may specify a tarball URL in place of a version range.</p>
-
-<p>This tarball will be downloaded and installed locally to your package at
-install time.</p>
-
-<h3 id="Git-URLs-as-Dependencies">Git URLs as Dependencies</h3>
-
-<p>Git urls can be of the form:</p>
-
-<pre><code>git://github.com/user/project.git#commit-ish
-git+ssh://user@hostname:project.git#commit-ish
-git+ssh://user@hostname/project.git#commit-ish
-git+http://user@hostname/project/blah.git#commit-ish
-git+https://user@hostname/project/blah.git#commit-ish</code></pre>
-
-<p>The <code>commit-ish</code> can be any tag, sha, or branch which can be supplied as
-an argument to <code>git checkout</code>.  The default is <code>master</code>.</p>
-
-<h2 id="GitHub-URLs">GitHub URLs</h2>
-
-<p>As of version 1.1.65, you can refer to GitHub urls as just &quot;foo&quot;: &quot;user/foo-project&quot;. For example:</p>
-
-<p><code>json
-{
-  &quot;name&quot;: &quot;foo&quot;,
-  &quot;version&quot;: &quot;0.0.0&quot;,
-  &quot;dependencies&quot;: {
-    &quot;express&quot;: &quot;visionmedia/express&quot;
-  }
-}
-</code></p>
-
-<h2 id="devDependencies">devDependencies</h2>
-
-<p>If someone is planning on downloading and using your module in their
-program, then they probably don&#39;t want or need to download and build
-the external test or documentation framework that you use.</p>
-
-<p>In this case, it&#39;s best to list these additional items in a
-<code>devDependencies</code> hash.</p>
-
-<p>These things will be installed when doing <code>npm link</code> or <code>npm install</code>
-from the root of a package, and can be managed like any other npm
-configuration param.  See <code><a href="../misc/npm-config.html">npm-config(7)</a></code> for more on the topic.</p>
-
-<p>For build steps that are not platform-specific, such as compiling
-CoffeeScript or other languages to JavaScript, use the <code>prepublish</code>
-script to do this, and make the required package a devDependency.</p>
-
-<p>For example:</p>
-
-<p><code>json
-{ &quot;name&quot;: &quot;ethopia-waza&quot;,
-  &quot;description&quot;: &quot;a delightfully fruity coffee varietal&quot;,
-  &quot;version&quot;: &quot;1.2.3&quot;,
-  &quot;devDependencies&quot;: {
-    &quot;coffee-script&quot;: &quot;~1.6.3&quot;
-  },
-  &quot;scripts&quot;: {
-    &quot;prepublish&quot;: &quot;coffee -o lib/ -c src/waza.coffee&quot;
-  },
-  &quot;main&quot;: &quot;lib/waza.js&quot;
-}
-</code></p>
-
-<p>The <code>prepublish</code> script will be run before publishing, so that users
-can consume the functionality without requiring them to compile it
-themselves.  In dev mode (ie, locally running <code>npm install</code>), it&#39;ll
-run this script as well, so that you can test it easily.</p>
-
-<h2 id="bundledDependencies">bundledDependencies</h2>
-
-<p>Array of package names that will be bundled when publishing the package.</p>
-
-<p>If this is spelled <code>&quot;bundleDependencies&quot;</code>, then that is also honorable.</p>
-
-<h2 id="optionalDependencies">optionalDependencies</h2>
-
-<p>If a dependency can be used, but you would like npm to proceed if it
-cannot be found or fails to install, then you may put it in the
-<code>optionalDependencies</code> hash.  This is a map of package name to version
-or url, just like the <code>dependencies</code> hash.  The difference is that
-failure is tolerated.</p>
-
-<p>It is still your program&#39;s responsibility to handle the lack of the
-dependency.  For example, something like this:</p>
-
-<pre><code>try {
-  var foo = require(&#39;foo&#39;)
-  var fooVersion = require(&#39;foo/package.json&#39;).version
-} catch (er) {
-  foo = null
-}
-if ( notGoodFooVersion(fooVersion) ) {
-  foo = null
-}
-
-// .. then later in your program ..
-
-if (foo) {
-  foo.doFooThings()
-}</code></pre>
-
-<p>Entries in <code>optionalDependencies</code> will override entries of the same name in
-<code>dependencies</code>, so it&#39;s usually best to only put in one place.</p>
-
-<h2 id="engines">engines</h2>
-
-<p>You can specify the version of node that your stuff works on:</p>
-
-<pre><code>{ &quot;engines&quot; : { &quot;node&quot; : &quot;&gt;=0.10.3 &lt;0.12&quot; } }</code></pre>
-
-<p>And, like with dependencies, if you don&#39;t specify the version (or if you
-specify &quot;*&quot; as the version), then any version of node will do.</p>
-
-<p>If you specify an &quot;engines&quot; field, then npm will require that &quot;node&quot; be
-somewhere on that list. If &quot;engines&quot; is omitted, then npm will just assume
-that it works on node.</p>
-
-<p>You can also use the &quot;engines&quot; field to specify which versions of npm
-are capable of properly installing your program.  For example:</p>
-
-<pre><code>{ &quot;engines&quot; : { &quot;npm&quot; : &quot;~1.0.20&quot; } }</code></pre>
-
-<p>Note that, unless the user has set the <code>engine-strict</code> config flag, this
-field is advisory only.</p>
-
-<h2 id="engineStrict">engineStrict</h2>
-
-<p>If you are sure that your module will <em>definitely not</em> run properly on
-versions of Node/npm other than those specified in the <code>engines</code> hash,
-then you can set <code>&quot;engineStrict&quot;: true</code> in your package.json file.
-This will override the user&#39;s <code>engine-strict</code> config setting.</p>
-
-<p>Please do not do this unless you are really very very sure.  If your
-engines hash is something overly restrictive, you can quite easily and
-inadvertently lock yourself into obscurity and prevent your users from
-updating to new versions of Node.  Consider this choice carefully.  If
-people abuse it, it will be removed in a future version of npm.</p>
-
-<h2 id="os">os</h2>
-
-<p>You can specify which operating systems your
-module will run on:</p>
-
-<pre><code>&quot;os&quot; : [ &quot;darwin&quot;, &quot;linux&quot; ]</code></pre>
-
-<p>You can also blacklist instead of whitelist operating systems,
-just prepend the blacklisted os with a &#39;!&#39;:</p>
-
-<pre><code>&quot;os&quot; : [ &quot;!win32&quot; ]</code></pre>
-
-<p>The host operating system is determined by <code>process.platform</code></p>
-
-<p>It is allowed to both blacklist, and whitelist, although there isn&#39;t any
-good reason to do this.</p>
-
-<h2 id="cpu">cpu</h2>
-
-<p>If your code only runs on certain cpu architectures,
-you can specify which ones.</p>
-
-<pre><code>&quot;cpu&quot; : [ &quot;x64&quot;, &quot;ia32&quot; ]</code></pre>
-
-<p>Like the <code>os</code> option, you can also blacklist architectures:</p>
-
-<pre><code>&quot;cpu&quot; : [ &quot;!arm&quot;, &quot;!mips&quot; ]</code></pre>
-
-<p>The host architecture is determined by <code>process.arch</code></p>
-
-<h2 id="preferGlobal">preferGlobal</h2>
-
-<p>If your package is primarily a command-line application that should be
-installed globally, then set this value to <code>true</code> to provide a warning
-if it is installed locally.</p>
-
-<p>It doesn&#39;t actually prevent users from installing it locally, but it
-does help prevent some confusion if it doesn&#39;t work as expected.</p>
-
-<h2 id="private">private</h2>
-
-<p>If you set <code>&quot;private&quot;: true</code> in your package.json, then npm will refuse
-to publish it.</p>
-
-<p>This is a way to prevent accidental publication of private repositories.
-If you would like to ensure that a given package is only ever published
-to a specific registry (for example, an internal registry),
-then use the <code>publishConfig</code> hash described below
-to override the <code>registry</code> config param at publish-time.</p>
-
-<h2 id="publishConfig">publishConfig</h2>
-
-<p>This is a set of config values that will be used at publish-time.  It&#39;s
-especially handy if you want to set the tag or registry, so that you can
-ensure that a given package is not tagged with &quot;latest&quot; or published to
-the global public registry by default.</p>
-
-<p>Any config values can be overridden, but of course only &quot;tag&quot; and
-&quot;registry&quot; probably matter for the purposes of publishing.</p>
-
-<p>See <code><a href="../misc/npm-config.html">npm-config(7)</a></code> to see the list of config options that can be
-overridden.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/semver.html">semver(7)</a></li><li><a href="../cli/npm-init.html">npm-init(1)</a></li><li><a href="../cli/npm-version.html">npm-version(1)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../cli/npm-help.html">npm-help(1)</a></li><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../cli/npm-rm.html">npm-rm(1)</a></li></ul>
-</div>
-<p id="footer">package.json &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/npmrc.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,93 +0,0 @@
-<!doctype html>
-<html>
-  <title>npmrc</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../files/npmrc.html">npmrc</a></h1> <p>The npm config files</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm gets its config settings from the command line, environment
-variables, and <code>npmrc</code> files.</p>
-
-<p>The <code>npm config</code> command can be used to update and edit the contents
-of the user and global npmrc files.</p>
-
-<p>For a list of available configuration options, see <a href="../misc/npm-config.html">npm-config(7)</a>.</p>
-
-<h2 id="FILES">FILES</h2>
-
-<p>The three relevant files are:</p>
-
-<ul><li>per-user config file (~/.npmrc)</li><li>global config file ($PREFIX/npmrc)</li><li>npm builtin config file (/path/to/npm/npmrc)</li></ul>
-
-<p>All npm config files are an ini-formatted list of <code>key = value</code>
-parameters.  Environment variables can be replaced using
-<code>${VARIABLE_NAME}</code>. For example:</p>
-
-<pre><code>prefix = ${HOME}/.npm-packages</code></pre>
-
-<p>Each of these files is loaded, and config options are resolved in
-priority order.  For example, a setting in the userconfig file would
-override the setting in the globalconfig file.</p>
-
-<h3 id="Per-user-config-file">Per-user config file</h3>
-
-<p><code>$HOME/.npmrc</code> (or the <code>userconfig</code> param, if set in the environment
-or on the command line)</p>
-
-<h3 id="Global-config-file">Global config file</h3>
-
-<p><code>$PREFIX/etc/npmrc</code> (or the <code>globalconfig</code> param, if set above):
-This file is an ini-file formatted list of <code>key = value</code> parameters.
-Environment variables can be replaced as above.</p>
-
-<h3 id="Built-in-config-file">Built-in config file</h3>
-
-<p><code>path/to/npm/itself/npmrc</code></p>
-
-<p>This is an unchangeable &quot;builtin&quot; configuration file that npm keeps
-consistent across updates.  Set fields in here using the <code>./configure</code>
-script that comes with npm.  This is primarily for distribution
-maintainers to override default configs in a standard and consistent
-manner.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../cli/npm.html">npm(1)</a></li></ul>
-</div>
-<p id="footer">npmrc &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/files/package.json.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,584 +0,0 @@
-<!doctype html>
-<html>
-  <title>package.json</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../files/package.json.html">package.json</a></h1> <p>Specifics of npm&#39;s package.json handling</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>This document is all you need to know about what&#39;s required in your package.json
-file.  It must be actual JSON, not just a JavaScript object literal.</p>
-
-<p>A lot of the behavior described in this document is affected by the config
-settings described in <code><a href="../misc/npm-config.html">npm-config(7)</a></code>.</p>
-
-<h2 id="DEFAULT-VALUES">DEFAULT VALUES</h2>
-
-<p>npm will default some values based on package contents.</p>
-
-<ul><li><p><code>&quot;scripts&quot;: {&quot;start&quot;: &quot;node server.js&quot;}</code></p><p>If there is a <code>server.js</code> file in the root of your package, then npm
-will default the <code>start</code> command to <code>node server.js</code>.</p></li><li><p><code>&quot;scripts&quot;:{&quot;preinstall&quot;: &quot;node-waf clean || true; node-waf configure build&quot;}</code></p><p>If there is a <code>wscript</code> file in the root of your package, npm will
-default the <code>preinstall</code> command to compile using node-waf.</p></li><li><p><code>&quot;scripts&quot;:{&quot;preinstall&quot;: &quot;node-gyp rebuild&quot;}</code></p><p>If there is a <code>binding.gyp</code> file in the root of your package, npm will
-default the <code>preinstall</code> command to compile using node-gyp.</p></li><li><p><code>&quot;contributors&quot;: [...]</code></p><p>If there is an <code>AUTHORS</code> file in the root of your package, npm will
-treat each line as a <code>Name &lt;email&gt; (url)</code> format, where email and url
-are optional.  Lines which start with a <code>#</code> or are blank, will be
-ignored.</p></li></ul>
-
-<h2 id="name">name</h2>
-
-<p>The <em>most</em> important things in your package.json are the name and version fields.
-Those are actually required, and your package won&#39;t install without
-them.  The name and version together form an identifier that is assumed
-to be completely unique.  Changes to the package should come along with
-changes to the version.</p>
-
-<p>The name is what your thing is called.  Some tips:</p>
-
-<ul><li>Don&#39;t put &quot;js&quot; or &quot;node&quot; in the name.  It&#39;s assumed that it&#39;s js, since you&#39;re
-writing a package.json file, and you can specify the engine using the &quot;engines&quot;
-field.  (See below.)</li><li>The name ends up being part of a URL, an argument on the command line, and a
-folder name. Any name with non-url-safe characters will be rejected.
-Also, it can&#39;t start with a dot or an underscore.</li><li>The name will probably be passed as an argument to require(), so it should
-be something short, but also reasonably descriptive.</li><li>You may want to check the npm registry to see if there&#39;s something by that name
-already, before you get too attached to it.  http://registry.npmjs.org/</li></ul>
-
-<h2 id="version">version</h2>
-
-<p>The <em>most</em> important things in your package.json are the name and version fields.
-Those are actually required, and your package won&#39;t install without
-them.  The name and version together form an identifier that is assumed
-to be completely unique.  Changes to the package should come along with
-changes to the version.</p>
-
-<p>Version must be parseable by
-<a href="https://github.com/isaacs/node-semver">node-semver</a>, which is bundled
-with npm as a dependency.  (<code>npm install semver</code> to use it yourself.)</p>
-
-<p>More on version numbers and ranges at <a href="../misc/semver.html">semver(7)</a>.</p>
-
-<h2 id="description">description</h2>
-
-<p>Put a description in it.  It&#39;s a string.  This helps people discover your
-package, as it&#39;s listed in <code>npm search</code>.</p>
-
-<h2 id="keywords">keywords</h2>
-
-<p>Put keywords in it.  It&#39;s an array of strings.  This helps people
-discover your package as it&#39;s listed in <code>npm search</code>.</p>
-
-<h2 id="homepage">homepage</h2>
-
-<p>The url to the project homepage.</p>
-
-<p><strong>NOTE</strong>: This is <em>not</em> the same as &quot;url&quot;.  If you put a &quot;url&quot; field,
-then the registry will think it&#39;s a redirection to your package that has
-been published somewhere else, and spit at you.</p>
-
-<p>Literally.  Spit.  I&#39;m so not kidding.</p>
-
-<h2 id="bugs">bugs</h2>
-
-<p>The url to your project&#39;s issue tracker and / or the email address to which
-issues should be reported. These are helpful for people who encounter issues
-with your package.</p>
-
-<p>It should look like this:</p>
-
-<pre><code>{ &quot;url&quot; : &quot;http://github.com/owner/project/issues&quot;
-, &quot;email&quot; : &quot;project@hostname.com&quot;
-}</code></pre>
-
-<p>You can specify either one or both values. If you want to provide only a url,
-you can specify the value for &quot;bugs&quot; as a simple string instead of an object.</p>
-
-<p>If a url is provided, it will be used by the <code>npm bugs</code> command.</p>
-
-<h2 id="license">license</h2>
-
-<p>You should specify a license for your package so that people know how they are
-permitted to use it, and any restrictions you&#39;re placing on it.</p>
-
-<p>The simplest way, assuming you&#39;re using a common license such as BSD or MIT, is
-to just specify the name of the license you&#39;re using, like this:</p>
-
-<pre><code>{ &quot;license&quot; : &quot;BSD&quot; }</code></pre>
-
-<p>If you have more complex licensing terms, or you want to provide more detail
-in your package.json file, you can use the more verbose plural form, like this:</p>
-
-<pre><code>&quot;licenses&quot; : [
-  { &quot;type&quot; : &quot;MyLicense&quot;
-  , &quot;url&quot; : &quot;http://github.com/owner/project/path/to/license&quot;
-  }
-]</code></pre>
-
-<p>It&#39;s also a good idea to include a license file at the top level in your package.</p>
-
-<h2 id="people-fields-author-contributors">people fields: author, contributors</h2>
-
-<p>The &quot;author&quot; is one person.  &quot;contributors&quot; is an array of people.  A &quot;person&quot;
-is an object with a &quot;name&quot; field and optionally &quot;url&quot; and &quot;email&quot;, like this:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;Barney Rubble&quot;
-, &quot;email&quot; : &quot;b@rubble.com&quot;
-, &quot;url&quot; : &quot;http://barnyrubble.tumblr.com/&quot;
-}</code></pre>
-
-<p>Or you can shorten that all into a single string, and npm will parse it for you:</p>
-
-<pre><code>&quot;Barney Rubble &lt;b@rubble.com&gt; (http://barnyrubble.tumblr.com/)</code></pre>
-
-<p>Both email and url are optional either way.</p>
-
-<p>npm also sets a top-level &quot;maintainers&quot; field with your npm user info.</p>
-
-<h2 id="files">files</h2>
-
-<p>The &quot;files&quot; field is an array of files to include in your project.  If
-you name a folder in the array, then it will also include the files
-inside that folder. (Unless they would be ignored by another rule.)</p>
-
-<p>You can also provide a &quot;.npmignore&quot; file in the root of your package,
-which will keep files from being included, even if they would be picked
-up by the files array.  The &quot;.npmignore&quot; file works just like a
-&quot;.gitignore&quot;.</p>
-
-<h2 id="main">main</h2>
-
-<p>The main field is a module ID that is the primary entry point to your program.
-That is, if your package is named <code>foo</code>, and a user installs it, and then does
-<code>require(&quot;foo&quot;)</code>, then your main module&#39;s exports object will be returned.</p>
-
-<p>This should be a module ID relative to the root of your package folder.</p>
-
-<p>For most modules, it makes the most sense to have a main script and often not
-much else.</p>
-
-<h2 id="bin">bin</h2>
-
-<p>A lot of packages have one or more executable files that they&#39;d like to
-install into the PATH. npm makes this pretty easy (in fact, it uses this
-feature to install the &quot;npm&quot; executable.)</p>
-
-<p>To use this, supply a <code>bin</code> field in your package.json which is a map of
-command name to local file name. On install, npm will symlink that file into
-<code>prefix/bin</code> for global installs, or <code>./node_modules/.bin/</code> for local
-installs.</p>
-
-<p>For example, npm has this:</p>
-
-<pre><code>{ &quot;bin&quot; : { &quot;npm&quot; : &quot;./cli.js&quot; } }</code></pre>
-
-<p>So, when you install npm, it&#39;ll create a symlink from the <code>cli.js</code> script to
-<code>/usr/local/bin/npm</code>.</p>
-
-<p>If you have a single executable, and its name should be the name
-of the package, then you can just supply it as a string.  For example:</p>
-
-<pre><code>{ &quot;name&quot;: &quot;my-program&quot;
-, &quot;version&quot;: &quot;1.2.5&quot;
-, &quot;bin&quot;: &quot;./path/to/program&quot; }</code></pre>
-
-<p>would be the same as this:</p>
-
-<pre><code>{ &quot;name&quot;: &quot;my-program&quot;
-, &quot;version&quot;: &quot;1.2.5&quot;
-, &quot;bin&quot; : { &quot;my-program&quot; : &quot;./path/to/program&quot; } }</code></pre>
-
-<h2 id="man">man</h2>
-
-<p>Specify either a single file or an array of filenames to put in place for the
-<code>man</code> program to find.</p>
-
-<p>If only a single file is provided, then it&#39;s installed such that it is the
-result from <code>man &lt;pkgname&gt;</code>, regardless of its actual filename.  For example:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;version&quot; : &quot;1.2.3&quot;
-, &quot;description&quot; : &quot;A packaged foo fooer for fooing foos&quot;
-, &quot;main&quot; : &quot;foo.js&quot;
-, &quot;man&quot; : &quot;./man/doc.1&quot;
-}</code></pre>
-
-<p>would link the <code>./man/doc.1</code> file in such that it is the target for <code>man foo</code></p>
-
-<p>If the filename doesn&#39;t start with the package name, then it&#39;s prefixed.
-So, this:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;version&quot; : &quot;1.2.3&quot;
-, &quot;description&quot; : &quot;A packaged foo fooer for fooing foos&quot;
-, &quot;main&quot; : &quot;foo.js&quot;
-, &quot;man&quot; : [ &quot;./man/foo.1&quot;, &quot;./man/bar.1&quot; ]
-}</code></pre>
-
-<p>will create files to do <code>man foo</code> and <code>man foo-bar</code>.</p>
-
-<p>Man files must end with a number, and optionally a <code>.gz</code> suffix if they are
-compressed.  The number dictates which man section the file is installed into.</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;version&quot; : &quot;1.2.3&quot;
-, &quot;description&quot; : &quot;A packaged foo fooer for fooing foos&quot;
-, &quot;main&quot; : &quot;foo.js&quot;
-, &quot;man&quot; : [ &quot;./man/foo.1&quot;, &quot;./man/foo.2&quot; ]
-}</code></pre>
-
-<p>will create entries for <code>man foo</code> and <code>man 2 foo</code></p>
-
-<h2 id="directories">directories</h2>
-
-<p>The CommonJS <a href="http://wiki.commonjs.org/wiki/Packages/1.0">Packages</a> spec details a
-few ways that you can indicate the structure of your package using a <code>directories</code>
-hash. If you look at <a href="http://registry.npmjs.org/npm/latest">npm&#39;s package.json</a>,
-you&#39;ll see that it has directories for doc, lib, and man.</p>
-
-<p>In the future, this information may be used in other creative ways.</p>
-
-<h3 id="directories-lib">directories.lib</h3>
-
-<p>Tell people where the bulk of your library is.  Nothing special is done
-with the lib folder in any way, but it&#39;s useful meta info.</p>
-
-<h3 id="directories-bin">directories.bin</h3>
-
-<p>If you specify a &quot;bin&quot; directory, then all the files in that folder will
-be used as the &quot;bin&quot; hash.</p>
-
-<p>If you have a &quot;bin&quot; hash already, then this has no effect.</p>
-
-<h3 id="directories-man">directories.man</h3>
-
-<p>A folder that is full of man pages.  Sugar to generate a &quot;man&quot; array by
-walking the folder.</p>
-
-<h3 id="directories-doc">directories.doc</h3>
-
-<p>Put markdown files in here.  Eventually, these will be displayed nicely,
-maybe, someday.</p>
-
-<h3 id="directories-example">directories.example</h3>
-
-<p>Put example scripts in here.  Someday, it might be exposed in some clever way.</p>
-
-<h2 id="repository">repository</h2>
-
-<p>Specify the place where your code lives. This is helpful for people who
-want to contribute.  If the git repo is on github, then the <code>npm docs</code>
-command will be able to find you.</p>
-
-<p>Do it like this:</p>
-
-<pre><code>&quot;repository&quot; :
-  { &quot;type&quot; : &quot;git&quot;
-  , &quot;url&quot; : &quot;http://github.com/isaacs/npm.git&quot;
-  }
-
-&quot;repository&quot; :
-  { &quot;type&quot; : &quot;svn&quot;
-  , &quot;url&quot; : &quot;http://v8.googlecode.com/svn/trunk/&quot;
-  }</code></pre>
-
-<p>The URL should be a publicly available (perhaps read-only) url that can be handed
-directly to a VCS program without any modification.  It should not be a url to an
-html project page that you put in your browser.  It&#39;s for computers.</p>
-
-<h2 id="scripts">scripts</h2>
-
-<p>The &quot;scripts&quot; member is an object hash of script commands that are run
-at various times in the lifecycle of your package.  The key is the lifecycle
-event, and the value is the command to run at that point.</p>
-
-<p>See <code><a href="../misc/npm-scripts.html">npm-scripts(7)</a></code> to find out more about writing package scripts.</p>
-
-<h2 id="config">config</h2>
-
-<p>A &quot;config&quot; hash can be used to set configuration
-parameters used in package scripts that persist across upgrades.  For
-instance, if a package had the following:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;config&quot; : { &quot;port&quot; : &quot;8080&quot; } }</code></pre>
-
-<p>and then had a &quot;start&quot; command that then referenced the
-<code>npm_package_config_port</code> environment variable, then the user could
-override that by doing <code>npm config set foo:port 8001</code>.</p>
-
-<p>See <code><a href="../misc/npm-config.html">npm-config(7)</a></code> and <code><a href="../misc/npm-scripts.html">npm-scripts(7)</a></code> for more on package
-configs.</p>
-
-<h2 id="dependencies">dependencies</h2>
-
-<p>Dependencies are specified with a simple hash of package name to
-version range. The version range is a string which has one or more
-space-separated descriptors.  Dependencies can also be identified with
-a tarball or git URL.</p>
-
-<p><strong>Please do not put test harnesses or transpilers in your
-<code>dependencies</code> hash.</strong>  See <code>devDependencies</code>, below.</p>
-
-<p>See <a href="../misc/semver.html">semver(7)</a> for more details about specifying version ranges.</p>
-
-<ul><li><code>version</code> Must match <code>version</code> exactly</li><li><code>&gt;version</code> Must be greater than <code>version</code></li><li><code>&gt;=version</code> etc</li><li><code>&lt;version</code></li><li><code>&lt;=version</code></li><li><code>~version</code> &quot;Approximately equivalent to version&quot;  See <a href="../misc/semver.html">semver(7)</a></li><li><code>1.2.x</code> 1.2.0, 1.2.1, etc., but not 1.3.0</li><li><code>http://...</code> See &#39;URLs as Dependencies&#39; below</li><li><code>*</code> Matches any version</li><li><code>&quot;&quot;</code> (just an empty string) Same as <code>*</code></li><li><code>version1 - version2</code> Same as <code>&gt;=version1 &lt;=version2</code>.</li><li><code>range1 || range2</code> Passes if either range1 or range2 are satisfied.</li><li><code>git...</code> See &#39;Git URLs as Dependencies&#39; below</li><li><code>user/repo</code> See &#39;GitHub URLs&#39; below</li></ul>
-
-<p>For example, these are all valid:</p>
-
-<pre><code>{ &quot;dependencies&quot; :
-  { &quot;foo&quot; : &quot;1.0.0 - 2.9999.9999&quot;
-  , &quot;bar&quot; : &quot;&gt;=1.0.2 &lt;2.1.2&quot;
-  , &quot;baz&quot; : &quot;&gt;1.0.2 &lt;=2.3.4&quot;
-  , &quot;boo&quot; : &quot;2.0.1&quot;
-  , &quot;qux&quot; : &quot;&lt;1.0.0 || &gt;=2.3.1 &lt;2.4.5 || &gt;=2.5.2 &lt;3.0.0&quot;
-  , &quot;asd&quot; : &quot;http://asdf.com/asdf.tar.gz&quot;
-  , &quot;til&quot; : &quot;~1.2&quot;
-  , &quot;elf&quot; : &quot;~1.2.3&quot;
-  , &quot;two&quot; : &quot;2.x&quot;
-  , &quot;thr&quot; : &quot;3.3.x&quot;
-  }
-}</code></pre>
-
-<h3 id="URLs-as-Dependencies">URLs as Dependencies</h3>
-
-<p>You may specify a tarball URL in place of a version range.</p>
-
-<p>This tarball will be downloaded and installed locally to your package at
-install time.</p>
-
-<h3 id="Git-URLs-as-Dependencies">Git URLs as Dependencies</h3>
-
-<p>Git urls can be of the form:</p>
-
-<pre><code>git://github.com/user/project.git#commit-ish
-git+ssh://user@hostname:project.git#commit-ish
-git+ssh://user@hostname/project.git#commit-ish
-git+http://user@hostname/project/blah.git#commit-ish
-git+https://user@hostname/project/blah.git#commit-ish</code></pre>
-
-<p>The <code>commit-ish</code> can be any tag, sha, or branch which can be supplied as
-an argument to <code>git checkout</code>.  The default is <code>master</code>.</p>
-
-<h2 id="GitHub-URLs">GitHub URLs</h2>
-
-<p>As of version 1.1.65, you can refer to GitHub urls as just &quot;foo&quot;: &quot;user/foo-project&quot;. For example:</p>
-
-<p><code>json
-{
-  &quot;name&quot;: &quot;foo&quot;,
-  &quot;version&quot;: &quot;0.0.0&quot;,
-  &quot;dependencies&quot;: {
-    &quot;express&quot;: &quot;visionmedia/express&quot;
-  }
-}
-</code></p>
-
-<h2 id="devDependencies">devDependencies</h2>
-
-<p>If someone is planning on downloading and using your module in their
-program, then they probably don&#39;t want or need to download and build
-the external test or documentation framework that you use.</p>
-
-<p>In this case, it&#39;s best to list these additional items in a
-<code>devDependencies</code> hash.</p>
-
-<p>These things will be installed when doing <code>npm link</code> or <code>npm install</code>
-from the root of a package, and can be managed like any other npm
-configuration param.  See <code><a href="../misc/npm-config.html">npm-config(7)</a></code> for more on the topic.</p>
-
-<p>For build steps that are not platform-specific, such as compiling
-CoffeeScript or other languages to JavaScript, use the <code>prepublish</code>
-script to do this, and make the required package a devDependency.</p>
-
-<p>For example:</p>
-
-<p><code>json
-{ &quot;name&quot;: &quot;ethopia-waza&quot;,
-  &quot;description&quot;: &quot;a delightfully fruity coffee varietal&quot;,
-  &quot;version&quot;: &quot;1.2.3&quot;,
-  &quot;devDependencies&quot;: {
-    &quot;coffee-script&quot;: &quot;~1.6.3&quot;
-  },
-  &quot;scripts&quot;: {
-    &quot;prepublish&quot;: &quot;coffee -o lib/ -c src/waza.coffee&quot;
-  },
-  &quot;main&quot;: &quot;lib/waza.js&quot;
-}
-</code></p>
-
-<p>The <code>prepublish</code> script will be run before publishing, so that users
-can consume the functionality without requiring them to compile it
-themselves.  In dev mode (ie, locally running <code>npm install</code>), it&#39;ll
-run this script as well, so that you can test it easily.</p>
-
-<h2 id="bundledDependencies">bundledDependencies</h2>
-
-<p>Array of package names that will be bundled when publishing the package.</p>
-
-<p>If this is spelled <code>&quot;bundleDependencies&quot;</code>, then that is also honorable.</p>
-
-<h2 id="optionalDependencies">optionalDependencies</h2>
-
-<p>If a dependency can be used, but you would like npm to proceed if it
-cannot be found or fails to install, then you may put it in the
-<code>optionalDependencies</code> hash.  This is a map of package name to version
-or url, just like the <code>dependencies</code> hash.  The difference is that
-failure is tolerated.</p>
-
-<p>It is still your program&#39;s responsibility to handle the lack of the
-dependency.  For example, something like this:</p>
-
-<pre><code>try {
-  var foo = require(&#39;foo&#39;)
-  var fooVersion = require(&#39;foo/package.json&#39;).version
-} catch (er) {
-  foo = null
-}
-if ( notGoodFooVersion(fooVersion) ) {
-  foo = null
-}
-
-// .. then later in your program ..
-
-if (foo) {
-  foo.doFooThings()
-}</code></pre>
-
-<p>Entries in <code>optionalDependencies</code> will override entries of the same name in
-<code>dependencies</code>, so it&#39;s usually best to only put in one place.</p>
-
-<h2 id="engines">engines</h2>
-
-<p>You can specify the version of node that your stuff works on:</p>
-
-<pre><code>{ &quot;engines&quot; : { &quot;node&quot; : &quot;&gt;=0.10.3 &lt;0.12&quot; } }</code></pre>
-
-<p>And, like with dependencies, if you don&#39;t specify the version (or if you
-specify &quot;*&quot; as the version), then any version of node will do.</p>
-
-<p>If you specify an &quot;engines&quot; field, then npm will require that &quot;node&quot; be
-somewhere on that list. If &quot;engines&quot; is omitted, then npm will just assume
-that it works on node.</p>
-
-<p>You can also use the &quot;engines&quot; field to specify which versions of npm
-are capable of properly installing your program.  For example:</p>
-
-<pre><code>{ &quot;engines&quot; : { &quot;npm&quot; : &quot;~1.0.20&quot; } }</code></pre>
-
-<p>Note that, unless the user has set the <code>engine-strict</code> config flag, this
-field is advisory only.</p>
-
-<h2 id="engineStrict">engineStrict</h2>
-
-<p>If you are sure that your module will <em>definitely not</em> run properly on
-versions of Node/npm other than those specified in the <code>engines</code> hash,
-then you can set <code>&quot;engineStrict&quot;: true</code> in your package.json file.
-This will override the user&#39;s <code>engine-strict</code> config setting.</p>
-
-<p>Please do not do this unless you are really very very sure.  If your
-engines hash is something overly restrictive, you can quite easily and
-inadvertently lock yourself into obscurity and prevent your users from
-updating to new versions of Node.  Consider this choice carefully.  If
-people abuse it, it will be removed in a future version of npm.</p>
-
-<h2 id="os">os</h2>
-
-<p>You can specify which operating systems your
-module will run on:</p>
-
-<pre><code>&quot;os&quot; : [ &quot;darwin&quot;, &quot;linux&quot; ]</code></pre>
-
-<p>You can also blacklist instead of whitelist operating systems,
-just prepend the blacklisted os with a &#39;!&#39;:</p>
-
-<pre><code>&quot;os&quot; : [ &quot;!win32&quot; ]</code></pre>
-
-<p>The host operating system is determined by <code>process.platform</code></p>
-
-<p>It is allowed to both blacklist, and whitelist, although there isn&#39;t any
-good reason to do this.</p>
-
-<h2 id="cpu">cpu</h2>
-
-<p>If your code only runs on certain cpu architectures,
-you can specify which ones.</p>
-
-<pre><code>&quot;cpu&quot; : [ &quot;x64&quot;, &quot;ia32&quot; ]</code></pre>
-
-<p>Like the <code>os</code> option, you can also blacklist architectures:</p>
-
-<pre><code>&quot;cpu&quot; : [ &quot;!arm&quot;, &quot;!mips&quot; ]</code></pre>
-
-<p>The host architecture is determined by <code>process.arch</code></p>
-
-<h2 id="preferGlobal">preferGlobal</h2>
-
-<p>If your package is primarily a command-line application that should be
-installed globally, then set this value to <code>true</code> to provide a warning
-if it is installed locally.</p>
-
-<p>It doesn&#39;t actually prevent users from installing it locally, but it
-does help prevent some confusion if it doesn&#39;t work as expected.</p>
-
-<h2 id="private">private</h2>
-
-<p>If you set <code>&quot;private&quot;: true</code> in your package.json, then npm will refuse
-to publish it.</p>
-
-<p>This is a way to prevent accidental publication of private repositories.
-If you would like to ensure that a given package is only ever published
-to a specific registry (for example, an internal registry),
-then use the <code>publishConfig</code> hash described below
-to override the <code>registry</code> config param at publish-time.</p>
-
-<h2 id="publishConfig">publishConfig</h2>
-
-<p>This is a set of config values that will be used at publish-time.  It&#39;s
-especially handy if you want to set the tag or registry, so that you can
-ensure that a given package is not tagged with &quot;latest&quot; or published to
-the global public registry by default.</p>
-
-<p>Any config values can be overridden, but of course only &quot;tag&quot; and
-&quot;registry&quot; probably matter for the purposes of publishing.</p>
-
-<p>See <code><a href="../misc/npm-config.html">npm-config(7)</a></code> to see the list of config options that can be
-overridden.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/semver.html">semver(7)</a></li><li><a href="../cli/npm-init.html">npm-init(1)</a></li><li><a href="../cli/npm-version.html">npm-version(1)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../cli/npm-help.html">npm-help(1)</a></li><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../cli/npm-rm.html">npm-rm(1)</a></li></ul>
-</div>
-<p id="footer">package.json &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/index.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,450 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-index</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="misc/npm-index.html">npm-index</a></h1> <p>Index of all npm documentation</p>
-
-<h2 id="README-1"><a href="../doc/README.html">README</a></h2>
-
-<p>node package manager</p>
-
-<h1>Command Line Documentation</h1>
-
-<h2 id="npm-1"><a href="cli/npm.html">npm(1)</a></h2>
-
-<p>node package manager</p>
-
-<h2 id="npm-adduser-1"><a href="cli/npm-adduser.html">npm-adduser(1)</a></h2>
-
-<p>Add a registry user account</p>
-
-<h2 id="npm-bin-1"><a href="cli/npm-bin.html">npm-bin(1)</a></h2>
-
-<p>Display npm bin folder</p>
-
-<h2 id="npm-bugs-1"><a href="cli/npm-bugs.html">npm-bugs(1)</a></h2>
-
-<p>Bugs for a package in a web browser maybe</p>
-
-<h2 id="npm-build-1"><a href="cli/npm-build.html">npm-build(1)</a></h2>
-
-<p>Build a package</p>
-
-<h2 id="npm-bundle-1"><a href="cli/npm-bundle.html">npm-bundle(1)</a></h2>
-
-<p>REMOVED</p>
-
-<h2 id="npm-cache-1"><a href="cli/npm-cache.html">npm-cache(1)</a></h2>
-
-<p>Manipulates packages cache</p>
-
-<h2 id="npm-completion-1"><a href="cli/npm-completion.html">npm-completion(1)</a></h2>
-
-<p>Tab Completion for npm</p>
-
-<h2 id="npm-config-1"><a href="cli/npm-config.html">npm-config(1)</a></h2>
-
-<p>Manage the npm configuration files</p>
-
-<h2 id="npm-dedupe-1"><a href="cli/npm-dedupe.html">npm-dedupe(1)</a></h2>
-
-<p>Reduce duplication</p>
-
-<h2 id="npm-deprecate-1"><a href="cli/npm-deprecate.html">npm-deprecate(1)</a></h2>
-
-<p>Deprecate a version of a package</p>
-
-<h2 id="npm-docs-1"><a href="cli/npm-docs.html">npm-docs(1)</a></h2>
-
-<p>Docs for a package in a web browser maybe</p>
-
-<h2 id="npm-edit-1"><a href="cli/npm-edit.html">npm-edit(1)</a></h2>
-
-<p>Edit an installed package</p>
-
-<h2 id="npm-explore-1"><a href="cli/npm-explore.html">npm-explore(1)</a></h2>
-
-<p>Browse an installed package</p>
-
-<h2 id="npm-help-search-1"><a href="cli/npm-help-search.html">npm-help-search(1)</a></h2>
-
-<p>Search npm help documentation</p>
-
-<h2 id="npm-help-1"><a href="cli/npm-help.html">npm-help(1)</a></h2>
-
-<p>Get help on npm</p>
-
-<h2 id="npm-init-1"><a href="cli/npm-init.html">npm-init(1)</a></h2>
-
-<p>Interactively create a package.json file</p>
-
-<h2 id="npm-install-1"><a href="cli/npm-install.html">npm-install(1)</a></h2>
-
-<p>Install a package</p>
-
-<h2 id="npm-link-1"><a href="cli/npm-link.html">npm-link(1)</a></h2>
-
-<p>Symlink a package folder</p>
-
-<h2 id="npm-ls-1"><a href="cli/npm-ls.html">npm-ls(1)</a></h2>
-
-<p>List installed packages</p>
-
-<h2 id="npm-outdated-1"><a href="cli/npm-outdated.html">npm-outdated(1)</a></h2>
-
-<p>Check for outdated packages</p>
-
-<h2 id="npm-owner-1"><a href="cli/npm-owner.html">npm-owner(1)</a></h2>
-
-<p>Manage package owners</p>
-
-<h2 id="npm-pack-1"><a href="cli/npm-pack.html">npm-pack(1)</a></h2>
-
-<p>Create a tarball from a package</p>
-
-<h2 id="npm-prefix-1"><a href="cli/npm-prefix.html">npm-prefix(1)</a></h2>
-
-<p>Display prefix</p>
-
-<h2 id="npm-prune-1"><a href="cli/npm-prune.html">npm-prune(1)</a></h2>
-
-<p>Remove extraneous packages</p>
-
-<h2 id="npm-publish-1"><a href="cli/npm-publish.html">npm-publish(1)</a></h2>
-
-<p>Publish a package</p>
-
-<h2 id="npm-rebuild-1"><a href="cli/npm-rebuild.html">npm-rebuild(1)</a></h2>
-
-<p>Rebuild a package</p>
-
-<h2 id="npm-restart-1"><a href="cli/npm-restart.html">npm-restart(1)</a></h2>
-
-<p>Start a package</p>
-
-<h2 id="npm-rm-1"><a href="cli/npm-rm.html">npm-rm(1)</a></h2>
-
-<p>Remove a package</p>
-
-<h2 id="npm-root-1"><a href="cli/npm-root.html">npm-root(1)</a></h2>
-
-<p>Display npm root</p>
-
-<h2 id="npm-run-script-1"><a href="cli/npm-run-script.html">npm-run-script(1)</a></h2>
-
-<p>Run arbitrary package scripts</p>
-
-<h2 id="npm-search-1"><a href="cli/npm-search.html">npm-search(1)</a></h2>
-
-<p>Search for packages</p>
-
-<h2 id="npm-shrinkwrap-1"><a href="cli/npm-shrinkwrap.html">npm-shrinkwrap(1)</a></h2>
-
-<p>Lock down dependency versions</p>
-
-<h2 id="npm-star-1"><a href="cli/npm-star.html">npm-star(1)</a></h2>
-
-<p>Mark your favorite packages</p>
-
-<h2 id="npm-stars-1"><a href="cli/npm-stars.html">npm-stars(1)</a></h2>
-
-<p>View packages marked as favorites</p>
-
-<h2 id="npm-start-1"><a href="cli/npm-start.html">npm-start(1)</a></h2>
-
-<p>Start a package</p>
-
-<h2 id="npm-stop-1"><a href="cli/npm-stop.html">npm-stop(1)</a></h2>
-
-<p>Stop a package</p>
-
-<h2 id="npm-submodule-1"><a href="cli/npm-submodule.html">npm-submodule(1)</a></h2>
-
-<p>Add a package as a git submodule</p>
-
-<h2 id="npm-tag-1"><a href="cli/npm-tag.html">npm-tag(1)</a></h2>
-
-<p>Tag a published version</p>
-
-<h2 id="npm-test-1"><a href="cli/npm-test.html">npm-test(1)</a></h2>
-
-<p>Test a package</p>
-
-<h2 id="npm-uninstall-1"><a href="cli/npm-uninstall.html">npm-uninstall(1)</a></h2>
-
-<p>Remove a package</p>
-
-<h2 id="npm-unpublish-1"><a href="cli/npm-unpublish.html">npm-unpublish(1)</a></h2>
-
-<p>Remove a package from the registry</p>
-
-<h2 id="npm-update-1"><a href="cli/npm-update.html">npm-update(1)</a></h2>
-
-<p>Update a package</p>
-
-<h2 id="npm-version-1"><a href="cli/npm-version.html">npm-version(1)</a></h2>
-
-<p>Bump a package version</p>
-
-<h2 id="npm-view-1"><a href="cli/npm-view.html">npm-view(1)</a></h2>
-
-<p>View registry info</p>
-
-<h2 id="npm-whoami-1"><a href="cli/npm-whoami.html">npm-whoami(1)</a></h2>
-
-<p>Display npm username</p>
-
-<h2 id="repo-1"><a href="cli/repo.html">repo(1)</a></h2>
-
-<p>Open package repository page in the browser</p>
-
-<h1>API Documentation</h1>
-
-<h2 id="npm-3"><a href="api/npm.html">npm(3)</a></h2>
-
-<p>node package manager</p>
-
-<h2 id="npm-bin-3"><a href="api/npm-bin.html">npm-bin(3)</a></h2>
-
-<p>Display npm bin folder</p>
-
-<h2 id="npm-bugs-3"><a href="api/npm-bugs.html">npm-bugs(3)</a></h2>
-
-<p>Bugs for a package in a web browser maybe</p>
-
-<h2 id="npm-commands-3"><a href="api/npm-commands.html">npm-commands(3)</a></h2>
-
-<p>npm commands</p>
-
-<h2 id="npm-config-3"><a href="api/npm-config.html">npm-config(3)</a></h2>
-
-<p>Manage the npm configuration files</p>
-
-<h2 id="npm-deprecate-3"><a href="api/npm-deprecate.html">npm-deprecate(3)</a></h2>
-
-<p>Deprecate a version of a package</p>
-
-<h2 id="npm-docs-3"><a href="api/npm-docs.html">npm-docs(3)</a></h2>
-
-<p>Docs for a package in a web browser maybe</p>
-
-<h2 id="npm-edit-3"><a href="api/npm-edit.html">npm-edit(3)</a></h2>
-
-<p>Edit an installed package</p>
-
-<h2 id="npm-explore-3"><a href="api/npm-explore.html">npm-explore(3)</a></h2>
-
-<p>Browse an installed package</p>
-
-<h2 id="npm-help-search-3"><a href="api/npm-help-search.html">npm-help-search(3)</a></h2>
-
-<p>Search the help pages</p>
-
-<h2 id="npm-init-3"><a href="api/npm-init.html">npm-init(3)</a></h2>
-
-<p>Interactively create a package.json file</p>
-
-<h2 id="npm-install-3"><a href="api/npm-install.html">npm-install(3)</a></h2>
-
-<p>install a package programmatically</p>
-
-<h2 id="npm-link-3"><a href="api/npm-link.html">npm-link(3)</a></h2>
-
-<p>Symlink a package folder</p>
-
-<h2 id="npm-load-3"><a href="api/npm-load.html">npm-load(3)</a></h2>
-
-<p>Load config settings</p>
-
-<h2 id="npm-ls-3"><a href="api/npm-ls.html">npm-ls(3)</a></h2>
-
-<p>List installed packages</p>
-
-<h2 id="npm-outdated-3"><a href="api/npm-outdated.html">npm-outdated(3)</a></h2>
-
-<p>Check for outdated packages</p>
-
-<h2 id="npm-owner-3"><a href="api/npm-owner.html">npm-owner(3)</a></h2>
-
-<p>Manage package owners</p>
-
-<h2 id="npm-pack-3"><a href="api/npm-pack.html">npm-pack(3)</a></h2>
-
-<p>Create a tarball from a package</p>
-
-<h2 id="npm-prefix-3"><a href="api/npm-prefix.html">npm-prefix(3)</a></h2>
-
-<p>Display prefix</p>
-
-<h2 id="npm-prune-3"><a href="api/npm-prune.html">npm-prune(3)</a></h2>
-
-<p>Remove extraneous packages</p>
-
-<h2 id="npm-publish-3"><a href="api/npm-publish.html">npm-publish(3)</a></h2>
-
-<p>Publish a package</p>
-
-<h2 id="npm-rebuild-3"><a href="api/npm-rebuild.html">npm-rebuild(3)</a></h2>
-
-<p>Rebuild a package</p>
-
-<h2 id="npm-restart-3"><a href="api/npm-restart.html">npm-restart(3)</a></h2>
-
-<p>Start a package</p>
-
-<h2 id="npm-root-3"><a href="api/npm-root.html">npm-root(3)</a></h2>
-
-<p>Display npm root</p>
-
-<h2 id="npm-run-script-3"><a href="api/npm-run-script.html">npm-run-script(3)</a></h2>
-
-<p>Run arbitrary package scripts</p>
-
-<h2 id="npm-search-3"><a href="api/npm-search.html">npm-search(3)</a></h2>
-
-<p>Search for packages</p>
-
-<h2 id="npm-shrinkwrap-3"><a href="api/npm-shrinkwrap.html">npm-shrinkwrap(3)</a></h2>
-
-<p>programmatically generate package shrinkwrap file</p>
-
-<h2 id="npm-start-3"><a href="api/npm-start.html">npm-start(3)</a></h2>
-
-<p>Start a package</p>
-
-<h2 id="npm-stop-3"><a href="api/npm-stop.html">npm-stop(3)</a></h2>
-
-<p>Stop a package</p>
-
-<h2 id="npm-submodule-3"><a href="api/npm-submodule.html">npm-submodule(3)</a></h2>
-
-<p>Add a package as a git submodule</p>
-
-<h2 id="npm-tag-3"><a href="api/npm-tag.html">npm-tag(3)</a></h2>
-
-<p>Tag a published version</p>
-
-<h2 id="npm-test-3"><a href="api/npm-test.html">npm-test(3)</a></h2>
-
-<p>Test a package</p>
-
-<h2 id="npm-uninstall-3"><a href="api/npm-uninstall.html">npm-uninstall(3)</a></h2>
-
-<p>uninstall a package programmatically</p>
-
-<h2 id="npm-unpublish-3"><a href="api/npm-unpublish.html">npm-unpublish(3)</a></h2>
-
-<p>Remove a package from the registry</p>
-
-<h2 id="npm-update-3"><a href="api/npm-update.html">npm-update(3)</a></h2>
-
-<p>Update a package</p>
-
-<h2 id="npm-version-3"><a href="api/npm-version.html">npm-version(3)</a></h2>
-
-<p>Bump a package version</p>
-
-<h2 id="npm-view-3"><a href="api/npm-view.html">npm-view(3)</a></h2>
-
-<p>View registry info</p>
-
-<h2 id="npm-whoami-3"><a href="api/npm-whoami.html">npm-whoami(3)</a></h2>
-
-<p>Display npm username</p>
-
-<h2 id="repo-3"><a href="api/repo.html">repo(3)</a></h2>
-
-<p>Open package repository page in the browser</p>
-
-<h1>Files</h1>
-
-<h2 id="npm-folders-5"><a href="files/npm-folders.html">npm-folders(5)</a></h2>
-
-<p>Folder Structures Used by npm</p>
-
-<h2 id="npmrc-5"><a href="files/npmrc.html">npmrc(5)</a></h2>
-
-<p>The npm config files</p>
-
-<h2 id="package-json-5"><a href="files/package.json.html">package.json(5)</a></h2>
-
-<p>Specifics of npm&#39;s package.json handling</p>
-
-<h1>Misc</h1>
-
-<h2 id="npm-coding-style-7"><a href="misc/npm-coding-style.html">npm-coding-style(7)</a></h2>
-
-<p>npm&#39;s &quot;funny&quot; coding style</p>
-
-<h2 id="npm-config-7"><a href="misc/npm-config.html">npm-config(7)</a></h2>
-
-<p>More than you probably want to know about npm configuration</p>
-
-<h2 id="npm-developers-7"><a href="misc/npm-developers.html">npm-developers(7)</a></h2>
-
-<p>Developer Guide</p>
-
-<h2 id="npm-disputes-7"><a href="misc/npm-disputes.html">npm-disputes(7)</a></h2>
-
-<p>Handling Module Name Disputes</p>
-
-<h2 id="npm-faq-7"><a href="misc/npm-faq.html">npm-faq(7)</a></h2>
-
-<p>Frequently Asked Questions</p>
-
-<h2 id="npm-index-7"><a href="misc/npm-index.html">npm-index(7)</a></h2>
-
-<p>Index of all npm documentation</p>
-
-<h2 id="npm-registry-7"><a href="misc/npm-registry.html">npm-registry(7)</a></h2>
-
-<p>The JavaScript Package Registry</p>
-
-<h2 id="npm-scripts-7"><a href="misc/npm-scripts.html">npm-scripts(7)</a></h2>
-
-<p>How npm handles the &quot;scripts&quot; field</p>
-
-<h2 id="removing-npm-7"><a href="misc/removing-npm.html">removing-npm(7)</a></h2>
-
-<p>Cleaning the Slate</p>
-
-<h2 id="semver-7"><a href="misc/semver.html">semver(7)</a></h2>
-
-<p>The semantic versioner for npm</p>
-</div>
-<p id="footer">npm-index &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-coding-style.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,216 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-coding-style</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../misc/npm-coding-style.html">npm-coding-style</a></h1> <p>npm&#39;s &quot;funny&quot; coding style</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm&#39;s coding style is a bit unconventional.  It is not different for
-difference&#39;s sake, but rather a carefully crafted style that is
-designed to reduce visual clutter and make bugs more apparent.</p>
-
-<p>If you want to contribute to npm (which is very encouraged), you should
-make your code conform to npm&#39;s style.</p>
-
-<p>Note: this concerns npm&#39;s code not the specific packages at npmjs.org</p>
-
-<h2 id="Line-Length">Line Length</h2>
-
-<p>Keep lines shorter than 80 characters.  It&#39;s better for lines to be
-too short than to be too long.  Break up long lists, objects, and other
-statements onto multiple lines.</p>
-
-<h2 id="Indentation">Indentation</h2>
-
-<p>Two-spaces.  Tabs are better, but they look like hell in web browsers
-(and on github), and node uses 2 spaces, so that&#39;s that.</p>
-
-<p>Configure your editor appropriately.</p>
-
-<h2 id="Curly-braces">Curly braces</h2>
-
-<p>Curly braces belong on the same line as the thing that necessitates them.</p>
-
-<p>Bad:</p>
-
-<pre><code>function ()
-{</code></pre>
-
-<p>Good:</p>
-
-<pre><code>function () {</code></pre>
-
-<p>If a block needs to wrap to the next line, use a curly brace.  Don&#39;t
-use it if it doesn&#39;t.</p>
-
-<p>Bad:</p>
-
-<pre><code>if (foo) { bar() }
-while (foo)
-  bar()</code></pre>
-
-<p>Good:</p>
-
-<pre><code>if (foo) bar()
-while (foo) {
-  bar()
-}</code></pre>
-
-<h2 id="Semicolons">Semicolons</h2>
-
-<p>Don&#39;t use them except in four situations:</p>
-
-<ul><li><code>for (;;)</code> loops.  They&#39;re actually required.</li><li>null loops like: <code>while (something) ;</code> (But you&#39;d better have a good
-reason for doing that.)</li><li><code>case &quot;foo&quot;: doSomething(); break</code></li><li>In front of a leading <code>(</code> or <code>[</code> at the start of the line.
-This prevents the expression from being interpreted
-as a function call or property access, respectively.</li></ul>
-
-<p>Some examples of good semicolon usage:</p>
-
-<pre><code>;(x || y).doSomething()
-;[a, b, c].forEach(doSomething)
-for (var i = 0; i &lt; 10; i ++) {
-  switch (state) {
-    case &quot;begin&quot;: start(); continue
-    case &quot;end&quot;: finish(); break
-    default: throw new Error(&quot;unknown state&quot;)
-  }
-  end()
-}</code></pre>
-
-<p>Note that starting lines with <code>-</code> and <code>+</code> also should be prefixed
-with a semicolon, but this is much less common.</p>
-
-<h2 id="Comma-First">Comma First</h2>
-
-<p>If there is a list of things separated by commas, and it wraps
-across multiple lines, put the comma at the start of the next
-line, directly below the token that starts the list.  Put the
-final token in the list on a line by itself.  For example:</p>
-
-<pre><code>var magicWords = [ &quot;abracadabra&quot;
-                 , &quot;gesundheit&quot;
-                 , &quot;ventrilo&quot;
-                 ]
-  , spells = { &quot;fireball&quot; : function () { setOnFire() }
-             , &quot;water&quot; : function () { putOut() }
-             }
-  , a = 1
-  , b = &quot;abc&quot;
-  , etc
-  , somethingElse</code></pre>
-
-<h2 id="Whitespace">Whitespace</h2>
-
-<p>Put a single space in front of ( for anything other than a function call.
-Also use a single space wherever it makes things more readable.</p>
-
-<p>Don&#39;t leave trailing whitespace at the end of lines.  Don&#39;t indent empty
-lines.  Don&#39;t use more spaces than are helpful.</p>
-
-<h2 id="Functions">Functions</h2>
-
-<p>Use named functions.  They make stack traces a lot easier to read.</p>
-
-<h2 id="Callbacks-Sync-async-Style">Callbacks, Sync/async Style</h2>
-
-<p>Use the asynchronous/non-blocking versions of things as much as possible.
-It might make more sense for npm to use the synchronous fs APIs, but this
-way, the fs and http and child process stuff all uses the same callback-passing
-methodology.</p>
-
-<p>The callback should always be the last argument in the list.  Its first
-argument is the Error or null.</p>
-
-<p>Be very careful never to ever ever throw anything.  It&#39;s worse than useless.
-Just send the error message back as the first argument to the callback.</p>
-
-<h2 id="Errors">Errors</h2>
-
-<p>Always create a new Error object with your message.  Don&#39;t just return a
-string message to the callback.  Stack traces are handy.</p>
-
-<h2 id="Logging">Logging</h2>
-
-<p>Logging is done using the <a href="https://github.com/isaacs/npmlog">npmlog</a>
-utility.</p>
-
-<p>Please clean up logs when they are no longer helpful.  In particular,
-logging the same object over and over again is not helpful.  Logs should
-report what&#39;s happening so that it&#39;s easier to track down where a fault
-occurs.</p>
-
-<p>Use appropriate log levels.  See <code><a href="../misc/npm-config.html">npm-config(7)</a></code> and search for
-&quot;loglevel&quot;.</p>
-
-<h2 id="Case-naming-etc">Case, naming, etc.</h2>
-
-<p>Use <code>lowerCamelCase</code> for multiword identifiers when they refer to objects,
-functions, methods, members, or anything not specified in this section.</p>
-
-<p>Use <code>UpperCamelCase</code> for class names (things that you&#39;d pass to &quot;new&quot;).</p>
-
-<p>Use <code>all-lower-hyphen-css-case</code> for multiword filenames and config keys.</p>
-
-<p>Use named functions.  They make stack traces easier to follow.</p>
-
-<p>Use <code>CAPS_SNAKE_CASE</code> for constants, things that should never change
-and are rarely used.</p>
-
-<p>Use a single uppercase letter for function names where the function
-would normally be anonymous, but needs to call itself recursively.  It
-makes it clear that it&#39;s a &quot;throwaway&quot; function.</p>
-
-<h2 id="null-undefined-false-0">null, undefined, false, 0</h2>
-
-<p>Boolean variables and functions should always be either <code>true</code> or
-<code>false</code>.  Don&#39;t set it to 0 unless it&#39;s supposed to be a number.</p>
-
-<p>When something is intentionally missing or removed, set it to <code>null</code>.</p>
-
-<p>Don&#39;t set things to <code>undefined</code>.  Reserve that value to mean &quot;not yet
-set to anything.&quot;</p>
-
-<p>Boolean objects are verboten.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-developers.html">npm-developers(7)</a></li><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../cli/npm.html">npm(1)</a></li></ul>
-</div>
-<p id="footer">npm-coding-style &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-config.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,751 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-config</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../misc/npm-config.html">npm-config</a></h1> <p>More than you probably want to know about npm configuration</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm gets its configuration values from 6 sources, in this priority:</p>
-
-<h3 id="Command-Line-Flags">Command Line Flags</h3>
-
-<p>Putting <code>--foo bar</code> on the command line sets the <code>foo</code> configuration
-parameter to <code>&quot;bar&quot;</code>.  A <code>--</code> argument tells the cli parser to stop
-reading flags.  A <code>--flag</code> parameter that is at the <em>end</em> of the
-command will be given the value of <code>true</code>.</p>
-
-<h3 id="Environment-Variables">Environment Variables</h3>
-
-<p>Any environment variables that start with <code>npm_config_</code> will be
-interpreted as a configuration parameter.  For example, putting
-<code>npm_config_foo=bar</code> in your environment will set the <code>foo</code>
-configuration parameter to <code>bar</code>.  Any environment configurations that
-are not given a value will be given the value of <code>true</code>.  Config
-values are case-insensitive, so <code>NPM_CONFIG_FOO=bar</code> will work the
-same.</p>
-
-<h3 id="npmrc-Files">npmrc Files</h3>
-
-<p>The three relevant files are:</p>
-
-<ul><li>per-user config file (~/.npmrc)</li><li>global config file ($PREFIX/npmrc)</li><li>npm builtin config file (/path/to/npm/npmrc)</li></ul>
-
-<p>See <a href="../files/npmrc.html">npmrc(5)</a> for more details.</p>
-
-<h3 id="Default-Configs">Default Configs</h3>
-
-<p>A set of configuration parameters that are internal to npm, and are
-defaults if nothing else is specified.</p>
-
-<h2 id="Shorthands-and-Other-CLI-Niceties">Shorthands and Other CLI Niceties</h2>
-
-<p>The following shorthands are parsed on the command-line:</p>
-
-<ul><li><code>-v</code>: <code>--version</code></li><li><code>-h</code>, <code>-?</code>, <code>--help</code>, <code>-H</code>: <code>--usage</code></li><li><code>-s</code>, <code>--silent</code>: <code>--loglevel silent</code></li><li><code>-q</code>, <code>--quiet</code>: <code>--loglevel warn</code></li><li><code>-d</code>: <code>--loglevel info</code></li><li><code>-dd</code>, <code>--verbose</code>: <code>--loglevel verbose</code></li><li><code>-ddd</code>: <code>--loglevel silly</code></li><li><code>-g</code>: <code>--global</code></li><li><code>-l</code>: <code>--long</code></li><li><code>-m</code>: <code>--message</code></li><li><code>-p</code>, <code>--porcelain</code>: <code>--parseable</code></li><li><code>-reg</code>: <code>--registry</code></li><li><code>-v</code>: <code>--version</code></li><li><code>-f</code>: <code>--force</code></li><li><code>-desc</code>: <code>--description</code></li><li><code>-S</code>: <code>--save</code></li><li><code>-D</code>: <code>--save-dev</code></li><li><code>-O</code>: <code>--save-optional</code></li><li><code>-B</code>: <code>--save-bundle</code></li><li><code>-y</code>: <code>--yes</code></li><li><code>-n</code>: <code>--yes false</code></li><li><code>ll</code> and <code>la</code> commands: <code>ls --long</code></li></ul>
-
-<p>If the specified configuration param resolves unambiguously to a known
-configuration parameter, then it is expanded to that configuration
-parameter.  For example:</p>
-
-<pre><code>npm ls --par
-# same as:
-npm ls --parseable</code></pre>
-
-<p>If multiple single-character shorthands are strung together, and the
-resulting combination is unambiguously not some other configuration
-param, then it is expanded to its various component pieces.  For
-example:</p>
-
-<pre><code>npm ls -gpld
-# same as:
-npm ls --global --parseable --long --loglevel info</code></pre>
-
-<h2 id="Per-Package-Config-Settings">Per-Package Config Settings</h2>
-
-<p>When running scripts (see <code><a href="../misc/npm-scripts.html">npm-scripts(7)</a></code>) the package.json &quot;config&quot;
-keys are overwritten in the environment if there is a config param of
-<code>&lt;name&gt;[@&lt;version&gt;]:&lt;key&gt;</code>.  For example, if the package.json has
-this:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;config&quot; : { &quot;port&quot; : &quot;8080&quot; }
-, &quot;scripts&quot; : { &quot;start&quot; : &quot;node server.js&quot; } }</code></pre>
-
-<p>and the server.js is this:</p>
-
-<pre><code>http.createServer(...).listen(process.env.npm_package_config_port)</code></pre>
-
-<p>then the user could change the behavior by doing:</p>
-
-<pre><code>npm config set foo:port 80</code></pre>
-
-<p>See <a href="../files/package.json.html">package.json(5)</a> for more information.</p>
-
-<h2 id="Config-Settings">Config Settings</h2>
-
-<h3 id="always-auth">always-auth</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Force npm to always require authentication when accessing the registry,
-even for <code>GET</code> requests.</p>
-
-<h3 id="bin-links">bin-links</h3>
-
-<ul><li>Default: <code>true</code></li><li>Type: Boolean</li></ul>
-
-<p>Tells npm to create symlinks (or <code>.cmd</code> shims on Windows) for package
-executables.</p>
-
-<p>Set to false to have it not do this.  This can be used to work around
-the fact that some file systems don&#39;t support symlinks, even on
-ostensibly Unix systems.</p>
-
-<h3 id="browser">browser</h3>
-
-<ul><li>Default: OS X: <code>&quot;open&quot;</code>, Windows: <code>&quot;start&quot;</code>, Others: <code>&quot;xdg-open&quot;</code></li><li>Type: String</li></ul>
-
-<p>The browser that is called by the <code>npm docs</code> command to open websites.</p>
-
-<h3 id="ca">ca</h3>
-
-<ul><li>Default: The npm CA certificate</li><li>Type: String or null</li></ul>
-
-<p>The Certificate Authority signing certificate that is trusted for SSL
-connections to the registry.</p>
-
-<p>Set to <code>null</code> to only allow &quot;known&quot; registrars, or to a specific CA cert
-to trust only that specific signing authority.</p>
-
-<p>See also the <code>strict-ssl</code> config.</p>
-
-<h3 id="cache">cache</h3>
-
-<ul><li>Default: Windows: <code>%AppData%\npm-cache</code>, Posix: <code>~/.npm</code></li><li>Type: path</li></ul>
-
-<p>The location of npm&#39;s cache directory.  See <code><a href="../cli/npm-cache.html">npm-cache(1)</a></code></p>
-
-<h3 id="cache-lock-stale">cache-lock-stale</h3>
-
-<ul><li>Default: 60000 (1 minute)</li><li>Type: Number</li></ul>
-
-<p>The number of ms before cache folder lockfiles are considered stale.</p>
-
-<h3 id="cache-lock-retries">cache-lock-retries</h3>
-
-<ul><li>Default: 10</li><li>Type: Number</li></ul>
-
-<p>Number of times to retry to acquire a lock on cache folder lockfiles.</p>
-
-<h3 id="cache-lock-wait">cache-lock-wait</h3>
-
-<ul><li>Default: 10000 (10 seconds)</li><li>Type: Number</li></ul>
-
-<p>Number of ms to wait for cache lock files to expire.</p>
-
-<h3 id="cache-max">cache-max</h3>
-
-<ul><li>Default: Infinity</li><li>Type: Number</li></ul>
-
-<p>The maximum time (in seconds) to keep items in the registry cache before
-re-checking against the registry.</p>
-
-<p>Note that no purging is done unless the <code>npm cache clean</code> command is
-explicitly used, and that only GET requests use the cache.</p>
-
-<h3 id="cache-min">cache-min</h3>
-
-<ul><li>Default: 10</li><li>Type: Number</li></ul>
-
-<p>The minimum time (in seconds) to keep items in the registry cache before
-re-checking against the registry.</p>
-
-<p>Note that no purging is done unless the <code>npm cache clean</code> command is
-explicitly used, and that only GET requests use the cache.</p>
-
-<h3 id="color">color</h3>
-
-<ul><li>Default: true on Posix, false on Windows</li><li>Type: Boolean or <code>&quot;always&quot;</code></li></ul>
-
-<p>If false, never shows colors.  If <code>&quot;always&quot;</code> then always shows colors.
-If true, then only prints color codes for tty file descriptors.</p>
-
-<h3 id="coverage">coverage</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>A flag to tell test-harness to run with their coverage options enabled,
-if they respond to the <code>npm_config_coverage</code> environment variable.</p>
-
-<h3 id="depth">depth</h3>
-
-<ul><li>Default: Infinity</li><li>Type: Number</li></ul>
-
-<p>The depth to go when recursing directories for <code>npm ls</code> and
-<code>npm cache ls</code>.</p>
-
-<h3 id="description">description</h3>
-
-<ul><li>Default: true</li><li>Type: Boolean</li></ul>
-
-<p>Show the description in <code>npm search</code></p>
-
-<h3 id="dev">dev</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Install <code>dev-dependencies</code> along with packages.</p>
-
-<p>Note that <code>dev-dependencies</code> are also installed if the <code>npat</code> flag is
-set.</p>
-
-<h3 id="editor">editor</h3>
-
-<ul><li>Default: <code>EDITOR</code> environment variable if set, or <code>&quot;vi&quot;</code> on Posix,
-or <code>&quot;notepad&quot;</code> on Windows.</li><li>Type: path</li></ul>
-
-<p>The command to run for <code>npm edit</code> or <code>npm config edit</code>.</p>
-
-<h3 id="engine-strict">engine-strict</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>If set to true, then npm will stubbornly refuse to install (or even
-consider installing) any package that claims to not be compatible with
-the current Node.js version.</p>
-
-<h3 id="force">force</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Makes various commands more forceful.</p>
-
-<ul><li>lifecycle script failure does not block progress.</li><li>publishing clobbers previously published versions.</li><li>skips cache when requesting from the registry.</li><li>prevents checks against clobbering non-npm files.</li></ul>
-
-<h3 id="fetch-retries">fetch-retries</h3>
-
-<ul><li>Default: 2</li><li>Type: Number</li></ul>
-
-<p>The &quot;retries&quot; config for the <code>retry</code> module to use when fetching
-packages from the registry.</p>
-
-<h3 id="fetch-retry-factor">fetch-retry-factor</h3>
-
-<ul><li>Default: 10</li><li>Type: Number</li></ul>
-
-<p>The &quot;factor&quot; config for the <code>retry</code> module to use when fetching
-packages.</p>
-
-<h3 id="fetch-retry-mintimeout">fetch-retry-mintimeout</h3>
-
-<ul><li>Default: 10000 (10 seconds)</li><li>Type: Number</li></ul>
-
-<p>The &quot;minTimeout&quot; config for the <code>retry</code> module to use when fetching
-packages.</p>
-
-<h3 id="fetch-retry-maxtimeout">fetch-retry-maxtimeout</h3>
-
-<ul><li>Default: 60000 (1 minute)</li><li>Type: Number</li></ul>
-
-<p>The &quot;maxTimeout&quot; config for the <code>retry</code> module to use when fetching
-packages.</p>
-
-<h3 id="git">git</h3>
-
-<ul><li>Default: <code>&quot;git&quot;</code></li><li>Type: String</li></ul>
-
-<p>The command to use for git commands.  If git is installed on the
-computer, but is not in the <code>PATH</code>, then set this to the full path to
-the git binary.</p>
-
-<h3 id="global">global</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Operates in &quot;global&quot; mode, so that packages are installed into the
-<code>prefix</code> folder instead of the current working directory.  See
-<code><a href="../files/npm-folders.html">npm-folders(5)</a></code> for more on the differences in behavior.</p>
-
-<ul><li>packages are installed into the <code>{prefix}/lib/node_modules</code> folder, instead of the
-current working directory.</li><li>bin files are linked to <code>{prefix}/bin</code></li><li>man pages are linked to <code>{prefix}/share/man</code></li></ul>
-
-<h3 id="globalconfig">globalconfig</h3>
-
-<ul><li>Default: {prefix}/etc/npmrc</li><li>Type: path</li></ul>
-
-<p>The config file to read for global config options.</p>
-
-<h3 id="globalignorefile">globalignorefile</h3>
-
-<ul><li>Default: {prefix}/etc/npmignore</li><li>Type: path</li></ul>
-
-<p>The config file to read for global ignore patterns to apply to all users
-and all projects.</p>
-
-<p>If not found, but there is a &quot;gitignore&quot; file in the
-same directory, then that will be used instead.</p>
-
-<h3 id="group">group</h3>
-
-<ul><li>Default: GID of the current process</li><li>Type: String or Number</li></ul>
-
-<p>The group to use when running package scripts in global mode as the root
-user.</p>
-
-<h3 id="https-proxy">https-proxy</h3>
-
-<ul><li>Default: the <code>HTTPS_PROXY</code> or <code>https_proxy</code> or <code>HTTP_PROXY</code> or
-<code>http_proxy</code> environment variables.</li><li>Type: url</li></ul>
-
-<p>A proxy to use for outgoing https requests.</p>
-
-<h3 id="user-agent">user-agent</h3>
-
-<ul><li>Default: node/{process.version} {process.platform} {process.arch}</li><li>Type: String</li></ul>
-
-<p>Sets a User-Agent to the request header</p>
-
-<h3 id="ignore">ignore</h3>
-
-<ul><li>Default: &quot;&quot;</li><li>Type: string</li></ul>
-
-<p>A white-space separated list of glob patterns of files to always exclude
-from packages when building tarballs.</p>
-
-<h3 id="init-module">init-module</h3>
-
-<ul><li>Default: ~/.npm-init.js</li><li>Type: path</li></ul>
-
-<p>A module that will be loaded by the <code>npm init</code> command.  See the
-documentation for the
-<a href="https://github.com/isaacs/init-package-json">init-package-json</a> module
-for more information, or <a href="../cli/npm-init.html">npm-init(1)</a>.</p>
-
-<h3 id="init-version">init.version</h3>
-
-<ul><li>Default: &quot;0.0.0&quot;</li><li>Type: semver</li></ul>
-
-<p>The value <code>npm init</code> should use by default for the package version.</p>
-
-<h3 id="init-author-name">init.author.name</h3>
-
-<ul><li>Default: &quot;&quot;</li><li>Type: String</li></ul>
-
-<p>The value <code>npm init</code> should use by default for the package author&#39;s name.</p>
-
-<h3 id="init-author-email">init.author.email</h3>
-
-<ul><li>Default: &quot;&quot;</li><li>Type: String</li></ul>
-
-<p>The value <code>npm init</code> should use by default for the package author&#39;s email.</p>
-
-<h3 id="init-author-url">init.author.url</h3>
-
-<ul><li>Default: &quot;&quot;</li><li>Type: String</li></ul>
-
-<p>The value <code>npm init</code> should use by default for the package author&#39;s homepage.</p>
-
-<h3 id="json">json</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Whether or not to output JSON data, rather than the normal output.</p>
-
-<p>This feature is currently experimental, and the output data structures
-for many commands is either not implemented in JSON yet, or subject to
-change.  Only the output from <code>npm ls --json</code> is currently valid.</p>
-
-<h3 id="link">link</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>If true, then local installs will link if there is a suitable globally
-installed package.</p>
-
-<p>Note that this means that local installs can cause things to be
-installed into the global space at the same time.  The link is only done
-if one of the two conditions are met:</p>
-
-<ul><li>The package is not already installed globally, or</li><li>the globally installed version is identical to the version that is
-being installed locally.</li></ul>
-
-<h3 id="loglevel">loglevel</h3>
-
-<ul><li>Default: &quot;http&quot;</li><li>Type: String</li><li>Values: &quot;silent&quot;, &quot;win&quot;, &quot;error&quot;, &quot;warn&quot;, &quot;http&quot;, &quot;info&quot;, &quot;verbose&quot;, &quot;silly&quot;</li></ul>
-
-<p>What level of logs to report.  On failure, <em>all</em> logs are written to
-<code>npm-debug.log</code> in the current working directory.</p>
-
-<p>Any logs of a higher level than the setting are shown.
-The default is &quot;http&quot;, which shows http, warn, and error output.</p>
-
-<h3 id="logstream">logstream</h3>
-
-<ul><li>Default: process.stderr</li><li>Type: Stream</li></ul>
-
-<p>This is the stream that is passed to the
-<a href="https://github.com/isaacs/npmlog">npmlog</a> module at run time.</p>
-
-<p>It cannot be set from the command line, but if you are using npm
-programmatically, you may wish to send logs to somewhere other than
-stderr.</p>
-
-<p>If the <code>color</code> config is set to true, then this stream will receive
-colored output if it is a TTY.</p>
-
-<h3 id="long">long</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Show extended information in <code>npm ls</code></p>
-
-<h3 id="message">message</h3>
-
-<ul><li>Default: &quot;%s&quot;</li><li>Type: String</li></ul>
-
-<p>Commit message which is used by <code>npm version</code> when creating version commit.</p>
-
-<p>Any &quot;%s&quot; in the message will be replaced with the version number.</p>
-
-<h3 id="node-version">node-version</h3>
-
-<ul><li>Default: process.version</li><li>Type: semver or false</li></ul>
-
-<p>The node version to use when checking package&#39;s &quot;engines&quot; hash.</p>
-
-<h3 id="npat">npat</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Run tests on installation and report results to the
-<code>npaturl</code>.</p>
-
-<h3 id="npaturl">npaturl</h3>
-
-<ul><li>Default: Not yet implemented</li><li>Type: url</li></ul>
-
-<p>The url to report npat test results.</p>
-
-<h3 id="onload-script">onload-script</h3>
-
-<ul><li>Default: false</li><li>Type: path</li></ul>
-
-<p>A node module to <code>require()</code> when npm loads.  Useful for programmatic
-usage.</p>
-
-<h3 id="optional">optional</h3>
-
-<ul><li>Default: true</li><li>Type: Boolean</li></ul>
-
-<p>Attempt to install packages in the <code>optionalDependencies</code> hash.  Note
-that if these packages fail to install, the overall installation
-process is not aborted.</p>
-
-<h3 id="parseable">parseable</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Output parseable results from commands that write to
-standard output.</p>
-
-<h3 id="prefix">prefix</h3>
-
-<ul><li>Default: see <a href="../files/npm-folders.html">npm-folders(5)</a></li><li>Type: path</li></ul>
-
-<p>The location to install global items.  If set on the command line, then
-it forces non-global commands to run in the specified folder.</p>
-
-<h3 id="production">production</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Set to true to run in &quot;production&quot; mode.</p>
-
-<ol><li>devDependencies are not installed at the topmost level when running
-local <code>npm install</code> without any arguments.</li><li>Set the NODE_ENV=&quot;production&quot; for lifecycle scripts.</li></ol>
-
-<h3 id="proprietary-attribs">proprietary-attribs</h3>
-
-<ul><li>Default: true</li><li>Type: Boolean</li></ul>
-
-<p>Whether or not to include proprietary extended attributes in the
-tarballs created by npm.</p>
-
-<p>Unless you are expecting to unpack package tarballs with something other
-than npm -- particularly a very outdated tar implementation -- leave
-this as true.</p>
-
-<h3 id="proxy">proxy</h3>
-
-<ul><li>Default: <code>HTTP_PROXY</code> or <code>http_proxy</code> environment variable, or null</li><li>Type: url</li></ul>
-
-<p>A proxy to use for outgoing http requests.</p>
-
-<h3 id="rebuild-bundle">rebuild-bundle</h3>
-
-<ul><li>Default: true</li><li>Type: Boolean</li></ul>
-
-<p>Rebuild bundled dependencies after installation.</p>
-
-<h3 id="registry">registry</h3>
-
-<ul><li>Default: https://registry.npmjs.org/</li><li>Type: url</li></ul>
-
-<p>The base URL of the npm package registry.</p>
-
-<h3 id="rollback">rollback</h3>
-
-<ul><li>Default: true</li><li>Type: Boolean</li></ul>
-
-<p>Remove failed installs.</p>
-
-<h3 id="save">save</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Save installed packages to a package.json file as dependencies.</p>
-
-<p>When used with the <code>npm rm</code> command, it removes it from the dependencies
-hash.</p>
-
-<p>Only works if there is already a package.json file present.</p>
-
-<h3 id="save-bundle">save-bundle</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>If a package would be saved at install time by the use of <code>--save</code>,
-<code>--save-dev</code>, or <code>--save-optional</code>, then also put it in the
-<code>bundleDependencies</code> list.</p>
-
-<p>When used with the <code>npm rm</code> command, it removes it from the
-bundledDependencies list.</p>
-
-<h3 id="save-dev">save-dev</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Save installed packages to a package.json file as devDependencies.</p>
-
-<p>When used with the <code>npm rm</code> command, it removes it from the devDependencies
-hash.</p>
-
-<p>Only works if there is already a package.json file present.</p>
-
-<h3 id="save-optional">save-optional</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Save installed packages to a package.json file as optionalDependencies.</p>
-
-<p>When used with the <code>npm rm</code> command, it removes it from the devDependencies
-hash.</p>
-
-<p>Only works if there is already a package.json file present.</p>
-
-<h3 id="searchopts">searchopts</h3>
-
-<ul><li>Default: &quot;&quot;</li><li>Type: String</li></ul>
-
-<p>Space-separated options that are always passed to search.</p>
-
-<h3 id="searchexclude">searchexclude</h3>
-
-<ul><li>Default: &quot;&quot;</li><li>Type: String</li></ul>
-
-<p>Space-separated options that limit the results from search.</p>
-
-<h3 id="searchsort">searchsort</h3>
-
-<ul><li>Default: &quot;name&quot;</li><li>Type: String</li><li>Values: &quot;name&quot;, &quot;-name&quot;, &quot;date&quot;, &quot;-date&quot;, &quot;description&quot;,
-&quot;-description&quot;, &quot;keywords&quot;, &quot;-keywords&quot;</li></ul>
-
-<p>Indication of which field to sort search results by.  Prefix with a <code>-</code>
-character to indicate reverse sort.</p>
-
-<h3 id="shell">shell</h3>
-
-<ul><li>Default: SHELL environment variable, or &quot;bash&quot; on Posix, or &quot;cmd&quot; on
-Windows</li><li>Type: path</li></ul>
-
-<p>The shell to run for the <code>npm explore</code> command.</p>
-
-<h3 id="shrinkwrap">shrinkwrap</h3>
-
-<ul><li>Default: true</li><li>Type: Boolean</li></ul>
-
-<p>If set to false, then ignore <code>npm-shrinkwrap.json</code> files when
-installing.</p>
-
-<h3 id="sign-git-tag">sign-git-tag</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>If set to true, then the <code>npm version</code> command will tag the version
-using <code>-s</code> to add a signature.</p>
-
-<p>Note that git requires you to have set up GPG keys in your git configs
-for this to work properly.</p>
-
-<h3 id="strict-ssl">strict-ssl</h3>
-
-<ul><li>Default: true</li><li>Type: Boolean</li></ul>
-
-<p>Whether or not to do SSL key validation when making requests to the
-registry via https.</p>
-
-<p>See also the <code>ca</code> config.</p>
-
-<h3 id="tag">tag</h3>
-
-<ul><li>Default: latest</li><li>Type: String</li></ul>
-
-<p>If you ask npm to install a package and don&#39;t tell it a specific version, then
-it will install the specified tag.</p>
-
-<p>Also the tag that is added to the package@version specified by the <code>npm
-tag</code> command, if no explicit tag is given.</p>
-
-<h3 id="tmp">tmp</h3>
-
-<ul><li>Default: TMPDIR environment variable, or &quot;/tmp&quot;</li><li>Type: path</li></ul>
-
-<p>Where to store temporary files and folders.  All temp files are deleted
-on success, but left behind on failure for forensic purposes.</p>
-
-<h3 id="unicode">unicode</h3>
-
-<ul><li>Default: true</li><li>Type: Boolean</li></ul>
-
-<p>When set to true, npm uses unicode characters in the tree output.  When
-false, it uses ascii characters to draw trees.</p>
-
-<h3 id="unsafe-perm">unsafe-perm</h3>
-
-<ul><li>Default: false if running as root, true otherwise</li><li>Type: Boolean</li></ul>
-
-<p>Set to true to suppress the UID/GID switching when running package
-scripts.  If set explicitly to false, then installing as a non-root user
-will fail.</p>
-
-<h3 id="usage">usage</h3>
-
-<ul><li>Default: false</li><li>Type: Boolean</li></ul>
-
-<p>Set to show short usage output (like the -H output)
-instead of complete help when doing <code><a href="../cli/npm-help.html">npm-help(1)</a></code>.</p>
-
-<h3 id="user">user</h3>
-
-<ul><li>Default: &quot;nobody&quot;</li><li>Type: String or Number</li></ul>
-
-<p>The UID to set to when running package scripts as root.</p>
-
-<h3 id="username">username</h3>
-
-<ul><li>Default: null</li><li>Type: String</li></ul>
-
-<p>The username on the npm registry.  Set with <code>npm adduser</code></p>
-
-<h3 id="userconfig">userconfig</h3>
-
-<ul><li>Default: ~/.npmrc</li><li>Type: path</li></ul>
-
-<p>The location of user-level configuration settings.</p>
-
-<h3 id="userignorefile">userignorefile</h3>
-
-<ul><li>Default: ~/.npmignore</li><li>Type: path</li></ul>
-
-<p>The location of a user-level ignore file to apply to all packages.</p>
-
-<p>If not found, but there is a .gitignore file in the same directory, then
-that will be used instead.</p>
-
-<h3 id="umask">umask</h3>
-
-<ul><li>Default: 022</li><li>Type: Octal numeric string</li></ul>
-
-<p>The &quot;umask&quot; value to use when setting the file creation mode on files
-and folders.</p>
-
-<p>Folders and executables are given a mode which is <code>0777</code> masked against
-this value.  Other files are given a mode which is <code>0666</code> masked against
-this value.  Thus, the defaults are <code>0755</code> and <code>0644</code> respectively.</p>
-
-<h3 id="version">version</h3>
-
-<ul><li>Default: false</li><li>Type: boolean</li></ul>
-
-<p>If true, output the npm version and exit successfully.</p>
-
-<p>Only relevant when specified explicitly on the command line.</p>
-
-<h3 id="versions">versions</h3>
-
-<ul><li>Default: false</li><li>Type: boolean</li></ul>
-
-<p>If true, output the npm version as well as node&#39;s <code>process.versions</code>
-hash, and exit successfully.</p>
-
-<p>Only relevant when specified explicitly on the command line.</p>
-
-<h3 id="viewer">viewer</h3>
-
-<ul><li>Default: &quot;man&quot; on Posix, &quot;browser&quot; on Windows</li><li>Type: path</li></ul>
-
-<p>The program to use to view help content.</p>
-
-<p>Set to <code>&quot;browser&quot;</code> to view html help content in the default web browser.</p>
-
-<h3 id="yes">yes</h3>
-
-<ul><li>Default: null</li><li>Type: Boolean or null</li></ul>
-
-<p>If set to <code>null</code>, then prompt the user for responses in some
-circumstances.</p>
-
-<p>If set to <code>true</code>, then answer &quot;yes&quot; to any prompt.  If set to <code>false</code>
-then answer &quot;no&quot; to any prompt.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li><li><a href="../cli/npm.html">npm(1)</a></li></ul>
-</div>
-<p id="footer">npm-config &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-developers.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,208 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-developers</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../misc/npm-developers.html">npm-developers</a></h1> <p>Developer Guide</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>So, you&#39;ve decided to use npm to develop (and maybe publish/deploy)
-your project.</p>
-
-<p>Fantastic!</p>
-
-<p>There are a few things that you need to do above the simple steps
-that your users will do to install your program.</p>
-
-<h2 id="About-These-Documents">About These Documents</h2>
-
-<p>These are man pages.  If you install npm, you should be able to
-then do <code>man npm-thing</code> to get the documentation on a particular
-topic, or <code>npm help thing</code> to see the same information.</p>
-
-<h2 id="What-is-a-package">What is a <code>package</code></h2>
-
-<p>A package is:</p>
-
-<ul><li>a) a folder containing a program described by a package.json file</li><li>b) a gzipped tarball containing (a)</li><li>c) a url that resolves to (b)</li><li>d) a <code>&lt;name&gt;@&lt;version&gt;</code> that is published on the registry with (c)</li><li>e) a <code>&lt;name&gt;@&lt;tag&gt;</code> that points to (d)</li><li>f) a <code>&lt;name&gt;</code> that has a &quot;latest&quot; tag satisfying (e)</li><li>g) a <code>git</code> url that, when cloned, results in (a).</li></ul>
-
-<p>Even if you never publish your package, you can still get a lot of
-benefits of using npm if you just want to write a node program (a), and
-perhaps if you also want to be able to easily install it elsewhere
-after packing it up into a tarball (b).</p>
-
-<p>Git urls can be of the form:</p>
-
-<pre><code>git://github.com/user/project.git#commit-ish
-git+ssh://user@hostname:project.git#commit-ish
-git+http://user@hostname/project/blah.git#commit-ish
-git+https://user@hostname/project/blah.git#commit-ish</code></pre>
-
-<p>The <code>commit-ish</code> can be any tag, sha, or branch which can be supplied as
-an argument to <code>git checkout</code>.  The default is <code>master</code>.</p>
-
-<h2 id="The-package-json-File">The package.json File</h2>
-
-<p>You need to have a <code>package.json</code> file in the root of your project to do
-much of anything with npm.  That is basically the whole interface.</p>
-
-<p>See <code><a href="../files/package.json.html">package.json(5)</a></code> for details about what goes in that file.  At the very
-least, you need:</p>
-
-<ul><li><p>name:
-This should be a string that identifies your project.  Please do not
-use the name to specify that it runs on node, or is in JavaScript.
-You can use the &quot;engines&quot; field to explicitly state the versions of
-node (or whatever else) that your program requires, and it&#39;s pretty
-well assumed that it&#39;s javascript.</p><p>It does not necessarily need to match your github repository name.</p><p>So, <code>node-foo</code> and <code>bar-js</code> are bad names.  <code>foo</code> or <code>bar</code> are better.</p></li><li><p>version:
-A semver-compatible version.</p></li><li><p>engines:
-Specify the versions of node (or whatever else) that your program
-runs on.  The node API changes a lot, and there may be bugs or new
-functionality that you depend on.  Be explicit.</p></li><li><p>author:
-Take some credit.</p></li><li><p>scripts:
-If you have a special compilation or installation script, then you
-should put it in the <code>scripts</code> hash.  You should definitely have at
-least a basic smoke-test command as the &quot;scripts.test&quot; field.
-See <a href="../misc/npm-scripts.html">npm-scripts(7)</a>.</p></li><li><p>main:
-If you have a single module that serves as the entry point to your
-program (like what the &quot;foo&quot; package gives you at require(&quot;foo&quot;)),
-then you need to specify that in the &quot;main&quot; field.</p></li><li><p>directories:
-This is a hash of folders.  The best ones to include are &quot;lib&quot; and
-&quot;doc&quot;, but if you specify a folder full of man pages in &quot;man&quot;, then
-they&#39;ll get installed just like these ones.</p></li></ul>
-
-<p>You can use <code>npm init</code> in the root of your package in order to get you
-started with a pretty basic package.json file.  See <code><a href="../cli/npm-init.html">npm-init(1)</a></code> for
-more info.</p>
-
-<h2 id="Keeping-files-out-of-your-package">Keeping files <em>out</em> of your package</h2>
-
-<p>Use a <code>.npmignore</code> file to keep stuff out of your package.  If there&#39;s
-no <code>.npmignore</code> file, but there <em>is</em> a <code>.gitignore</code> file, then npm will
-ignore the stuff matched by the <code>.gitignore</code> file.  If you <em>want</em> to
-include something that is excluded by your <code>.gitignore</code> file, you can
-create an empty <code>.npmignore</code> file to override it.</p>
-
-<p>By default, the following paths and files are ignored, so there&#39;s no
-need to add them to <code>.npmignore</code> explicitly:</p>
-
-<ul><li><code>.*.swp</code></li><li><code>._*</code></li><li><code>.DS_Store</code></li><li><code>.git</code></li><li><code>.hg</code></li><li><code>.lock-wscript</code></li><li><code>.svn</code></li><li><code>.wafpickle-*</code></li><li><code>CVS</code></li><li><code>npm-debug.log</code></li></ul>
-
-<p>Additionally, everything in <code>node_modules</code> is ignored, except for
-bundled dependencies. npm automatically handles this for you, so don&#39;t
-bother adding <code>node_modules</code> to <code>.npmignore</code>.</p>
-
-<p>The following paths and files are never ignored, so adding them to
-<code>.npmignore</code> is pointless:</p>
-
-<ul><li><code>package.json</code></li><li><code><a href="../../doc/README.html">README</a>.*</code></li></ul>
-
-<h2 id="Link-Packages">Link Packages</h2>
-
-<p><code>npm link</code> is designed to install a development package and see the
-changes in real time without having to keep re-installing it.  (You do
-need to either re-link or <code>npm rebuild -g</code> to update compiled packages,
-of course.)</p>
-
-<p>More info at <code><a href="../cli/npm-link.html">npm-link(1)</a></code>.</p>
-
-<h2 id="Before-Publishing-Make-Sure-Your-Package-Installs-and-Works">Before Publishing: Make Sure Your Package Installs and Works</h2>
-
-<p><strong>This is important.</strong></p>
-
-<p>If you can not install it locally, you&#39;ll have
-problems trying to publish it.  Or, worse yet, you&#39;ll be able to
-publish it, but you&#39;ll be publishing a broken or pointless package.
-So don&#39;t do that.</p>
-
-<p>In the root of your package, do this:</p>
-
-<pre><code>npm install . -g</code></pre>
-
-<p>That&#39;ll show you that it&#39;s working.  If you&#39;d rather just create a symlink
-package that points to your working directory, then do this:</p>
-
-<pre><code>npm link</code></pre>
-
-<p>Use <code>npm ls -g</code> to see if it&#39;s there.</p>
-
-<p>To test a local install, go into some other folder, and then do:</p>
-
-<pre><code>cd ../some-other-folder
-npm install ../my-package</code></pre>
-
-<p>to install it locally into the node_modules folder in that other place.</p>
-
-<p>Then go into the node-repl, and try using require(&quot;my-thing&quot;) to
-bring in your module&#39;s main module.</p>
-
-<h2 id="Create-a-User-Account">Create a User Account</h2>
-
-<p>Create a user with the adduser command.  It works like this:</p>
-
-<pre><code>npm adduser</code></pre>
-
-<p>and then follow the prompts.</p>
-
-<p>This is documented better in <a href="../cli/npm-adduser.html">npm-adduser(1)</a>.</p>
-
-<h2 id="Publish-your-package">Publish your package</h2>
-
-<p>This part&#39;s easy.  IN the root of your folder, do this:</p>
-
-<pre><code>npm publish</code></pre>
-
-<p>You can give publish a url to a tarball, or a filename of a tarball,
-or a path to a folder.</p>
-
-<p>Note that pretty much <strong>everything in that folder will be exposed</strong>
-by default.  So, if you have secret stuff in there, use a
-<code>.npmignore</code> file to list out the globs to ignore, or publish
-from a fresh checkout.</p>
-
-<h2 id="Brag-about-it">Brag about it</h2>
-
-<p>Send emails, write blogs, blab in IRC.</p>
-
-<p>Tell the world how easy it is to install your program!</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-faq.html">npm-faq(7)</a></li><li><a href="../cli/npm.html">npm(1)</a></li><li><a href="../cli/npm-init.html">npm-init(1)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../misc/npm-scripts.html">npm-scripts(7)</a></li><li><a href="../cli/npm-publish.html">npm-publish(1)</a></li><li><a href="../cli/npm-adduser.html">npm-adduser(1)</a></li><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li></ul>
-</div>
-<p id="footer">npm-developers &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-disputes.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,125 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-disputes</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../misc/npm-disputes.html">npm-disputes</a></h1> <p>Handling Module Name Disputes</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<ol><li>Get the author email with <code>npm owner ls &lt;pkgname&gt;</code></li><li>Email the author, CC <a href="mailto:i@izs.me">i@izs.me</a>.</li><li>After a few weeks, if there&#39;s no resolution, we&#39;ll sort it out.</li></ol>
-
-<p>Don&#39;t squat on package names.  Publish code or move out of the way.</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>There sometimes arise cases where a user publishes a module, and then
-later, some other user wants to use that name.  Here are some common
-ways that happens (each of these is based on actual events.)</p>
-
-<ol><li>Joe writes a JavaScript module <code>foo</code>, which is not node-specific.
-Joe doesn&#39;t use node at all.  Bob   wants to use <code>foo</code> in node, so he
-wraps it in an npm module.  Some time later, Joe starts using node,
-and wants to take over management of his program.</li><li>Bob writes an npm module <code>foo</code>, and publishes it.  Perhaps much
-later, Joe finds a bug in <code>foo</code>, and fixes it.  He sends a pull
-request to Bob, but Bob doesn&#39;t have the time to deal with it,
-because he has a new job and a new baby and is focused on his new
-erlang project, and kind of not involved with node any more.  Joe
-would like to publish a new <code>foo</code>, but can&#39;t, because the name is
-taken.</li><li>Bob writes a 10-line flow-control library, and calls it <code>foo</code>, and
-publishes it to the npm registry.  Being a simple little thing, it
-never really has to be updated.  Joe works for Foo Inc, the makers
-of the critically acclaimed and widely-marketed <code>foo</code> JavaScript
-toolkit framework.  They publish it to npm as <code>foojs</code>, but people are
-routinely confused when <code>npm install foo</code> is some different thing.</li><li>Bob writes a parser for the widely-known <code>foo</code> file format, because
-he needs it for work.  Then, he gets a new job, and never updates the
-prototype.  Later on, Joe writes a much more complete <code>foo</code> parser,
-but can&#39;t publish, because Bob&#39;s <code>foo</code> is in the way.</li></ol>
-
-<p>The validity of Joe&#39;s claim in each situation can be debated.  However,
-Joe&#39;s appropriate course of action in each case is the same.</p>
-
-<ol><li><code>npm owner ls foo</code>.  This will tell Joe the email address of the
-owner (Bob).</li><li>Joe emails Bob, explaining the situation <strong>as respectfully as possible</strong>,
-and what he would like to do with the module name.  He adds
-isaacs <a href="mailto:i@izs.me">i@izs.me</a> to the CC list of the email.  Mention in the email
-that Bob can run <code>npm owner add joe foo</code> to add Joe as an owner of
-the <code>foo</code> package.</li><li>After a reasonable amount of time, if Bob has not responded, or if
-Bob and Joe can&#39;t come to any sort of resolution, email isaacs
-<a href="mailto:i@izs.me">i@izs.me</a> and we&#39;ll sort it out.  (&quot;Reasonable&quot; is usually about 4
-weeks, but extra time is allowed around common holidays.)</li></ol>
-
-<h2 id="REASONING">REASONING</h2>
-
-<p>In almost every case so far, the parties involved have been able to reach
-an amicable resolution without any major intervention.  Most people
-really do want to be reasonable, and are probably not even aware that
-they&#39;re in your way.</p>
-
-<p>Module ecosystems are most vibrant and powerful when they are as
-self-directed as possible.  If an admin one day deletes something you
-had worked on, then that is going to make most people quite upset,
-regardless of the justification.  When humans solve their problems by
-talking to other humans with respect, everyone has the chance to end up
-feeling good about the interaction.</p>
-
-<h2 id="EXCEPTIONS">EXCEPTIONS</h2>
-
-<p>Some things are not allowed, and will be removed without discussion if
-they are brought to the attention of the npm registry admins, including
-but not limited to:</p>
-
-<ol><li>Malware (that is, a package designed to exploit or harm the machine on
-which it is installed).</li><li>Violations of copyright or licenses (for example, cloning an
-MIT-licensed program, and then removing or changing the copyright and
-license statement).</li><li>Illegal content.</li><li>&quot;Squatting&quot; on a package name that you <em>plan</em> to use, but aren&#39;t
-actually using.  Sorry, I don&#39;t care how great the name is, or how
-perfect a fit it is for the thing that someday might happen.  If
-someone wants to use it today, and you&#39;re just taking up space with
-an empty tarball, you&#39;re going to be evicted.</li><li>Putting empty packages in the registry.  Packages must have SOME
-functionality.  It can be silly, but it can&#39;t be <em>nothing</em>.  (See
-also: squatting.)</li><li>Doing weird things with the registry, like using it as your own
-personal application database or otherwise putting non-packagey
-things into it.</li></ol>
-
-<p>If you see bad behavior like this, please report it right away.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../misc/npm-registry.html">npm-registry(7)</a></li><li><a href="../cli/npm-owner.html">npm-owner(1)</a></li></ul>
-</div>
-<p id="footer">npm-disputes &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-faq.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,374 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-faq</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../misc/npm-faq.html">npm-faq</a></h1> <p>Frequently Asked Questions</p>
-
-<h2 id="Where-can-I-find-these-docs-in-HTML">Where can I find these docs in HTML?</h2>
-
-<p><a href="https://npmjs.org/doc/">https://npmjs.org/doc/</a>, or run:</p>
-
-<pre><code>npm config set viewer browser</code></pre>
-
-<p>to open these documents in your default web browser rather than <code>man</code>.</p>
-
-<h2 id="It-didn-t-work">It didn&#39;t work.</h2>
-
-<p>That&#39;s not really a question.</p>
-
-<h2 id="Why-didn-t-it-work">Why didn&#39;t it work?</h2>
-
-<p>I don&#39;t know yet.</p>
-
-<p>Read the error output, and if you can&#39;t figure out what it means,
-do what it says and post a bug with all the information it asks for.</p>
-
-<h2 id="Where-does-npm-put-stuff">Where does npm put stuff?</h2>
-
-<p>See <code><a href="../files/npm-folders.html">npm-folders(5)</a></code></p>
-
-<p>tl;dr:</p>
-
-<ul><li>Use the <code>npm root</code> command to see where modules go, and the <code>npm bin</code>
-command to see where executables go</li><li>Global installs are different from local installs.  If you install
-something with the <code>-g</code> flag, then its executables go in <code>npm bin -g</code>
-and its modules go in <code>npm root -g</code>.</li></ul>
-
-<h2 id="How-do-I-install-something-on-my-computer-in-a-central-location">How do I install something on my computer in a central location?</h2>
-
-<p>Install it globally by tacking <code>-g</code> or <code>--global</code> to the command.  (This
-is especially important for command line utilities that need to add
-their bins to the global system <code>PATH</code>.)</p>
-
-<h2 id="I-installed-something-globally-but-I-can-t-require-it">I installed something globally, but I can&#39;t <code>require()</code> it</h2>
-
-<p>Install it locally.</p>
-
-<p>The global install location is a place for command-line utilities
-to put their bins in the system <code>PATH</code>.  It&#39;s not for use with <code>require()</code>.</p>
-
-<p>If you <code>require()</code> a module in your code, then that means it&#39;s a
-dependency, and a part of your program.  You need to install it locally
-in your program.</p>
-
-<h2 id="Why-can-t-npm-just-put-everything-in-one-place-like-other-package-managers">Why can&#39;t npm just put everything in one place, like other package managers?</h2>
-
-<p>Not every change is an improvement, but every improvement is a change.
-This would be like asking git to do network IO for every commit.  It&#39;s
-not going to happen, because it&#39;s a terrible idea that causes more
-problems than it solves.</p>
-
-<p>It is much harder to avoid dependency conflicts without nesting
-dependencies.  This is fundamental to the way that npm works, and has
-proven to be an extremely successful approach.  See <code><a href="../files/npm-folders.html">npm-folders(5)</a></code> for
-more details.</p>
-
-<p>If you want a package to be installed in one place, and have all your
-programs reference the same copy of it, then use the <code>npm link</code> command.
-That&#39;s what it&#39;s for.  Install it globally, then link it into each
-program that uses it.</p>
-
-<h2 id="Whatever-I-really-want-the-old-style-everything-global-style">Whatever, I really want the old style &#39;everything global&#39; style.</h2>
-
-<p>Write your own package manager, then.  It&#39;s not that hard.</p>
-
-<p>npm will not help you do something that is known to be a bad idea.</p>
-
-<h2 id="Should-I-check-my-node_modules-folder-into-git">Should I check my <code>node_modules</code> folder into git?</h2>
-
-<p>Mikeal Rogers answered this question very well:</p>
-
-<p><a href="http://www.mikealrogers.com/posts/nodemodules-in-git.html">http://www.mikealrogers.com/posts/nodemodules-in-git.html</a></p>
-
-<p>tl;dr</p>
-
-<ul><li>Check <code>node_modules</code> into git for things you <strong>deploy</strong>, such as
-websites and apps.</li><li>Do not check <code>node_modules</code> into git for libraries and modules
-intended to be reused.</li><li>Use npm to manage dependencies in your dev environment, but not in
-your deployment scripts.</li></ul>
-
-<h2 id="Is-it-npm-or-NPM-or-Npm">Is it &#39;npm&#39; or &#39;NPM&#39; or &#39;Npm&#39;?</h2>
-
-<p>npm should never be capitalized unless it is being displayed in a
-location that is customarily all-caps (such as the title of man pages.)</p>
-
-<h2 id="If-npm-is-an-acronym-why-is-it-never-capitalized">If &#39;npm&#39; is an acronym, why is it never capitalized?</h2>
-
-<p>Contrary to the belief of many, &quot;npm&quot; is not in fact an abbreviation for
-&quot;Node Package Manager&quot;.  It is a recursive bacronymic abbreviation for
-&quot;npm is not an acronym&quot;.  (If it was &quot;ninaa&quot;, then it would be an
-acronym, and thus incorrectly named.)</p>
-
-<p>&quot;NPM&quot;, however, <em>is</em> an acronym (more precisely, a capitonym) for the
-National Association of Pastoral Musicians.  You can learn more
-about them at <a href="http://npm.org/">http://npm.org/</a>.</p>
-
-<p>In software, &quot;NPM&quot; is a Non-Parametric Mapping utility written by
-Chris Rorden.  You can analyze pictures of brains with it.  Learn more
-about the (capitalized) NPM program at <a href="http://www.cabiatl.com/mricro/npm/">http://www.cabiatl.com/mricro/npm/</a>.</p>
-
-<p>The first seed that eventually grew into this flower was a bash utility
-named &quot;pm&quot;, which was a shortened descendent of &quot;pkgmakeinst&quot;, a
-bash function that was used to install various different things on different
-platforms, most often using Yahoo&#39;s <code>yinst</code>.  If <code>npm</code> was ever an
-acronym for anything, it was <code>node pm</code> or maybe <code>new pm</code>.</p>
-
-<p>So, in all seriousness, the &quot;npm&quot; project is named after its command-line
-utility, which was organically selected to be easily typed by a right-handed
-programmer using a US QWERTY keyboard layout, ending with the
-right-ring-finger in a postition to type the <code>-</code> key for flags and
-other command-line arguments.  That command-line utility is always
-lower-case, though it starts most sentences it is a part of.</p>
-
-<h2 id="How-do-I-list-installed-packages">How do I list installed packages?</h2>
-
-<p><code>npm ls</code></p>
-
-<h2 id="How-do-I-search-for-packages">How do I search for packages?</h2>
-
-<p><code>npm search</code></p>
-
-<p>Arguments are greps.  <code>npm search jsdom</code> shows jsdom packages.</p>
-
-<h2 id="How-do-I-update-npm">How do I update npm?</h2>
-
-<pre><code>npm update npm -g</code></pre>
-
-<p>You can also update all outdated local packages by doing <code>npm update</code> without
-any arguments, or global packages by doing <code>npm update -g</code>.</p>
-
-<p>Occasionally, the version of npm will progress such that the current
-version cannot be properly installed with the version that you have
-installed already.  (Consider, if there is ever a bug in the <code>update</code>
-command.)</p>
-
-<p>In those cases, you can do this:</p>
-
-<pre><code>curl https://npmjs.org/install.sh | sh</code></pre>
-
-<h2 id="What-is-a-package">What is a <code>package</code>?</h2>
-
-<p>A package is:</p>
-
-<ul><li>a) a folder containing a program described by a package.json file</li><li>b) a gzipped tarball containing (a)</li><li>c) a url that resolves to (b)</li><li>d) a <code>&lt;name&gt;@&lt;version&gt;</code> that is published on the registry with (c)</li><li>e) a <code>&lt;name&gt;@&lt;tag&gt;</code> that points to (d)</li><li>f) a <code>&lt;name&gt;</code> that has a &quot;latest&quot; tag satisfying (e)</li><li>g) a <code>git</code> url that, when cloned, results in (a).</li></ul>
-
-<p>Even if you never publish your package, you can still get a lot of
-benefits of using npm if you just want to write a node program (a), and
-perhaps if you also want to be able to easily install it elsewhere
-after packing it up into a tarball (b).</p>
-
-<p>Git urls can be of the form:</p>
-
-<pre><code>git://github.com/user/project.git#commit-ish
-git+ssh://user@hostname:project.git#commit-ish
-git+http://user@hostname/project/blah.git#commit-ish
-git+https://user@hostname/project/blah.git#commit-ish</code></pre>
-
-<p>The <code>commit-ish</code> can be any tag, sha, or branch which can be supplied as
-an argument to <code>git checkout</code>.  The default is <code>master</code>.</p>
-
-<h2 id="What-is-a-module">What is a <code>module</code>?</h2>
-
-<p>A module is anything that can be loaded with <code>require()</code> in a Node.js
-program.  The following things are all examples of things that can be
-loaded as modules:</p>
-
-<ul><li>A folder with a <code>package.json</code> file containing a <code>main</code> field.</li><li>A folder with an <code>index.js</code> file in it.</li><li>A JavaScript file.</li></ul>
-
-<p>Most npm packages are modules, because they are libraries that you
-load with <code>require</code>.  However, there&#39;s no requirement that an npm
-package be a module!  Some only contain an executable command-line
-interface, and don&#39;t provide a <code>main</code> field for use in Node programs.</p>
-
-<p>Almost all npm packages (at least, those that are Node programs)
-<em>contain</em> many modules within them (because every file they load with
-<code>require()</code> is a module).</p>
-
-<p>In the context of a Node program, the <code>module</code> is also the thing that
-was loaded <em>from</em> a file.  For example, in the following program:</p>
-
-<pre><code>var req = require(&#39;request&#39;)</code></pre>
-
-<p>we might say that &quot;The variable <code>req</code> refers to the <code>request</code> module&quot;.</p>
-
-<h2 id="So-why-is-it-the-node_modules-folder-but-package-json-file-Why-not-node_packages-or-module-json">So, why is it the &quot;<code>node_modules</code>&quot; folder, but &quot;<code>package.json</code>&quot; file?  Why not <code>node_packages</code> or <code>module.json</code>?</h2>
-
-<p>The <code>package.json</code> file defines the package.  (See &quot;What is a
-package?&quot; above.)</p>
-
-<p>The <code>node_modules</code> folder is the place Node.js looks for modules.
-(See &quot;What is a module?&quot; above.)</p>
-
-<p>For example, if you create a file at <code>node_modules/foo.js</code> and then
-had a program that did <code>var f = require(&#39;foo.js&#39;)</code> then it would load
-the module.  However, <code>foo.js</code> is not a &quot;package&quot; in this case,
-because it does not have a package.json.</p>
-
-<p>Alternatively, if you create a package which does not have an
-<code>index.js</code> or a <code>&quot;main&quot;</code> field in the <code>package.json</code> file, then it is
-not a module.  Even if it&#39;s installed in <code>node_modules</code>, it can&#39;t be
-an argument to <code>require()</code>.</p>
-
-<h2 id="node_modules-is-the-name-of-my-deity-s-arch-rival-and-a-Forbidden-Word-in-my-religion-Can-I-configure-npm-to-use-a-different-folder"><code>&quot;node_modules&quot;</code> is the name of my deity&#39;s arch-rival, and a Forbidden Word in my religion.  Can I configure npm to use a different folder?</h2>
-
-<p>No.  This will never happen.  This question comes up sometimes,
-because it seems silly from the outside that npm couldn&#39;t just be
-configured to put stuff somewhere else, and then npm could load them
-from there.  It&#39;s an arbitrary spelling choice, right?  What&#39;s the big
-deal?</p>
-
-<p>At the time of this writing, the string <code>&#39;node_modules&#39;</code> appears 151
-times in 53 separate files in npm and node core (excluding tests and
-documentation).</p>
-
-<p>Some of these references are in node&#39;s built-in module loader.  Since
-npm is not involved <strong>at all</strong> at run-time, node itself would have to
-be configured to know where you&#39;ve decided to stick stuff.  Complexity
-hurdle #1.  Since the Node module system is locked, this cannot be
-changed, and is enough to kill this request.  But I&#39;ll continue, in
-deference to your deity&#39;s delicate feelings regarding spelling.</p>
-
-<p>Many of the others are in dependencies that npm uses, which are not
-necessarily tightly coupled to npm (in the sense that they do not read
-npm&#39;s configuration files, etc.)  Each of these would have to be
-configured to take the name of the <code>node_modules</code> folder as a
-parameter.  Complexity hurdle #2.</p>
-
-<p>Furthermore, npm has the ability to &quot;bundle&quot; dependencies by adding
-the dep names to the <code>&quot;bundledDependencies&quot;</code> list in package.json,
-which causes the folder to be included in the package tarball.  What
-if the author of a module bundles its dependencies, and they use a
-different spelling for <code>node_modules</code>?  npm would have to rename the
-folder at publish time, and then be smart enough to unpack it using
-your locally configured name.  Complexity hurdle #3.</p>
-
-<p>Furthermore, what happens when you <em>change</em> this name?  Fine, it&#39;s
-easy enough the first time, just rename the <code>node_modules</code> folders to
-<code>./blergyblerp/</code> or whatever name you choose.  But what about when you
-change it again?  npm doesn&#39;t currently track any state about past
-configuration settings, so this would be rather difficult to do
-properly.  It would have to track every previous value for this
-config, and always accept any of them, or else yesterday&#39;s install may
-be broken tomorrow.  Complexity hurdle #5.</p>
-
-<p>Never going to happen.  The folder is named <code>node_modules</code>.  It is
-written indelibly in the Node Way, handed down from the ancient times
-of Node 0.3.</p>
-
-<h2 id="How-do-I-install-node-with-npm">How do I install node with npm?</h2>
-
-<p>You don&#39;t.  Try one of these node version managers:</p>
-
-<p>Unix:</p>
-
-<ul><li><a href="http://github.com/isaacs/nave">http://github.com/isaacs/nave</a></li><li><a href="http://github.com/visionmedia/n">http://github.com/visionmedia/n</a></li><li><a href="http://github.com/creationix/nvm">http://github.com/creationix/nvm</a></li></ul>
-
-<p>Windows:</p>
-
-<ul><li><a href="http://github.com/marcelklehr/nodist">http://github.com/marcelklehr/nodist</a></li><li><a href="https://github.com/hakobera/nvmw">https://github.com/hakobera/nvmw</a></li></ul>
-
-<h2 id="How-can-I-use-npm-for-development">How can I use npm for development?</h2>
-
-<p>See <code><a href="../misc/npm-developers.html">npm-developers(7)</a></code> and <code><a href="../files/package.json.html">package.json(5)</a></code>.</p>
-
-<p>You&#39;ll most likely want to <code>npm link</code> your development folder.  That&#39;s
-awesomely handy.</p>
-
-<p>To set up your own private registry, check out <code><a href="../misc/npm-registry.html">npm-registry(7)</a></code>.</p>
-
-<h2 id="Can-I-list-a-url-as-a-dependency">Can I list a url as a dependency?</h2>
-
-<p>Yes.  It should be a url to a gzipped tarball containing a single folder
-that has a package.json in its root, or a git url.
-(See &quot;what is a package?&quot; above.)</p>
-
-<h2 id="How-do-I-symlink-to-a-dev-folder-so-I-don-t-have-to-keep-re-installing">How do I symlink to a dev folder so I don&#39;t have to keep re-installing?</h2>
-
-<p>See <code><a href="../cli/npm-link.html">npm-link(1)</a></code></p>
-
-<h2 id="The-package-registry-website-What-is-that-exactly">The package registry website.  What is that exactly?</h2>
-
-<p>See <code><a href="../misc/npm-registry.html">npm-registry(7)</a></code>.</p>
-
-<h2 id="I-forgot-my-password-and-can-t-publish-How-do-I-reset-it">I forgot my password, and can&#39;t publish.  How do I reset it?</h2>
-
-<p>Go to <a href="https://npmjs.org/forgot">https://npmjs.org/forgot</a>.</p>
-
-<h2 id="I-get-ECONNREFUSED-a-lot-What-s-up">I get ECONNREFUSED a lot.  What&#39;s up?</h2>
-
-<p>Either the registry is down, or node&#39;s DNS isn&#39;t able to reach out.</p>
-
-<p>To check if the registry is down, open up <a href="http://registry.npmjs.org/">http://registry.npmjs.org/</a>
-in a web browser.  This will also tell you if you are just unable to
-access the internet for some reason.</p>
-
-<p>If the registry IS down, let me know by emailing <a href="mailto:i@izs.me">i@izs.me</a> or posting
-an issue at <a href="https://github.com/isaacs/npm/issues">https://github.com/isaacs/npm/issues</a>.  We&#39;ll have
-someone kick it or something.</p>
-
-<h2 id="Why-no-namespaces">Why no namespaces?</h2>
-
-<p>Please see this discussion: <a href="https://github.com/isaacs/npm/issues/798">https://github.com/isaacs/npm/issues/798</a></p>
-
-<p>tl;dr - It doesn&#39;t actually make things better, and can make them worse.</p>
-
-<p>If you want to namespace your own packages, you may: simply use the
-<code>-</code> character to separate the names.  npm is a mostly anarchic system.
-There is not sufficient need to impose namespace rules on everyone.</p>
-
-<h2 id="Who-does-npm">Who does npm?</h2>
-
-<p><code>npm view npm author</code></p>
-
-<p><code>npm view npm contributors</code></p>
-
-<h2 id="I-have-a-question-or-request-not-addressed-here-Where-should-I-put-it">I have a question or request not addressed here. Where should I put it?</h2>
-
-<p>Post an issue on the github project:</p>
-
-<ul><li><a href="https://github.com/isaacs/npm/issues">https://github.com/isaacs/npm/issues</a></li></ul>
-
-<h2 id="Why-does-npm-hate-me">Why does npm hate me?</h2>
-
-<p>npm is not capable of hatred.  It loves everyone, especially you.</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm.html">npm(1)</a></li><li><a href="../misc/npm-developers.html">npm-developers(7)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npm-folders.html">npm-folders(5)</a></li></ul>
-</div>
-<p id="footer">npm-faq &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-index.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,450 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-index</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../misc/npm-index.html">npm-index</a></h1> <p>Index of all npm documentation</p>
-
-<h2 id="README-1"><a href="../../doc/README.html">README</a></h2>
-
-<p>node package manager</p>
-
-<h1>Command Line Documentation</h1>
-
-<h2 id="npm-1"><a href="../cli/npm.html">npm(1)</a></h2>
-
-<p>node package manager</p>
-
-<h2 id="npm-adduser-1"><a href="../cli/npm-adduser.html">npm-adduser(1)</a></h2>
-
-<p>Add a registry user account</p>
-
-<h2 id="npm-bin-1"><a href="../cli/npm-bin.html">npm-bin(1)</a></h2>
-
-<p>Display npm bin folder</p>
-
-<h2 id="npm-bugs-1"><a href="../cli/npm-bugs.html">npm-bugs(1)</a></h2>
-
-<p>Bugs for a package in a web browser maybe</p>
-
-<h2 id="npm-build-1"><a href="../cli/npm-build.html">npm-build(1)</a></h2>
-
-<p>Build a package</p>
-
-<h2 id="npm-bundle-1"><a href="../cli/npm-bundle.html">npm-bundle(1)</a></h2>
-
-<p>REMOVED</p>
-
-<h2 id="npm-cache-1"><a href="../cli/npm-cache.html">npm-cache(1)</a></h2>
-
-<p>Manipulates packages cache</p>
-
-<h2 id="npm-completion-1"><a href="../cli/npm-completion.html">npm-completion(1)</a></h2>
-
-<p>Tab Completion for npm</p>
-
-<h2 id="npm-config-1"><a href="../cli/npm-config.html">npm-config(1)</a></h2>
-
-<p>Manage the npm configuration files</p>
-
-<h2 id="npm-dedupe-1"><a href="../cli/npm-dedupe.html">npm-dedupe(1)</a></h2>
-
-<p>Reduce duplication</p>
-
-<h2 id="npm-deprecate-1"><a href="../cli/npm-deprecate.html">npm-deprecate(1)</a></h2>
-
-<p>Deprecate a version of a package</p>
-
-<h2 id="npm-docs-1"><a href="../cli/npm-docs.html">npm-docs(1)</a></h2>
-
-<p>Docs for a package in a web browser maybe</p>
-
-<h2 id="npm-edit-1"><a href="../cli/npm-edit.html">npm-edit(1)</a></h2>
-
-<p>Edit an installed package</p>
-
-<h2 id="npm-explore-1"><a href="../cli/npm-explore.html">npm-explore(1)</a></h2>
-
-<p>Browse an installed package</p>
-
-<h2 id="npm-help-search-1"><a href="../cli/npm-help-search.html">npm-help-search(1)</a></h2>
-
-<p>Search npm help documentation</p>
-
-<h2 id="npm-help-1"><a href="../cli/npm-help.html">npm-help(1)</a></h2>
-
-<p>Get help on npm</p>
-
-<h2 id="npm-init-1"><a href="../cli/npm-init.html">npm-init(1)</a></h2>
-
-<p>Interactively create a package.json file</p>
-
-<h2 id="npm-install-1"><a href="../cli/npm-install.html">npm-install(1)</a></h2>
-
-<p>Install a package</p>
-
-<h2 id="npm-link-1"><a href="../cli/npm-link.html">npm-link(1)</a></h2>
-
-<p>Symlink a package folder</p>
-
-<h2 id="npm-ls-1"><a href="../cli/npm-ls.html">npm-ls(1)</a></h2>
-
-<p>List installed packages</p>
-
-<h2 id="npm-outdated-1"><a href="../cli/npm-outdated.html">npm-outdated(1)</a></h2>
-
-<p>Check for outdated packages</p>
-
-<h2 id="npm-owner-1"><a href="../cli/npm-owner.html">npm-owner(1)</a></h2>
-
-<p>Manage package owners</p>
-
-<h2 id="npm-pack-1"><a href="../cli/npm-pack.html">npm-pack(1)</a></h2>
-
-<p>Create a tarball from a package</p>
-
-<h2 id="npm-prefix-1"><a href="../cli/npm-prefix.html">npm-prefix(1)</a></h2>
-
-<p>Display prefix</p>
-
-<h2 id="npm-prune-1"><a href="../cli/npm-prune.html">npm-prune(1)</a></h2>
-
-<p>Remove extraneous packages</p>
-
-<h2 id="npm-publish-1"><a href="../cli/npm-publish.html">npm-publish(1)</a></h2>
-
-<p>Publish a package</p>
-
-<h2 id="npm-rebuild-1"><a href="../cli/npm-rebuild.html">npm-rebuild(1)</a></h2>
-
-<p>Rebuild a package</p>
-
-<h2 id="npm-restart-1"><a href="../cli/npm-restart.html">npm-restart(1)</a></h2>
-
-<p>Start a package</p>
-
-<h2 id="npm-rm-1"><a href="../cli/npm-rm.html">npm-rm(1)</a></h2>
-
-<p>Remove a package</p>
-
-<h2 id="npm-root-1"><a href="../cli/npm-root.html">npm-root(1)</a></h2>
-
-<p>Display npm root</p>
-
-<h2 id="npm-run-script-1"><a href="../cli/npm-run-script.html">npm-run-script(1)</a></h2>
-
-<p>Run arbitrary package scripts</p>
-
-<h2 id="npm-search-1"><a href="../cli/npm-search.html">npm-search(1)</a></h2>
-
-<p>Search for packages</p>
-
-<h2 id="npm-shrinkwrap-1"><a href="../cli/npm-shrinkwrap.html">npm-shrinkwrap(1)</a></h2>
-
-<p>Lock down dependency versions</p>
-
-<h2 id="npm-star-1"><a href="../cli/npm-star.html">npm-star(1)</a></h2>
-
-<p>Mark your favorite packages</p>
-
-<h2 id="npm-stars-1"><a href="../cli/npm-stars.html">npm-stars(1)</a></h2>
-
-<p>View packages marked as favorites</p>
-
-<h2 id="npm-start-1"><a href="../cli/npm-start.html">npm-start(1)</a></h2>
-
-<p>Start a package</p>
-
-<h2 id="npm-stop-1"><a href="../cli/npm-stop.html">npm-stop(1)</a></h2>
-
-<p>Stop a package</p>
-
-<h2 id="npm-submodule-1"><a href="../cli/npm-submodule.html">npm-submodule(1)</a></h2>
-
-<p>Add a package as a git submodule</p>
-
-<h2 id="npm-tag-1"><a href="../cli/npm-tag.html">npm-tag(1)</a></h2>
-
-<p>Tag a published version</p>
-
-<h2 id="npm-test-1"><a href="../cli/npm-test.html">npm-test(1)</a></h2>
-
-<p>Test a package</p>
-
-<h2 id="npm-uninstall-1"><a href="../cli/npm-uninstall.html">npm-uninstall(1)</a></h2>
-
-<p>Remove a package</p>
-
-<h2 id="npm-unpublish-1"><a href="../cli/npm-unpublish.html">npm-unpublish(1)</a></h2>
-
-<p>Remove a package from the registry</p>
-
-<h2 id="npm-update-1"><a href="../cli/npm-update.html">npm-update(1)</a></h2>
-
-<p>Update a package</p>
-
-<h2 id="npm-version-1"><a href="../cli/npm-version.html">npm-version(1)</a></h2>
-
-<p>Bump a package version</p>
-
-<h2 id="npm-view-1"><a href="../cli/npm-view.html">npm-view(1)</a></h2>
-
-<p>View registry info</p>
-
-<h2 id="npm-whoami-1"><a href="../cli/npm-whoami.html">npm-whoami(1)</a></h2>
-
-<p>Display npm username</p>
-
-<h2 id="repo-1"><a href="../cli/repo.html">repo(1)</a></h2>
-
-<p>Open package repository page in the browser</p>
-
-<h1>API Documentation</h1>
-
-<h2 id="npm-3"><a href="../api/npm.html">npm(3)</a></h2>
-
-<p>node package manager</p>
-
-<h2 id="npm-bin-3"><a href="../api/npm-bin.html">npm-bin(3)</a></h2>
-
-<p>Display npm bin folder</p>
-
-<h2 id="npm-bugs-3"><a href="../api/npm-bugs.html">npm-bugs(3)</a></h2>
-
-<p>Bugs for a package in a web browser maybe</p>
-
-<h2 id="npm-commands-3"><a href="../api/npm-commands.html">npm-commands(3)</a></h2>
-
-<p>npm commands</p>
-
-<h2 id="npm-config-3"><a href="../api/npm-config.html">npm-config(3)</a></h2>
-
-<p>Manage the npm configuration files</p>
-
-<h2 id="npm-deprecate-3"><a href="../api/npm-deprecate.html">npm-deprecate(3)</a></h2>
-
-<p>Deprecate a version of a package</p>
-
-<h2 id="npm-docs-3"><a href="../api/npm-docs.html">npm-docs(3)</a></h2>
-
-<p>Docs for a package in a web browser maybe</p>
-
-<h2 id="npm-edit-3"><a href="../api/npm-edit.html">npm-edit(3)</a></h2>
-
-<p>Edit an installed package</p>
-
-<h2 id="npm-explore-3"><a href="../api/npm-explore.html">npm-explore(3)</a></h2>
-
-<p>Browse an installed package</p>
-
-<h2 id="npm-help-search-3"><a href="../api/npm-help-search.html">npm-help-search(3)</a></h2>
-
-<p>Search the help pages</p>
-
-<h2 id="npm-init-3"><a href="../api/npm-init.html">npm-init(3)</a></h2>
-
-<p>Interactively create a package.json file</p>
-
-<h2 id="npm-install-3"><a href="../api/npm-install.html">npm-install(3)</a></h2>
-
-<p>install a package programmatically</p>
-
-<h2 id="npm-link-3"><a href="../api/npm-link.html">npm-link(3)</a></h2>
-
-<p>Symlink a package folder</p>
-
-<h2 id="npm-load-3"><a href="../api/npm-load.html">npm-load(3)</a></h2>
-
-<p>Load config settings</p>
-
-<h2 id="npm-ls-3"><a href="../api/npm-ls.html">npm-ls(3)</a></h2>
-
-<p>List installed packages</p>
-
-<h2 id="npm-outdated-3"><a href="../api/npm-outdated.html">npm-outdated(3)</a></h2>
-
-<p>Check for outdated packages</p>
-
-<h2 id="npm-owner-3"><a href="../api/npm-owner.html">npm-owner(3)</a></h2>
-
-<p>Manage package owners</p>
-
-<h2 id="npm-pack-3"><a href="../api/npm-pack.html">npm-pack(3)</a></h2>
-
-<p>Create a tarball from a package</p>
-
-<h2 id="npm-prefix-3"><a href="../api/npm-prefix.html">npm-prefix(3)</a></h2>
-
-<p>Display prefix</p>
-
-<h2 id="npm-prune-3"><a href="../api/npm-prune.html">npm-prune(3)</a></h2>
-
-<p>Remove extraneous packages</p>
-
-<h2 id="npm-publish-3"><a href="../api/npm-publish.html">npm-publish(3)</a></h2>
-
-<p>Publish a package</p>
-
-<h2 id="npm-rebuild-3"><a href="../api/npm-rebuild.html">npm-rebuild(3)</a></h2>
-
-<p>Rebuild a package</p>
-
-<h2 id="npm-restart-3"><a href="../api/npm-restart.html">npm-restart(3)</a></h2>
-
-<p>Start a package</p>
-
-<h2 id="npm-root-3"><a href="../api/npm-root.html">npm-root(3)</a></h2>
-
-<p>Display npm root</p>
-
-<h2 id="npm-run-script-3"><a href="../api/npm-run-script.html">npm-run-script(3)</a></h2>
-
-<p>Run arbitrary package scripts</p>
-
-<h2 id="npm-search-3"><a href="../api/npm-search.html">npm-search(3)</a></h2>
-
-<p>Search for packages</p>
-
-<h2 id="npm-shrinkwrap-3"><a href="../api/npm-shrinkwrap.html">npm-shrinkwrap(3)</a></h2>
-
-<p>programmatically generate package shrinkwrap file</p>
-
-<h2 id="npm-start-3"><a href="../api/npm-start.html">npm-start(3)</a></h2>
-
-<p>Start a package</p>
-
-<h2 id="npm-stop-3"><a href="../api/npm-stop.html">npm-stop(3)</a></h2>
-
-<p>Stop a package</p>
-
-<h2 id="npm-submodule-3"><a href="../api/npm-submodule.html">npm-submodule(3)</a></h2>
-
-<p>Add a package as a git submodule</p>
-
-<h2 id="npm-tag-3"><a href="../api/npm-tag.html">npm-tag(3)</a></h2>
-
-<p>Tag a published version</p>
-
-<h2 id="npm-test-3"><a href="../api/npm-test.html">npm-test(3)</a></h2>
-
-<p>Test a package</p>
-
-<h2 id="npm-uninstall-3"><a href="../api/npm-uninstall.html">npm-uninstall(3)</a></h2>
-
-<p>uninstall a package programmatically</p>
-
-<h2 id="npm-unpublish-3"><a href="../api/npm-unpublish.html">npm-unpublish(3)</a></h2>
-
-<p>Remove a package from the registry</p>
-
-<h2 id="npm-update-3"><a href="../api/npm-update.html">npm-update(3)</a></h2>
-
-<p>Update a package</p>
-
-<h2 id="npm-version-3"><a href="../api/npm-version.html">npm-version(3)</a></h2>
-
-<p>Bump a package version</p>
-
-<h2 id="npm-view-3"><a href="../api/npm-view.html">npm-view(3)</a></h2>
-
-<p>View registry info</p>
-
-<h2 id="npm-whoami-3"><a href="../api/npm-whoami.html">npm-whoami(3)</a></h2>
-
-<p>Display npm username</p>
-
-<h2 id="repo-3"><a href="../api/repo.html">repo(3)</a></h2>
-
-<p>Open package repository page in the browser</p>
-
-<h1>Files</h1>
-
-<h2 id="npm-folders-5"><a href="../files/npm-folders.html">npm-folders(5)</a></h2>
-
-<p>Folder Structures Used by npm</p>
-
-<h2 id="npmrc-5"><a href="../files/npmrc.html">npmrc(5)</a></h2>
-
-<p>The npm config files</p>
-
-<h2 id="package-json-5"><a href="../files/package.json.html">package.json(5)</a></h2>
-
-<p>Specifics of npm&#39;s package.json handling</p>
-
-<h1>Misc</h1>
-
-<h2 id="npm-coding-style-7"><a href="../misc/npm-coding-style.html">npm-coding-style(7)</a></h2>
-
-<p>npm&#39;s &quot;funny&quot; coding style</p>
-
-<h2 id="npm-config-7"><a href="../misc/npm-config.html">npm-config(7)</a></h2>
-
-<p>More than you probably want to know about npm configuration</p>
-
-<h2 id="npm-developers-7"><a href="../misc/npm-developers.html">npm-developers(7)</a></h2>
-
-<p>Developer Guide</p>
-
-<h2 id="npm-disputes-7"><a href="../misc/npm-disputes.html">npm-disputes(7)</a></h2>
-
-<p>Handling Module Name Disputes</p>
-
-<h2 id="npm-faq-7"><a href="../misc/npm-faq.html">npm-faq(7)</a></h2>
-
-<p>Frequently Asked Questions</p>
-
-<h2 id="npm-index-7"><a href="../misc/npm-index.html">npm-index(7)</a></h2>
-
-<p>Index of all npm documentation</p>
-
-<h2 id="npm-registry-7"><a href="../misc/npm-registry.html">npm-registry(7)</a></h2>
-
-<p>The JavaScript Package Registry</p>
-
-<h2 id="npm-scripts-7"><a href="../misc/npm-scripts.html">npm-scripts(7)</a></h2>
-
-<p>How npm handles the &quot;scripts&quot; field</p>
-
-<h2 id="removing-npm-7"><a href="../misc/removing-npm.html">removing-npm(7)</a></h2>
-
-<p>Cleaning the Slate</p>
-
-<h2 id="semver-7"><a href="../misc/semver.html">semver(7)</a></h2>
-
-<p>The semantic versioner for npm</p>
-</div>
-<p id="footer">npm-index &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-registry.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,105 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-registry</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../misc/npm-registry.html">npm-registry</a></h1> <p>The JavaScript Package Registry</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>To resolve packages by name and version, npm talks to a registry website
-that implements the CommonJS Package Registry specification for reading
-package info.</p>
-
-<p>Additionally, npm&#39;s package registry implementation supports several
-write APIs as well, to allow for publishing packages and managing user
-account information.</p>
-
-<p>The official public npm registry is at <a href="http://registry.npmjs.org/">http://registry.npmjs.org/</a>.  It
-is powered by a CouchDB database at
-<a href="http://isaacs.iriscouch.com/registry">http://isaacs.iriscouch.com/registry</a>.  The code for the couchapp is
-available at <a href="http://github.com/isaacs/npmjs.org">http://github.com/isaacs/npmjs.org</a>.  npm user accounts
-are CouchDB users, stored in the <a href="http://isaacs.iriscouch.com/_users">http://isaacs.iriscouch.com/_users</a>
-database.</p>
-
-<p>The registry URL is supplied by the <code>registry</code> config parameter.  See
-<code><a href="../cli/npm-config.html">npm-config(1)</a></code>, <code><a href="../files/npmrc.html">npmrc(5)</a></code>, and <code><a href="../misc/npm-config.html">npm-config(7)</a></code> for more on managing
-npm&#39;s configuration.</p>
-
-<h2 id="Can-I-run-my-own-private-registry">Can I run my own private registry?</h2>
-
-<p>Yes!</p>
-
-<p>The easiest way is to replicate the couch database, and use the same (or
-similar) design doc to implement the APIs.</p>
-
-<p>If you set up continuous replication from the official CouchDB, and then
-set your internal CouchDB as the registry config, then you&#39;ll be able
-to read any published packages, in addition to your private ones, and by
-default will only publish internally.  If you then want to publish a
-package for the whole world to see, you can simply override the
-<code>--registry</code> config for that command.</p>
-
-<h2 id="I-don-t-want-my-package-published-in-the-official-registry-It-s-private">I don&#39;t want my package published in the official registry. It&#39;s private.</h2>
-
-<p>Set <code>&quot;private&quot;: true</code> in your package.json to prevent it from being
-published at all, or
-<code>&quot;publishConfig&quot;:{&quot;registry&quot;:&quot;http://my-internal-registry.local&quot;}</code>
-to force it to be published only to your internal registry.</p>
-
-<p>See <code><a href="../files/package.json.html">package.json(5)</a></code> for more info on what goes in the package.json file.</p>
-
-<h2 id="Will-you-replicate-from-my-registry-into-the-public-one">Will you replicate from my registry into the public one?</h2>
-
-<p>No.  If you want things to be public, then publish them into the public
-registry using npm.  What little security there is would be for nought
-otherwise.</p>
-
-<h2 id="Do-I-have-to-use-couchdb-to-build-a-registry-that-npm-can-talk-to">Do I have to use couchdb to build a registry that npm can talk to?</h2>
-
-<p>No, but it&#39;s way easier.  Basically, yes, you do, or you have to
-effectively implement the entire CouchDB API anyway.</p>
-
-<h2 id="Is-there-a-website-or-something-to-see-package-docs-and-such">Is there a website or something to see package docs and such?</h2>
-
-<p>Yes, head over to <a href="https://npmjs.org/">https://npmjs.org/</a></p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-config.html">npm-config(1)</a></li><li><a href="../misc/npm-config.html">npm-config(7)</a></li><li><a href="../files/npmrc.html">npmrc(5)</a></li><li><a href="../misc/npm-developers.html">npm-developers(7)</a></li><li><a href="../misc/npm-disputes.html">npm-disputes(7)</a></li></ul>
-</div>
-<p id="footer">npm-registry &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/npm-scripts.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,257 +0,0 @@
-<!doctype html>
-<html>
-  <title>npm-scripts</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../misc/npm-scripts.html">npm-scripts</a></h1> <p>How npm handles the &quot;scripts&quot; field</p>
-
-<h2 id="DESCRIPTION">DESCRIPTION</h2>
-
-<p>npm supports the &quot;scripts&quot; member of the package.json script, for the
-following scripts:</p>
-
-<ul><li>prepublish:
-Run BEFORE the package is published.  (Also run on local <code>npm
-install</code> without any arguments.)</li><li>publish, postpublish:
-Run AFTER the package is published.</li><li>preinstall:
-Run BEFORE the package is installed</li><li>install, postinstall:
-Run AFTER the package is installed.</li><li>preuninstall, uninstall:
-Run BEFORE the package is uninstalled.</li><li>postuninstall:
-Run AFTER the package is uninstalled.</li><li>preupdate:
-Run BEFORE the package is updated with the update command.</li><li>update, postupdate:
-Run AFTER the package is updated with the update command.</li><li>pretest, test, posttest:
-Run by the <code>npm test</code> command.</li><li>prestop, stop, poststop:
-Run by the <code>npm stop</code> command.</li><li>prestart, start, poststart:
-Run by the <code>npm start</code> command.</li><li>prerestart, restart, postrestart:
-Run by the <code>npm restart</code> command. Note: <code>npm restart</code> will run the
-stop and start scripts if no <code>restart</code> script is provided.</li></ul>
-
-<p>Additionally, arbitrary scripts can be run by doing
-<code>npm run-script &lt;stage&gt; &lt;pkg&gt;</code>.</p>
-
-<h2 id="NOTE-INSTALL-SCRIPTS-ARE-AN-ANTIPATTERN">NOTE: INSTALL SCRIPTS ARE AN ANTIPATTERN</h2>
-
-<p><strong>tl;dr</strong> Don&#39;t use <code>install</code>.  Use a <code>.gyp</code> file for compilation, and
-<code>prepublish</code> for anything else.</p>
-
-<p>You should almost never have to explicitly set a <code>preinstall</code> or
-<code>install</code> script.  If you are doing this, please consider if there is
-another option.</p>
-
-<p>The only valid use of <code>install</code> or <code>preinstall</code> scripts is for
-compilation which must be done on the target architecture.  In early
-versions of node, this was often done using the <code>node-waf</code> scripts, or
-a standalone <code>Makefile</code>, and early versions of npm required that it be
-explicitly set in package.json.  This was not portable, and harder to
-do properly.</p>
-
-<p>In the current version of node, the standard way to do this is using a
-<code>.gyp</code> file.  If you have a file with a <code>.gyp</code> extension in the root
-of your package, then npm will run the appropriate <code>node-gyp</code> commands
-automatically at install time.  This is the only officially supported
-method for compiling binary addons, and does not require that you add
-anything to your package.json file.</p>
-
-<p>If you have to do other things before your package is used, in a way
-that is not dependent on the operating system or architecture of the
-target system, then use a <code>prepublish</code> script instead.  This includes
-tasks such as:</p>
-
-<ul><li>Compile CoffeeScript source code into JavaScript.</li><li>Create minified versions of JavaScript source code.</li><li>Fetching remote resources that your package will use.</li></ul>
-
-<p>The advantage of doing these things at <code>prepublish</code> time instead of
-<code>preinstall</code> or <code>install</code> time is that they can be done once, in a
-single place, and thus greatly reduce complexity and variability.
-Additionally, this means that:</p>
-
-<ul><li>You can depend on <code>coffee-script</code> as a <code>devDependency</code>, and thus
-your users don&#39;t need to have it installed.</li><li>You don&#39;t need to include the minifiers in your package, reducing
-the size for your users.</li><li>You don&#39;t need to rely on your users having <code>curl</code> or <code>wget</code> or
-other system tools on the target machines.</li></ul>
-
-<h2 id="DEFAULT-VALUES">DEFAULT VALUES</h2>
-
-<p>npm will default some script values based on package contents.</p>
-
-<ul><li><p><code>&quot;start&quot;: &quot;node server.js&quot;</code>:</p><p>If there is a <code>server.js</code> file in the root of your package, then npm
-will default the <code>start</code> command to <code>node server.js</code>.</p></li><li><p><code>&quot;preinstall&quot;: &quot;node-waf clean || true; node-waf configure build&quot;</code>:</p><p>If there is a <code>wscript</code> file in the root of your package, npm will
-default the <code>preinstall</code> command to compile using node-waf.</p></li></ul>
-
-<h2 id="USER">USER</h2>
-
-<p>If npm was invoked with root privileges, then it will change the uid
-to the user account or uid specified by the <code>user</code> config, which
-defaults to <code>nobody</code>.  Set the <code>unsafe-perm</code> flag to run scripts with
-root privileges.</p>
-
-<h2 id="ENVIRONMENT">ENVIRONMENT</h2>
-
-<p>Package scripts run in an environment where many pieces of information
-are made available regarding the setup of npm and the current state of
-the process.</p>
-
-<h3 id="path">path</h3>
-
-<p>If you depend on modules that define executable scripts, like test
-suites, then those executables will be added to the <code>PATH</code> for
-executing the scripts.  So, if your package.json has this:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;dependencies&quot; : { &quot;bar&quot; : &quot;0.1.x&quot; }
-, &quot;scripts&quot;: { &quot;start&quot; : &quot;bar ./test&quot; } }</code></pre>
-
-<p>then you could run <code>npm start</code> to execute the <code>bar</code> script, which is
-exported into the <code>node_modules/.bin</code> directory on <code>npm install</code>.</p>
-
-<h3 id="package-json-vars">package.json vars</h3>
-
-<p>The package.json fields are tacked onto the <code>npm_package_</code> prefix. So,
-for instance, if you had <code>{&quot;name&quot;:&quot;foo&quot;, &quot;version&quot;:&quot;1.2.5&quot;}</code> in your
-package.json file, then your package scripts would have the
-<code>npm_package_name</code> environment variable set to &quot;foo&quot;, and the
-<code>npm_package_version</code> set to &quot;1.2.5&quot;</p>
-
-<h3 id="configuration">configuration</h3>
-
-<p>Configuration parameters are put in the environment with the
-<code>npm_config_</code> prefix. For instance, you can view the effective <code>root</code>
-config by checking the <code>npm_config_root</code> environment variable.</p>
-
-<h3 id="Special-package-json-config-hash">Special: package.json &quot;config&quot; hash</h3>
-
-<p>The package.json &quot;config&quot; keys are overwritten in the environment if
-there is a config param of <code>&lt;name&gt;[@&lt;version&gt;]:&lt;key&gt;</code>.  For example,
-if the package.json has this:</p>
-
-<pre><code>{ &quot;name&quot; : &quot;foo&quot;
-, &quot;config&quot; : { &quot;port&quot; : &quot;8080&quot; }
-, &quot;scripts&quot; : { &quot;start&quot; : &quot;node server.js&quot; } }</code></pre>
-
-<p>and the server.js is this:</p>
-
-<pre><code>http.createServer(...).listen(process.env.npm_package_config_port)</code></pre>
-
-<p>then the user could change the behavior by doing:</p>
-
-<pre><code>npm config set foo:port 80</code></pre>
-
-<h3 id="current-lifecycle-event">current lifecycle event</h3>
-
-<p>Lastly, the <code>npm_lifecycle_event</code> environment variable is set to
-whichever stage of the cycle is being executed. So, you could have a
-single script used for different parts of the process which switches
-based on what&#39;s currently happening.</p>
-
-<p>Objects are flattened following this format, so if you had
-<code>{&quot;scripts&quot;:{&quot;install&quot;:&quot;foo.js&quot;}}</code> in your package.json, then you&#39;d
-see this in the script:</p>
-
-<pre><code>process.env.npm_package_scripts_install === &quot;foo.js&quot;</code></pre>
-
-<h2 id="EXAMPLES">EXAMPLES</h2>
-
-<p>For example, if your package.json contains this:</p>
-
-<pre><code>{ &quot;scripts&quot; :
-  { &quot;install&quot; : &quot;scripts/install.js&quot;
-  , &quot;postinstall&quot; : &quot;scripts/install.js&quot;
-  , &quot;uninstall&quot; : &quot;scripts/uninstall.js&quot;
-  }
-}</code></pre>
-
-<p>then the <code>scripts/install.js</code> will be called for the install,
-post-install, stages of the lifecycle, and the <code>scripts/uninstall.js</code>
-would be called when the package is uninstalled.  Since
-<code>scripts/install.js</code> is running for three different phases, it would
-be wise in this case to look at the <code>npm_lifecycle_event</code> environment
-variable.</p>
-
-<p>If you want to run a make command, you can do so.  This works just
-fine:</p>
-
-<pre><code>{ &quot;scripts&quot; :
-  { &quot;preinstall&quot; : &quot;./configure&quot;
-  , &quot;install&quot; : &quot;make &amp;&amp; make install&quot;
-  , &quot;test&quot; : &quot;make test&quot;
-  }
-}</code></pre>
-
-<h2 id="EXITING">EXITING</h2>
-
-<p>Scripts are run by passing the line as a script argument to <code>sh</code>.</p>
-
-<p>If the script exits with a code other than 0, then this will abort the
-process.</p>
-
-<p>Note that these script files don&#39;t have to be nodejs or even
-javascript programs. They just have to be some kind of executable
-file.</p>
-
-<h2 id="HOOK-SCRIPTS">HOOK SCRIPTS</h2>
-
-<p>If you want to run a specific script at a specific lifecycle event for
-ALL packages, then you can use a hook script.</p>
-
-<p>Place an executable file at <code>node_modules/.hooks/{eventname}</code>, and
-it&#39;ll get run for all packages when they are going through that point
-in the package lifecycle for any packages installed in that root.</p>
-
-<p>Hook scripts are run exactly the same way as package.json scripts.
-That is, they are in a separate child process, with the env described
-above.</p>
-
-<h2 id="BEST-PRACTICES">BEST PRACTICES</h2>
-
-<ul><li>Don&#39;t exit with a non-zero error code unless you <em>really</em> mean it.
-Except for uninstall scripts, this will cause the npm action to
-fail, and potentially be rolled back.  If the failure is minor or
-only will prevent some optional features, then it&#39;s better to just
-print a warning and exit successfully.</li><li>Try not to use scripts to do what npm can do for you.  Read through
-<code><a href="../files/package.json.html">package.json(5)</a></code> to see all the things that you can specify and enable
-by simply describing your package appropriately.  In general, this
-will lead to a more robust and consistent state.</li><li>Inspect the env to determine where to put things.  For instance, if
-the <code>npm_config_binroot</code> environ is set to <code>/home/user/bin</code>, then
-don&#39;t try to install executables into <code>/usr/local/bin</code>.  The user
-probably set it up that way for a reason.</li><li>Don&#39;t prefix your script commands with &quot;sudo&quot;.  If root permissions
-are required for some reason, then it&#39;ll fail with that error, and
-the user will sudo the npm command in question.</li></ul>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../cli/npm-run-script.html">npm-run-script(1)</a></li><li><a href="../files/package.json.html">package.json(5)</a></li><li><a href="../misc/npm-developers.html">npm-developers(7)</a></li><li><a href="../cli/npm-install.html">npm-install(1)</a></li></ul>
-</div>
-<p id="footer">npm-scripts &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/removing-npm.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,92 +0,0 @@
-<!doctype html>
-<html>
-  <title>removing-npm</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../cli/npm-removal.html">npm-removal</a></h1> <p>Cleaning the Slate</p>
-
-<h2 id="SYNOPSIS">SYNOPSIS</h2>
-
-<p>So sad to see you go.</p>
-
-<pre><code>sudo npm uninstall npm -g</code></pre>
-
-<p>Or, if that fails, get the npm source code, and do:</p>
-
-<pre><code>sudo make uninstall</code></pre>
-
-<h2 id="More-Severe-Uninstalling">More Severe Uninstalling</h2>
-
-<p>Usually, the above instructions are sufficient.  That will remove
-npm, but leave behind anything you&#39;ve installed.</p>
-
-<p>If that doesn&#39;t work, or if you require more drastic measures,
-continue reading.</p>
-
-<p>Note that this is only necessary for globally-installed packages.  Local
-installs are completely contained within a project&#39;s <code>node_modules</code>
-folder.  Delete that folder, and everything is gone (unless a package&#39;s
-install script is particularly ill-behaved).</p>
-
-<p>This assumes that you installed node and npm in the default place.  If
-you configured node with a different <code>--prefix</code>, or installed npm with a
-different prefix setting, then adjust the paths accordingly, replacing
-<code>/usr/local</code> with your install prefix.</p>
-
-<p>To remove everything npm-related manually:</p>
-
-<pre><code>rm -rf /usr/local/{lib/node{,/.npm,_modules},bin,share/man}/npm*</code></pre>
-
-<p>If you installed things <em>with</em> npm, then your best bet is to uninstall
-them with npm first, and then install them again once you have a
-proper install.  This can help find any symlinks that are lying
-around:</p>
-
-<pre><code>ls -laF /usr/local/{lib/node{,/.npm},bin,share/man} | grep npm</code></pre>
-
-<p>Prior to version 0.3, npm used shim files for executables and node
-modules.  To track those down, you can do the following:</p>
-
-<pre><code>find /usr/local/{lib/node,bin} -exec grep -l npm \{\} \; ;</code></pre>
-
-<p>(This is also in the <a href="../../doc/README.html">README</a> file.)</p>
-
-<h2 id="SEE-ALSO">SEE ALSO</h2>
-
-<ul><li><a href="../../doc/README.html">README</a></li><li><a href="../cli/npm-rm.html">npm-rm(1)</a></li><li><a href="../cli/npm-prune.html">npm-prune(1)</a></li></ul>
-</div>
-<p id="footer">removing-npm &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/doc/misc/semver.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,128 +0,0 @@
-<!doctype html>
-<html>
-  <title>semver</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
-<h1><a href="../misc/semver.html">semver</a></h1> <p>The semantic versioner for npm</p>
-
-<h2 id="Usage">Usage</h2>
-
-<pre><code>$ npm install semver
-
-semver.valid(&#39;1.2.3&#39;) // &#39;1.2.3&#39;
-semver.valid(&#39;a.b.c&#39;) // null
-semver.clean(&#39;  =v1.2.3   &#39;) // &#39;1.2.3&#39;
-semver.satisfies(&#39;1.2.3&#39;, &#39;1.x || &gt;=2.5.0 || 5.0.0 - 7.2.3&#39;) // true
-semver.gt(&#39;1.2.3&#39;, &#39;9.8.7&#39;) // false
-semver.lt(&#39;1.2.3&#39;, &#39;9.8.7&#39;) // true</code></pre>
-
-<p>As a command-line utility:</p>
-
-<pre><code>$ semver -h
-
-Usage: semver &lt;version&gt; [&lt;version&gt; [...]] [-r &lt;range&gt; | -i &lt;inc&gt; | -d &lt;dec&gt;]
-Test if version(s) satisfy the supplied range(s), and sort them.
-
-Multiple versions or ranges may be supplied, unless increment
-or decrement options are specified.  In that case, only a single
-version may be used, and it is incremented by the specified level
-
-Program exits successfully if any valid version satisfies
-all supplied ranges, and prints all satisfying versions.
-
-If no versions are valid, or ranges are not satisfied,
-then exits failure.
-
-Versions are printed in ascending order, so supplying
-multiple versions to the utility will just sort them.</code></pre>
-
-<h2 id="Versions">Versions</h2>
-
-<p>A &quot;version&quot; is described by the v2.0.0 specification found at
-<a href="http://semver.org/">http://semver.org/</a>.</p>
-
-<p>A leading <code>&quot;=&quot;</code> or <code>&quot;v&quot;</code> character is stripped off and ignored.</p>
-
-<h2 id="Ranges">Ranges</h2>
-
-<p>The following range styles are supported:</p>
-
-<ul><li><code>1.2.3</code> A specific version.  When nothing else will do.  Note that
-build metadata is still ignored, so <code>1.2.3+build2012</code> will satisfy
-this range.</li><li><code>&gt;1.2.3</code> Greater than a specific version.</li><li><code>&lt;1.2.3</code> Less than a specific version.  If there is no prerelease
-tag on the version range, then no prerelease version will be allowed
-either, even though these are technically &quot;less than&quot;.</li><li><code>&gt;=1.2.3</code> Greater than or equal to.  Note that prerelease versions
-are NOT equal to their &quot;normal&quot; equivalents, so <code>1.2.3-beta</code> will
-not satisfy this range, but <code>2.3.0-beta</code> will.</li><li><code>&lt;=1.2.3</code> Less than or equal to.  In this case, prerelease versions
-ARE allowed, so <code>1.2.3-beta</code> would satisfy.</li><li><code>1.2.3 - 2.3.4</code> := <code>&gt;=1.2.3 &lt;=2.3.4</code></li><li><code>~1.2.3</code> := <code>&gt;=1.2.3-0 &lt;1.3.0-0</code>  &quot;Reasonably close to 1.2.3&quot;.  When
-using tilde operators, prerelease versions are supported as well,
-but a prerelease of the next significant digit will NOT be
-satisfactory, so <code>1.3.0-beta</code> will not satisfy <code>~1.2.3</code>.</li><li><code>~1.2</code> := <code>&gt;=1.2.0-0 &lt;1.3.0-0</code> &quot;Any version starting with 1.2&quot;</li><li><code>1.2.x</code> := <code>&gt;=1.2.0-0 &lt;1.3.0-0</code> &quot;Any version starting with 1.2&quot;</li><li><code>~1</code> := <code>&gt;=1.0.0-0 &lt;2.0.0-0</code> &quot;Any version starting with 1&quot;</li><li><code>1.x</code> := <code>&gt;=1.0.0-0 &lt;2.0.0-0</code> &quot;Any version starting with 1&quot;</li></ul>
-
-<p>Ranges can be joined with either a space (which implies &quot;and&quot;) or a
-<code>||</code> (which implies &quot;or&quot;).</p>
-
-<h2 id="Functions">Functions</h2>
-
-<p>All methods and classes take a final <code>loose</code> boolean argument that, if
-true, will be more forgiving about not-quite-valid semver strings.
-The resulting output will always be 100% strict, of course.</p>
-
-<p>Strict-mode Comparators and Ranges will be strict about the SemVer
-strings that they parse.</p>
-
-<ul><li>valid(v): Return the parsed version, or null if it&#39;s not valid.</li><li>inc(v, release): Return the version incremented by the release type
-(major, minor, patch, or prerelease), or null if it&#39;s not valid.</li></ul>
-
-<h3 id="Comparison">Comparison</h3>
-
-<ul><li>gt(v1, v2): <code>v1 &gt; v2</code></li><li>gte(v1, v2): <code>v1 &gt;= v2</code></li><li>lt(v1, v2): <code>v1 &lt; v2</code></li><li>lte(v1, v2): <code>v1 &lt;= v2</code></li><li>eq(v1, v2): <code>v1 == v2</code> This is true if they&#39;re logically equivalent,
-even if they&#39;re not the exact same string.  You already know how to
-compare strings.</li><li>neq(v1, v2): <code>v1 != v2</code> The opposite of eq.</li><li>cmp(v1, comparator, v2): Pass in a comparison string, and it&#39;ll call
-the corresponding function above.  <code>&quot;===&quot;</code> and <code>&quot;!==&quot;</code> do simple
-string comparison, but are included for completeness.  Throws if an
-invalid comparison string is provided.</li><li>compare(v1, v2): Return 0 if v1 == v2, or 1 if v1 is greater, or -1 if
-v2 is greater.  Sorts in ascending order if passed to Array.sort().</li><li>rcompare(v1, v2): The reverse of compare.  Sorts an array of versions
-in descending order when passed to Array.sort().</li></ul>
-
-<h3 id="Ranges">Ranges</h3>
-
-<ul><li>validRange(range): Return the valid range or null if it&#39;s not valid</li><li>satisfies(version, range): Return true if the version satisfies the
-range.</li><li>maxSatisfying(versions, range): Return the highest version in the list
-that satisfies the range, or null if none of them do.</li></ul>
-</div>
-<p id="footer">semver &mdash; npm@1.3.14</p>
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/docfoot-script.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-<script>
-;(function () {
-var wrapper = document.getElementById("wrapper")
-var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0)
-  .filter(function (el) {
-    return el.parentNode === wrapper
-        && el.tagName.match(/H[1-6]/)
-        && el.id
-  })
-var l = 2
-  , toc = document.createElement("ul")
-toc.innerHTML = els.map(function (el) {
-  var i = el.tagName.charAt(1)
-    , out = ""
-  while (i > l) {
-    out += "<ul>"
-    l ++
-  }
-  while (i < l) {
-    out += "</ul>"
-    l --
-  }
-  out += "<li><a href='#" + el.id + "'>" +
-    ( el.innerText || el.text || el.innerHTML)
-    + "</a>"
-  return out
-}).join("\n")
-toc.id = "toc"
-document.body.appendChild(toc)
-})()
-</script>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/docfoot.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-</div>
-<p id="footer">@NAME@ &mdash; npm@@VERSION@</p>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/dochead.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,8 +0,0 @@
-<!doctype html>
-<html>
-  <title>@NAME@</title>
-  <meta http-equiv="content-type" value="text/html;utf-8">
-  <link rel="stylesheet" type="text/css" href="../../static/style.css">
-
-  <body>
-    <div id="wrapper">
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/favicon.ico has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/index.html	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,95 +0,0 @@
-<!doctype html>
-
-<html>
-<head>
-<style>
-  html { background:#202050;
-	font-family:CentSchbook Mono BT, Bitstream Vera Sans Mono, monofont, monospace;
-  }
-body { background:#ddd; width:600px; border:10px solid #fff; margin:2em auto; padding:2em }
-h1 {
-  font-size:200px;
-  line-height:1;
-  font-family:"gubblebum-blocky", monospace;
-  color:#f00;
-  text-align:center;
-  padding:0;
-  margin:0 auto;
-  text-indent:-999em;
-  height:202px;
-  width:519px;
-  background:url(npm.png) center;
-}
-h2 {
-  color:#202050;
-  font-size:100%;
-}
-p, ul, ol { margin:1em 0 0; padding:0 }
-li { list-style-position:inside }
-a { color:#f00; text-decoration:none; }
-a:hover { text-decoration:underline; }
-code { background:#fff ; outline: 1px solid #ccc; padding:0 2px; }
-
-@font-face {
-	font-family:monofont;
-	src: url(http://foohack.com/tpl/fonts/Bitstream-Vera-Sans-Mono/VeraMono.ttf) format("truetype");
-}
-@font-face {
-	font-family:monofont;
-	font-style:italic;
-	src: url(http://foohack.com/tpl/fonts/Bitstream-Vera-Sans-Mono/VeraMoIt.ttf) format("truetype");
-}
-@font-face {
-	font-family:monofont;
-	font-weight:bold;
-	src: url(http://foohack.com/tpl/fonts/Bitstream-Vera-Sans-Mono/VeraMoBd.ttf) format("truetype");
-}
-@font-face {
-	font-family:monofont;
-	font-style:italic;
-	font-weight:bold;
-	src: url(http://foohack.com/tpl/fonts/Bitstream-Vera-Sans-Mono/VeraMoBI.ttf) format("truetype");
-}
-
-</style>
-	<title>npm - Node Package Manager</title>
-</head>
-<h1>npm</h1>
-
-<p>npm is a package manager for <a href="http://nodejs.org/">node</a>.  You can use it to install
-  and publish your node programs.  It manages dependencies and does other cool stuff.</p>
-
-<h2>Easy Zero Line Install</h2>
-
-<p><a href="http://nodejs.org/#download">Install Node.js</a> <br>
-(npm comes with it.)</p>
-
-<p>Because a one-line install is one too many.</p>
-
-<h2>Fancy Install</h2>
-
-<ol>
-  <li><a href="https://github.com/isaacs/npm">Get the code.</a>
-  <li>Do what <a href="https://npmjs.org/doc/README.html">the README</a>
-      says to do.
-</ol>
-
-<p>There's a pretty thorough install script at
-<a href="https://npmjs.org/install.sh">https://npmjs.org/install.sh</a></p>
-
-<p>For maximum security, make sure to thorougly inspect every
-program that you run on your computer!</p>
-
-<h2>Other Cool Stuff</h2>
-
-<ul>
-  <li><a href="https://npmjs.org/doc/README.html">README</a>
-  <li><a href="doc/">Help Documentation</a>
-  <li><a href="doc/faq.html">FAQ</a>
-  <li><a href="https://search.npmjs.org/">Search for Packages</a>
-  <li><a href="https://groups.google.com/group/npm-">Mailing List</a>
-  <li><a href="https://github.com/isaacs/npm/issues">Bugs</a>
-</ul>
-
-</body>
-</html>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/static/style.css	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,350 +0,0 @@
-/*
-
-Webfont: Gubblebum Blocky by Jelloween
-License: http://www.myfonts.com/viewlicense?type=web&buildid=2303021
-Webfonts copyright: Copyright (c) 2007 by Tjarda Koster. All rights reserved.
-
-"Gubblebum Blocky" font
-Copyright (c) 2007 by Tjarda Koster, http://jelloween.deviantart.com
-included for use in the npm website and documentation,
-used with permission.
-
-*/
-
-@font-face {
-  font-family: gubblefont;
-  src: url('webfonts/23242D_3_0.eot');
-  src: url('webfonts/23242D_3_0.eot?#iefix') format('embedded-opentype'),
-       url('webfonts/23242D_3_0.woff') format('woff'),
-       url('webfonts/23242D_3_0.ttf') format('truetype');
-}
-
-/* reset */
-* {
-    margin:0;
-    padding:0;
-    border:none;
-    font-family:inherit;
-    font-size:inherit;
-    font-weight:inherit;
-}
-:target::before {
-  content:" >>> ";
-  position:absolute;
-  display:block;
-  opacity:0.5;
-  color:#f00;
-  margin:0 0 0 -2em;
-}
-abbr, acronym {
-  border-bottom:1px dotted #aaa;
-}
-kbd, code, pre {
-  font-family:monospace;
-    margin:0;
-    font-size:18px;
-    line-height:24px;
-  background:#eee;
-  outline:1px solid #ccc;
-}
-kbd code, kbd pre, kbd kbd,
-pre code, pre pre, pre kbd,
-code code, code pre, code kbd { outline: none }
-.dollar::before {
-  content:"$ ";
-  display:inline;
-}
-p, ul, ol, dl, pre {
-    margin:30px 0;
-    line-height:30px;
-}
-hr {
-    margin:30px auto 29px;
-    width:66%;
-    height:1px;
-    background:#aaa;
-}
-pre {
-    display:block;
-}
-dd :first-child {
-    margin-top:0;
-}
-
-body {
-    quotes:"“" "”" "‘" "’";
-    width:666px;
-    margin:30px auto 120px;
-    font-family:Times New Roman, serif;
-    font-size:20px;
-    background:#fff;
-    line-height:30px;
-    color:#111;
-}
-
-blockquote {
-    position:relative;
-    font-size:16px;
-    line-height:30px;
-    font-weight:bold;
-    width:85%;
-    margin:0 auto;
-}
-blockquote::before {
-    font-size:90px;
-    display:block;
-    position:absolute;
-    top:20px;
-    right:100%;
-    content:"“";
-    padding-right:10px;
-    color:#ccc;
-}
-.source cite::before {
-    content:"— ";
-}
-.source {
-    padding-left:20%;
-    margin-top:30px;
-}
-.source cite span {
-    font-style:normal;
-}
-blockquote p {
-    margin-bottom:0;
-}
-.quote blockquote {
-    font-weight:normal;
-}
-
-h1, h2, h3, h4, h5, h6, dt, #header {
-  font-family:serif;
-  font-size:20px;
-  font-weight:bold;
-}
-h2 {
-  background:#eee;
-}
-h1, h2 {
-  line-height:40px;
-}
-
-i, em, cite {
-    font-style:italic;
-}
-b, strong { 
-    font-weight:bold;
-}
-i, em, cite, b, strong, small {
-    line-height:28px;
-}
-small, .small, .small *, aside {
-    font-style:italic;
-    color:#669;
-    font-size:18px;
-}
-spall a, .small a {
-    text-decoration:underline;
-}
-del {
-    text-decoration:line-through;
-}
-ins {
-    text-decoration:underline;
-}
-.alignright { display:block; float:right; margin-left:1em; }
-.alignleft { display:block; float:left; margin-right:1em; }
-
-q:before, q q q:before, q q q q q:before, q q q q q q q:before { content:"“"; }
-q q:before, q q q q:before, q q q q q q:before, q q q q q q q q:before { content:"‘"; }
-q:after, q q q:after, q q q q q:after, q q q q q q q:after { content:"”"; }
-q q:after, q q q q:after, q q q q q q:after, q q q q q q q q:after { content:"’"; }
-
-a { color:#00f; text-decoration:none; }
-a:visited { color:#636; }
-a:hover, a:active { color:#900!important; text-decoration:underline; }
-
-h1 {
-  font-weight:bold;
-  background:#fff;
-}
-h1 a, h1 a:visited {
-  font-family:gubblefont, Gubblebum-Blocky, GubbleBum Blocky, GubbleBum, monospace;
-  font-size:60px;
-  color:#900;
-  display:block;
-}
-h1 a:focus, h1 a:hover, h1 a:active {
-  color:#f00!important;
-  text-decoration:none;
-}
-
-.navigation {
-    display:table;
-    width:100%;
-    margin:0 0 30px 0;
-    position:relative;
-}
-#nav-above {
-    margin-bottom:0;
-}
-.navigation .nav-previous {
-    display:table-cell;
-    text-align:left;
-    width:50%;
-}
-/* hang the » and « off into the margins */
-.navigation .nav-previous a:before, .navigation .nav-next a:after {
-    content: "«";
-    display:block;
-    height:30px;
-    margin-bottom:-30px;
-    text-decoration:none;
-    margin-left:-15px;
-}
-.navigation .nav-next a:after {
-    content: "»";
-    text-align:right;
-    margin-left:0;
-    margin-top:-30px;
-    margin-right:-15px;
-}
-
-
-.navigation .nav-next {
-    display:table-cell;
-    text-align:right;
-    width:50%;
-}
-.navigation a {
-    display:block;
-    width:100%;
-    height:100%;
-}
-
-input, button, textarea {
-    border:0;
-    line-height:30px;
-}
-textarea {
-    height:300px;
-}
-input {
-    height:30px;
-    line-height:30px;
-}
-input.submit, input#submit, input.button, button, input[type=submit] {
-    cursor:hand; cursor:pointer;
-    outline:1px solid #ccc;
-}
-
-#wrapper {
-    margin-bottom:90px;
-    position:relative;
-    z-index:1;
-    *zoom:1;
-    background:#fff;
-}
-#wrapper:after {
-    display:block;
-    content:".";
-    visibility:hidden;
-    width:0;
-    height:0;
-    clear:both;
-}
-
-.sidebar .xoxo > li {
-    float:left;
-    width:50%;
-}
-.sidebar li {
-    list-style:none;
-}
-.sidebar #elsewhere {
-    margin-left:-10%;
-    margin-right:-10%;
-}
-.sidebar #rss-links, .sidebar #twitter-feeds {
-    float:right;
-    clear:right;
-    width:20%;
-}
-.sidebar #comment {
-  clear:both;
-  float:none;
-  width:100%;
-}
-.sidebar #search {
-    clear:both;
-    float:none;
-    width:100%;
-}
-.sidebar #search h2 {
-    margin-left:40%;
-}
-.sidebar #search #s {
-    width:90%;
-    float:left;
-}
-.sidebar #search #searchsubmit {
-    width:10%;
-    float:right;
-}
-.sidebar * {
-    font-size:15px;
-    line-height:30px;
-}
-
-#footer, #footer * {
-  text-align:right;
-  font-size:16px;
-  color:#ccc;
-  font-style:italic;
-  word-spacing:1em;
-}
-
-#toc {
-  position:absolute;
-  top:0;
-  right:0;
-  padding:40px 0 40px 20px;
-  margin:0;
-  width:200px;
-  opacity:0.2;
-  z-index:-1;
-}
-#toc:hover {
-  opacity:1;
-  background:#fff;
-  z-index:999;
-}
-#toc ul {
-  padding:0;
-  margin:0;
-}
-#toc, #toc li {
-  list-style-type:none;
-  font-size:15px;
-  line-height:15px;
-}
-#toc li {
-  padding:0 0 0 10px;
-}
-#toc li a {
-  position:relative;
-  display:block;
-}
-
-@media print {
-    a[href] {
-        color:inherit;
-    }
-    a[href]:after {
-        white-space:nowrap;
-        content:" " attr(href);
-    }
-    a[href^=\#], .navigation {
-        display:none;
-    }
-}
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/static/webfonts/23242D_3_0.eot has changed
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/static/webfonts/23242D_3_0.ttf has changed
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/html/static/webfonts/23242D_3_0.woff has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/adduser.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,138 +0,0 @@
-
-module.exports = adduser
-
-var log = require("npmlog")
-  , npm = require("./npm.js")
-  , registry = npm.registry
-  , read = require("read")
-  , userValidate = require("npm-user-validate")
-  , crypto
-
-try {
-  crypto = process.binding("crypto") && require("crypto")
-} catch (ex) {}
-
-adduser.usage = "npm adduser\nThen enter stuff at the prompts"
-
-function adduser (args, cb) {
-  if (!crypto) return cb(new Error(
-    "You must compile node with ssl support to use the adduser feature"))
-
-  var c = { u : npm.config.get("username")
-          , p : npm.config.get("_password")
-          , e : npm.config.get("email")
-          }
-    , changed = false
-    , u = {}
-    , fns = [readUsername, readPassword, readEmail, save]
-
-  loop()
-  function loop (er) {
-    if (er) return cb(er)
-    var fn = fns.shift()
-    if (fn) return fn(c, u, loop)
-    cb()
-  }
-}
-
-function readUsername (c, u, cb) {
-  var v = userValidate.username
-  read({prompt: "Username: ", default: c.u}, function (er, un) {
-    if (er) {
-      return cb(er.message === "cancelled" ? er.message : er)
-    }
-
-    // make sure it's valid.  we have to do this here, because
-    // couchdb will only ever say "bad password" with a 401 when
-    // you try to PUT a _users record that the validate_doc_update
-    // rejects for *any* reason.
-
-    if (!un) {
-      return readUsername(c, u, cb)
-    }
-
-    var error = v(un)
-    if (error) {
-      log.warn(error.message)
-      return readUsername(c, u, cb)
-    }
-
-    c.changed = c.u !== un
-    u.u = un
-    cb(er)
-  })
-}
-
-function readPassword (c, u, cb) {
-  var v = userValidate.pw
-
-  if (!c.changed) {
-    u.p = c.p
-    return cb()
-  }
-  read({prompt: "Password: ", silent: true}, function (er, pw) {
-    if (er) {
-      return cb(er.message === "cancelled" ? er.message : er)
-    }
-
-    if (!pw) {
-      return readPassword(c, u, cb)
-    }
-
-    var error = v(pw)
-    if (error) {
-      log.warn(error.message)
-      return readPassword(c, u, cb)
-    }
-
-    u.p = pw
-    cb(er)
-  })
-}
-
-function readEmail (c, u, cb) {
-  var v = userValidate.email
-
-  read({prompt: "Email: ", default: c.e}, function (er, em) {
-    if (er) {
-      return cb(er.message === "cancelled" ? er.message : er)
-    }
-
-    if (!em) {
-      return readEmail(c, u, cb)
-    }
-
-    var error = v(em)
-    if (error) {
-      log.warn(error.message)
-      return readEmail(c, u, cb)
-    }
-
-    u.e = em
-    cb(er)
-  })
-}
-
-function save (c, u, cb) {
-  if (c.changed) {
-    delete registry.auth
-    delete registry.username
-    delete registry.password
-    registry.username = u.u
-    registry.password = u.p
-  }
-
-  // save existing configs, but yank off for this PUT
-  registry.adduser(u.u, u.p, u.e, function (er) {
-    if (er) return cb(er)
-    registry.username = u.u
-    registry.password = u.p
-    registry.email = u.e
-    npm.config.set("username", u.u, "user")
-    npm.config.set("_password", u.p, "user")
-    npm.config.set("email", u.e, "user")
-    npm.config.del("_token", "user")
-    log.info("adduser", "Authorized user %s", u.u)
-    npm.config.save("user", cb)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/bin.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-module.exports = bin
-
-var npm = require("./npm.js")
-
-bin.usage = "npm bin\nnpm bin -g\n(just prints the bin folder)"
-
-function bin (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-  var b = npm.bin
-    , PATH = (process.env.PATH || "").split(":")
-
-  if (!silent) console.log(b)
-  process.nextTick(cb.bind(this, null, b))
-
-  if (npm.config.get("global") && PATH.indexOf(b) === -1) {
-    npm.config.get("logstream").write("(not in PATH env variable)\n")
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/bugs.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,43 +0,0 @@
-
-module.exports = bugs
-
-bugs.usage = "npm bugs <pkgname>"
-
-var npm = require("./npm.js")
-  , registry = npm.registry
-  , log = require("npmlog")
-  , opener = require("opener")
-
-bugs.completion = function (opts, cb) {
-  if (opts.conf.argv.remain.length > 2) return cb()
-  registry.get("/-/short", 60000, function (er, list) {
-    return cb(null, list || [])
-  })
-}
-
-function bugs (args, cb) {
-  if (!args.length) return cb(bugs.usage)
-  var n = args[0].split("@").shift()
-  registry.get(n + "/latest", 3600, function (er, d) {
-    if (er) return cb(er)
-    var bugs = d.bugs
-      , repo = d.repository || d.repositories
-      , url
-    if (bugs) {
-      url = (typeof bugs === "string") ? bugs : bugs.url
-    } else if (repo) {
-      if (Array.isArray(repo)) repo = repo.shift()
-      if (repo.hasOwnProperty("url")) repo = repo.url
-      log.verbose("repository", repo)
-      if (repo && repo.match(/^(https?:\/\/|git(:\/\/|@))github.com/)) {
-        url = repo.replace(/^git(@|:\/\/)/, "https://")
-                  .replace(/^https?:\/\/github.com:/, "https://github.com/")
-                  .replace(/\.git$/, '')+"/issues"
-      }
-    }
-    if (!url) {
-      url = "https://npmjs.org/package/" + d.name
-    }
-    opener(url, { command: npm.config.get("browser") }, cb)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/build.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,225 +0,0 @@
-// npm build command
-
-// everything about the installation after the creation of
-// the .npm/{name}/{version}/package folder.
-// linking the modules into the npm.root,
-// resolving dependencies, etc.
-
-// This runs AFTER install or link are completed.
-
-var npm = require("./npm.js")
-  , log = require("npmlog")
-  , chain = require("slide").chain
-  , fs = require("graceful-fs")
-  , path = require("path")
-  , lifecycle = require("./utils/lifecycle.js")
-  , readJson = require("read-package-json")
-  , link = require("./utils/link.js")
-  , linkIfExists = link.ifExists
-  , cmdShim = require("cmd-shim")
-  , cmdShimIfExists = cmdShim.ifExists
-  , asyncMap = require("slide").asyncMap
-
-module.exports = build
-build.usage = "npm build <folder>\n(this is plumbing)"
-
-build._didBuild = {}
-build._noLC = {}
-function build (args, global, didPre, didRB, cb) {
-  if (typeof cb !== "function") cb = didRB, didRB = false
-  if (typeof cb !== "function") cb = didPre, didPre = false
-  if (typeof cb !== "function") {
-    cb = global, global = npm.config.get("global")
-  }
-  // it'd be nice to asyncMap these, but actually, doing them
-  // in parallel generally munges up the output from node-waf
-  var builder = build_(global, didPre, didRB)
-  chain(args.map(function (arg) { return function (cb) {
-    builder(arg, cb)
-  }}), cb)
-}
-
-function build_ (global, didPre, didRB) { return function (folder, cb) {
-  folder = path.resolve(folder)
-  build._didBuild[folder] = true
-  log.info("build", folder)
-  readJson(path.resolve(folder, "package.json"), function (er, pkg) {
-    if (er) return cb(er)
-    chain
-      ( [ !didPre && [lifecycle, pkg, "preinstall", folder]
-        , [linkStuff, pkg, folder, global, didRB]
-        , pkg.name === "npm" && [writeBuiltinConf, folder]
-        , didPre !== build._noLC && [lifecycle, pkg, "install", folder]
-        , didPre !== build._noLC && [lifecycle, pkg, "postinstall", folder]
-        , didPre !== build._noLC
-          && npm.config.get("npat")
-          && [lifecycle, pkg, "test", folder] ]
-      , cb )
-  })
-}}
-
-function writeBuiltinConf (folder, cb) {
-  // the builtin config is "sticky". Any time npm installs itself,
-  // it puts its builtin config file there, as well.
-  if (!npm.config.usingBuiltin
-      || folder !== path.dirname(__dirname)) {
-    return cb()
-  }
-  npm.config.save("builtin", cb)
-}
-
-function linkStuff (pkg, folder, global, didRB, cb) {
-  // allow to opt out of linking binaries.
-  if (npm.config.get("bin-links") === false) return cb()
-
-  // if it's global, and folder is in {prefix}/node_modules,
-  // then bins are in {prefix}/bin
-  // otherwise, then bins are in folder/../.bin
-  var parent = path.dirname(folder)
-    , gnm = global && npm.globalDir
-    , top = parent === npm.dir
-    , gtop = parent === gnm
-
-  log.verbose("linkStuff", [global, gnm, gtop, parent])
-  log.info("linkStuff", pkg._id)
-
-  shouldWarn(pkg, folder, global, function() {
-    asyncMap( [linkBins, linkMans, !didRB && rebuildBundles]
-            , function (fn, cb) {
-      if (!fn) return cb()
-      log.verbose(fn.name, pkg._id)
-      fn(pkg, folder, parent, gtop, cb)
-    }, cb)
-  })
-}
-
-function shouldWarn(pkg, folder, global, cb) {
-  var parent = path.dirname(folder)
-    , top = parent === npm.dir
-    , cwd = process.cwd()
-
-  readJson(path.resolve(cwd, "package.json"), function(er, topPkg) {
-    if (er) return cb(er)
-
-    var linkedPkg = path.basename(cwd)
-      , currentPkg = path.basename(folder)
-
-    // current searched package is the linked package on first call
-    if (linkedPkg !== currentPkg) {
-
-      if (!topPkg.dependencies) return cb()
-
-      // don't generate a warning if it's listed in dependencies
-      if (Object.keys(topPkg.dependencies).indexOf(currentPkg) === -1) {
-
-        if (top && pkg.preferGlobal && !global) {
-          log.warn("prefer global", pkg._id + " should be installed with -g")
-        }
-      }
-    }
-
-    cb()
-  })
-}
-
-function rebuildBundles (pkg, folder, parent, gtop, cb) {
-  if (!npm.config.get("rebuild-bundle")) return cb()
-
-  var deps = Object.keys(pkg.dependencies || {})
-             .concat(Object.keys(pkg.devDependencies || {}))
-    , bundles = pkg.bundleDependencies || pkg.bundledDependencies || []
-
-  fs.readdir(path.resolve(folder, "node_modules"), function (er, files) {
-    // error means no bundles
-    if (er) return cb()
-
-    log.verbose("rebuildBundles", files)
-    // don't asyncMap these, because otherwise build script output
-    // gets interleaved and is impossible to read
-    chain(files.filter(function (file) {
-      // rebuild if:
-      // not a .folder, like .bin or .hooks
-      return !file.match(/^[\._-]/)
-          // not some old 0.x style bundle
-          && file.indexOf("@") === -1
-          // either not a dep, or explicitly bundled
-          && (deps.indexOf(file) === -1 || bundles.indexOf(file) !== -1)
-    }).map(function (file) {
-      file = path.resolve(folder, "node_modules", file)
-      return function (cb) {
-        if (build._didBuild[file]) return cb()
-        log.verbose("rebuild bundle", file)
-        // if file is not a package dir, then don't do it.
-        fs.lstat(path.resolve(file, "package.json"), function (er, st) {
-          if (er) return cb()
-          build_(false)(file, cb)
-        })
-    }}), cb)
-  })
-}
-
-function linkBins (pkg, folder, parent, gtop, cb) {
-  if (!pkg.bin || !gtop && path.basename(parent) !== "node_modules") {
-    return cb()
-  }
-  var binRoot = gtop ? npm.globalBin
-                     : path.resolve(parent, ".bin")
-  log.verbose("link bins", [pkg.bin, binRoot, gtop])
-
-  asyncMap(Object.keys(pkg.bin), function (b, cb) {
-    linkBin( path.resolve(folder, pkg.bin[b])
-           , path.resolve(binRoot, b)
-           , gtop && folder
-           , function (er) {
-      if (er) return cb(er)
-      // bins should always be executable.
-      // XXX skip chmod on windows?
-      fs.chmod(path.resolve(folder, pkg.bin[b]), npm.modes.exec, function (er) {
-        if (er || !gtop) return cb(er)
-        var dest = path.resolve(binRoot, b)
-          , src = path.resolve(folder, pkg.bin[b])
-          , out = npm.config.get("parseable")
-                ? dest + "::" + src + ":BINFILE"
-                : dest + " -> " + src
-        console.log(out)
-        cb()
-      })
-    })
-  }, cb)
-}
-
-function linkBin (from, to, gently, cb) {
-  if (process.platform !== "win32") {
-    return linkIfExists(from, to, gently, cb)
-  } else {
-    return cmdShimIfExists(from, to, cb)
-  }
-}
-
-function linkMans (pkg, folder, parent, gtop, cb) {
-  if (!pkg.man || !gtop || process.platform === "win32") return cb()
-
-  var manRoot = path.resolve(npm.config.get("prefix"), "share", "man")
-
-  // make sure that the mans are unique.
-  // otherwise, if there are dupes, it'll fail with EEXIST
-  var set = pkg.man.reduce(function (acc, man) {
-    acc[path.basename(man)] = man
-    return acc
-  }, {})
-  pkg.man = pkg.man.filter(function (man) {
-    return set[path.basename(man)] === man
-  })
-
-  asyncMap(pkg.man, function (man, cb) {
-    if (typeof man !== "string") return cb()
-    var parseMan = man.match(/(.*\.([0-9]+)(\.gz)?)$/)
-      , stem = parseMan[1]
-      , sxn = parseMan[2]
-      , gz = parseMan[3] || ""
-      , bn = path.basename(stem)
-      , manDest = path.join(manRoot, "man" + sxn, bn)
-
-    linkIfExists(man, manDest, gtop && folder, cb)
-  }, cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/cache.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1255 +0,0 @@
-// XXX lib/utils/tar.js and this file need to be rewritten.
-
-// URL-to-cache folder mapping:
-// : -> !
-// @ -> _
-// http://registry.npmjs.org/foo/version -> cache/http!/...
-//
-
-/*
-fetching a url:
-1. Check for url in inFlightUrls.  If present, add cb, and return.
-2. create inFlightURL list
-3. Acquire lock at {cache}/{sha(url)}.lock
-   retries = {cache-lock-retries, def=3}
-   stale = {cache-lock-stale, def=30000}
-   wait = {cache-lock-wait, def=100}
-4. if lock can't be acquired, then fail
-5. fetch url, clear lock, call cbs
-
-cache folders:
-1. urls: http!/server.com/path/to/thing
-2. c:\path\to\thing: file!/c!/path/to/thing
-3. /path/to/thing: file!/path/to/thing
-4. git@ private: git_github.com!isaacs/npm
-5. git://public: git!/github.com/isaacs/npm
-6. git+blah:// git-blah!/server.com/foo/bar
-
-adding a folder:
-1. tar into tmp/random/package.tgz
-2. untar into tmp/random/contents/package, stripping one dir piece
-3. tar tmp/random/contents/package to cache/n/v/package.tgz
-4. untar cache/n/v/package.tgz into cache/n/v/package
-5. rm tmp/random
-
-Adding a url:
-1. fetch to tmp/random/package.tgz
-2. goto folder(2)
-
-adding a name@version:
-1. registry.get(name/version)
-2. if response isn't 304, add url(dist.tarball)
-
-adding a name@range:
-1. registry.get(name)
-2. Find a version that satisfies
-3. add name@version
-
-adding a local tarball:
-1. untar to tmp/random/{blah}
-2. goto folder(2)
-*/
-
-exports = module.exports = cache
-cache.read = read
-cache.clean = clean
-cache.unpack = unpack
-cache.lock = lock
-cache.unlock = unlock
-
-var mkdir = require("mkdirp")
-  , spawn = require("child_process").spawn
-  , exec = require("child_process").execFile
-  , once = require("once")
-  , fetch = require("./utils/fetch.js")
-  , npm = require("./npm.js")
-  , fs = require("graceful-fs")
-  , rm = require("rimraf")
-  , readJson = require("read-package-json")
-  , registry = npm.registry
-  , log = require("npmlog")
-  , path = require("path")
-  , sha = require("sha")
-  , asyncMap = require("slide").asyncMap
-  , semver = require("semver")
-  , tar = require("./utils/tar.js")
-  , fileCompletion = require("./utils/completion/file-completion.js")
-  , url = require("url")
-  , chownr = require("chownr")
-  , lockFile = require("lockfile")
-  , crypto = require("crypto")
-  , retry = require("retry")
-  , zlib = require("zlib")
-  , chmodr = require("chmodr")
-  , which = require("which")
-  , isGitUrl = require("./utils/is-git-url.js")
-
-cache.usage = "npm cache add <tarball file>"
-            + "\nnpm cache add <folder>"
-            + "\nnpm cache add <tarball url>"
-            + "\nnpm cache add <git url>"
-            + "\nnpm cache add <name>@<version>"
-            + "\nnpm cache ls [<path>]"
-            + "\nnpm cache clean [<pkg>[@<version>]]"
-
-cache.completion = function (opts, cb) {
-
-  var argv = opts.conf.argv.remain
-  if (argv.length === 2) {
-    return cb(null, ["add", "ls", "clean"])
-  }
-
-  switch (argv[2]) {
-    case "clean":
-    case "ls":
-      // cache and ls are easy, because the completion is
-      // what ls_ returns anyway.
-      // just get the partial words, minus the last path part
-      var p = path.dirname(opts.partialWords.slice(3).join("/"))
-      if (p === ".") p = ""
-      return ls_(p, 2, cb)
-    case "add":
-      // Same semantics as install and publish.
-      return npm.commands.install.completion(opts, cb)
-  }
-}
-
-function cache (args, cb) {
-  var cmd = args.shift()
-  switch (cmd) {
-    case "rm": case "clear": case "clean": return clean(args, cb)
-    case "list": case "sl": case "ls": return ls(args, cb)
-    case "add": return add(args, cb)
-    default: return cb(new Error(
-      "Invalid cache action: "+cmd))
-  }
-}
-
-// if the pkg and ver are in the cache, then
-// just do a readJson and return.
-// if they're not, then fetch them from the registry.
-function read (name, ver, forceBypass, cb) {
-  if (typeof cb !== "function") cb = forceBypass, forceBypass = true
-  var jsonFile = path.join(npm.cache, name, ver, "package", "package.json")
-  function c (er, data) {
-    if (data) deprCheck(data)
-    return cb(er, data)
-  }
-
-  if (forceBypass && npm.config.get("force")) {
-    log.verbose("using force", "skipping cache")
-    return addNamed(name, ver, c)
-  }
-
-  readJson(jsonFile, function (er, data) {
-    er = needName(er, data)
-    er = needVersion(er, data)
-    if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-    if (er) return addNamed(name, ver, c)
-    deprCheck(data)
-    c(er, data)
-  })
-}
-
-// npm cache ls [<path>]
-function ls (args, cb) {
-  args = args.join("/").split("@").join("/")
-  if (args.substr(-1) === "/") args = args.substr(0, args.length - 1)
-  var prefix = npm.config.get("cache")
-  if (0 === prefix.indexOf(process.env.HOME)) {
-    prefix = "~" + prefix.substr(process.env.HOME.length)
-  }
-  ls_(args, npm.config.get("depth"), function (er, files) {
-    console.log(files.map(function (f) {
-      return path.join(prefix, f)
-    }).join("\n").trim())
-    cb(er, files)
-  })
-}
-
-// Calls cb with list of cached pkgs matching show.
-function ls_ (req, depth, cb) {
-  return fileCompletion(npm.cache, req, depth, cb)
-}
-
-// npm cache clean [<path>]
-function clean (args, cb) {
-  if (!cb) cb = args, args = []
-  if (!args) args = []
-  args = args.join("/").split("@").join("/")
-  if (args.substr(-1) === "/") args = args.substr(0, args.length - 1)
-  var f = path.join(npm.cache, path.normalize(args))
-  if (f === npm.cache) {
-    fs.readdir(npm.cache, function (er, files) {
-      if (er) return cb()
-      asyncMap( files.filter(function (f) {
-                  return npm.config.get("force") || f !== "-"
-                }).map(function (f) {
-                  return path.join(npm.cache, f)
-                })
-              , rm, cb )
-    })
-  } else rm(path.join(npm.cache, path.normalize(args)), cb)
-}
-
-// npm cache add <tarball-url>
-// npm cache add <pkg> <ver>
-// npm cache add <tarball>
-// npm cache add <folder>
-cache.add = function (pkg, ver, scrub, cb) {
-  if (typeof cb !== "function") cb = scrub, scrub = false
-  if (typeof cb !== "function") cb = ver, ver = null
-  if (scrub) {
-    return clean([], function (er) {
-      if (er) return cb(er)
-      add([pkg, ver], cb)
-    })
-  }
-  log.verbose("cache add", [pkg, ver])
-  return add([pkg, ver], cb)
-}
-
-function add (args, cb) {
-  // this is hot code.  almost everything passes through here.
-  // the args can be any of:
-  // ["url"]
-  // ["pkg", "version"]
-  // ["pkg@version"]
-  // ["pkg", "url"]
-  // This is tricky, because urls can contain @
-  // Also, in some cases we get [name, null] rather
-  // that just a single argument.
-
-  var usage = "Usage:\n"
-            + "    npm cache add <tarball-url>\n"
-            + "    npm cache add <pkg>@<ver>\n"
-            + "    npm cache add <tarball>\n"
-            + "    npm cache add <folder>\n"
-    , name
-    , spec
-
-  if (args[1] === undefined) args[1] = null
-
-  // at this point the args length must ==2
-  if (args[1] !== null) {
-    name = args[0]
-    spec = args[1]
-  } else if (args.length === 2) {
-    spec = args[0]
-  }
-
-  log.verbose("cache add", "name=%j spec=%j args=%j", name, spec, args)
-
-
-  if (!name && !spec) return cb(usage)
-
-  // see if the spec is a url
-  // otherwise, treat as name@version
-  var p = url.parse(spec) || {}
-  log.verbose("parsed url", p)
-
-  // it could be that we got name@http://blah
-  // in that case, we will not have a protocol now, but if we
-  // split and check, we will.
-  if (!name && !p.protocol && spec.indexOf("@") !== -1) {
-    spec = spec.split("@")
-    name = spec.shift()
-    spec = spec.join("@")
-    return add([name, spec], cb)
-  }
-
-  switch (p.protocol) {
-    case "http:":
-    case "https:":
-      return addRemoteTarball(spec, null, name, cb)
-
-    default:
-      if (isGitUrl(p))
-        return addRemoteGit(spec, p, name, false, cb)
-
-      // if we have a name and a spec, then try name@spec
-      // if not, then try just spec (which may try name@"" if not found)
-      if (name) {
-        addNamed(name, spec, cb)
-      } else {
-        addLocal(spec, cb)
-      }
-  }
-}
-
-function fetchAndShaCheck (u, tmp, shasum, cb) {
-  fetch(u, tmp, function (er, response) {
-    if (er) {
-      log.error("fetch failed", u)
-      return cb(er, response)
-    }
-    if (!shasum) return cb(null, response)
-    // validate that the url we just downloaded matches the expected shasum.
-    sha.check(tmp, shasum, function (er) {
-      return cb(er, response, shasum)
-    })
-  })
-}
-
-// Only have a single download action at once for a given url
-// additional calls stack the callbacks.
-var inFlightURLs = {}
-function addRemoteTarball (u, shasum, name, cb_) {
-  if (typeof cb_ !== "function") cb_ = name, name = ""
-  if (typeof cb_ !== "function") cb_ = shasum, shasum = null
-
-  if (!inFlightURLs[u]) inFlightURLs[u] = []
-  var iF = inFlightURLs[u]
-  iF.push(cb_)
-  if (iF.length > 1) return
-
-  function cb (er, data) {
-    if (data) {
-      data._from = u
-      data._resolved = u
-    }
-    unlock(u, function () {
-      var c
-      while (c = iF.shift()) c(er, data)
-      delete inFlightURLs[u]
-    })
-  }
-
-  var tmp = path.join(npm.tmp, Date.now()+"-"+Math.random(), "tmp.tgz")
-
-  lock(u, function (er) {
-    if (er) return cb(er)
-
-    log.verbose("addRemoteTarball", [u, shasum])
-    mkdir(path.dirname(tmp), function (er) {
-      if (er) return cb(er)
-      addRemoteTarball_(u, tmp, shasum, done)
-    })
-  })
-
-  function done (er, resp, shasum) {
-    if (er) return cb(er)
-    addLocalTarball(tmp, name, shasum, cb)
-  }
-}
-
-function addRemoteTarball_(u, tmp, shasum, cb) {
-  // Tuned to spread 3 attempts over about a minute.
-  // See formula at <https://github.com/tim-kos/node-retry>.
-  var operation = retry.operation
-    ( { retries: npm.config.get("fetch-retries")
-      , factor: npm.config.get("fetch-retry-factor")
-      , minTimeout: npm.config.get("fetch-retry-mintimeout")
-      , maxTimeout: npm.config.get("fetch-retry-maxtimeout") })
-
-  operation.attempt(function (currentAttempt) {
-    log.info("retry", "fetch attempt " + currentAttempt
-      + " at " + (new Date()).toLocaleTimeString())
-    fetchAndShaCheck(u, tmp, shasum, function (er, response, shasum) {
-      // Only retry on 408, 5xx or no `response`.
-      var sc = response && response.statusCode
-      var statusRetry = !sc || (sc === 408 || sc >= 500)
-      if (er && statusRetry && operation.retry(er)) {
-        log.info("retry", "will retry, error on last attempt: " + er)
-        return
-      }
-      cb(er, response, shasum)
-    })
-  })
-}
-
-// 1. cacheDir = path.join(cache,'_git-remotes',sha1(u))
-// 2. checkGitDir(cacheDir) ? 4. : 3. (rm cacheDir if necessary)
-// 3. git clone --mirror u cacheDir
-// 4. cd cacheDir && git fetch -a origin
-// 5. git archive /tmp/random.tgz
-// 6. addLocalTarball(/tmp/random.tgz) <gitref> --format=tar --prefix=package/
-// silent flag is used if this should error quietly
-function addRemoteGit (u, parsed, name, silent, cb_) {
-  if (typeof cb_ !== "function") cb_ = name, name = null
-
-  if (!inFlightURLs[u]) inFlightURLs[u] = []
-  var iF = inFlightURLs[u]
-  iF.push(cb_)
-  if (iF.length > 1) return
-
-  function cb (er, data) {
-    unlock(u, function () {
-      var c
-      while (c = iF.shift()) c(er, data)
-      delete inFlightURLs[u]
-    })
-  }
-
-  var p, co // cachePath, git-ref we want to check out
-
-  lock(u, function (er) {
-    if (er) return cb(er)
-
-    // figure out what we should check out.
-    var co = parsed.hash && parsed.hash.substr(1) || "master"
-    // git is so tricky!
-    // if the path is like ssh://foo:22/some/path then it works, but
-    // it needs the ssh://
-    // If the path is like ssh://foo:some/path then it works, but
-    // only if you remove the ssh://
-    var origUrl = u
-    u = u.replace(/^git\+/, "")
-         .replace(/#.*$/, "")
-
-    // ssh paths that are scp-style urls don't need the ssh://
-    if (parsed.pathname.match(/^\/?:/)) {
-      u = u.replace(/^ssh:\/\//, "")
-    }
-
-    var v = crypto.createHash("sha1").update(u).digest("hex").slice(0, 8)
-    v = u.replace(/[^a-zA-Z0-9]+/g, '-') + '-' + v
-
-    log.verbose("addRemoteGit", [u, co])
-
-    p = path.join(npm.config.get("cache"), "_git-remotes", v)
-
-    checkGitDir(p, u, co, origUrl, silent, function(er, data) {
-      chmodr(p, npm.modes.file, function(erChmod) {
-        if (er) return cb(er, data)
-        return cb(erChmod, data)
-      })
-    })
-  })
-}
-
-function checkGitDir (p, u, co, origUrl, silent, cb) {
-  fs.stat(p, function (er, s) {
-    if (er) return cloneGitRemote(p, u, co, origUrl, silent, cb)
-    if (!s.isDirectory()) return rm(p, function (er){
-      if (er) return cb(er)
-      cloneGitRemote(p, u, co, origUrl, silent, cb)
-    })
-
-    var git = npm.config.get("git")
-    var args = [ "config", "--get", "remote.origin.url" ]
-    var env = gitEnv()
-
-    // check for git
-    which(git, function (err) {
-      if (err) {
-        err.code = "ENOGIT"
-        return cb(err)
-      }
-      exec(git, args, {cwd: p, env: env}, function (er, stdout, stderr) {
-        stdoutTrimmed = (stdout + "\n" + stderr).trim()
-        if (er || u !== stdout.trim()) {
-          log.warn( "`git config --get remote.origin.url` returned "
-                  + "wrong result ("+u+")", stdoutTrimmed )
-          return rm(p, function (er){
-            if (er) return cb(er)
-            cloneGitRemote(p, u, co, origUrl, silent, cb)
-          })
-        }
-        log.verbose("git remote.origin.url", stdoutTrimmed)
-        archiveGitRemote(p, u, co, origUrl, cb)
-      })
-    })
-  })
-}
-
-function cloneGitRemote (p, u, co, origUrl, silent, cb) {
-  mkdir(p, function (er) {
-    if (er) return cb(er)
-
-    var git = npm.config.get("git")
-    var args = [ "clone", "--mirror", u, p ]
-    var env = gitEnv()
-
-    // check for git
-    which(git, function (err) {
-      if (err) {
-        err.code = "ENOGIT"
-        return cb(err)
-      }
-      exec(git, args, {cwd: p, env: env}, function (er, stdout, stderr) {
-        stdout = (stdout + "\n" + stderr).trim()
-        if (er) {
-          if (silent) {
-            log.verbose("git clone " + u, stdout)
-          } else {
-            log.error("git clone " + u, stdout)
-          }
-          return cb(er)
-        }
-        log.verbose("git clone " + u, stdout)
-        archiveGitRemote(p, u, co, origUrl, cb)
-      })
-    })
-  })
-}
-
-function archiveGitRemote (p, u, co, origUrl, cb) {
-  var git = npm.config.get("git")
-  var archive = [ "fetch", "-a", "origin" ]
-  var resolve = [ "rev-list", "-n1", co ]
-  var env = gitEnv()
-
-  var errState = null
-  var n = 0
-  var resolved = null
-  var tmp
-
-  exec(git, archive, {cwd: p, env: env}, function (er, stdout, stderr) {
-    stdout = (stdout + "\n" + stderr).trim()
-    if (er) {
-      log.error("git fetch -a origin ("+u+")", stdout)
-      return cb(er)
-    }
-    log.verbose("git fetch -a origin ("+u+")", stdout)
-    tmp = path.join(npm.tmp, Date.now()+"-"+Math.random(), "tmp.tgz")
-    resolveHead()
-  })
-
-  function resolveHead () {
-    exec(git, resolve, {cwd: p, env: env}, function (er, stdout, stderr) {
-      stdout = (stdout + "\n" + stderr).trim()
-      if (er) {
-        log.error("Failed resolving git HEAD (" + u + ")", stderr)
-        return cb(er)
-      }
-      log.verbose("git rev-list -n1 " + co, stdout)
-      var parsed = url.parse(origUrl)
-      parsed.hash = stdout
-      resolved = url.format(parsed)
-
-      // https://github.com/isaacs/npm/issues/3224
-      // node incorrectly sticks a / at the start of the path
-      // We know that the host won't change, so split and detect this
-      var spo = origUrl.split(parsed.host)
-      var spr = resolved.split(parsed.host)
-      if (spo[1].charAt(0) === ':' && spr[1].charAt(0) === '/')
-        spr[1] = spr[1].slice(1)
-      resolved = spr.join(parsed.host)
-
-      log.verbose('resolved git url', resolved)
-      next()
-    })
-  }
-
-  function next () {
-    mkdir(path.dirname(tmp), function (er) {
-      if (er) return cb(er)
-      var gzip = zlib.createGzip({ level: 9 })
-      var git = npm.config.get("git")
-      var args = ["archive", co, "--format=tar", "--prefix=package/"]
-      var out = fs.createWriteStream(tmp)
-      var env = gitEnv()
-      cb = once(cb)
-      var cp = spawn(git, args, { env: env, cwd: p })
-      cp.on("error", cb)
-      cp.stderr.on("data", function(chunk) {
-        log.silly(chunk.toString(), "git archive")
-      })
-
-      cp.stdout.pipe(gzip).pipe(out).on("close", function() {
-        addLocalTarball(tmp, function(er, data) {
-          if (data) data._resolved = resolved
-          cb(er, data)
-        })
-      })
-    })
-  }
-}
-
-var gitEnv_
-function gitEnv () {
-  // git responds to env vars in some weird ways in post-receive hooks
-  // so don't carry those along.
-  if (gitEnv_) return gitEnv_
-  gitEnv_ = {}
-  for (var k in process.env) {
-    if (!~['GIT_PROXY_COMMAND','GIT_SSH'].indexOf(k) && k.match(/^GIT/)) continue
-    gitEnv_[k] = process.env[k]
-  }
-  return gitEnv_
-}
-
-
-// only have one request in flight for a given
-// name@blah thing.
-var inFlightNames = {}
-function addNamed (name, x, data, cb_) {
-  if (typeof cb_ !== "function") cb_ = data, data = null
-  log.verbose("addNamed", [name, x])
-
-  var k = name + "@" + x
-  if (!inFlightNames[k]) inFlightNames[k] = []
-  var iF = inFlightNames[k]
-  iF.push(cb_)
-  if (iF.length > 1) return
-
-  function cb (er, data) {
-    if (data && !data._fromGithub) data._from = k
-    unlock(k, function () {
-      var c
-      while (c = iF.shift()) c(er, data)
-      delete inFlightNames[k]
-    })
-  }
-
-  log.verbose("addNamed", [semver.valid(x), semver.validRange(x)])
-  lock(k, function (er, fd) {
-    if (er) return cb(er)
-
-    var fn = ( semver.valid(x, true) ? addNameVersion
-             : semver.validRange(x, true) ? addNameRange
-             : addNameTag
-             )
-    fn(name, x, data, cb)
-  })
-}
-
-function addNameTag (name, tag, data, cb_) {
-  if (typeof cb_ !== "function") cb_ = data, data = null
-  log.info("addNameTag", [name, tag])
-  var explicit = true
-  if (!tag) {
-    explicit = false
-    tag = npm.config.get("tag")
-  }
-
-  function cb(er, data) {
-    // might be username/project
-    // in that case, try it as a github url.
-    if (er && tag.split("/").length === 2) {
-      return maybeGithub(tag, name, er, cb_)
-    }
-    return cb_(er, data)
-  }
-
-  registry.get(name, function (er, data, json, response) {
-    if (er) return cb(er)
-    engineFilter(data)
-    if (data["dist-tags"] && data["dist-tags"][tag]
-        && data.versions[data["dist-tags"][tag]]) {
-      var ver = data["dist-tags"][tag]
-      return addNamed(name, ver, data.versions[ver], cb)
-    }
-    if (!explicit && Object.keys(data.versions).length) {
-      return addNamed(name, "*", data, cb)
-    }
-
-    er = installTargetsError(tag, data)
-    return cb(er)
-  })
-}
-
-
-function engineFilter (data) {
-  var npmv = npm.version
-    , nodev = npm.config.get("node-version")
-    , strict = npm.config.get("engine-strict")
-
-  if (!nodev || npm.config.get("force")) return data
-
-  Object.keys(data.versions || {}).forEach(function (v) {
-    var eng = data.versions[v].engines
-    if (!eng) return
-    if (!strict && !data.versions[v].engineStrict) return
-    if (eng.node && !semver.satisfies(nodev, eng.node, true)
-        || eng.npm && !semver.satisfies(npmv, eng.npm, true)) {
-      delete data.versions[v]
-    }
-  })
-}
-
-function addNameRange (name, range, data, cb) {
-  if (typeof cb !== "function") cb = data, data = null
-
-  range = semver.validRange(range, true)
-  if (range === null) return cb(new Error(
-    "Invalid version range: "+range))
-
-  log.silly("addNameRange", {name:name, range:range, hasData:!!data})
-
-  if (data) return next()
-  registry.get(name, function (er, d, json, response) {
-    if (er) return cb(er)
-    data = d
-    next()
-  })
-
-  function next () {
-    log.silly( "addNameRange", "number 2"
-             , {name:name, range:range, hasData:!!data})
-    engineFilter(data)
-
-    log.silly("addNameRange", "versions"
-             , [data.name, Object.keys(data.versions || {})])
-
-    // if the tagged version satisfies, then use that.
-    var tagged = data["dist-tags"][npm.config.get("tag")]
-    if (tagged
-        && data.versions[tagged]
-        && semver.satisfies(tagged, range, true)) {
-      return addNamed(name, tagged, data.versions[tagged], cb)
-    }
-
-    // find the max satisfying version.
-    var versions = Object.keys(data.versions || {})
-    var ms = semver.maxSatisfying(versions, range, true)
-    if (!ms) {
-      return cb(installTargetsError(range, data))
-    }
-
-    // if we don't have a registry connection, try to see if
-    // there's a cached copy that will be ok.
-    addNamed(name, ms, data.versions[ms], cb)
-  }
-}
-
-function installTargetsError (requested, data) {
-  var targets = Object.keys(data["dist-tags"]).filter(function (f) {
-    return (data.versions || {}).hasOwnProperty(f)
-  }).concat(Object.keys(data.versions || {}))
-
-  requested = data.name + (requested ? "@'" + requested + "'" : "")
-
-  targets = targets.length
-          ? "Valid install targets:\n" + JSON.stringify(targets) + "\n"
-          : "No valid targets found.\n"
-          + "Perhaps not compatible with your version of node?"
-
-  var er = new Error( "No compatible version found: "
-                  + requested + "\n" + targets)
-  er.code = "ETARGET"
-  return er
-}
-
-function addNameVersion (name, v, data, cb) {
-  if (typeof cb !== "function") cb = data, data = null
-
-  var ver = semver.valid(v, true)
-  if (!ver) return cb(new Error("Invalid version: "+v))
-
-  var response
-
-  if (data) {
-    response = null
-    return next()
-  }
-  registry.get(name + "/" + ver, function (er, d, json, resp) {
-    if (er) return cb(er)
-    data = d
-    response = resp
-    next()
-  })
-
-  function next () {
-    deprCheck(data)
-    var dist = data.dist
-
-    if (!dist) return cb(new Error("No dist in "+data._id+" package"))
-
-    if (!dist.tarball) return cb(new Error(
-      "No dist.tarball in " + data._id + " package"))
-
-    if ((response && response.statusCode !== 304) || npm.config.get("force")) {
-      return fetchit()
-    }
-
-    // we got cached data, so let's see if we have a tarball.
-    var pkgroot = path.join(npm.cache, name, ver)
-    var pkgtgz = path.join(pkgroot, "package.tgz")
-    var pkgjson = path.join(pkgroot, "package", "package.json")
-    fs.stat(pkgtgz, function (er, s) {
-      if (!er) {
-        readJson(pkgjson, function (er, data) {
-          er = needName(er, data)
-          er = needVersion(er, data)
-          if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR")
-            return cb(er)
-          if (er) return fetchit()
-          return cb(null, data)
-        })
-      } else return fetchit()
-    })
-
-    function fetchit () {
-      if (!npm.config.get("registry")) {
-        return cb(new Error("Cannot fetch: "+dist.tarball))
-      }
-
-      // use the same protocol as the registry.
-      // https registry --> https tarballs, but
-      // only if they're the same hostname, or else
-      // detached tarballs may not work.
-      var tb = url.parse(dist.tarball)
-      var rp = url.parse(npm.config.get("registry"))
-      if (tb.hostname === rp.hostname
-          && tb.protocol !== rp.protocol) {
-        tb.protocol = url.parse(npm.config.get("registry")).protocol
-        delete tb.href
-      }
-      tb = url.format(tb)
-
-      // only add non-shasum'ed packages if --forced.
-      // only ancient things would lack this for good reasons nowadays.
-      if (!dist.shasum && !npm.config.get("force")) {
-        return cb(new Error("package lacks shasum: " + data._id))
-      }
-      return addRemoteTarball( tb
-                             , dist.shasum
-                             , name+"-"+ver
-                             , cb )
-    }
-  }
-}
-
-function addLocal (p, name, cb_) {
-  if (typeof cb_ !== "function") cb_ = name, name = ""
-
-  function cb (er, data) {
-    unlock(p, function () {
-      if (er) {
-        // if it doesn't have a / in it, it might be a
-        // remote thing.
-        if (p.indexOf("/") === -1 && p.charAt(0) !== "."
-           && (process.platform !== "win32" || p.indexOf("\\") === -1)) {
-          return addNamed(p, "", cb_)
-        }
-        log.error("addLocal", "Could not install %s", p)
-        return cb_(er)
-      }
-      if (data && !data._fromGithub) data._from = p
-      return cb_(er, data)
-    })
-  }
-
-  lock(p, function (er) {
-    if (er) return cb(er)
-    // figure out if this is a folder or file.
-    fs.stat(p, function (er, s) {
-      if (er) {
-        // might be username/project
-        // in that case, try it as a github url.
-        if (p.split("/").length === 2) {
-          return maybeGithub(p, name, er, cb)
-        }
-        return cb(er)
-      }
-      if (s.isDirectory()) addLocalDirectory(p, name, cb)
-      else addLocalTarball(p, name, cb)
-    })
-  })
-}
-
-function maybeGithub (p, name, er, cb) {
-  var u = "git://github.com/" + p
-    , up = url.parse(u)
-  log.info("maybeGithub", "Attempting %s from %s", p, u)
-
-  return addRemoteGit(u, up, name, true, function (er2, data) {
-    if (er2) {
-      var upriv = "git+ssh://git@github.com:" + p
-        , uppriv = url.parse(upriv)
-
-      log.info("maybeGithub", "Attempting %s from %s", p, upriv)
-
-      return addRemoteGit(upriv, uppriv, false, name, function (er3, data) {
-        if (er3) return cb(er)
-        success(upriv, data)
-      })
-    }
-    success(u, data)
-  })
-
-  function success (u, data) {
-    data._from = u
-    data._fromGithub = true
-    return cb(null, data)
-  }
-}
-
-function addLocalTarball (p, name, shasum, cb_) {
-  if (typeof cb_ !== "function") cb_ = shasum, shasum = null
-  if (typeof cb_ !== "function") cb_ = name, name = ""
-  // if it's a tar, and not in place,
-  // then unzip to .tmp, add the tmp folder, and clean up tmp
-  if (p.indexOf(npm.tmp) === 0)
-    return addTmpTarball(p, name, shasum, cb_)
-
-  if (p.indexOf(npm.cache) === 0) {
-    if (path.basename(p) !== "package.tgz") return cb_(new Error(
-      "Not a valid cache tarball name: "+p))
-    return addPlacedTarball(p, name, shasum, cb_)
-  }
-
-  function cb (er, data) {
-    if (data) data._resolved = p
-    return cb_(er, data)
-  }
-
-  // just copy it over and then add the temp tarball file.
-  var tmp = path.join(npm.tmp, name + Date.now()
-                             + "-" + Math.random(), "tmp.tgz")
-  mkdir(path.dirname(tmp), function (er) {
-    if (er) return cb(er)
-    var from = fs.createReadStream(p)
-      , to = fs.createWriteStream(tmp)
-      , errState = null
-    function errHandler (er) {
-      if (errState) return
-      return cb(errState = er)
-    }
-    from.on("error", errHandler)
-    to.on("error", errHandler)
-    to.on("close", function () {
-      if (errState) return
-      log.verbose("chmod", tmp, npm.modes.file.toString(8))
-      fs.chmod(tmp, npm.modes.file, function (er) {
-        if (er) return cb(er)
-        addTmpTarball(tmp, name, shasum, cb)
-      })
-    })
-    from.pipe(to)
-  })
-}
-
-// to maintain the cache dir's permissions consistently.
-var cacheStat = null
-function getCacheStat (cb) {
-  if (cacheStat) return cb(null, cacheStat)
-  fs.stat(npm.cache, function (er, st) {
-    if (er) return makeCacheDir(cb)
-    if (!st.isDirectory()) {
-      log.error("getCacheStat", "invalid cache dir %j", npm.cache)
-      return cb(er)
-    }
-    return cb(null, cacheStat = st)
-  })
-}
-
-function makeCacheDir (cb) {
-  if (!process.getuid) return mkdir(npm.cache, cb)
-
-  var uid = +process.getuid()
-    , gid = +process.getgid()
-
-  if (uid === 0) {
-    if (process.env.SUDO_UID) uid = +process.env.SUDO_UID
-    if (process.env.SUDO_GID) gid = +process.env.SUDO_GID
-  }
-  if (uid !== 0 || !process.env.HOME) {
-    cacheStat = {uid: uid, gid: gid}
-    return mkdir(npm.cache, afterMkdir)
-  }
-
-  fs.stat(process.env.HOME, function (er, st) {
-    if (er) {
-      log.error("makeCacheDir", "homeless?")
-      return cb(er)
-    }
-    cacheStat = st
-    log.silly("makeCacheDir", "cache dir uid, gid", [st.uid, st.gid])
-    return mkdir(npm.cache, afterMkdir)
-  })
-
-  function afterMkdir (er, made) {
-    if (er || !cacheStat || isNaN(cacheStat.uid) || isNaN(cacheStat.gid)) {
-      return cb(er, cacheStat)
-    }
-
-    if (!made) return cb(er, cacheStat)
-
-    // ensure that the ownership is correct.
-    chownr(made, cacheStat.uid, cacheStat.gid, function (er) {
-      return cb(er, cacheStat)
-    })
-  }
-}
-
-
-
-
-function addPlacedTarball (p, name, shasum, cb) {
-  if (!cb) cb = name, name = ""
-  getCacheStat(function (er, cs) {
-    if (er) return cb(er)
-    return addPlacedTarball_(p, name, cs.uid, cs.gid, shasum, cb)
-  })
-}
-
-// Resolved sum is the shasum from the registry dist object, but
-// *not* necessarily the shasum of this tarball, because for stupid
-// historical reasons, npm re-packs each package an extra time through
-// a temp directory, so all installed packages are actually built with
-// *this* version of npm, on this machine.
-//
-// Once upon a time, this meant that we could change package formats
-// around and fix junk that might be added by incompatible tar
-// implementations.  Then, for a while, it was a way to correct bs
-// added by bugs in our own tar implementation.  Now, it's just
-// garbage, but cleaning it up is a pain, and likely to cause issues
-// if anything is overlooked, so it's not high priority.
-//
-// If you're bored, and looking to make npm go faster, and you've
-// already made it this far in this file, here's a better methodology:
-//
-// cache.add should really be cache.place.  That is, it should take
-// a set of arguments like it does now, but then also a destination
-// folder.
-//
-// cache.add('foo@bar', '/path/node_modules/foo', cb)
-//
-// 1. Resolve 'foo@bar' to some specific:
-//   - git url
-//   - local folder
-//   - local tarball
-//   - tarball url
-// 2. If resolved through the registry, then pick up the dist.shasum
-// along the way.
-// 3. Acquire request() stream fetching bytes: FETCH
-// 4. FETCH.pipe(tar unpack stream to dest)
-// 5. FETCH.pipe(shasum generator)
-// When the tar and shasum streams both finish, make sure that the
-// shasum matches dist.shasum, and if not, clean up and bail.
-//
-// publish(cb)
-//
-// 1. read package.json
-// 2. get root package object (for rev, and versions)
-// 3. update root package doc with version info
-// 4. remove _attachments object
-// 5. remove versions object
-// 5. jsonify, remove last }
-// 6. get stream: registry.put(/package)
-// 7. write trailing-}-less JSON
-// 8. write "_attachments":
-// 9. JSON.stringify(attachments), remove trailing }
-// 10. Write start of attachments (stubs)
-// 11. JSON(filename)+':{"type":"application/octet-stream","data":"'
-// 12. acquire tar packing stream, PACK
-// 13. PACK.pipe(PUT)
-// 14. PACK.pipe(shasum generator)
-// 15. when PACK finishes, get shasum
-// 16. PUT.write('"}},') (finish _attachments
-// 17. update "versions" object with current package version
-// (including dist.shasum and dist.tarball)
-// 18. write '"versions":' + JSON(versions)
-// 19. write '}}' (versions, close main doc)
-
-function addPlacedTarball_ (p, name, uid, gid, resolvedSum, cb) {
-  // now we know it's in place already as .cache/name/ver/package.tgz
-  // unpack to .cache/name/ver/package/, read the package.json,
-  // and fire cb with the json data.
-  var target = path.dirname(p)
-    , folder = path.join(target, "package")
-
-  lock(folder, function (er) {
-    if (er) return cb(er)
-    rmUnpack()
-  })
-
-  function rmUnpack () {
-    rm(folder, function (er) {
-      unlock(folder, function () {
-        if (er) {
-          log.error("addPlacedTarball", "Could not remove %j", folder)
-          return cb(er)
-        }
-        thenUnpack()
-      })
-    })
-  }
-
-  function thenUnpack () {
-    tar.unpack(p, folder, null, null, uid, gid, function (er) {
-      if (er) {
-        log.error("addPlacedTarball", "Could not unpack %j to %j", p, target)
-        return cb(er)
-      }
-      // calculate the sha of the file that we just unpacked.
-      // this is so that the data is available when publishing.
-      sha.get(p, function (er, shasum) {
-        if (er) {
-          log.error("addPlacedTarball", "shasum fail", p)
-          return cb(er)
-        }
-        readJson(path.join(folder, "package.json"), function (er, data) {
-          er = needName(er, data)
-          er = needVersion(er, data)
-          if (er) {
-            log.error("addPlacedTarball", "Couldn't read json in %j"
-                     , folder)
-            return cb(er)
-          }
-
-          data.dist = data.dist || {}
-          data.dist.shasum = shasum
-          deprCheck(data)
-          asyncMap([p], function (f, cb) {
-            log.verbose("chmod", f, npm.modes.file.toString(8))
-            fs.chmod(f, npm.modes.file, cb)
-          }, function (f, cb) {
-            if (process.platform === "win32") {
-              log.silly("chown", "skipping for windows", f)
-              cb()
-            } else if (typeof uid === "number"
-                && typeof gid === "number"
-                && parseInt(uid, 10) === uid
-                && parseInt(gid, 10) === gid) {
-              log.verbose("chown", f, [uid, gid])
-              fs.chown(f, uid, gid, cb)
-            } else {
-              log.verbose("chown", "skip for invalid uid/gid", [f, uid, gid])
-              cb()
-            }
-          }, function (er) {
-            cb(er, data)
-          })
-        })
-      })
-    })
-  }
-}
-
-// At this point, if shasum is set, it's something that we've already
-// read and checked.  Just stashing it in the data at this point.
-function addLocalDirectory (p, name, shasum, cb) {
-  if (typeof cb !== "function") cb = shasum, shasum = ""
-  if (typeof cb !== "function") cb = name, name = ""
-  // if it's a folder, then read the package.json,
-  // tar it to the proper place, and add the cache tar
-  if (p.indexOf(npm.cache) === 0) return cb(new Error(
-    "Adding a cache directory to the cache will make the world implode."))
-  readJson(path.join(p, "package.json"), false, function (er, data) {
-    er = needName(er, data)
-    er = needVersion(er, data)
-    if (er) return cb(er)
-    deprCheck(data)
-    var random = Date.now() + "-" + Math.random()
-      , tmp = path.join(npm.tmp, random)
-      , tmptgz = path.resolve(tmp, "tmp.tgz")
-      , placed = path.resolve( npm.cache, data.name
-                             , data.version, "package.tgz" )
-      , placeDirect = path.basename(p) === "package"
-      , tgz = placeDirect ? placed : tmptgz
-    getCacheStat(function (er, cs) {
-      mkdir(path.dirname(tgz), function (er, made) {
-        if (er) return cb(er)
-
-        var fancy = p.indexOf(npm.tmp) !== 0
-                    && p.indexOf(npm.cache) !== 0
-        tar.pack(tgz, p, data, fancy, function (er) {
-          if (er) {
-            log.error( "addLocalDirectory", "Could not pack %j to %j"
-                     , p, tgz )
-            return cb(er)
-          }
-
-          // if we don't get a cache stat, or if the gid/uid is not
-          // a number, then just move on.  chown would fail anyway.
-          if (!cs || isNaN(cs.uid) || isNaN(cs.gid)) return cb()
-
-          chownr(made || tgz, cs.uid, cs.gid, function (er) {
-            if (er) return cb(er)
-            addLocalTarball(tgz, name, shasum, cb)
-          })
-        })
-      })
-    })
-  })
-}
-
-function addTmpTarball (tgz, name, shasum, cb) {
-  if (!cb) cb = name, name = ""
-  getCacheStat(function (er, cs) {
-    if (er) return cb(er)
-    var contents = path.dirname(tgz)
-    tar.unpack( tgz, path.resolve(contents, "package")
-              , null, null
-              , cs.uid, cs.gid
-              , function (er) {
-      if (er) {
-        return cb(er)
-      }
-      addLocalDirectory(path.resolve(contents, "package"), name, shasum, cb)
-    })
-  })
-}
-
-function unpack (pkg, ver, unpackTarget, dMode, fMode, uid, gid, cb) {
-  if (typeof cb !== "function") cb = gid, gid = null
-  if (typeof cb !== "function") cb = uid, uid = null
-  if (typeof cb !== "function") cb = fMode, fMode = null
-  if (typeof cb !== "function") cb = dMode, dMode = null
-
-  read(pkg, ver, false, function (er, data) {
-    if (er) {
-      log.error("unpack", "Could not read data for %s", pkg + "@" + ver)
-      return cb(er)
-    }
-    npm.commands.unbuild([unpackTarget], true, function (er) {
-      if (er) return cb(er)
-      tar.unpack( path.join(npm.cache, pkg, ver, "package.tgz")
-                , unpackTarget
-                , dMode, fMode
-                , uid, gid
-                , cb )
-    })
-  })
-}
-
-var deprecated = {}
-  , deprWarned = {}
-function deprCheck (data) {
-  if (deprecated[data._id]) data.deprecated = deprecated[data._id]
-  if (data.deprecated) deprecated[data._id] = data.deprecated
-  else return
-  if (!deprWarned[data._id]) {
-    deprWarned[data._id] = true
-    log.warn("deprecated", "%s: %s", data._id, data.deprecated)
-  }
-}
-
-function lockFileName (u) {
-  var c = u.replace(/[^a-zA-Z0-9]+/g, "-").replace(/^-+|-+$/g, "")
-    , h = crypto.createHash("sha1").update(u).digest("hex")
-  h = h.substr(0, 8)
-  c = c.substr(-32)
-  log.silly("lockFile", h + "-" + c, u)
-  return path.resolve(npm.config.get("cache"), h + "-" + c + ".lock")
-}
-
-var myLocks = {}
-function lock (u, cb) {
-  // the cache dir needs to exist already for this.
-  getCacheStat(function (er, cs) {
-    if (er) return cb(er)
-    var opts = { stale: npm.config.get("cache-lock-stale")
-               , retries: npm.config.get("cache-lock-retries")
-               , wait: npm.config.get("cache-lock-wait") }
-    var lf = lockFileName(u)
-    log.verbose("lock", u, lf)
-    lockFile.lock(lf, opts, function(er) {
-      if (!er) myLocks[lf] = true
-      cb(er)
-    })
-  })
-}
-
-function unlock (u, cb) {
-  var lf = lockFileName(u)
-  if (!myLocks[lf]) return process.nextTick(cb)
-  myLocks[lf] = false
-  lockFile.unlock(lockFileName(u), cb)
-}
-
-function needName(er, data) {
-  return er ? er
-       : (data && !data.name) ? new Error("No name provided")
-       : null
-}
-
-function needVersion(er, data) {
-  return er ? er
-       : (data && !data.version) ? new Error("No version provided")
-       : null
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/completion.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,253 +0,0 @@
-
-module.exports = completion
-
-completion.usage = "npm completion >> ~/.bashrc\n"
-                 + "npm completion >> ~/.zshrc\n"
-                 + "source <(npm completion)"
-
-var npm = require("./npm.js")
-  , npmconf = require("npmconf")
-  , configDefs = npmconf.defs
-  , configTypes = configDefs.types
-  , shorthands = configDefs.shorthands
-  , nopt = require("nopt")
-  , configNames = Object.keys(configTypes).filter(function (e) {
-      return e.charAt(0) !== "_"
-    })
-  , shorthandNames = Object.keys(shorthands)
-  , allConfs = configNames.concat(shorthandNames)
-  , once = require("once")
-
-
-completion.completion = function (opts, cb) {
-  if (opts.w > 3) return cb()
-
-  var fs = require("graceful-fs")
-    , path = require("path")
-    , bashExists = null
-    , zshExists = null
-    , bashProfExists = null
-  fs.stat(path.resolve(process.env.HOME, ".bashrc"), function (er, b) {
-    bashExists = !er
-    next()
-  })
-  fs.stat(path.resolve(process.env.HOME, ".zshrc"), function (er, b) {
-    zshExists = !er
-    next()
-  })
-  function next () {
-    if (zshExists === null || bashExists === null) return
-    var out = []
-    if (zshExists) out.push("~/.zshrc")
-    if (bashExists) out.push("~/.bashrc")
-    if (opts.w === 2) out = out.map(function (m) {
-      return [">>", m]
-    })
-    cb(null, out)
-  }
-}
-
-function completion (args, cb) {
-  if (process.platform === "win32") {
-    var e = new Error("npm completion not supported on windows")
-    e.code = "ENOTSUP"
-    e.errno = require("constants").ENOTSUP
-    return cb(e)
-  }
-
-  // if the COMP_* isn't in the env, then just dump the script.
-  if (process.env.COMP_CWORD === undefined
-    ||process.env.COMP_LINE === undefined
-    ||process.env.COMP_POINT === undefined
-    ) return dumpScript(cb)
-
-  console.error(process.env.COMP_CWORD)
-  console.error(process.env.COMP_LINE)
-  console.error(process.env.COMP_POINT)
-
-  //console.log("abracadabrasauce\nabracad cat monger")
-  //if (Math.random() * 3 < 1) console.log("man\\ bear\\ pig")
-  //else if (Math.random() * 3 < 1)
-  //  console.log("porkchop\\ sandwiches\nporkman")
-  //else console.log("encephylophagy")
-
-  // get the partial line and partial word,
-  // if the point isn't at the end.
-  // ie, tabbing at: npm foo b|ar
-  var w = +process.env.COMP_CWORD
-    , words = args.map(unescape)
-    , word = words[w]
-    , line = process.env.COMP_LINE
-    , point = +process.env.COMP_POINT
-    , lineLength = line.length
-    , partialLine = line.substr(0, point)
-    , partialWords = words.slice(0, w)
-
-  // figure out where in that last word the point is.
-  var partialWord = args[w]
-    , i = partialWord.length
-  while (partialWord.substr(0, i) !== partialLine.substr(-1*i) && i > 0) {
-    i --
-  }
-  partialWord = unescape(partialWord.substr(0, i))
-  partialWords.push(partialWord)
-
-  var opts = { words : words
-             , w : w
-             , word : word
-             , line : line
-             , lineLength : line.length
-             , point : point
-             , partialLine : partialLine
-             , partialWords : partialWords
-             , partialWord : partialWord
-             , raw: args
-             }
-
-  cb = wrapCb(cb, opts)
-
-  console.error(opts)
-
-  if (partialWords.slice(0, -1).indexOf("--") === -1) {
-    if (word.charAt(0) === "-") return configCompl(opts, cb)
-    if (words[w - 1]
-        && words[w - 1].charAt(0) === "-"
-        && !isFlag(words[w - 1])) {
-      // awaiting a value for a non-bool config.
-      // don't even try to do this for now
-      console.error("configValueCompl")
-      return configValueCompl(opts, cb)
-    }
-  }
-
-  // try to find the npm command.
-  // it's the first thing after all the configs.
-  // take a little shortcut and use npm's arg parsing logic.
-  // don't have to worry about the last arg being implicitly
-  // boolean'ed, since the last block will catch that.
-  var parsed = opts.conf =
-    nopt(configTypes, shorthands, partialWords.slice(0, -1), 0)
-  // check if there's a command already.
-  console.error(parsed)
-  var cmd = parsed.argv.remain[1]
-  if (!cmd) return cmdCompl(opts, cb)
-
-  Object.keys(parsed).forEach(function (k) {
-    npm.config.set(k, parsed[k])
-  })
-
-  // at this point, if words[1] is some kind of npm command,
-  // then complete on it.
-  // otherwise, do nothing
-  cmd = npm.commands[cmd]
-  if (cmd && cmd.completion) return cmd.completion(opts, cb)
-
-  // nothing to do.
-  cb()
-}
-
-function dumpScript (cb) {
-  var fs = require("graceful-fs")
-    , path = require("path")
-    , p = path.resolve(__dirname, "utils/completion.sh")
-
-  // The Darwin patch below results in callbacks first for the write and then
-  // for the error handler, so make sure we only call our callback once.
-  cb = once(cb)
-
-  fs.readFile(p, "utf8", function (er, d) {
-    if (er) return cb(er)
-    d = d.replace(/^\#\!.*?\n/, "")
-
-    process.stdout.write(d, function (n) { cb() })
-    process.stdout.on("error", function (er) {
-      // Darwin is a real dick sometimes.
-      //
-      // This is necessary because the "source" or "." program in
-      // bash on OS X closes its file argument before reading
-      // from it, meaning that you get exactly 1 write, which will
-      // work most of the time, and will always raise an EPIPE.
-      //
-      // Really, one should not be tossing away EPIPE errors, or any
-      // errors, so casually.  But, without this, `. <(npm completion)`
-      // can never ever work on OS X.
-      if (er.errno === "EPIPE") er = null
-      cb(er)
-    })
-
-  })
-}
-
-function unescape (w) {
-  if (w.charAt(0) === "\"") return w.replace(/^"|"$/g, "")
-  else return w.replace(/\\ /g, " ")
-}
-
-function escape (w) {
-  if (!w.match(/\s+/)) return w
-  return "\"" + w + "\""
-}
-
-// The command should respond with an array.  Loop over that,
-// wrapping quotes around any that have spaces, and writing
-// them to stdout.  Use console.log, not the outfd config.
-// If any of the items are arrays, then join them with a space.
-// Ie, returning ["a", "b c", ["d", "e"]] would allow it to expand
-// to: "a", "b c", or "d" "e"
-function wrapCb (cb, opts) { return function (er, compls) {
-  if (!Array.isArray(compls)) compls = compls ? [compls] : []
-  compls = compls.map(function (c) {
-    if (Array.isArray(c)) c = c.map(escape).join(" ")
-    else c = escape(c)
-    return c
-  })
-  if (opts.partialWord) compls = compls.filter(function (c) {
-    return c.indexOf(opts.partialWord) === 0
-  })
-  console.error([er && er.stack, compls, opts.partialWord])
-  if (er || compls.length === 0) return cb(er)
-
-  console.log(compls.join("\n"))
-  cb()
-}}
-
-// the current word has a dash.  Return the config names,
-// with the same number of dashes as the current word has.
-function configCompl (opts, cb) {
-  var word = opts.word
-    , split = word.match(/^(-+)((?:no-)*)(.*)$/)
-    , dashes = split[1]
-    , no = split[2]
-    , conf = split[3]
-    , confs = allConfs
-    , flags = configNames.filter(isFlag)
-  console.error(flags)
-
-  return cb(null, allConfs.map(function (c) {
-    return dashes + c
-  }).concat(flags.map(function (f) {
-    return dashes + (no || "no-") + f
-  })))
-}
-
-// expand with the valid values of various config values.
-// not yet implemented.
-function configValueCompl (opts, cb) {
-  console.error('configValue', opts)
-  return cb(null, [])
-}
-
-// check if the thing is a flag or not.
-function isFlag (word) {
-  // shorthands never take args.
-  var split = word.match(/^(-*)((?:no-)+)?(.*)$/)
-    , dashes = split[1]
-    , no = split[2]
-    , conf = split[3]
-  return no || configTypes[conf] === Boolean || shorthands[conf]
-}
-
-// complete against the npm commands
-function cmdCompl (opts, cb) {
-  return cb(null, npm.fullList)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/config.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,285 +0,0 @@
-
-module.exports = config
-
-config.usage = "npm config set <key> <value>"
-             + "\nnpm config get [<key>]"
-             + "\nnpm config delete <key>"
-             + "\nnpm config list"
-             + "\nnpm config edit"
-             + "\nnpm set <key> <value>"
-             + "\nnpm get [<key>]"
-
-var log = require("npmlog")
-  , npm = require("./npm.js")
-  , spawn = require("child_process").spawn
-  , fs = require("graceful-fs")
-  , npmconf = require("npmconf")
-  , types = npmconf.defs.types
-  , ini = require("ini")
-  , editor = require("editor")
-  , os = require("os")
-
-config.completion = function (opts, cb) {
-  var argv = opts.conf.argv.remain
-  if (argv[1] !== "config") argv.unshift("config")
-  if (argv.length === 2) {
-    var cmds = ["get", "set", "delete", "ls", "rm", "edit"]
-    if (opts.partialWord !== "l") cmds.push("list")
-    return cb(null, cmds)
-  }
-
-  var action = argv[2]
-  switch (action) {
-    case "set":
-      // todo: complete with valid values, if possible.
-      if (argv.length > 3) return cb(null, [])
-      // fallthrough
-    case "get":
-    case "delete":
-    case "rm":
-      return cb(null, Object.keys(types))
-    case "edit":
-    case "list": case "ls":
-      return cb(null, [])
-    default: return cb(null, [])
-  }
-}
-
-// npm config set key value
-// npm config get key
-// npm config list
-function config (args, cb) {
-  var action = args.shift()
-  switch (action) {
-    case "set": return set(args[0], args[1], cb)
-    case "get": return get(args[0], cb)
-    case "delete": case "rm": case "del": return del(args[0], cb)
-    case "list": case "ls": return list(cb)
-    case "edit": return edit(cb)
-    default: return unknown(action, cb)
-  }
-}
-
-function edit (cb) {
-  var e = npm.config.get("editor")
-    , which = npm.config.get("global") ? "global" : "user"
-    , f = npm.config.get(which + "config")
-  if (!e) return cb(new Error("No EDITOR config or environ set."))
-  npm.config.save(which, function (er) {
-    if (er) return cb(er)
-    fs.readFile(f, "utf8", function (er, data) {
-      if (er) data = ""
-      data = [ ";;;;"
-             , "; npm "+(npm.config.get("global") ?
-                         "globalconfig" : "userconfig")+" file"
-             , "; this is a simple ini-formatted file"
-             , "; lines that start with semi-colons are comments."
-             , "; read `npm help config` for help on the various options"
-             , ";;;;"
-             , ""
-             , data
-             ].concat( [ ";;;;"
-                       , "; all options with default values"
-                       , ";;;;"
-                       ]
-                     )
-              .concat(Object.keys(npmconf.defaults).reduce(function (arr, key) {
-                var obj = {};
-                obj[key] = npmconf.defaults[key]
-                if (key === "logstream") return arr
-                return arr.concat(
-                  ini.stringify(obj)
-                    .replace(/\n$/m, '')
-                    .replace(/^/g, '; ')
-                    .replace(/\n/g, '\n; ')
-                    .split('\n'))
-              }, []))
-              .concat([""])
-              .join(os.EOL)
-      fs.writeFile
-        ( f
-        , data
-        , "utf8"
-        , function (er) {
-            if (er) return cb(er)
-            editor(f, { editor: e }, cb)
-          }
-        )
-    })
-  })
-}
-
-function del (key, cb) {
-  if (!key) return cb(new Error("no key provided"))
-  var where = npm.config.get("global") ? "global" : "user"
-  npm.config.del(key, where)
-  npm.config.save(where, cb)
-}
-
-function set (key, val, cb) {
-  if (key === undefined) {
-    return unknown("", cb)
-  }
-  if (val === undefined) {
-    if (key.indexOf("=") !== -1) {
-      var k = key.split("=")
-      key = k.shift()
-      val = k.join("=")
-    } else {
-      val = ""
-    }
-  }
-  key = key.trim()
-  val = val.trim()
-  log.info("config", "set %j %j", key, val)
-  var where = npm.config.get("global") ? "global" : "user"
-  npm.config.set(key, val, where)
-  npm.config.save(where, cb)
-}
-
-function get (key, cb) {
-  if (!key) return list(cb)
-  if (key.charAt(0) === "_") {
-    return cb(new Error("---sekretz---"))
-  }
-  console.log(npm.config.get(key))
-  cb()
-}
-
-function sort (a, b) {
-  return a > b ? 1 : -1
-}
-
-function reverse (a, b) {
-  return a > b ? -1 : 1
-}
-
-function public (k) {
-  return !(k.charAt(0) === "_" || types[k] !== types[k])
-}
-
-function getKeys (data) {
-  return Object.keys(data).filter(public).sort(sort)
-}
-
-function list (cb) {
-  var msg = ""
-    , long = npm.config.get("long")
-
-  var cli = npm.config.sources.cli.data
-    , cliKeys = getKeys(cli)
-  if (cliKeys.length) {
-    msg += "; cli configs\n"
-    cliKeys.forEach(function (k) {
-      if (cli[k] && typeof cli[k] === "object") return
-      if (k === "argv") return
-      msg += k + " = " + JSON.stringify(cli[k]) + "\n"
-    })
-    msg += "\n"
-  }
-
-  // env configs
-  var env = npm.config.sources.env.data
-    , envKeys = getKeys(env)
-  if (envKeys.length) {
-    msg += "; environment configs\n"
-    envKeys.forEach(function (k) {
-      if (env[k] !== npm.config.get(k)) {
-        if (!long) return
-        msg += "; " + k + " = " + JSON.stringify(env[k])
-            + " (overridden)\n"
-      } else msg += k + " = " + JSON.stringify(env[k]) + "\n"
-    })
-    msg += "\n"
-  }
-
-  // user config file
-  var uconf = npm.config.sources.user.data
-    , uconfKeys = getKeys(uconf)
-  if (uconfKeys.length) {
-    msg += "; userconfig " + npm.config.get("userconfig") + "\n"
-    uconfKeys.forEach(function (k) {
-      var val = (k.charAt(0) === "_")
-              ? "---sekretz---"
-              : JSON.stringify(uconf[k])
-      if (uconf[k] !== npm.config.get(k)) {
-        if (!long) return
-        msg += "; " + k + " = " + val
-            + " (overridden)\n"
-      } else msg += k + " = " + val + "\n"
-    })
-    msg += "\n"
-  }
-
-  // global config file
-  var gconf = npm.config.sources.global.data
-    , gconfKeys = getKeys(gconf)
-  if (gconfKeys.length) {
-    msg += "; globalconfig " + npm.config.get("globalconfig") + "\n"
-    gconfKeys.forEach(function (k) {
-      var val = (k.charAt(0) === "_")
-              ? "---sekretz---"
-              : JSON.stringify(gconf[k])
-      if (gconf[k] !== npm.config.get(k)) {
-        if (!long) return
-        msg += "; " + k + " = " + val
-            + " (overridden)\n"
-      } else msg += k + " = " + val + "\n"
-    })
-    msg += "\n"
-  }
-
-  // builtin config file
-  var builtin = npm.config.sources.builtin || {}
-  if (builtin && builtin.data) {
-    var bconf = builtin.data
-      , bpath = builtin.path
-      , bconfKeys = getKeys(bconf)
-    if (bconfKeys.length) {
-      var path = require("path")
-      msg += "; builtin config " + bpath + "\n"
-      bconfKeys.forEach(function (k) {
-        var val = (k.charAt(0) === "_")
-                ? "---sekretz---"
-                : JSON.stringify(bconf[k])
-        if (bconf[k] !== npm.config.get(k)) {
-          if (!long) return
-          msg += "; " + k + " = " + val
-              + " (overridden)\n"
-        } else msg += k + " = " + val + "\n"
-      })
-      msg += "\n"
-    }
-  }
-
-  // only show defaults if --long
-  if (!long) {
-    msg += "; node bin location = " + process.execPath + "\n"
-         + "; cwd = " + process.cwd() + "\n"
-         + "; HOME = " + process.env.HOME + "\n"
-         + "; 'npm config ls -l' to show all defaults.\n"
-
-    console.log(msg)
-    return cb()
-  }
-
-  var defaults = npmconf.defaults
-    , defKeys = getKeys(defaults)
-  msg += "; default values\n"
-  defKeys.forEach(function (k) {
-    if (defaults[k] && typeof defaults[k] === "object") return
-    var val = JSON.stringify(defaults[k])
-    if (defaults[k] !== npm.config.get(k)) {
-      msg += "; " + k + " = " + val
-          + " (overridden)\n"
-    } else msg += k + " = " + val + "\n"
-  })
-  msg += "\n"
-
-  console.log(msg)
-  return cb()
-}
-
-function unknown (action, cb) {
-  cb("Usage:\n" + config.usage)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/dedupe.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,353 +0,0 @@
-// traverse the node_modules/package.json tree
-// looking for duplicates.  If any duplicates are found,
-// then move them up to the highest level necessary
-// in order to make them no longer duplicated.
-//
-// This is kind of ugly, and really highlights the need for
-// much better "put pkg X at folder Y" abstraction.  Oh well,
-// whatever.  Perfect enemy of the good, and all that.
-
-var fs = require("fs")
-var asyncMap = require("slide").asyncMap
-var path = require("path")
-var readJson = require("read-package-json")
-var archy = require("archy")
-var util = require("util")
-var RegClient = require("npm-registry-client")
-var npmconf = require("npmconf")
-var npm = require("npm")
-var semver = require("semver")
-var npm = require("npm")
-var rimraf = require("rimraf")
-var log = require("npmlog")
-var npm = require("./npm.js")
-
-module.exports = dedupe
-
-dedupe.usage = "npm dedupe [pkg pkg...]"
-
-function dedupe (args, silent, cb) {
-  if (typeof silent === "function") cb = silent, silent = false
-  var dryrun = false
-  if (npm.command.match(/^find/)) dryrun = true
-  return dedupe_(npm.prefix, args, {}, dryrun, silent, cb)
-}
-
-function dedupe_ (dir, filter, unavoidable, dryrun, silent, cb) {
-  readInstalled(path.resolve(dir), {}, null, function (er, data, counter) {
-    if (er) {
-      return cb(er)
-    }
-
-    if (!data) {
-      return cb()
-    }
-
-    // find out which things are dupes
-    var dupes = Object.keys(counter || {}).filter(function (k) {
-      if (filter.length && -1 === filter.indexOf(k)) return false
-      return counter[k] > 1 && !unavoidable[k]
-    }).reduce(function (s, k) {
-      s[k] = []
-      return s
-    }, {})
-
-    // any that are unavoidable need to remain as they are.  don't even
-    // try to touch them or figure it out.  Maybe some day, we can do
-    // something a bit more clever here, but for now, just skip over it,
-    // and all its children.
-    ;(function U (obj) {
-      if (unavoidable[obj.name]) {
-        obj.unavoidable = true
-      }
-      if (obj.parent && obj.parent.unavoidable) {
-        obj.unavoidable = true
-      }
-      Object.keys(obj.children).forEach(function (k) {
-        U(obj.children[k])
-      })
-    })
-
-    // then collect them up and figure out who needs them
-    ;(function C (obj) {
-      if (dupes[obj.name] && !obj.unavoidable) {
-        dupes[obj.name].push(obj)
-        obj.duplicate = true
-      }
-      obj.dependents = whoDepends(obj)
-      Object.keys(obj.children).forEach(function (k) {
-        C(obj.children[k])
-      })
-    })(data)
-
-    if (dryrun) {
-      var k = Object.keys(dupes)
-      if (!k.length) return cb()
-      return npm.commands.ls(k, silent, cb)
-    }
-
-    var summary = Object.keys(dupes).map(function (n) {
-      return [n, dupes[n].filter(function (d) {
-        return d && d.parent && !d.parent.duplicate && !d.unavoidable
-      }).map(function M (d) {
-        return [d.path, d.version, d.dependents.map(function (k) {
-          return [k.path, k.version, k.dependencies[d.name] || ""]
-        })]
-      })]
-    }).map(function (item) {
-      var name = item[0]
-      var set = item[1]
-
-      var ranges = set.map(function (i) {
-        return i[2].map(function (d) {
-          return d[2]
-        })
-      }).reduce(function (l, r) {
-        return l.concat(r)
-      }, []).map(function (v, i, set) {
-        if (set.indexOf(v) !== i) return false
-        return v
-      }).filter(function (v) {
-        return v !== false
-      })
-
-      var locs = set.map(function (i) {
-        return i[0]
-      })
-
-      var versions = set.map(function (i) {
-        return i[1]
-      }).filter(function (v, i, set) {
-        return set.indexOf(v) === i
-      })
-
-      var has = set.map(function (i) {
-        return [i[0], i[1]]
-      }).reduce(function (set, kv) {
-        set[kv[0]] = kv[1]
-        return set
-      }, {})
-
-      var loc = locs.length ? locs.reduce(function (a, b) {
-        // a=/path/to/node_modules/foo/node_modules/bar
-        // b=/path/to/node_modules/elk/node_modules/bar
-        // ==/path/to/node_modules/bar
-        a = a.split(/\/node_modules\//)
-        b = b.split(/\/node_modules\//)
-        var name = a.pop()
-        b.pop()
-        // find the longest chain that both A and B share.
-        // then push the name back on it, and join by /node_modules/
-        var res = []
-        for (var i = 0, al = a.length, bl = b.length; i < al && i < bl && a[i] === b[i]; i++);
-        return a.slice(0, i).concat(name).join("/node_modules/")
-      }) : undefined
-
-      return [item[0], { item: item
-                       , ranges: ranges
-                       , locs: locs
-                       , loc: loc
-                       , has: has
-                       , versions: versions
-                       }]
-    }).filter(function (i) {
-      return i[1].loc
-    })
-
-    findVersions(npm, summary, function (er, set) {
-      if (er) return cb(er)
-      if (!set.length) return cb()
-      installAndRetest(set, filter, dir, unavoidable, silent, cb)
-    })
-  })
-}
-
-function installAndRetest (set, filter, dir, unavoidable, silent, cb) {
-  //return cb(null, set)
-  var remove = []
-
-  asyncMap(set, function (item, cb) {
-    // [name, has, loc, locMatch, regMatch, others]
-    var name = item[0]
-    var has = item[1]
-    var where = item[2]
-    var locMatch = item[3]
-    var regMatch = item[4]
-    var others = item[5]
-
-    // nothing to be done here.  oh well.  just a conflict.
-    if (!locMatch && !regMatch) {
-      log.warn("unavoidable conflict", item[0], item[1])
-      log.warn("unavoidable conflict", "Not de-duplicating")
-      unavoidable[item[0]] = true
-      return cb()
-    }
-
-    // nothing to do except to clean up the extraneous deps
-    if (locMatch && has[where] === locMatch) {
-      remove.push.apply(remove, others)
-      return cb()
-    }
-
-    if (regMatch) {
-      var what = name + "@" + regMatch
-      // where is /path/to/node_modules/foo/node_modules/bar
-      // for package "bar", but we need it to be just
-      // /path/to/node_modules/foo
-      where = where.split(/\/node_modules\//)
-      where.pop()
-      where = where.join("/node_modules/")
-      remove.push.apply(remove, others)
-
-      return npm.commands.install(where, what, cb)
-    }
-
-    // hrm?
-    return cb(new Error("danger zone\n" + name + " " +
-                        + regMatch + " " + locMatch))
-
-  }, function (er, installed) {
-    if (er) return cb(er)
-    asyncMap(remove, rimraf, function (er) {
-      if (er) return cb(er)
-      remove.forEach(function (r) {
-        log.info("rm", r)
-      })
-      dedupe_(dir, filter, unavoidable, false, silent, cb)
-    })
-  })
-}
-
-function findVersions (npm, summary, cb) {
-  // now, for each item in the summary, try to find the maximum version
-  // that will satisfy all the ranges.  next step is to install it at
-  // the specified location.
-  asyncMap(summary, function (item, cb) {
-    var name = item[0]
-    var data = item[1]
-    var loc = data.loc
-    var locs = data.locs.filter(function (l) {
-      return l !== loc
-    })
-
-    // not actually a dupe, or perhaps all the other copies were
-    // children of a dupe, so this'll maybe be picked up later.
-    if (locs.length === 0) {
-      return cb(null, [])
-    }
-
-    // { <folder>: <version> }
-    var has = data.has
-
-    // the versions that we already have.
-    // if one of these is ok, then prefer to use that.
-    // otherwise, try fetching from the registry.
-    var versions = data.versions
-
-    var ranges = data.ranges
-    npm.registry.get(name, function (er, data) {
-      var regVersions = er ? [] : Object.keys(data.versions)
-      var locMatch = bestMatch(versions, ranges)
-      var regMatch;
-      var tag = npm.config.get("tag");
-      var distTags = data["dist-tags"];
-      if (distTags && distTags[tag] && data.versions[distTags[tag]]) {
-        regMatch = distTags[tag]
-      } else {
-        regMatch = bestMatch(regVersions, ranges)
-      }
-
-      cb(null, [[name, has, loc, locMatch, regMatch, locs]])
-    })
-  }, cb)
-}
-
-function bestMatch (versions, ranges) {
-  return versions.filter(function (v) {
-    return !ranges.some(function (r) {
-      return !semver.satisfies(v, r, true)
-    })
-  }).sort(semver.compareLoose).pop()
-}
-
-
-function readInstalled (dir, counter, parent, cb) {
-  var pkg, children, realpath
-
-  fs.realpath(dir, function (er, rp) {
-    realpath = rp
-    next()
-  })
-
-  readJson(path.resolve(dir, "package.json"), function (er, data) {
-    if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-    if (er) return cb() // not a package, probably.
-    counter[data.name] = counter[data.name] || 0
-    counter[data.name]++
-    pkg =
-      { _id: data._id
-      , name: data.name
-      , version: data.version
-      , dependencies: data.dependencies || {}
-      , optionalDependencies: data.optionalDependencies || {}
-      , devDependencies: data.devDependencies || {}
-      , bundledDependencies: data.bundledDependencies || []
-      , path: dir
-      , realPath: dir
-      , children: {}
-      , parent: parent
-      , family: Object.create(parent ? parent.family : null)
-      , unavoidable: false
-      , duplicate: false
-      }
-    if (parent) {
-      parent.children[data.name] = pkg
-      parent.family[data.name] = pkg
-    }
-    next()
-  })
-
-  fs.readdir(path.resolve(dir, "node_modules"), function (er, c) {
-    children = c || [] // error is ok, just means no children.
-    children = children.filter(function (p) {
-      return !p.match(/^[\._-]/)
-    })
-    next()
-  })
-
-  function next () {
-    if (!children || !pkg || !realpath) return
-
-    // ignore devDependencies.  Just leave them where they are.
-    children = children.filter(function (c) {
-      return !pkg.devDependencies.hasOwnProperty(c)
-    })
-
-    pkg.realPath = realpath
-    if (pkg.realPath !== pkg.path) children = []
-    var d = path.resolve(dir, "node_modules")
-    asyncMap(children, function (child, cb) {
-      readInstalled(path.resolve(d, child), counter, pkg, cb)
-    }, function (er) {
-      cb(er, pkg, counter)
-    })
-  }
-}
-
-function whoDepends (pkg) {
-  var start = pkg.parent || pkg
-  return whoDepends_(pkg, [], start)
-}
-
-function whoDepends_ (pkg, who, test) {
-  if (test !== pkg &&
-      test.dependencies[pkg.name] &&
-      test.family[pkg.name] === pkg) {
-    who.push(test)
-  }
-  Object.keys(test.children).forEach(function (n) {
-    whoDepends_(pkg, who, test.children[n])
-  })
-  return who
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/deprecate.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,47 +0,0 @@
-
-module.exports = deprecate
-
-deprecate.usage = "npm deprecate <pkg>[@<version>] <message>"
-
-deprecate.completion = function (opts, cb) {
-  // first, get a list of remote packages this user owns.
-  // once we have a user account, then don't complete anything.
-  var un = npm.config.get("username")
-  if (!npm.config.get("username")) return cb()
-  if (opts.conf.argv.remain.length > 2) return cb()
-  // get the list of packages by user
-  var uri = "/-/by-user/"+encodeURIComponent(un)
-  registry.get(uri, null, 60000, function (er, list) {
-    if (er) return cb()
-    console.error(list)
-    return cb(null, list[un])
-  })
-}
-
-var semver = require("semver")
-  , npm = require("./npm.js")
-  , registry = npm.registry
-
-function deprecate (args, cb) {
-  var pkg = args[0]
-    , msg = args[1]
-  if (msg === undefined) return cb("Usage: " + deprecate.usage)
-  // fetch the data and make sure it exists.
-  pkg = pkg.split(/@/)
-  var name = pkg.shift()
-    , ver = pkg.join("@")
-  if (semver.validRange(ver) === null) {
-    return cb(new Error("invalid version range: "+ver))
-  }
-  registry.get(name, function (er, data) {
-    if (er) return cb(er)
-    // filter all the versions that match
-    Object.keys(data.versions).filter(function (v) {
-      return semver.satisfies(v, ver, true)
-    }).forEach(function (v) {
-      data.versions[v].deprecated = msg
-    })
-    // now update the doc on the registry
-    registry.request('PUT', data._id, data, cb)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/docs.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-module.exports = docs
-
-docs.usage = "npm docs <pkgname>"
-
-docs.completion = function (opts, cb) {
-  if (opts.conf.argv.remain.length > 2) return cb()
-  registry.get("/-/short", 60000, function (er, list) {
-    return cb(null, list || [])
-  })
-}
-
-var npm = require("./npm.js")
-  , registry = npm.registry
-  , log = require("npmlog")
-  , opener = require("opener")
-
-function docs (args, cb) {
-  if (!args.length) return cb(docs.usage)
-  var project = args[0]
-  var npmName = project.split("@").shift()
-  registry.get(project + "/latest", 3600, function (er, d) {
-    if (er) {
-      if (project.split("/").length !== 2) return cb(er)
-      
-      var url = "https://github.com/" + project + "#readme"
-      return opener(url, { command: npm.config.get("browser") }, cb)
-    }
-
-    var homepage = d.homepage
-      , repo = d.repository || d.repositories
-      , url = homepage ? homepage
-            : "https://npmjs.org/package/" + d.name
-    opener(url, { command: npm.config.get("browser") }, cb)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/edit.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-// npm edit <pkg>[@<version>]
-// open the package folder in the $EDITOR
-
-module.exports = edit
-edit.usage = "npm edit <pkg>"
-
-edit.completion = require("./utils/completion/installed-shallow.js")
-
-var npm = require("./npm.js")
-  , spawn = require("child_process").spawn
-  , path = require("path")
-  , fs = require("graceful-fs")
-  , editor = require("editor")
-
-function edit (args, cb) {
-  var p = args[0]
-  if (args.length !== 1 || !p) return cb(edit.usage)
-  var e = npm.config.get("editor")
-  if (!e) return cb(new Error(
-    "No editor set.  Set the 'editor' config, or $EDITOR environ."))
-  p = p.split("/")
-       .join("/node_modules/")
-       .replace(/(\/node_modules)+/, "/node_modules")
-  var f = path.resolve(npm.dir, p)
-  fs.lstat(f, function (er) {
-    if (er) return cb(er)
-    editor(f, { editor: e }, function (er) {
-      if (er) return cb(er)
-      npm.commands.rebuild(args, cb)
-    })
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/explore.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-// npm explore <pkg>[@<version>]
-// open a subshell to the package folder.
-
-module.exports = explore
-explore.usage = "npm explore <pkg> [ -- <cmd>]"
-explore.completion = require("./utils/completion/installed-shallow.js")
-
-var npm = require("./npm.js")
-  , spawn = require("child_process").spawn
-  , path = require("path")
-  , fs = require("graceful-fs")
-
-function explore (args, cb) {
-  if (args.length < 1 || !args[0]) return cb(explore.usage)
-  var p = args.shift()
-  args = args.join(" ").trim()
-  if (args) args = ["-c", args]
-  else args = []
-
-  var cwd = path.resolve(npm.dir, p)
-  var sh = npm.config.get("shell")
-  fs.stat(cwd, function (er, s) {
-    if (er || !s.isDirectory()) return cb(new Error(
-      "It doesn't look like "+p+" is installed."))
-    if (!args.length) console.log(
-      "\nExploring "+cwd+"\n"+
-      "Type 'exit' or ^D when finished\n")
-
-    var shell = spawn(sh, args, { cwd: cwd, customFds: [0, 1, 2] })
-    shell.on("close", function (er) {
-      // only fail if non-interactive.
-      if (!args.length) return cb()
-      cb(er)
-    })
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/faq.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,8 +0,0 @@
-
-module.exports = faq
-
-faq.usage = "npm faq"
-
-var npm = require("./npm.js")
-
-function faq (args, cb) { npm.commands.help(["faq"], cb) }
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/get.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,12 +0,0 @@
-
-module.exports = get
-
-get.usage = "npm get <key> <value> (See `npm config`)"
-
-var npm = require("./npm.js")
-
-get.completion = npm.commands.config.completion
-
-function get (args, cb) {
-  npm.commands.config(["get"].concat(args), cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/help-search.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,216 +0,0 @@
-
-module.exports = helpSearch
-
-var fs = require("graceful-fs")
-  , path = require("path")
-  , asyncMap = require("slide").asyncMap
-  , cliDocsPath = path.join(__dirname, "..", "doc", "cli")
-  , apiDocsPath = path.join(__dirname, "..", "doc", "api")
-  , log = require("npmlog")
-  , npm = require("./npm.js")
-  , glob = require("glob")
-
-helpSearch.usage = "npm help-search <text>"
-
-function helpSearch (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-  if (!args.length) return cb(helpSearch.usage)
-
-  // see if we're actually searching the api docs.
-  var argv = npm.config.get("argv").cooked
-
-  var docPath = path.resolve(__dirname, "..", "doc")
-  return glob(docPath + "/*/*.md", function (er, files) {
-    if (er)
-      return cb(er)
-    readFiles(files, function (er, data) {
-      if (er)
-        return cb(er)
-      searchFiles(args, data, function (er, results) {
-        if (er)
-          return cb(er)
-        formatResults(args, results, cb)
-      })
-    })
-  })
-}
-
-function readFiles (files, cb) {
-  var res = {}
-  asyncMap(files, function (file, cb) {
-    fs.readFile(file, 'utf8', function (er, data) {
-      res[file] = data
-      return cb(er)
-    })
-  }, function (er) {
-    return cb(er, res)
-  })
-}
-
-function searchFiles (args, files, cb) {
-  var results = []
-  Object.keys(files).forEach(function (file) {
-    var data = files[file]
-
-    // skip if no matches at all
-    for (var a = 0, l = args.length; a < l && !match; a++) {
-      var match = data.toLowerCase().indexOf(args[a].toLowerCase()) !== -1
-    }
-    if (!match)
-      return
-
-    var lines = data.split(/\n+/)
-    var context = []
-
-    // if a line has a search term, then skip it and the next line.
-    // if the next line has a search term, then skip all 3
-    // otherwise, set the line to null.  then remove the nulls.
-    for (var i = 0, l = lines.length; i < l; i ++) {
-      var line = lines[i]
-        , nextLine = lines[i + 1]
-        , match = false
-      if (nextLine) {
-        for (var a = 0, ll = args.length; a < ll && !match; a ++) {
-          match = nextLine.toLowerCase()
-                  .indexOf(args[a].toLowerCase()) !== -1
-        }
-        if (match) {
-          // skip over the next line, and the line after it.
-          i += 2
-          continue
-        }
-      }
-
-      match = false
-      for (var a = 0, ll = args.length; a < ll && !match; a ++) {
-        match = line.toLowerCase().indexOf(args[a].toLowerCase()) !== -1
-      }
-      if (match) {
-        // skip over the next line
-        i ++
-        continue
-      }
-
-      lines[i] = null
-    }
-
-    // now squish any string of nulls into a single null
-    lines = lines.reduce(function (l, r) {
-      if (!(r === null && l[l.length-1] === null)) l.push(r)
-      return l
-    }, [])
-
-    if (lines[lines.length - 1] === null) lines.pop()
-    if (lines[0] === null) lines.shift()
-
-    // now see how many args were found at all.
-    var found = {}
-      , totalHits = 0
-    lines.forEach(function (line) {
-      args.forEach(function (arg) {
-        var hit = (line || "").toLowerCase()
-                  .split(arg.toLowerCase()).length - 1
-        if (hit > 0) {
-          found[arg] = (found[arg] || 0) + hit
-          totalHits += hit
-        }
-      })
-    })
-
-    var cmd = "npm help "
-    if (path.basename(path.dirname(file)) === "api") {
-      cmd = "npm apihelp "
-    }
-    cmd += path.basename(file, ".md").replace(/^npm-/, "")
-    results.push({ file: file
-                 , cmd: cmd
-                 , lines: lines
-                 , found: Object.keys(found)
-                 , hits: found
-                 , totalHits: totalHits
-                 })
-  })
-
-  // if only one result, then just show that help section.
-  if (results.length === 1) {
-    return npm.commands.help([results[0].file.replace(/\.md$/, "")], cb)
-  }
-
-  if (results.length === 0) {
-    console.log("No results for " + args.map(JSON.stringify).join(" "))
-    return cb()
-  }
-
-  // sort results by number of results found, then by number of hits
-  // then by number of matching lines
-  results = results.sort(function (a, b) {
-    return a.found.length > b.found.length ? -1
-         : a.found.length < b.found.length ? 1
-         : a.totalHits > b.totalHits ? -1
-         : a.totalHits < b.totalHits ? 1
-         : a.lines.length > b.lines.length ? -1
-         : a.lines.length < b.lines.length ? 1
-         : 0
-  })
-
-  cb(null, results)
-}
-
-function formatResults (args, results, cb) {
-  var cols = Math.min(process.stdout.columns || Infinity, 80) + 1
-
-  var out = results.map(function (res, i, results) {
-    var out = res.cmd
-      , r = Object.keys(res.hits).map(function (k) {
-          return k + ":" + res.hits[k]
-        }).sort(function (a, b) {
-          return a > b ? 1 : -1
-        }).join(" ")
-
-    out += ((new Array(Math.max(1, cols - out.length - r.length)))
-             .join (" ")) + r
-
-    if (!npm.config.get("long")) return out
-
-    var out = "\n\n" + out
-         + "\n" + (new Array(cols)).join("—") + "\n"
-         + res.lines.map(function (line, i) {
-      if (line === null || i > 3) return ""
-      for (var out = line, a = 0, l = args.length; a < l; a ++) {
-        var finder = out.toLowerCase().split(args[a].toLowerCase())
-          , newOut = []
-          , p = 0
-        finder.forEach(function (f) {
-          newOut.push( out.substr(p, f.length)
-                     , "\1"
-                     , out.substr(p + f.length, args[a].length)
-                     , "\2" )
-          p += f.length + args[a].length
-        })
-        out = newOut.join("")
-      }
-      if (npm.color) {
-        var color = "\033[31;40m"
-          , reset = "\033[0m"
-      } else {
-        var color = ""
-          , reset = ""
-      }
-      out = out.split("\1").join(color)
-               .split("\2").join(reset)
-      return out
-    }).join("\n").trim()
-    return out
-  }).join("\n")
-
-  if (results.length && !npm.config.get("long")) {
-    out = "Top hits for "+(args.map(JSON.stringify).join(" "))
-        + "\n" + (new Array(cols)).join("—") + "\n"
-        + out
-        + "\n" + (new Array(cols)).join("—") + "\n"
-        + "(run with -l or --long to see more context)"
-  }
-
-  console.log(out.trim())
-  cb(null, results)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/help.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,231 +0,0 @@
-
-module.exports = help
-
-help.completion = function (opts, cb) {
-  if (opts.conf.argv.remain.length > 2) return cb(null, [])
-  getSections(cb)
-}
-
-var fs = require("graceful-fs")
-  , path = require("path")
-  , spawn = require("child_process").spawn
-  , npm = require("./npm.js")
-  , log = require("npmlog")
-  , opener = require("opener")
-  , glob = require("glob")
-
-function help (args, cb) {
-  var argv = npm.config.get("argv").cooked
-
-  var argnum = 0
-  if (args.length === 2 && ~~args[0]) {
-    argnum = ~~args.shift()
-  }
-
-  // npm help foo bar baz: search topics
-  if (args.length > 1 && args[0]) {
-    return npm.commands["help-search"](args, num, cb)
-  }
-
-  var section = npm.deref(args[0]) || args[0]
-
-  // npm help <noargs>:  show basic usage
-  if (!section)
-    return npmUsage(cb)
-
-  // npm <cmd> -h: show command usage
-  if ( npm.config.get("usage")
-    && npm.commands[section]
-    && npm.commands[section].usage
-  ) {
-    npm.config.set("loglevel", "silent")
-    log.level = "silent"
-    console.log(npm.commands[section].usage)
-    return cb()
-  }
-
-  // npm apihelp <section>: Prefer section 3 over section 1
-  var apihelp = argv.length && -1 !== argv[0].indexOf("api")
-  var pref = apihelp ? [3, 1, 5, 7] : [1, 3, 5, 7]
-  if (argnum)
-    pref = [ argnum ].concat(pref.filter(function (n) {
-      return n !== argnum
-    }))
-
-  // npm help <section>: Try to find the path
-  var manroot = path.resolve(__dirname, "..", "man")
-  var htmlroot = path.resolve(__dirname, "..", "html", "doc")
-
-  // legacy
-  if (section === "global")
-    section = "folders"
-  else if (section === "json")
-    section = "package.json"
-
-  // find either /section.n or /npm-section.n
-  var f = "+(npm-" + section + "|" + section + ").[0-9]"
-  return glob(manroot + "/*/" + f, function (er, mans) {
-    if (er)
-      return cb(er)
-
-    if (!mans.length)
-      return npm.commands["help-search"](args, cb)
-
-    viewMan(pickMan(mans, pref), cb)
-  })
-}
-
-function pickMan (mans, pref_) {
-  var nre = /([0-9]+)$/
-  var pref = {}
-  pref_.forEach(function (sect, i) {
-    pref[sect] = i
-  })
-  mans = mans.sort(function (a, b) {
-    var an = a.match(nre)[1]
-    var bn = b.match(nre)[1]
-    return an === bn ? (a > b ? -1 : 1)
-         : pref[an] < pref[bn] ? -1
-         : 1
-  })
-  return mans[0]
-}
-
-function viewMan (man, cb) {
-  var nre = /([0-9]+)$/
-  var num = man.match(nre)[1]
-  var section = path.basename(man, "." + num)
-
-  // at this point, we know that the specified man page exists
-  var manpath = path.join(__dirname, "..", "man")
-    , env = {}
-  Object.keys(process.env).forEach(function (i) {
-    env[i] = process.env[i]
-  })
-  env.MANPATH = manpath
-  var viewer = npm.config.get("viewer")
-
-  switch (viewer) {
-    case "woman":
-      var a = ["-e", "(woman-find-file \"" + man + "\")"]
-      var conf = { env: env, customFds: [ 0, 1, 2] }
-      var woman = spawn("emacsclient", a, conf)
-      woman.on("close", cb)
-      break
-
-    case "browser":
-      opener(htmlMan(man), { command: npm.config.get("browser") }, cb)
-      break
-
-    default:
-      var conf = { env: env, customFds: [ 0, 1, 2] }
-      var man = spawn("man", [num, section], conf)
-      man.on("close", cb)
-      break
-  }
-}
-
-function htmlMan (man) {
-  var sect = +man.match(/([0-9]+)$/)[1]
-  var f = path.basename(man).replace(/([0-9]+)$/, "html")
-  switch (sect) {
-    case 1:
-      sect = "cli"
-      break
-    case 3:
-      sect = "api"
-      break
-    case 5:
-      sect = "files"
-      break
-    case 7:
-      sect = "misc"
-      break
-    default:
-      throw new Error("invalid man section: " + sect)
-  }
-  return path.resolve(__dirname, "..", "html", "doc", sect, f)
-}
-
-function npmUsage (cb) {
-  npm.config.set("loglevel", "silent")
-  log.level = "silent"
-  console.log
-    ( ["\nUsage: npm <command>"
-      , ""
-      , "where <command> is one of:"
-      , npm.config.get("long") ? usages()
-        : "    " + wrap(Object.keys(npm.commands))
-      , ""
-      , "npm <cmd> -h     quick help on <cmd>"
-      , "npm -l           display full usage info"
-      , "npm faq          commonly asked questions"
-      , "npm help <term>  search for help on <term>"
-      , "npm help npm     involved overview"
-      , ""
-      , "Specify configs in the ini-formatted file:"
-      , "    " + npm.config.get("userconfig")
-      , "or on the command line via: npm <command> --key value"
-      , "Config info can be viewed via: npm help config"
-      , ""
-      , "npm@" + npm.version + " " + path.dirname(__dirname)
-      ].join("\n"))
-  cb()
-}
-
-function usages () {
-  // return a string of <cmd>: <usage>
-  var maxLen = 0
-  return Object.keys(npm.commands).filter(function (c) {
-    return c === npm.deref(c)
-  }).reduce(function (set, c) {
-    set.push([c, npm.commands[c].usage || ""])
-    maxLen = Math.max(maxLen, c.length)
-    return set
-  }, []).map(function (item) {
-    var c = item[0]
-      , usage = item[1]
-    return "\n    " + c + (new Array(maxLen - c.length + 2).join(" "))
-         + (usage.split("\n")
-            .join("\n" + (new Array(maxLen + 6).join(" "))))
-  }).join("\n")
-  return out
-}
-
-
-function wrap (arr) {
-  var out = ['']
-    , l = 0
-    , line
-
-  line = process.stdout.columns
-  if (!line)
-    line = 60
-  else
-    line = Math.min(60, Math.max(line - 16, 24))
-
-  arr.sort(function (a,b) { return a<b?-1:1 })
-    .forEach(function (c) {
-      if (out[l].length + c.length + 2 < line) {
-        out[l] += ', '+c
-      } else {
-        out[l++] += ','
-        out[l] = c
-      }
-    })
-  return out.join("\n    ").substr(2)
-}
-
-function getSections (cb) {
-  var g = path.resolve(__dirname, "../man/man[0-9]/*.[0-9]")
-  glob(g, function (er, files) {
-    if (er)
-      return cb(er)
-    cb(null, Object.keys(files.reduce(function (acc, file) {
-      file = path.basename(file).replace(/\.[0-9]+$/, "")
-      file = file.replace(/^npm-/, "")
-      acc[file] = true
-      return acc
-    }, { help: true })))
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/init.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-
-// initialize a package.json file
-
-module.exports = init
-
-var log = require("npmlog")
-  , npm = require("./npm.js")
-  , initJson = require("init-package-json")
-
-init.usage = "npm init"
-
-function init (args, cb) {
-  var dir = process.cwd()
-  log.pause()
-  var initFile = npm.config.get('init-module')
-
-  console.log(
-    ["This utility will walk you through creating a package.json file."
-    ,"It only covers the most common items, and tries to guess sane defaults."
-    ,""
-    ,"See `npm help json` for definitive documentation on these fields"
-    ,"and exactly what they do."
-    ,""
-    ,"Use `npm install <pkg> --save` afterwards to install a package and"
-    ,"save it as a dependency in the package.json file."
-    ,""
-    ,"Press ^C at any time to quit."
-    ].join("\n"))
-
-  initJson(dir, initFile, npm.config, function (er, data) {
-    log.resume()
-    log.silly('package data', data)
-    log.info('init', 'written successfully')
-    cb(er, data)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/install.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1086 +0,0 @@
-// npm install <pkg> <pkg> <pkg>
-//
-// See doc/install.md for more description
-
-// Managing contexts...
-// there's a lot of state associated with an "install" operation, including
-// packages that are already installed, parent packages, current shrinkwrap, and
-// so on. We maintain this state in a "context" object that gets passed around.
-// every time we dive into a deeper node_modules folder, the "family" list that
-// gets passed along uses the previous "family" list as its __proto__.  Any
-// "resolved precise dependency" things that aren't already on this object get
-// added, and then that's passed to the next generation of installation.
-
-module.exports = install
-
-install.usage = "npm install"
-              + "\nnpm install <pkg>"
-              + "\nnpm install <pkg>@<tag>"
-              + "\nnpm install <pkg>@<version>"
-              + "\nnpm install <pkg>@<version range>"
-              + "\nnpm install <folder>"
-              + "\nnpm install <tarball file>"
-              + "\nnpm install <tarball url>"
-              + "\nnpm install <git:// url>"
-              + "\nnpm install <github username>/<github project>"
-              + "\n\nCan specify one or more: npm install ./foo.tgz bar@stable /some/folder"
-              + "\nIf no argument is supplied and ./npm-shrinkwrap.json is "
-              + "\npresent, installs dependencies specified in the shrinkwrap."
-              + "\nOtherwise, installs dependencies from ./package.json."
-
-install.completion = function (opts, cb) {
-  // install can complete to a folder with a package.json, or any package.
-  // if it has a slash, then it's gotta be a folder
-  // if it starts with https?://, then just give up, because it's a url
-  // for now, not yet implemented.
-  var registry = npm.registry
-  registry.get("/-/short", function (er, pkgs) {
-    if (er) return cb()
-    if (!opts.partialWord) return cb(null, pkgs)
-
-    var name = opts.partialWord.split("@").shift()
-    pkgs = pkgs.filter(function (p) {
-      return p.indexOf(name) === 0
-    })
-
-    if (pkgs.length !== 1 && opts.partialWord === name) {
-      return cb(null, pkgs)
-    }
-
-    registry.get(pkgs[0], function (er, d) {
-      if (er) return cb()
-      return cb(null, Object.keys(d["dist-tags"] || {})
-                .concat(Object.keys(d.versions || {}))
-                .map(function (t) {
-                  return pkgs[0] + "@" + t
-                }))
-    })
-  })
-}
-
-var npm = require("./npm.js")
-  , semver = require("semver")
-  , readJson = require("read-package-json")
-  , readInstalled = require("read-installed")
-  , log = require("npmlog")
-  , path = require("path")
-  , fs = require("graceful-fs")
-  , cache = require("./cache.js")
-  , asyncMap = require("slide").asyncMap
-  , chain = require("slide").chain
-  , url = require("url")
-  , mkdir = require("mkdirp")
-  , lifecycle = require("./utils/lifecycle.js")
-  , archy = require("archy")
-
-function install (args, cb_) {
-  var hasArguments = !!args.length
-
-  function cb (er, installed) {
-    if (er) return cb_(er)
-
-    findPeerInvalid(where, function (er, problem) {
-      if (er) return cb_(er)
-
-      if (problem) {
-        var peerInvalidError = new Error("The package " + problem.name +
-          " does not satisfy its siblings' peerDependencies requirements!")
-        peerInvalidError.code = "EPEERINVALID"
-        peerInvalidError.packageName = problem.name
-        peerInvalidError.peersDepending = problem.peersDepending
-        return cb(peerInvalidError)
-      }
-
-      var tree = treeify(installed || [])
-        , pretty = prettify(tree, installed).trim()
-
-      if (pretty) console.log(pretty)
-      save(where, installed, tree, pretty, hasArguments, cb_)
-    })
-  }
-
-  // the /path/to/node_modules/..
-  var where = path.resolve(npm.dir, "..")
-
-  // internal api: install(where, what, cb)
-  if (arguments.length === 3) {
-    where = args
-    args = [].concat(cb_) // pass in [] to do default dep-install
-    cb_ = arguments[2]
-    log.verbose("install", "where,what", [where, args])
-  }
-
-  if (!npm.config.get("global")) {
-    args = args.filter(function (a) {
-      return path.resolve(a) !== where
-    })
-  }
-
-  mkdir(where, function (er, made) {
-    if (er) return cb(er)
-    // install dependencies locally by default,
-    // or install current folder globally
-    if (!args.length) {
-      var opt = { dev: npm.config.get("dev") || !npm.config.get("production") }
-
-      if (npm.config.get("global")) args = ["."]
-      else return readDependencies(null, where, opt, function (er, data) {
-        if (er) {
-          log.error("install", "Couldn't read dependencies")
-          return cb(er)
-        }
-        var deps = Object.keys(data.dependencies || {})
-        log.verbose("install", "where, deps", [where, deps])
-        var context = { family: {}
-                      , ancestors: {}
-                      , explicit: false
-                      , parent: data
-                      , wrap: null }
-
-        if (data.name === path.basename(where) &&
-            path.basename(path.dirname(where)) === "node_modules") {
-          // Only include in ancestry if it can actually be required.
-          // Otherwise, it does not count.
-          context.family[data.name] =
-            context.ancestors[data.name] = data.version
-        }
-
-        installManyTop(deps.map(function (dep) {
-          var target = data.dependencies[dep]
-            , parsed = url.parse(target.replace(/^git\+/, "git"))
-          target = dep + "@" + target
-          return target
-        }), where, context, function(er, results) {
-          if (er) return cb(er, results)
-          lifecycle(data, "prepublish", where, function(er) {
-            return cb(er, results)
-          })
-        })
-      })
-    }
-
-    // initial "family" is the name:version of the root, if it's got
-    // a package.json file.
-    var jsonFile = path.resolve(where, "package.json")
-    readJson(jsonFile, log.warn, function (er, data) {
-      if (er
-          && er.code !== "ENOENT"
-          && er.code !== "ENOTDIR") return cb(er)
-      if (er) data = null
-      var context = { family: {}
-                    , ancestors: {}
-                    , explicit: true
-                    , parent: data
-                    , wrap: null }
-      if (data) {
-        context.family[data.name] = context.ancestors[data.name] = data.version
-      }
-      var fn = npm.config.get("global") ? installMany : installManyTop
-      fn(args, where, context, cb)
-    })
-  })
-}
-
-function findPeerInvalid (where, cb) {
-  readInstalled(where, log.warn, function (er, data) {
-    if (er) return cb(er)
-
-    cb(null, findPeerInvalid_(data.dependencies, []))
-  })
-}
-
-function findPeerInvalid_ (packageMap, fpiList) {
-  if (fpiList.indexOf(packageMap) !== -1)
-    return
-
-  fpiList.push(packageMap)
-
-  for (var packageName in packageMap) {
-    var pkg = packageMap[packageName]
-
-    if (pkg.peerInvalid) {
-      var peersDepending = {};
-      for (peerName in packageMap) {
-        var peer = packageMap[peerName]
-        if (peer.peerDependencies && peer.peerDependencies[packageName]) {
-          peersDepending[peer.name + "@" + peer.version] =
-            peer.peerDependencies[packageName]
-        }
-      }
-      return { name: pkg.name, peersDepending: peersDepending }
-    }
-
-    if (pkg.dependencies) {
-      var invalid = findPeerInvalid_(pkg.dependencies, fpiList)
-      if (invalid)
-        return invalid
-    }
-  }
-
-  return null
-}
-
-// reads dependencies for the package at "where". There are several cases,
-// depending on our current state and the package's configuration:
-//
-// 1. If "context" is specified, then we examine the context to see if there's a
-//    shrinkwrap there. In that case, dependencies are read from the shrinkwrap.
-// 2. Otherwise, if an npm-shrinkwrap.json file is present, dependencies are
-//    read from there.
-// 3. Otherwise, dependencies come from package.json.
-//
-// Regardless of which case we fall into, "cb" is invoked with a first argument
-// describing the full package (as though readJson had been used) but with
-// "dependencies" read as described above. The second argument to "cb" is the
-// shrinkwrap to use in processing this package's dependencies, which may be
-// "wrap" (in case 1) or a new shrinkwrap (in case 2).
-function readDependencies (context, where, opts, cb) {
-  var wrap = context ? context.wrap : null
-
-  readJson( path.resolve(where, "package.json")
-          , log.warn
-          , function (er, data) {
-    if (er && er.code === "ENOENT") er.code = "ENOPACKAGEJSON"
-    if (er)  return cb(er)
-
-    if (opts && opts.dev) {
-      if (!data.dependencies) data.dependencies = {}
-      Object.keys(data.devDependencies || {}).forEach(function (k) {
-        data.dependencies[k] = data.devDependencies[k]
-      })
-    }
-
-    if (!npm.config.get("optional") && data.optionalDependencies) {
-      Object.keys(data.optionalDependencies).forEach(function (d) {
-        delete data.dependencies[d]
-      })
-    }
-
-    // User has opted out of shrinkwraps entirely
-    if (npm.config.get("shrinkwrap") === false)
-      return cb(null, data, null)
-
-    if (wrap) {
-      log.verbose("readDependencies: using existing wrap", [where, wrap])
-      var rv = {}
-      Object.keys(data).forEach(function (key) {
-        rv[key] = data[key]
-      })
-      rv.dependencies = {}
-      Object.keys(wrap).forEach(function (key) {
-        log.verbose("from wrap", [key, wrap[key]])
-        rv.dependencies[key] = readWrap(wrap[key])
-      })
-      log.verbose("readDependencies returned deps", rv.dependencies)
-      return cb(null, rv, wrap)
-    }
-
-    var wrapfile = path.resolve(where, "npm-shrinkwrap.json")
-
-    fs.readFile(wrapfile, "utf8", function (er, wrapjson) {
-      if (er) {
-        log.verbose("readDependencies", "using package.json deps")
-        return cb(null, data, null)
-      }
-
-      try {
-        var newwrap = JSON.parse(wrapjson)
-      } catch (ex) {
-        return cb(ex)
-      }
-
-      log.info("shrinkwrap", "file %j", wrapfile)
-      var rv = {}
-      Object.keys(data).forEach(function (key) {
-        rv[key] = data[key]
-      })
-      rv.dependencies = {}
-      Object.keys(newwrap.dependencies || {}).forEach(function (key) {
-        rv.dependencies[key] = readWrap(newwrap.dependencies[key])
-      })
-
-      // fold in devDependencies if not already present, at top level
-      if (opts && opts.dev) {
-        Object.keys(data.devDependencies || {}).forEach(function (k) {
-          rv.dependencies[k] = rv.dependencies[k] || data.devDependencies[k]
-        })
-      }
-
-      log.verbose("readDependencies returned deps", rv.dependencies)
-      return cb(null, rv, newwrap.dependencies)
-    })
-  })
-}
-
-function readWrap (w) {
-  return (w.resolved) ? w.resolved
-       : (w.from && url.parse(w.from).protocol) ? w.from
-       : w.version
-}
-
-// if the -S|--save option is specified, then write installed packages
-// as dependencies to a package.json file.
-// This is experimental.
-function save (where, installed, tree, pretty, hasArguments, cb) {
-  if (!hasArguments ||
-      !npm.config.get("save") &&
-      !npm.config.get("save-dev") &&
-      !npm.config.get("save-optional") ||
-      npm.config.get("global")) {
-    return cb(null, installed, tree, pretty)
-  }
-
-  var saveBundle = npm.config.get('save-bundle')
-
-  // each item in the tree is a top-level thing that should be saved
-  // to the package.json file.
-  // The relevant tree shape is { <folder>: {what:<pkg>} }
-  var saveTarget = path.resolve(where, "package.json")
-    , things = Object.keys(tree).map(function (k) {
-        // if "what" was a url, then save that instead.
-        var t = tree[k]
-          , u = url.parse(t.from)
-          , w = t.what.split("@")
-        if (u && u.protocol) w[1] = t.from
-        return w
-      }).reduce(function (set, k) {
-        var rangeDescriptor = semver.valid(k[1], true) &&
-                              semver.gte(k[1], "0.1.0", true)
-                            ? "~" : ""
-        set[k[0]] = rangeDescriptor + k[1]
-        return set
-      }, {})
-
-  // don't use readJson, because we don't want to do all the other
-  // tricky npm-specific stuff that's in there.
-  fs.readFile(saveTarget, function (er, data) {
-    // ignore errors here, just don't save it.
-    try {
-      data = JSON.parse(data.toString("utf8"))
-    } catch (ex) {
-      er = ex
-    }
-
-    if (er) {
-      return cb(null, installed, tree, pretty)
-    }
-
-    var deps = npm.config.get("save-optional") ? "optionalDependencies"
-             : npm.config.get("save-dev") ? "devDependencies"
-             : "dependencies"
-
-    if (saveBundle) {
-      var bundle = data.bundleDependencies || data.bundledDependencies
-      delete data.bundledDependencies
-      if (!Array.isArray(bundle)) bundle = []
-      data.bundleDependencies = bundle
-    }
-
-    log.verbose('saving', things)
-    data[deps] = data[deps] || {}
-    Object.keys(things).forEach(function (t) {
-      data[deps][t] = things[t]
-      if (saveBundle) {
-        var i = bundle.indexOf(t)
-        if (i === -1) bundle.push(t)
-      }
-    })
-
-    data = JSON.stringify(data, null, 2) + "\n"
-    fs.writeFile(saveTarget, data, function (er) {
-      cb(er, installed, tree, pretty)
-    })
-  })
-}
-
-
-// Outputting *all* the installed modules is a bit confusing,
-// because the length of the path does not make it clear
-// that the submodules are not immediately require()able.
-// TODO: Show the complete tree, ls-style, but only if --long is provided
-function prettify (tree, installed) {
-  if (npm.config.get("json")) {
-    function red (set, kv) {
-      set[kv[0]] = kv[1]
-      return set
-    }
-
-    tree = Object.keys(tree).map(function (p) {
-      if (!tree[p]) return null
-      var what = tree[p].what.split("@")
-        , name = what.shift()
-        , version = what.join("@")
-        , o = { name: name, version: version, from: tree[p].from }
-      o.dependencies = tree[p].children.map(function P (dep) {
-         var what = dep.what.split("@")
-           , name = what.shift()
-           , version = what.join("@")
-           , o = { version: version, from: dep.from }
-         o.dependencies = dep.children.map(P).reduce(red, {})
-         return [name, o]
-       }).reduce(red, {})
-       return o
-    })
-
-    return JSON.stringify(tree, null, 2)
-  }
-  if (npm.config.get("parseable")) return parseable(installed)
-
-  return Object.keys(tree).map(function (p) {
-    return archy({ label: tree[p].what + " " + p
-                 , nodes: (tree[p].children || []).map(function P (c) {
-                     if (npm.config.get("long")) {
-                       return { label: c.what, nodes: c.children.map(P) }
-                     }
-                     var g = c.children.map(function (g) {
-                       return g.what
-                     }).join(", ")
-                     if (g) g = " (" + g + ")"
-                     return c.what + g
-                   })
-                 })
-  }).join("\n")
-}
-
-function parseable (installed) {
-  var long = npm.config.get("long")
-    , cwd = process.cwd()
-  return installed.map(function (item) {
-    return path.resolve(cwd, item[1]) +
-         ( long ?  ":" + item[0] : "" )
-  }).join("\n")
-}
-
-function treeify (installed) {
-  // each item is [what, where, parent, parentDir]
-  // If no parent, then report it.
-  // otherwise, tack it into the parent's children list.
-  // If the parent isn't a top-level then ignore it.
-  var whatWhere = installed.reduce(function (l, r) {
-    var parentDir = r[3]
-      , parent = r[2]
-      , where = r[1]
-      , what = r[0]
-      , from = r[4]
-    l[where] = { parentDir: parentDir
-               , parent: parent
-               , children: []
-               , where: where
-               , what: what
-               , from: from }
-    return l
-  }, {})
-
-  // log.warn("install", whatWhere, "whatWhere")
-  return Object.keys(whatWhere).reduce(function (l, r) {
-    var ww = whatWhere[r]
-    //log.warn("r, ww", [r, ww])
-    if (!ww.parent) {
-      l[r] = ww
-    } else {
-      var p = whatWhere[ww.parentDir]
-      if (p) p.children.push(ww)
-      else l[r] = ww
-    }
-    return l
-  }, {})
-}
-
-
-// just like installMany, but also add the existing packages in
-// where/node_modules to the family object.
-function installManyTop (what, where, context, cb_) {
-  function cb (er, d) {
-    if (context.explicit || er) return cb_(er, d)
-    // since this wasn't an explicit install, let's build the top
-    // folder, so that `npm install` also runs the lifecycle scripts.
-    npm.commands.build([where], false, true, function (er) {
-      return cb_(er, d)
-    })
-  }
-
-  if (context.explicit) return next()
-
-  readJson(path.join(where, "package.json"), log.warn, function (er, data) {
-    if (er) return next(er)
-    lifecycle(data, "preinstall", where, next)
-  })
-
-  function next (er) {
-    if (er) return cb(er)
-    installManyTop_(what, where, context, cb)
-  }
-}
-
-function installManyTop_ (what, where, context, cb) {
-  var nm = path.resolve(where, "node_modules")
-    , names = context.explicit
-            ? what.map(function (w) { return w.split(/@/).shift() })
-            : []
-
-  fs.readdir(nm, function (er, pkgs) {
-    if (er) return installMany(what, where, context, cb)
-    pkgs = pkgs.filter(function (p) {
-      return !p.match(/^[\._-]/)
-    })
-    asyncMap(pkgs.map(function (p) {
-      return path.resolve(nm, p, "package.json")
-    }), function (jsonfile, cb) {
-      readJson(jsonfile, log.warn, function (er, data) {
-        if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-        if (er) return cb(null, [])
-        return cb(null, [[data.name, data.version]])
-      })
-    }, function (er, packages) {
-      // if there's nothing in node_modules, then don't freak out.
-      if (er) packages = []
-      // add all the existing packages to the family list.
-      // however, do not add to the ancestors list.
-      packages.forEach(function (p) {
-        context.family[p[0]] = p[1]
-      })
-      return installMany(what, where, context, cb)
-    })
-  })
-}
-
-function installMany (what, where, context, cb) {
-  // readDependencies takes care of figuring out whether the list of
-  // dependencies we'll iterate below comes from an existing shrinkwrap from a
-  // parent level, a new shrinkwrap at this level, or package.json at this
-  // level, as well as which shrinkwrap (if any) our dependencies should use.
-  var opt = { dev: npm.config.get("dev") }
-  readDependencies(context, where, opt, function (er, data, wrap) {
-    if (er) data = {}
-
-    var parent = data
-
-    var d = data.dependencies || {}
-
-    // if we're explicitly installing "what" into "where", then the shrinkwrap
-    // for "where" doesn't apply. This would be the case if someone were adding
-    // a new package to a shrinkwrapped package. (data.dependencies will not be
-    // used here except to indicate what packages are already present, so
-    // there's no harm in using that.)
-    if (context.explicit) wrap = null
-
-    // what is a list of things.
-    // resolve each one.
-    asyncMap( what
-            , targetResolver(where, context, d)
-            , function (er, targets) {
-
-      if (er) return cb(er)
-
-      // each target will be a data object corresponding
-      // to a package, folder, or whatever that is in the cache now.
-      var newPrev = Object.create(context.family)
-        , newAnc = Object.create(context.ancestors)
-
-      newAnc[data.name] = data.version
-      targets.forEach(function (t) {
-        newPrev[t.name] = t.version
-      })
-      log.silly("resolved", targets)
-      targets.filter(function (t) { return t }).forEach(function (t) {
-        log.info("install", "%s into %s", t._id, where)
-      })
-      asyncMap(targets, function (target, cb) {
-        log.info("installOne", target._id)
-        var wrapData = wrap ? wrap[target.name] : null
-        var newWrap = wrapData && wrapData.dependencies
-                    ? wrap[target.name].dependencies || {}
-                    : null
-        var newContext = { family: newPrev
-                         , ancestors: newAnc
-                         , parent: parent
-                         , explicit: false
-                         , wrap: newWrap }
-        installOne(target, where, newContext, cb)
-      }, cb)
-    })
-  })
-}
-
-function targetResolver (where, context, deps) {
-  var alreadyInstalledManually = context.explicit ? [] : null
-    , nm = path.resolve(where, "node_modules")
-    , parent = context.parent
-    , wrap = context.wrap
-
-  if (!context.explicit) fs.readdir(nm, function (er, inst) {
-    if (er) return alreadyInstalledManually = []
-
-    // don't even mess with non-package looking things
-    inst = inst.filter(function (p) {
-      return !p.match(/^[\._-]/)
-    })
-
-    asyncMap(inst, function (pkg, cb) {
-      readJson(path.resolve(nm, pkg, "package.json"), log.warn, function (er, d) {
-        if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-        // error means it's not a package, most likely.
-        if (er) return cb(null, [])
-
-        // if it's a bundled dep, then assume that anything there is valid.
-        // otherwise, make sure that it's a semver match with what we want.
-        var bd = parent.bundleDependencies
-        if (bd && bd.indexOf(d.name) !== -1 ||
-            semver.satisfies(d.version, deps[d.name] || "*", true)) {
-          return cb(null, d.name)
-        }
-
-        // something is there, but it's not satisfactory.  Clobber it.
-        return cb(null, [])
-      })
-    }, function (er, inst) {
-      // this is the list of things that are valid and should be ignored.
-      alreadyInstalledManually = inst
-    })
-  })
-
-  var to = 0
-  return function resolver (what, cb) {
-    if (!alreadyInstalledManually) return setTimeout(function () {
-      resolver(what, cb)
-    }, to++)
-
-    // now we know what's been installed here manually,
-    // or tampered with in some way that npm doesn't want to overwrite.
-    if (alreadyInstalledManually.indexOf(what.split("@").shift()) !== -1) {
-      log.verbose("already installed", "skipping %s %s", what, where)
-      return cb(null, [])
-    }
-
-    // check for a version installed higher in the tree.
-    // If installing from a shrinkwrap, it must match exactly.
-    if (context.family[what]) {
-      if (wrap && wrap[what].version === context.family[what]) {
-        log.verbose("shrinkwrap", "use existing", what)
-        return cb(null, [])
-      }
-    }
-
-    // if it's identical to its parent, then it's probably someone
-    // doing `npm install foo` inside of the foo project.  Print
-    // a warning, and skip it.
-    if (parent && parent.name === what && !npm.config.get("force")) {
-      log.warn("install", "Refusing to install %s as a dependency of itself"
-              , what)
-      return cb(null, [])
-    }
-
-    if (wrap) {
-      var name = what.split(/@/).shift()
-      if (wrap[name]) {
-        var wrapTarget = readWrap(wrap[name])
-        what = name + "@" + wrapTarget
-      } else {
-        log.verbose("shrinkwrap", "skipping %s (not in shrinkwrap)", what)
-      }
-    } else if (deps[what]) {
-      what = what + "@" + deps[what]
-    }
-
-    cache.add(what, function (er, data) {
-      if (er && parent && parent.optionalDependencies &&
-          parent.optionalDependencies.hasOwnProperty(what.split("@")[0])) {
-        log.warn("optional dep failed, continuing", what)
-        log.verbose("optional dep failed, continuing", [what, er])
-        return cb(null, [])
-      }
-
-      if (!er &&
-          data &&
-          !context.explicit &&
-          context.family[data.name] === data.version &&
-          !npm.config.get("force")) {
-        log.info("already installed", data.name + "@" + data.version)
-        return cb(null, [])
-      }
-
-      if (data && !data._from) data._from = what
-
-      return cb(er, data || [])
-    })
-  }
-}
-
-// we've already decided to install this.  if anything's in the way,
-// then uninstall it first.
-function installOne (target, where, context, cb) {
-  // the --link flag makes this a "link" command if it's at the
-  // the top level.
-  if (where === npm.prefix && npm.config.get("link")
-      && !npm.config.get("global")) {
-    return localLink(target, where, context, cb)
-  }
-  installOne_(target, where, context, function (er, installedWhat) {
-
-    // check if this one is optional to its parent.
-    if (er && context.parent && context.parent.optionalDependencies &&
-        context.parent.optionalDependencies.hasOwnProperty(target.name)) {
-      log.warn("optional dep failed, continuing", target._id)
-      log.verbose("optional dep failed, continuing", [target._id, er])
-      er = null
-    }
-
-    cb(er, installedWhat)
-  })
-
-}
-
-function localLink (target, where, context, cb) {
-  log.verbose("localLink", target._id)
-  var jsonFile = path.resolve( npm.globalDir, target.name
-                             , "package.json" )
-    , parent = context.parent
-
-  readJson(jsonFile, log.warn, function (er, data) {
-    if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-    if (er || data._id === target._id) {
-      if (er) {
-        install( path.resolve(npm.globalDir, "..")
-               , target._id
-               , function (er) {
-          if (er) return cb(er, [])
-          thenLink()
-        })
-      } else thenLink()
-
-      function thenLink () {
-        npm.commands.link([target.name], function (er, d) {
-          log.silly("localLink", "back from link", [er, d])
-          cb(er, [resultList(target, where, parent && parent._id)])
-        })
-      }
-
-    } else {
-      log.verbose("localLink", "install locally (no link)", target._id)
-      installOne_(target, where, context, cb)
-    }
-  })
-}
-
-function resultList (target, where, parentId) {
-  var nm = path.resolve(where, "node_modules")
-    , targetFolder = path.resolve(nm, target.name)
-    , prettyWhere = where
-
-  if (!npm.config.get("global")) {
-    prettyWhere = path.relative(process.cwd(), where)
-  }
-
-  if (prettyWhere === ".") prettyWhere = null
-
-  if (!npm.config.get("global")) {
-    // print out the folder relative to where we are right now.
-    targetFolder = path.relative(process.cwd(), targetFolder)
-  }
-
-  return [ target._id
-         , targetFolder
-         , prettyWhere && parentId
-         , parentId && prettyWhere
-         , target._from ]
-}
-
-// name => install locations
-var installOnesInProgress = Object.create(null)
-
-function isIncompatibleInstallOneInProgress(target, where) {
-  return target.name in installOnesInProgress &&
-         installOnesInProgress[target.name].indexOf(where) !== -1
-}
-
-function installOne_ (target, where, context, cb) {
-  var nm = path.resolve(where, "node_modules")
-    , targetFolder = path.resolve(nm, target.name)
-    , prettyWhere = path.relative(process.cwd(), where)
-    , parent = context.parent
-
-  if (prettyWhere === ".") prettyWhere = null
-
-  if (isIncompatibleInstallOneInProgress(target, where)) {
-    var prettyTarget = path.relative(process.cwd(), targetFolder)
-
-    // just call back, with no error.  the error will be detected in the
-    // final check for peer-invalid dependencies
-    return cb()
-  }
-
-  if (!(target.name in installOnesInProgress)) {
-    installOnesInProgress[target.name] = []
-  }
-  installOnesInProgress[target.name].push(where)
-  var indexOfIOIP = installOnesInProgress[target.name].length - 1
-
-  chain
-    ( [ [checkEngine, target]
-      , [checkPlatform, target]
-      , [checkCycle, target, context.ancestors]
-      , [checkGit, targetFolder]
-      , [write, target, targetFolder, context] ]
-    , function (er, d) {
-        installOnesInProgress[target.name].splice(indexOfIOIP, 1)
-
-        if (er) return cb(er)
-
-        d.push(resultList(target, where, parent && parent._id))
-        cb(er, d)
-      }
-    )
-}
-
-function checkEngine (target, cb) {
-  var npmv = npm.version
-    , force = npm.config.get("force")
-    , nodev = force ? null : npm.config.get("node-version")
-    , strict = npm.config.get("engine-strict") || target.engineStrict
-    , eng = target.engines
-  if (!eng) return cb()
-  if (nodev && eng.node && !semver.satisfies(nodev, eng.node)
-      || eng.npm && !semver.satisfies(npmv, eng.npm)) {
-    if (strict) {
-      var er = new Error("Unsupported")
-      er.code = "ENOTSUP"
-      er.required = eng
-      er.pkgid = target._id
-      return cb(er)
-    } else {
-      log.warn( "engine", "%s: wanted: %j (current: %j)"
-              , target._id, eng, {node: nodev, npm: npm.version} )
-    }
-  }
-  return cb()
-}
-
-function checkPlatform (target, cb) {
-  var platform = process.platform
-    , arch = process.arch
-    , osOk = true
-    , cpuOk = true
-    , force = npm.config.get("force")
-
-  if (force) {
-    return cb()
-  }
-
-  if (target.os) {
-    osOk = checkList(platform, target.os)
-  }
-  if (target.cpu) {
-    cpuOk = checkList(arch, target.cpu)
-  }
-  if (!osOk || !cpuOk) {
-    var er = new Error("Unsupported")
-    er.code = "EBADPLATFORM"
-    er.os = target.os || ['any']
-    er.cpu = target.cpu || ['any']
-    er.pkgid = target._id
-    return cb(er)
-  }
-  return cb()
-}
-
-function checkList (value, list) {
-  var tmp
-    , match = false
-    , blc = 0
-  if (typeof list === "string") {
-    list = [list]
-  }
-  if (list.length === 1 && list[0] === "any") {
-    return true
-  }
-  for (var i = 0; i < list.length; ++i) {
-    tmp = list[i]
-    if (tmp[0] === '!') {
-      tmp = tmp.slice(1)
-      if (tmp === value) {
-        return false
-      }
-      ++blc
-    } else {
-      match = match || tmp === value
-    }
-  }
-  return match || blc === list.length
-}
-
-function checkCycle (target, ancestors, cb) {
-  // there are some very rare and pathological edge-cases where
-  // a cycle can cause npm to try to install a never-ending tree
-  // of stuff.
-  // Simplest:
-  //
-  // A -> B -> A' -> B' -> A -> B -> A' -> B' -> A -> ...
-  //
-  // Solution: Simply flat-out refuse to install any name@version
-  // that is already in the prototype tree of the ancestors object.
-  // A more correct, but more complex, solution would be to symlink
-  // the deeper thing into the new location.
-  // Will do that if anyone whines about this irl.
-  //
-  // Note: `npm install foo` inside of the `foo` package will abort
-  // earlier if `--force` is not set.  However, if it IS set, then
-  // we need to still fail here, but just skip the first level. Of
-  // course, it'll still fail eventually if it's a true cycle, and
-  // leave things in an undefined state, but that's what is to be
-  // expected when `--force` is used.  That is why getPrototypeOf
-  // is used *twice* here: to skip the first level of repetition.
-
-  var p = Object.getPrototypeOf(Object.getPrototypeOf(ancestors))
-    , name = target.name
-    , version = target.version
-  while (p && p !== Object.prototype && p[name] !== version) {
-    p = Object.getPrototypeOf(p)
-  }
-  if (p[name] !== version) return cb()
-
-  var er = new Error("Unresolvable cycle detected")
-  var tree = [target._id, JSON.parse(JSON.stringify(ancestors))]
-    , t = Object.getPrototypeOf(ancestors)
-  while (t && t !== Object.prototype) {
-    if (t === p) t.THIS_IS_P = true
-    tree.push(JSON.parse(JSON.stringify(t)))
-    t = Object.getPrototypeOf(t)
-  }
-  log.verbose("unresolvable dependency tree", tree)
-  er.pkgid = target._id
-  er.code = "ECYCLE"
-  return cb(er)
-}
-
-function checkGit (folder, cb) {
-  // if it's a git repo then don't touch it!
-  fs.lstat(folder, function (er, s) {
-    if (er || !s.isDirectory()) return cb()
-    else checkGit_(folder, cb)
-  })
-}
-
-function checkGit_ (folder, cb) {
-  fs.stat(path.resolve(folder, ".git"), function (er, s) {
-    if (!er && s.isDirectory()) {
-      var e = new Error("Appears to be a git repo or submodule.")
-      e.path = folder
-      e.code = "EISGIT"
-      return cb(e)
-    }
-    cb()
-  })
-}
-
-function write (target, targetFolder, context, cb_) {
-  var up = npm.config.get("unsafe-perm")
-    , user = up ? null : npm.config.get("user")
-    , group = up ? null : npm.config.get("group")
-    , family = context.family
-
-  function cb (er, data) {
-    // cache.unpack returns the data object, and all we care about
-    // is the list of installed packages from that last thing.
-    if (!er) return cb_(er, data)
-
-    if (false === npm.config.get("rollback")) return cb_(er)
-    npm.commands.unbuild([targetFolder], true, function (er2) {
-      if (er2) log.error("error rolling back", target._id, er2)
-      return cb_(er, data)
-    })
-  }
-
-  var bundled = []
-
-  chain
-    ( [ [ cache.unpack, target.name, target.version, targetFolder
-        , null, null, user, group ]
-      , [ fs, "writeFile"
-        , path.resolve(targetFolder, "package.json")
-        , JSON.stringify(target, null, 2) + "\n" ]
-      , [ lifecycle, target, "preinstall", targetFolder ]
-      , function (cb) {
-          if (!target.bundleDependencies) return cb()
-
-          var bd = path.resolve(targetFolder, "node_modules")
-          fs.readdir(bd, function (er, b) {
-            // nothing bundled, maybe
-            if (er) return cb()
-            bundled = b || []
-            cb()
-          })
-        } ]
-
-    // nest the chain so that we can throw away the results returned
-    // up until this point, since we really don't care about it.
-    , function X (er) {
-      if (er) return cb(er)
-
-      // before continuing to installing dependencies, check for a shrinkwrap.
-      var opt = { dev: npm.config.get("dev") }
-      readDependencies(context, targetFolder, opt, function (er, data, wrap) {
-        var deps = prepareForInstallMany(data, "dependencies", bundled, wrap,
-            family)
-        var depsTargetFolder = targetFolder
-        var depsContext = { family: family
-                         , ancestors: context.ancestors
-                         , parent: target
-                         , explicit: false
-                         , wrap: wrap }
-
-        var peerDeps = prepareForInstallMany(data, "peerDependencies", bundled,
-            wrap, family)
-        var pdTargetFolder = path.resolve(targetFolder, "..", "..")
-        var pdContext = context
-
-        var actions =
-          [ [ installManyAndBuild, deps, depsTargetFolder, depsContext ] ]
-
-        if (peerDeps.length > 0) {
-          actions.push(
-            [ installMany, peerDeps, pdTargetFolder, pdContext ]
-          )
-        }
-
-        chain(actions, cb)
-      })
-    })
-}
-
-function installManyAndBuild (deps, targetFolder, context, cb) {
-  installMany(deps, targetFolder, context, function (er, d) {
-    log.verbose("about to build", targetFolder)
-    if (er) return cb(er)
-    npm.commands.build( [targetFolder]
-                      , npm.config.get("global")
-                      , true
-                      , function (er) { return cb(er, d) })
-  })
-}
-
-function prepareForInstallMany (packageData, depsKey, bundled, wrap, family) {
-  var deps = Object.keys(packageData[depsKey] || {})
-
-  // don't install bundleDependencies, unless they're missing.
-  if (packageData.bundleDependencies) {
-    deps = deps.filter(function (d) {
-      return packageData.bundleDependencies.indexOf(d) === -1 ||
-             bundled.indexOf(d) === -1
-    })
-  }
-
-  return deps.filter(function (d) {
-    // prefer to not install things that are satisfied by
-    // something in the "family" list, unless we're installing
-    // from a shrinkwrap.
-    if (wrap) return wrap
-    if (semver.validRange(family[d], true))
-      return !semver.satisfies(family[d], packageData[depsKey][d], true)
-    return true
-  }).map(function (d) {
-    var t = packageData[depsKey][d]
-      , parsed = url.parse(t.replace(/^git\+/, "git"))
-    t = d + "@" + t
-    return t
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/link.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,169 +0,0 @@
-// link with no args: symlink the folder to the global location
-// link with package arg: symlink the global to the local
-
-var npm = require("./npm.js")
-  , symlink = require("./utils/link.js")
-  , fs = require("graceful-fs")
-  , log = require("npmlog")
-  , asyncMap = require("slide").asyncMap
-  , chain = require("slide").chain
-  , path = require("path")
-  , rm = require("rimraf")
-  , build = require("./build.js")
-
-module.exports = link
-
-link.usage = "npm link (in package dir)"
-           + "\nnpm link <pkg> (link global into local)"
-
-link.completion = function (opts, cb) {
-  var dir = npm.globalDir
-  fs.readdir(dir, function (er, files) {
-    cb(er, files.filter(function (f) {
-      return !f.match(/^[\._-]/)
-    }))
-  })
-}
-
-function link (args, cb) {
-  if (process.platform === "win32") {
-    var semver = require("semver")
-    if (!semver.satisfies(process.version, ">=0.7.9")) {
-      var msg = "npm link not supported on windows prior to node 0.7.9"
-        , e = new Error(msg)
-      e.code = "ENOTSUP"
-      e.errno = require("constants").ENOTSUP
-      return cb(e)
-    }
-  }
-
-  if (npm.config.get("global")) {
-    return cb(new Error("link should never be --global.\n"
-                       +"Please re-run this command with --local"))
-  }
-
-  if (args.length === 1 && args[0] === ".") args = []
-  if (args.length) return linkInstall(args, cb)
-  linkPkg(npm.prefix, cb)
-}
-
-function linkInstall (pkgs, cb) {
-  asyncMap(pkgs, function (pkg, cb) {
-    function n (er, data) {
-      if (er) return cb(er, data)
-      // install returns [ [folder, pkgId], ... ]
-      // but we definitely installed just one thing.
-      var d = data.filter(function (d) { return !d[3] })
-      pp = d[0][1]
-      pkg = path.basename(pp)
-      target = path.resolve(npm.dir, pkg)
-      next()
-    }
-
-    var t = path.resolve(npm.globalDir, "..")
-      , pp = path.resolve(npm.globalDir, pkg)
-      , rp = null
-      , target = path.resolve(npm.dir, pkg)
-
-    // if it's a folder or a random not-installed thing, then
-    // link or install it first
-    if (pkg.indexOf("/") !== -1 || pkg.indexOf("\\") !== -1) {
-      return fs.lstat(path.resolve(pkg), function (er, st) {
-        if (er || !st.isDirectory()) {
-          npm.commands.install(t, pkg, n)
-        } else {
-          rp = path.resolve(pkg)
-          linkPkg(rp, n)
-        }
-      })
-    }
-
-    fs.lstat(pp, function (er, st) {
-      if (er) {
-        rp = pp
-        return npm.commands.install(t, pkg, n)
-      } else if (!st.isSymbolicLink()) {
-        rp = pp
-        next()
-      } else {
-        return fs.realpath(pp, function (er, real) {
-          if (er) log.warn("invalid symbolic link", pkg)
-          else rp = real
-          next()
-        })
-      }
-    })
-
-    function next () {
-      chain
-        ( [ [npm.commands, "unbuild", [target]]
-          , [function (cb) {
-              log.verbose("link", "symlinking %s to %s",  pp, target)
-              cb()
-            }]
-          , [symlink, pp, target]
-          // do run lifecycle scripts - full build here.
-          , rp && [build, [target]]
-          , [ resultPrinter, pkg, pp, target, rp ] ]
-        , cb )
-    }
-  }, cb)
-}
-
-function linkPkg (folder, cb_) {
-  var me = folder || npm.prefix
-    , readJson = require("read-package-json")
-
-  log.verbose("linkPkg", folder)
-
-  readJson(path.resolve(me, "package.json"), function (er, d) {
-    function cb (er) {
-      return cb_(er, [[d && d._id, target, null, null]])
-    }
-    if (er) return cb(er)
-    var target = path.resolve(npm.globalDir, d.name)
-    rm(target, function (er) {
-      if (er) return cb(er)
-      symlink(me, target, function (er) {
-        if (er) return cb(er)
-        log.verbose("link", "build target", target)
-        // also install missing dependencies.
-        npm.commands.install(me, [], function (er, installed) {
-          if (er) return cb(er)
-          // build the global stuff.  Don't run *any* scripts, because
-          // install command already will have done that.
-          build([target], true, build._noLC, true, function (er) {
-            if (er) return cb(er)
-            resultPrinter(path.basename(me), me, target, cb)
-          })
-        })
-      })
-    })
-  })
-}
-
-function resultPrinter (pkg, src, dest, rp, cb) {
-  if (typeof cb !== "function") cb = rp, rp = null
-  var where = dest
-  rp = (rp || "").trim()
-  src = (src || "").trim()
-  // XXX If --json is set, then look up the data from the package.json
-  if (npm.config.get("parseable")) {
-    return parseableOutput(dest, rp || src, cb)
-  }
-  if (rp === src) rp = null
-  console.log(where + " -> " + src + (rp ? " -> " + rp: ""))
-  cb()
-}
-
-function parseableOutput (dest, rp, cb) {
-  // XXX this should match ls --parseable and install --parseable
-  // look up the data from package.json, format it the same way.
-  //
-  // link is always effectively "long", since it doesn't help much to
-  // *just* print the target folder.
-  // However, we don't actually ever read the version number, so
-  // the second field is always blank.
-  console.log(dest + "::" + rp)
-  cb()
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/ls.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,356 +0,0 @@
-
-// show the installed versions of packages
-//
-// --parseable creates output like this:
-// <fullpath>:<name@ver>:<realpath>:<flags>
-// Flags are a :-separated list of zero or more indicators
-
-module.exports = exports = ls
-
-var npm = require("./npm.js")
-  , readInstalled = require("read-installed")
-  , log = require("npmlog")
-  , path = require("path")
-  , archy = require("archy")
-  , semver = require("semver")
-  , url = require("url")
-  , isGitUrl = require("./utils/is-git-url.js")
-
-ls.usage = "npm ls"
-
-ls.completion = require("./utils/completion/installed-deep.js")
-
-function ls (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-
-  var dir = path.resolve(npm.dir, "..")
-
-  // npm ls 'foo@~1.3' bar 'baz@<2'
-  if (!args) args = []
-  else args = args.map(function (a) {
-    var nv = a.split("@")
-      , name = nv.shift()
-      , ver = semver.validRange(nv.join("@")) || ""
-
-    return [ name, ver ]
-  })
-
-  var depth = npm.config.get("depth")
-  readInstalled(dir, depth, log.warn, function (er, data) {
-    var bfs = bfsify(data, args)
-      , lite = getLite(bfs)
-
-    if (er || silent) return cb(er, data, lite)
-
-    var long = npm.config.get("long")
-      , json = npm.config.get("json")
-      , out
-    if (json) {
-      var seen = []
-      var d = long ? bfs : lite
-      // the raw data can be circular
-      out = JSON.stringify(d, function (k, o) {
-        if (typeof o === "object") {
-          if (-1 !== seen.indexOf(o)) return "[Circular]"
-          seen.push(o)
-        }
-        return o
-      }, 2)
-    } else if (npm.config.get("parseable")) {
-      out = makeParseable(bfs, long, dir)
-    } else if (data) {
-      out = makeArchy(bfs, long, dir)
-    }
-    console.log(out)
-
-    if (args.length && !data._found) process.exitCode = 1
-
-    // if any errors were found, then complain and exit status 1
-    if (lite.problems && lite.problems.length) {
-      er = lite.problems.join('\n')
-    }
-    cb(er, data, lite)
-  })
-}
-
-// only include
-function filter (data, args) {
-
-}
-
-function alphasort (a, b) {
-  a = a.toLowerCase()
-  b = b.toLowerCase()
-  return a > b ? 1
-       : a < b ? -1 : 0
-}
-
-function getLite (data, noname) {
-  var lite = {}
-    , maxDepth = npm.config.get("depth")
-
-  if (!noname && data.name) lite.name = data.name
-  if (data.version) lite.version = data.version
-  if (data.extraneous) {
-    lite.extraneous = true
-    lite.problems = lite.problems || []
-    lite.problems.push( "extraneous: "
-                      + data.name + "@" + data.version
-                      + " " + (data.path || "") )
-  }
-
-  if (data._from)
-    lite.from = data._from
-
-  if (data._resolved)
-    lite.resolved = data._resolved
-
-  if (data.invalid) {
-    lite.invalid = true
-    lite.problems = lite.problems || []
-    lite.problems.push( "invalid: "
-                      + data.name + "@" + data.version
-                      + " " + (data.path || "") )
-  }
-
-  if (data.peerInvalid) {
-    lite.peerInvalid = true
-    lite.problems = lite.problems || []
-    lite.problems.push( "peer invalid: "
-                      + data.name + "@" + data.version
-                      + " " + (data.path || "") )
-  }
-
-  if (data.dependencies) {
-    var deps = Object.keys(data.dependencies)
-    if (deps.length) lite.dependencies = deps.map(function (d) {
-      var dep = data.dependencies[d]
-      if (typeof dep === "string") {
-        lite.problems = lite.problems || []
-        var p
-        if (data.depth >= maxDepth) {
-          p = "max depth reached: "
-        } else {
-          p = "missing: "
-        }
-        p += d + "@" + dep
-          + ", required by "
-          + data.name + "@" + data.version
-        lite.problems.push(p)
-        return [d, { required: dep, missing: true }]
-      }
-      return [d, getLite(dep, true)]
-    }).reduce(function (deps, d) {
-      if (d[1].problems) {
-        lite.problems = lite.problems || []
-        lite.problems.push.apply(lite.problems, d[1].problems)
-      }
-      deps[d[0]] = d[1]
-      return deps
-    }, {})
-  }
-  return lite
-}
-
-function bfsify (root, args, current, queue, seen) {
-  // walk over the data, and turn it from this:
-  // +-- a
-  // |   `-- b
-  // |       `-- a (truncated)
-  // `--b (truncated)
-  // into this:
-  // +-- a
-  // `-- b
-  // which looks nicer
-  args = args || []
-  current = current || root
-  queue = queue || []
-  seen = seen || [root]
-  var deps = current.dependencies = current.dependencies || {}
-  Object.keys(deps).forEach(function (d) {
-    var dep = deps[d]
-    if (typeof dep !== "object") return
-    if (seen.indexOf(dep) !== -1) {
-      if (npm.config.get("parseable") || !npm.config.get("long")) {
-        delete deps[d]
-        return
-      } else {
-        dep = deps[d] = Object.create(dep)
-        dep.dependencies = {}
-      }
-    }
-    queue.push(dep)
-    seen.push(dep)
-  })
-
-  if (!queue.length) {
-    // if there were args, then only show the paths to found nodes.
-    return filterFound(root, args)
-  }
-  return bfsify(root, args, queue.shift(), queue, seen)
-}
-
-function filterFound (root, args) {
-  if (!args.length) return root
-  var deps = root.dependencies
-  if (deps) Object.keys(deps).forEach(function (d) {
-    var dep = filterFound(deps[d], args)
-
-    // see if this one itself matches
-    var found = false
-    for (var i = 0; !found && i < args.length; i ++) {
-      if (d === args[i][0]) {
-        found = semver.satisfies(dep.version, args[i][1], true)
-      }
-    }
-    // included explicitly
-    if (found) dep._found = true
-    // included because a child was included
-    if (dep._found && !root._found) root._found = 1
-    // not included
-    if (!dep._found) delete deps[d]
-  })
-  if (!root._found) root._found = false
-  return root
-}
-
-function makeArchy (data, long, dir) {
-  var out = makeArchy_(data, long, dir, 0)
-  return archy(out, "", { unicode: npm.config.get("unicode") })
-}
-
-function makeArchy_ (data, long, dir, depth, parent, d) {
-  var color = npm.color
-  if (typeof data === "string") {
-    if (depth < npm.config.get("depth")) {
-      // just missing
-      var p = parent.link || parent.path
-      var unmet = "UNMET DEPENDENCY"
-      if (color) {
-        unmet = "\033[31;40m" + unmet + "\033[0m"
-      }
-      data = unmet + " " + d + " " + data
-    } else {
-      data = d+"@"+ data
-    }
-    return data
-  }
-
-  var out = {}
-  // the top level is a bit special.
-  out.label = data._id || ""
-  if (data._found === true && data._id) {
-    var pre = color ? "\033[33;40m" : ""
-      , post = color ? "\033[m" : ""
-    out.label = pre + out.label.trim() + post + " "
-  }
-  if (data.link) out.label += " -> " + data.link
-
-  if (data.invalid) {
-    if (data.realName !== data.name) out.label += " ("+data.realName+")"
-    out.label += " " + (color ? "\033[31;40m" : "")
-              + "invalid"
-              + (color ? "\033[0m" : "")
-  }
-
-  if (data.peerInvalid) {
-    out.label += " " + (color ? "\033[31;40m" : "")
-              + "peer invalid"
-              + (color ? "\033[0m" : "")
-  }
-
-  if (data.extraneous && data.path !== dir) {
-    out.label += " " + (color ? "\033[32;40m" : "")
-              + "extraneous"
-              + (color ? "\033[0m" : "")
-  }
-
-  // add giturl to name@version
-  if (data._resolved) {
-    var p = url.parse(data._resolved)
-    if (isGitUrl(p))
-      out.label += " (" + data._resolved + ")"
-  }
-
-  if (long) {
-    if (dir === data.path) out.label += "\n" + dir
-    out.label += "\n" + getExtras(data, dir)
-  } else if (dir === data.path) {
-    if (out.label) out.label += " "
-    out.label += dir
-  }
-
-  // now all the children.
-  out.nodes = Object.keys(data.dependencies || {})
-    .sort(alphasort).map(function (d) {
-      return makeArchy_(data.dependencies[d], long, dir, depth + 1, data, d)
-    })
-
-  if (out.nodes.length === 0 && data.path === dir) {
-    out.nodes = ["(empty)"]
-  }
-
-  return out
-}
-
-function getExtras (data, dir) {
-  var extras = []
-
-  if (data.description) extras.push(data.description)
-  if (data.repository) extras.push(data.repository.url)
-  if (data.homepage) extras.push(data.homepage)
-  if (data._from) {
-    var from = data._from
-    if (from.indexOf(data.name + "@") === 0) {
-      from = from.substr(data.name.length + 1)
-    }
-    var u = url.parse(from)
-    if (u.protocol) extras.push(from)
-  }
-  return extras.join("\n")
-}
-
-
-function makeParseable (data, long, dir, depth, parent, d) {
-  depth = depth || 0
-
-  return [ makeParseable_(data, long, dir, depth, parent, d) ]
-  .concat(Object.keys(data.dependencies || {})
-    .sort(alphasort).map(function (d) {
-      return makeParseable(data.dependencies[d], long, dir, depth + 1, data, d)
-    }))
-  .filter(function (x) { return x })
-  .join("\n")
-}
-
-function makeParseable_ (data, long, dir, depth, parent, d) {
-  if (data.hasOwnProperty("_found") && data._found !== true) return ""
-
-  if (typeof data === "string") {
-    if (data.depth < npm.config.get("depth")) {
-      var p = parent.link || parent.path
-      data = npm.config.get("long")
-           ? path.resolve(parent.path, "node_modules", d)
-           + ":"+d+"@"+JSON.stringify(data)+":INVALID:MISSING"
-           : ""
-    } else {
-      data = path.resolve(data.path || "", "node_modules", d || "")
-           + (npm.config.get("long")
-             ? ":" + d + "@" + JSON.stringify(data)
-             + ":" // no realpath resolved
-             + ":MAXDEPTH"
-             : "")
-    }
-
-    return data
-  }
-
-  if (!npm.config.get("long")) return data.path
-
-  return data.path
-       + ":" + (data._id || "")
-       + ":" + (data.realPath !== data.path ? data.realPath : "")
-       + (data.extraneous ? ":EXTRANEOUS" : "")
-       + (data.invalid ? ":INVALID" : "")
-       + (data.peerInvalid ? ":PEERINVALID" : "")
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/npm.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,520 +0,0 @@
-;(function(){
-// windows: running "npm blah" in this folder will invoke WSH, not node.
-if (typeof WScript !== "undefined") {
-  WScript.echo("npm does not work when run\n"
-              +"with the Windows Scripting Host\n\n"
-              +"'cd' to a different directory,\n"
-              +"or type 'npm.cmd <args>',\n"
-              +"or type 'node npm <args>'.")
-  WScript.quit(1)
-  return
-}
-
-
-// monkey-patch support for 0.6 child processes
-require('child-process-close')
-
-var EventEmitter = require("events").EventEmitter
-  , npm = module.exports = new EventEmitter
-  , config = require("./config.js")
-  , npmconf = require("npmconf")
-  , log = require("npmlog")
-  , fs = require("graceful-fs")
-  , path = require("path")
-  , abbrev = require("abbrev")
-  , which = require("which")
-  , semver = require("semver")
-  , findPrefix = require("./utils/find-prefix.js")
-  , getUid = require("uid-number")
-  , mkdirp = require("mkdirp")
-  , slide = require("slide")
-  , chain = slide.chain
-  , RegClient = require("npm-registry-client")
-
-npm.config = {loaded: false}
-
-// /usr/local is often a read-only fs, which is not
-// well handled by node or mkdirp.  Just double-check
-// in the case of errors when making the prefix dirs.
-function mkdir (p, cb) {
-  mkdirp(p, function (er, made) {
-    // it could be that we couldn't create it, because it
-    // already exists, and is on a read-only fs.
-    if (er) {
-      return fs.stat(p, function (er2, st) {
-        if (er2 || !st.isDirectory()) return cb(er)
-        return cb(null, made)
-      })
-    }
-    return cb(er, made)
-  })
-}
-
-npm.commands = {}
-
-try {
-  var pv = process.version.replace(/^v/, '')
-  // startup, ok to do this synchronously
-  var j = JSON.parse(fs.readFileSync(
-    path.join(__dirname, "../package.json"))+"")
-  npm.version = j.version
-  npm.nodeVersionRequired = j.engines.node
-  if (!semver.satisfies(pv, j.engines.node)) {
-    log.warn("unsupported version", [""
-            ,"npm requires node version: "+j.engines.node
-            ,"And you have: "+pv
-            ,"which is not satisfactory."
-            ,""
-            ,"Bad things will likely happen.  You have been warned."
-            ,""].join("\n"))
-  }
-} catch (ex) {
-  try {
-    log.info("error reading version", ex)
-  } catch (er) {}
-  npm.version = ex
-}
-
-var commandCache = {}
-  // short names for common things
-  , aliases = { "rm" : "uninstall"
-              , "r" : "uninstall"
-              , "un" : "uninstall"
-              , "unlink" : "uninstall"
-              , "remove" : "uninstall"
-              , "rb" : "rebuild"
-              , "list" : "ls"
-              , "la" : "ls"
-              , "ll" : "ls"
-              , "ln" : "link"
-              , "i" : "install"
-              , "isntall" : "install"
-              , "up" : "update"
-              , "c" : "config"
-              , "info" : "view"
-              , "show" : "view"
-              , "find" : "search"
-              , "s" : "search"
-              , "se" : "search"
-              , "author" : "owner"
-              , "home" : "docs"
-              , "issues": "bugs"
-              , "unstar": "star" // same function
-              , "apihelp" : "help"
-              , "login": "adduser"
-              , "add-user": "adduser"
-              , "tst": "test"
-              , "find-dupes": "dedupe"
-              , "ddp": "dedupe"
-              , "v": "view"
-              }
-
-  , aliasNames = Object.keys(aliases)
-  // these are filenames in .
-  , cmdList = [ "install"
-              , "uninstall"
-              , "cache"
-              , "config"
-              , "set"
-              , "get"
-              , "update"
-              , "outdated"
-              , "prune"
-              , "submodule"
-              , "pack"
-              , "dedupe"
-
-              , "rebuild"
-              , "link"
-
-              , "publish"
-              , "star"
-              , "stars"
-              , "tag"
-              , "adduser"
-              , "unpublish"
-              , "owner"
-              , "deprecate"
-              , "shrinkwrap"
-
-              , "help"
-              , "help-search"
-              , "ls"
-              , "search"
-              , "view"
-              , "init"
-              , "version"
-              , "edit"
-              , "explore"
-              , "docs"
-              , "repo"
-              , "bugs"
-              , "faq"
-              , "root"
-              , "prefix"
-              , "bin"
-              , "whoami"
-
-              , "test"
-              , "stop"
-              , "start"
-              , "restart"
-              , "run-script"
-              , "completion"
-              ]
-  , plumbing = [ "build"
-               , "unbuild"
-               , "xmas"
-               , "substack"
-               , "visnup"
-               ]
-  , fullList = npm.fullList = cmdList.concat(aliasNames).filter(function (c) {
-      return plumbing.indexOf(c) === -1
-    })
-  , abbrevs = abbrev(fullList)
-
-Object.keys(abbrevs).concat(plumbing).forEach(function addCommand (c) {
-  Object.defineProperty(npm.commands, c, { get : function () {
-    if (!loaded) throw new Error(
-      "Call npm.load(conf, cb) before using this command.\n"+
-      "See the README.md or cli.js for example usage.")
-    var a = npm.deref(c)
-    if (c === "la" || c === "ll") {
-      npm.config.set("long", true)
-    }
-    npm.command = c
-    if (commandCache[a]) return commandCache[a]
-    var cmd = require(__dirname+"/"+a+".js")
-    commandCache[a] = function () {
-      var args = Array.prototype.slice.call(arguments, 0)
-      if (typeof args[args.length - 1] !== "function") {
-        args.push(defaultCb)
-      }
-      if (args.length === 1) args.unshift([])
-      cmd.apply(npm, args)
-    }
-    Object.keys(cmd).forEach(function (k) {
-      commandCache[a][k] = cmd[k]
-    })
-    return commandCache[a]
-  }, enumerable: fullList.indexOf(c) !== -1 })
-
-  // make css-case commands callable via camelCase as well
-  if (c.match(/\-([a-z])/)) {
-    addCommand(c.replace(/\-([a-z])/g, function (a, b) {
-      return b.toUpperCase()
-    }))
-  }
-})
-
-function defaultCb (er, data) {
-  if (er) console.error(er.stack || er.message)
-  else console.log(data)
-}
-
-npm.deref = function (c) {
-  if (!c) return ""
-  if (c.match(/[A-Z]/)) c = c.replace(/([A-Z])/g, function (m) {
-    return "-" + m.toLowerCase()
-  })
-  if (plumbing.indexOf(c) !== -1) return c
-  var a = abbrevs[c]
-  if (aliases[a]) a = aliases[a]
-  return a
-}
-
-var loaded = false
-  , loading = false
-  , loadErr = null
-  , loadListeners = []
-
-function loadCb (er) {
-  loadListeners.forEach(function (cb) {
-    process.nextTick(cb.bind(npm, er, npm))
-  })
-  loadListeners.length = 0
-}
-
-npm.load = function (cli, cb_) {
-  if (!cb_ && typeof cli === "function") cb_ = cli , cli = {}
-  if (!cb_) cb_ = function () {}
-  if (!cli) cli = {}
-  loadListeners.push(cb_)
-  if (loaded || loadErr) return cb(loadErr)
-  if (loading) return
-  loading = true
-  var onload = true
-
-  function cb (er) {
-    if (loadErr) return
-    if (npm.config.get("force")) {
-      log.warn("using --force", "I sure hope you know what you are doing.")
-    }
-    npm.config.loaded = true
-    loaded = true
-    loadCb(loadErr = er)
-    if (onload = onload && npm.config.get("onload-script")) {
-      require(onload)
-      onload = false
-    }
-  }
-
-  log.pause()
-
-  load(npm, cli, cb)
-}
-
-function load (npm, cli, cb) {
-  which(process.argv[0], function (er, node) {
-    if (!er && node.toUpperCase() !== process.execPath.toUpperCase()) {
-      log.verbose("node symlink", node)
-      process.execPath = node
-      process.installPrefix = path.resolve(node, "..", "..")
-    }
-
-    // look up configs
-    //console.error("about to look up configs")
-
-    var builtin = path.resolve(__dirname, "..", "npmrc")
-    npmconf.load(cli, builtin, function (er, conf) {
-      if (er === conf) er = null
-
-      npm.config = conf
-
-      var color = conf.get("color")
-
-      log.level = conf.get("loglevel")
-      log.heading = "npm"
-      log.stream = conf.get("logstream")
-      switch (color) {
-        case "always": log.enableColor(); break
-        case false: log.disableColor(); break
-      }
-      log.resume()
-
-      if (er) return cb(er)
-
-      // see if we need to color normal output
-      switch (color) {
-        case "always":
-          npm.color = true
-          break
-        case false:
-          npm.color = false
-          break
-        default:
-          var tty = require("tty")
-          if (process.stdout.isTTY) npm.color = true
-          else if (!tty.isatty) npm.color = true
-          else if (tty.isatty(1)) npm.color = true
-          else npm.color = false
-          break
-      }
-
-      // at this point the configs are all set.
-      // go ahead and spin up the registry client.
-      var token = conf.get("_token")
-      if (typeof token === "string") {
-        try {
-          token = JSON.parse(token)
-          conf.set("_token", token, "user")
-          conf.save("user")
-        } catch (e) { token = null }
-      }
-
-      npm.registry = new RegClient(npm.config)
-
-      // save the token cookie in the config file
-      if (npm.registry.couchLogin) {
-        npm.registry.couchLogin.tokenSet = function (tok) {
-          npm.config.set("_token", tok, "user")
-          // ignore save error.  best effort.
-          npm.config.save("user")
-        }
-      }
-
-      var umask = npm.config.get("umask")
-      npm.modes = { exec: 0777 & (~umask)
-                  , file: 0666 & (~umask)
-                  , umask: umask }
-
-      chain([ [ loadPrefix, npm, cli ]
-            , [ setUser, conf, conf.root ]
-            , [ loadUid, npm ]
-            ], cb)
-    })
-  })
-}
-
-function loadPrefix (npm, conf, cb) {
-  // try to guess at a good node_modules location.
-  var p
-    , gp
-  if (!Object.prototype.hasOwnProperty.call(conf, "prefix")) {
-    p = process.cwd()
-  } else {
-    p = npm.config.get("prefix")
-  }
-  gp = npm.config.get("prefix")
-
-  findPrefix(p, function (er, p) {
-    Object.defineProperty(npm, "localPrefix",
-      { get : function () { return p }
-      , set : function (r) { return p = r }
-      , enumerable : true
-      })
-    // the prefix MUST exist, or else nothing works.
-    if (!npm.config.get("global")) {
-      mkdir(p, next)
-    } else {
-      next(er)
-    }
-  })
-
-  gp = path.resolve(gp)
-  Object.defineProperty(npm, "globalPrefix",
-    { get : function () { return gp }
-    , set : function (r) { return gp = r }
-    , enumerable : true
-    })
-  // the prefix MUST exist, or else nothing works.
-  mkdir(gp, next)
-
-
-  var i = 2
-    , errState = null
-  function next (er) {
-    if (errState) return
-    if (er) return cb(errState = er)
-    if (--i === 0) return cb()
-  }
-}
-
-
-function loadUid (npm, cb) {
-  // if we're not in unsafe-perm mode, then figure out who
-  // to run stuff as.  Do this first, to support `npm update npm -g`
-  if (!npm.config.get("unsafe-perm")) {
-    getUid(npm.config.get("user"), npm.config.get("group"), cb)
-  } else {
-    process.nextTick(cb)
-  }
-}
-
-function setUser (cl, dc, cb) {
-  // If global, leave it as-is.
-  // If not global, then set the user to the owner of the prefix folder.
-  // Just set the default, so it can be overridden.
-  if (cl.get("global")) return cb()
-  if (process.env.SUDO_UID) {
-    dc.user = +(process.env.SUDO_UID)
-    return cb()
-  }
-
-  var prefix = path.resolve(cl.get("prefix"))
-  mkdir(prefix, function (er) {
-    if (er) {
-      log.error("could not create prefix dir", prefix)
-      return cb(er)
-    }
-    fs.stat(prefix, function (er, st) {
-      dc.user = st && st.uid
-      return cb(er)
-    })
-  })
-}
-
-
-Object.defineProperty(npm, "prefix",
-  { get : function () {
-      return npm.config.get("global") ? npm.globalPrefix : npm.localPrefix
-    }
-  , set : function (r) {
-      var k = npm.config.get("global") ? "globalPrefix" : "localPrefix"
-      return npm[k] = r
-    }
-  , enumerable : true
-  })
-
-Object.defineProperty(npm, "bin",
-  { get : function () {
-      if (npm.config.get("global")) return npm.globalBin
-      return path.resolve(npm.root, ".bin")
-    }
-  , enumerable : true
-  })
-
-Object.defineProperty(npm, "globalBin",
-  { get : function () {
-      var b = npm.globalPrefix
-      if (process.platform !== "win32") b = path.resolve(b, "bin")
-      return b
-    }
-  })
-
-Object.defineProperty(npm, "dir",
-  { get : function () {
-      if (npm.config.get("global")) return npm.globalDir
-      return path.resolve(npm.prefix, "node_modules")
-    }
-  , enumerable : true
-  })
-
-Object.defineProperty(npm, "globalDir",
-  { get : function () {
-      return (process.platform !== "win32")
-           ? path.resolve(npm.globalPrefix, "lib", "node_modules")
-           : path.resolve(npm.globalPrefix, "node_modules")
-    }
-  , enumerable : true
-  })
-
-Object.defineProperty(npm, "root",
-  { get : function () { return npm.dir } })
-
-Object.defineProperty(npm, "cache",
-  { get : function () { return npm.config.get("cache") }
-  , set : function (r) { return npm.config.set("cache", r) }
-  , enumerable : true
-  })
-
-var tmpFolder
-var crypto = require("crypto")
-var rand = crypto.randomBytes(6)
-                 .toString("base64")
-                 .replace(/\//g, '_')
-                 .replace(/\+/, '-')
-Object.defineProperty(npm, "tmp",
-  { get : function () {
-      if (!tmpFolder) tmpFolder = "npm-" + process.pid + "-" + rand
-      return path.resolve(npm.config.get("tmp"), tmpFolder)
-    }
-  , enumerable : true
-  })
-
-// the better to repl you with
-Object.getOwnPropertyNames(npm.commands).forEach(function (n) {
-  if (npm.hasOwnProperty(n) || n === "config") return
-
-  Object.defineProperty(npm, n, { get: function () {
-    return function () {
-      var args = Array.prototype.slice.call(arguments, 0)
-        , cb = defaultCb
-
-      if (args.length === 1 && Array.isArray(args[0])) {
-        args = args[0]
-      }
-
-      if (typeof args[args.length - 1] === "function") {
-        cb = args.pop()
-      }
-
-      npm.commands[n](args, cb)
-    }
-  }, enumerable: false, configurable: true })
-})
-
-if (require.main === module) {
-  require("../bin/npm-cli.js")
-}
-})()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/outdated.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,191 +0,0 @@
-/*
-
-npm outdated [pkg]
-
-Does the following:
-
-1. check for a new version of pkg
-
-If no packages are specified, then run for all installed
-packages.
-
-*/
-
-module.exports = outdated
-
-outdated.usage = "npm outdated [<pkg> [<pkg> ...]]"
-
-outdated.completion = require("./utils/completion/installed-deep.js")
-
-
-var path = require("path")
-  , fs = require("graceful-fs")
-  , readJson = require("read-package-json")
-  , cache = require("./cache.js")
-  , asyncMap = require("slide").asyncMap
-  , npm = require("./npm.js")
-  , url = require("url")
-
-function outdated (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-  var dir = path.resolve(npm.dir, "..")
-  outdated_(args, dir, {}, function (er, list) {
-    if (er || silent) return cb(er, list)
-    var outList = list.map(makePretty)
-    console.log(outList.join("\n"))
-    cb(null, list)
-  })
-}
-
-// [[ dir, dep, has, want ]]
-function makePretty (p) {
-  var parseable = npm.config.get("parseable")
-    , long = npm.config.get("long")
-    , dep = p[1]
-    , dir = path.resolve(p[0], "node_modules", dep)
-    , has = p[2]
-    , want = p[3]
-    , latest = p[4]
-
-  // XXX add --json support
-  // Should match (more or less) the output of ls --json
-
-  if (parseable) {
-    var str = dir
-    if (npm.config.get("long")) {
-      str += ":" + dep + "@" + want
-           + ":" + (has ? (dep + "@" + has) : "MISSING")
-    }
-    return str
-  }
-
-  if (!npm.config.get("global")) {
-    dir = path.relative(process.cwd(), dir)
-  }
-  return dep + " " + dir
-       + " current=" + (has || "MISSING")
-       + " wanted=" + want
-       + " latest=" + latest
-}
-
-function outdated_ (args, dir, parentHas, cb) {
-  // get the deps from package.json, or {<dir/node_modules/*>:"*"}
-  // asyncMap over deps:
-  //   shouldHave = cache.add(dep, req).version
-  //   if has === shouldHave then
-  //     return outdated(args, dir/node_modules/dep, parentHas + has)
-  //   else if dep in args or args is empty
-  //     return [dir, dep, has, shouldHave]
-
-  var deps = null
-  readJson(path.resolve(dir, "package.json"), function (er, d) {
-    if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-    deps = (er) ? true : (d.dependencies || {})
-    var doUpdate = npm.config.get("dev") ||
-                    (!npm.config.get("production") &&
-                    !Object.keys(parentHas).length &&
-                    !npm.config.get("global"))
-    if (!er && d && doUpdate) {
-      Object.keys(d.devDependencies || {}).forEach(function (k) {
-        if (!(k in parentHas)) {
-          deps[k] = d.devDependencies[k]
-        }
-      })
-    }
-    return next()
-  })
-
-  var has = null
-  fs.readdir(path.resolve(dir, "node_modules"), function (er, pkgs) {
-    if (er) {
-      has = Object.create(parentHas)
-      return next()
-    }
-    pkgs = pkgs.filter(function (p) {
-      return !p.match(/^[\._-]/)
-    })
-    asyncMap(pkgs, function (pkg, cb) {
-      var jsonFile = path.resolve(dir, "node_modules", pkg, "package.json")
-      readJson(jsonFile, function (er, d) {
-        if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-        cb(null, er ? [] : [[d.name, d.version, d._from]])
-      })
-    }, function (er, pvs) {
-      if (er) return cb(er)
-      has = Object.create(parentHas)
-      pvs.forEach(function (pv) {
-        has[pv[0]] = {
-          version: pv[1],
-          from: pv[2]
-        }
-      })
-
-      next()
-    })
-  })
-
-  function next () {
-    if (!has || !deps) return
-    if (deps === true) {
-      deps = Object.keys(has).reduce(function (l, r) {
-        l[r] = "*"
-        return l
-      }, {})
-    }
-
-    // now get what we should have, based on the dep.
-    // if has[dep] !== shouldHave[dep], then cb with the data
-    // otherwise dive into the folder
-    asyncMap(Object.keys(deps), function (dep, cb) {
-      shouldUpdate(args, dir, dep, has, deps[dep], cb)
-    }, cb)
-  }
-}
-
-function shouldUpdate (args, dir, dep, has, req, cb) {
-  // look up the most recent version.
-  // if that's what we already have, or if it's not on the args list,
-  // then dive into it.  Otherwise, cb() with the data.
-
-  // { version: , from: }
-  var curr = has[dep]
-
-  function skip () {
-    outdated_( args
-             , path.resolve(dir, "node_modules", dep)
-             , has
-             , cb )
-  }
-
-  function doIt (wanted, latest) {
-    cb(null, [[ dir, dep, curr && curr.version, wanted, latest, req ]])
-  }
-
-  if (args.length && args.indexOf(dep) === -1) {
-    return skip()
-  }
-
-  var registry = npm.registry
-  // search for the latest package
-  registry.get(dep + "/latest", function (er, l) {
-    if (er) return cb()
-    // so, we can conceivably update this.  find out if we need to.
-    cache.add(dep, req, function (er, d) {
-      // if this fails, then it means we can't update this thing.
-      // it's probably a thing that isn't published.
-      if (er) return skip()
-
-      // check that the url origin hasn't changed (#1727) and that
-      // there is no newer version available
-      var dFromUrl = d._from && url.parse(d._from).protocol
-      var cFromUrl = curr && curr.from && url.parse(curr.from).protocol
-
-      if (!curr || dFromUrl && cFromUrl && d._from !== curr.from
-          || d.version !== curr.version
-          || d.version !== l.version)
-        doIt(d.version, l.version)
-      else
-        skip()
-    })
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/owner.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,205 +0,0 @@
-
-module.exports = owner
-
-owner.usage = "npm owner add <username> <pkg>"
-            + "\nnpm owner rm <username> <pkg>"
-            + "\nnpm owner ls <pkg>"
-
-owner.completion = function (opts, cb) {
-  var argv = opts.conf.argv.remain
-  if (argv.length > 4) return cb()
-  if (argv.length <= 2) {
-    var subs = ["add", "rm"]
-    if (opts.partialWord === "l") subs.push("ls")
-    else subs.push("ls", "list")
-    return cb(null, subs)
-  }
-  var un = encodeURIComponent(npm.config.get("username"))
-  switch (argv[2]) {
-    case "ls":
-      if (argv.length > 3) return cb()
-      else return registry.get("/-/short", cb)
-
-    case "rm":
-      if (argv.length > 3) {
-        var theUser = encodeURIComponent(argv[3])
-          , uri = "/-/by-user/"+theUser+"|"+un
-        console.error(uri)
-        return registry.get(uri, function (er, d) {
-          if (er) return cb(er)
-          // return the intersection
-          return cb(null, d[theUser].filter(function (p) {
-            // kludge for server adminery.
-            return un === "isaacs" || d[un].indexOf(p) === -1
-          }))
-        })
-      }
-      // else fallthrough
-    case "add":
-      if (argv.length > 3) {
-        var theUser = encodeURIComponent(argv[3])
-          , uri = "/-/by-user/"+theUser+"|"+un
-        console.error(uri)
-        return registry.get(uri, function (er, d) {
-          console.error(uri, er || d)
-          // return mine that they're not already on.
-          if (er) return cb(er)
-          var mine = d[un] || []
-            , theirs = d[theUser] || []
-          return cb(null, mine.filter(function (p) {
-            return theirs.indexOf(p) === -1
-          }))
-        })
-      }
-      // just list all users who aren't me.
-      return registry.get("/-/users", function (er, list) {
-        if (er) return cb()
-        return cb(null, Object.keys(list).filter(function (n) {
-          return n !== un
-        }))
-      })
-
-    default:
-      return cb()
-  }
-}
-
-var npm = require("./npm.js")
-  , registry = npm.registry
-  , log = require("npmlog")
-  , readJson = require("read-package-json")
-
-function owner (args, cb) {
-  var action = args.shift()
-  switch (action) {
-    case "ls": case "list": return ls(args[0], cb)
-    case "add": return add(args[0], args[1], cb)
-    case "rm": case "remove": return rm(args[0], args[1], cb)
-    default: return unknown(action, cb)
-  }
-}
-
-function ls (pkg, cb) {
-  if (!pkg) return readLocalPkg(function (er, pkg) {
-    if (er) return cb(er)
-    if (!pkg) return cb(owner.usage)
-    ls(pkg, cb)
-  })
-
-  registry.get(pkg, function (er, data) {
-    var msg = ""
-    if (er) {
-      log.error("owner ls", "Couldn't get owner data", pkg)
-      return cb(er)
-    }
-    var owners = data.maintainers
-    if (!owners || !owners.length) msg = "admin party!"
-    else msg = owners.map(function (o) { return o.name +" <"+o.email+">" }).join("\n")
-    console.log(msg)
-    cb(er, owners)
-  })
-}
-
-function add (user, pkg, cb) {
-  if (!user) return cb(owner.usage)
-  if (!pkg) return readLocalPkg(function (er, pkg) {
-    if (er) return cb(er)
-    if (!pkg) return cb(new Error(owner.usage))
-    add(user, pkg, cb)
-  })
-
-  log.verbose("owner add", "%s to %s", user, pkg)
-  mutate(pkg, user, function (u, owners) {
-    if (!owners) owners = []
-    for (var i = 0, l = owners.length; i < l; i ++) {
-      var o = owners[i]
-      if (o.name === u.name) {
-        log.info( "owner add"
-                , "Already a package owner: "+o.name+" <"+o.email+">")
-        return false
-      }
-    }
-    owners.push(u)
-    return owners
-  }, cb)
-}
-
-function rm (user, pkg, cb) {
-  if (!pkg) return readLocalPkg(function (er, pkg) {
-    if (er) return cb(er)
-    if (!pkg) return cb(new Error(owner.usage))
-    rm(user, pkg, cb)
-  })
-
-  log.verbose("owner rm", "%s from %s", user, pkg)
-  mutate(pkg, null, function (u, owners) {
-    var found = false
-      , m = owners.filter(function (o) {
-          var match = (o.name === user)
-          found = found || match
-          return !match
-        })
-    if (!found) {
-      log.info("owner rm", "Not a package owner: "+user)
-      return false
-    }
-    if (!m.length) return new Error(
-      "Cannot remove all owners of a package.  Add someone else first.")
-    return m
-  }, cb)
-}
-
-function mutate (pkg, user, mutation, cb) {
-  if (user) {
-    registry.get("/-/user/org.couchdb.user:"+user, mutate_)
-  } else {
-    mutate_(null, null)
-  }
-
-  function mutate_ (er, u) {
-    if (!er && user && (!u || u.error)) er = new Error(
-      "Couldn't get user data for "+user+": "+JSON.stringify(u))
-
-    if (er) {
-      log.error("owner mutate", "Error getting user data for %s", user)
-      return cb(er)
-    }
-
-    if (u) u = { "name" : u.name, "email" : u.email }
-    registry.get(pkg, function (er, data) {
-      if (er) {
-        log.error("owner mutate", "Error getting package data for %s", pkg)
-        return cb(er)
-      }
-      var m = mutation(u, data.maintainers)
-      if (!m) return cb() // handled
-      if (m instanceof Error) return cb(m) // error
-      data = { _id : data._id
-             , _rev : data._rev
-             , maintainers : m
-             }
-      registry.request("PUT"
-          , pkg+"/-rev/"+data._rev, data
-          , function (er, data) {
-        if (!er && data.error) er = new Error(
-          "Failed to update package metadata: "+JSON.stringify(data))
-        if (er) {
-          log.error("owner mutate", "Failed to update package metadata")
-        }
-        cb(er, data)
-      })
-    })
-  }
-}
-
-function readLocalPkg (cb) {
-  if (npm.config.get("global")) return cb()
-  var path = require("path")
-  readJson(path.resolve(npm.prefix, "package.json"), function (er, d) {
-    return cb(er, d && d.name)
-  })
-}
-
-function unknown (action, cb) {
-  cb("Usage: \n"+owner.usage)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/pack.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,65 +0,0 @@
-// npm pack <pkg>
-// Packs the specified package into a .tgz file, which can then
-// be installed.
-
-module.exports = pack
-
-var npm = require("./npm.js")
-  , install = require("./install.js")
-  , cache = require("./cache.js")
-  , fs = require("graceful-fs")
-  , chain = require("slide").chain
-  , path = require("path")
-  , cwd = process.cwd()
-
-pack.usage = "npm pack <pkg>"
-
-// if it can be installed, it can be packed.
-pack.completion = install.completion
-
-function pack (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-
-  if (args.length === 0) args = ["."]
-
-  chain(args.map(function (arg) { return function (cb) {
-    pack_(arg, cb)
-  }}), function (er, files) {
-    if (er || silent) return cb(er, files)
-    printFiles(files, cb)
-  })
-}
-
-function printFiles (files, cb) {
-  files = files.map(function (file) {
-    return path.relative(cwd, file)
-  })
-  console.log(files.join("\n"))
-  cb()
-}
-
-// add to cache, then cp to the cwd
-function pack_ (pkg, cb) {
-  cache.add(pkg, function (er, data) {
-    if (er) return cb(er)
-    var fname = path.resolve(data._id.replace(/@/g, "-") + ".tgz")
-      , cached = path.resolve( npm.cache
-                             , data.name
-                             , data.version
-                             , "package.tgz" )
-      , from = fs.createReadStream(cached)
-      , to = fs.createWriteStream(fname)
-      , errState = null
-
-    from.on("error", cb_)
-    to.on("error", cb_)
-    to.on("close", cb_)
-    from.pipe(to)
-
-    function cb_ (er) {
-      if (errState) return
-      if (er) return cb(errState = er)
-      cb(null, fname)
-    }
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/prefix.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
-module.exports = prefix
-
-var npm = require("./npm.js")
-
-prefix.usage = "npm prefix\nnpm prefix -g\n(just prints the prefix folder)"
-
-function prefix (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-  if (!silent) console.log(npm.prefix)
-  process.nextTick(cb.bind(this, null, npm.prefix))
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/prune.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,52 +0,0 @@
-// prune extraneous packages.
-
-module.exports = prune
-
-prune.usage = "npm prune"
-
-var readInstalled = require("read-installed")
-  , npm = require("./npm.js")
-  , path = require("path")
-  , readJson = require("read-package-json")
-  , log = require("npmlog")
-
-prune.completion = require("./utils/completion/installed-deep.js")
-
-function prune (args, cb) {
-  var jsonFile = path.resolve(npm.dir, "..", "package.json" )
-  readJson(jsonFile, log.warn, function (er, packageData) {
-    if (er) return cb(er)
-    readInstalled(npm.prefix, npm.config.get("depth"), function (er, data) {
-      if (er) return cb(er)
-      if (npm.config.get("production")) {
-        Object.keys(packageData.devDependencies || {}).forEach(function (k) {
-          if (data.dependencies[k]) data.dependencies[k].extraneous = true
-        })
-      }
-      prune_(args, data, cb)
-    })
-  })
-}
-
-function prune_ (args, data, cb) {
-  npm.commands.unbuild(prunables(args, data, []), cb)
-}
-
-function prunables (args, data, seen) {
-  var deps = data.dependencies || {}
-  return Object.keys(deps).map(function (d) {
-    if (typeof deps[d] !== "object"
-        || seen.indexOf(deps[d]) !== -1) return null
-    seen.push(deps[d])
-    if (deps[d].extraneous
-        && (args.length === 0 || args.indexOf(d) !== -1)) {
-      var extra = deps[d]
-      delete deps[d]
-      return extra.path
-    }
-    return prunables(args, deps[d], seen)
-  }).filter(function (d) { return d !== null })
-  .reduce(function FLAT (l, r) {
-    return l.concat(Array.isArray(r) ? r.reduce(FLAT,[]) : r)
-  }, [])
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/publish.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,117 +0,0 @@
-
-module.exports = publish
-
-var npm = require("./npm.js")
-  , log = require("npmlog")
-  , tar = require("./utils/tar.js")
-  , path = require("path")
-  , readJson = require("read-package-json")
-  , fs = require("graceful-fs")
-  , lifecycle = require("./utils/lifecycle.js")
-  , chain = require("slide").chain
-  , Conf = require("npmconf").Conf
-  , RegClient = require("npm-registry-client")
-
-publish.usage = "npm publish <tarball>"
-              + "\nnpm publish <folder>"
-              + "\n\nPublishes '.' if no argument supplied"
-
-publish.completion = function (opts, cb) {
-  // publish can complete to a folder with a package.json
-  // or a tarball, or a tarball url.
-  // for now, not yet implemented.
-  return cb()
-}
-
-function publish (args, isRetry, cb) {
-  if (typeof cb !== "function") cb = isRetry, isRetry = false
-  if (args.length === 0) args = ["."]
-  if (args.length !== 1) return cb(publish.usage)
-
-  log.verbose("publish", args)
-  var arg = args[0]
-  // if it's a local folder, then run the prepublish there, first.
-  readJson(path.resolve(arg, "package.json"), function (er, data) {
-    er = needVersion(er, data)
-    if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-    // error is ok.  could be publishing a url or tarball
-    // however, that means that we will not have automatically run
-    // the prepublish script, since that gets run when adding a folder
-    // to the cache.
-    if (er) return cacheAddPublish(arg, false, isRetry, cb)
-    cacheAddPublish(arg, true, isRetry, cb)
-  })
-}
-
-// didPre in this case means that we already ran the prepublish script,
-// and that the "dir" is an actual directory, and not something silly
-// like a tarball or name@version thing.
-// That means that we can run publish/postpublish in the dir, rather than
-// in the cache dir.
-function cacheAddPublish (dir, didPre, isRetry, cb) {
-  npm.commands.cache.add(dir, function (er, data) {
-    if (er) return cb(er)
-    log.silly("publish", data)
-    var cachedir = path.resolve( npm.cache
-                               , data.name
-                               , data.version
-                               , "package" )
-    chain
-      ( [ !didPre && [lifecycle, data, "prepublish", cachedir]
-        , [publish_, dir, data, isRetry, cachedir]
-        , [lifecycle, data, "publish", didPre ? dir : cachedir]
-        , [lifecycle, data, "postpublish", didPre ? dir : cachedir] ]
-      , cb )
-  })
-}
-
-function publish_ (arg, data, isRetry, cachedir, cb) {
-  if (!data) return cb(new Error("no package.json file found"))
-
-  // check for publishConfig hash
-  var registry = npm.registry
-  if (data.publishConfig) {
-    var pubConf = new Conf(npm.config)
-
-    // don't modify the actual publishConfig object, in case we have
-    // to set a login token or some other data.
-    pubConf.unshift(Object.keys(data.publishConfig).reduce(function (s, k) {
-      s[k] = data.publishConfig[k]
-      return s
-    }, {}))
-    registry = new RegClient(pubConf)
-  }
-
-  data._npmVersion = npm.version
-  data._npmUser = { name: npm.config.get("username")
-                  , email: npm.config.get("email") }
-
-  delete data.modules
-  if (data.private) return cb(new Error
-    ("This package has been marked as private\n"
-    +"Remove the 'private' field from the package.json to publish it."))
-
-  var tarball = cachedir + ".tgz"
-  registry.publish(data, tarball, function (er) {
-    if (er && er.code === "EPUBLISHCONFLICT"
-        && npm.config.get("force") && !isRetry) {
-      log.warn("publish", "Forced publish over "+data._id)
-      return npm.commands.unpublish([data._id], function (er) {
-        // ignore errors.  Use the force.  Reach out with your feelings.
-        // but if it fails again, then report the first error.
-        publish([arg], er || true, cb)
-      })
-    }
-    // report the unpublish error if this was a retry and unpublish failed
-    if (er && isRetry && isRetry !== true) return cb(isRetry)
-    if (er) return cb(er)
-    console.log("+ " + data._id)
-    cb()
-  })
-}
-
-function needVersion(er, data) {
-  return er ? er
-       : (data && !data.version) ? new Error("No version provided")
-       : null
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/rebuild.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,74 +0,0 @@
-
-module.exports = rebuild
-
-var readInstalled = require("read-installed")
-  , semver = require("semver")
-  , log = require("npmlog")
-  , path = require("path")
-  , npm = require("./npm.js")
-  , asyncMap = require("slide").asyncMap
-  , fs = require("graceful-fs")
-
-rebuild.usage = "npm rebuild [<name>[@<version>] [name[@<version>] ...]]"
-
-rebuild.completion = require("./utils/completion/installed-deep.js")
-
-function rebuild (args, cb) {
-  readInstalled(npm.prefix, npm.config.get("depth"), function (er, data) {
-    log.info("readInstalled", typeof data)
-    if (er) return cb(er)
-    var set = filter(data, args)
-      , folders = Object.keys(set).filter(function (f) {
-          return f !== npm.prefix
-        })
-    if (!folders.length) return cb()
-    log.silly("rebuild set", folders)
-    cleanBuild(folders, set, cb)
-  })
-}
-
-function cleanBuild (folders, set, cb) {
-  npm.commands.build(folders, function (er) {
-    if (er) return cb(er)
-    console.log(folders.map(function (f) {
-      return set[f] + " " + f
-    }).join("\n"))
-    cb()
-  })
-}
-
-function filter (data, args, set, seen) {
-  if (!set) set = {}
-  if (!seen) seen = {}
-  if (set.hasOwnProperty(data.path)) return set
-  if (seen.hasOwnProperty(data.path)) return set
-  seen[data.path] = true
-  var pass
-  if (!args.length) pass = true // rebuild everything
-  else if (data.name && data._id) {
-    for (var i = 0, l = args.length; i < l; i ++) {
-      var arg = args[i]
-        , nv = arg.split("@")
-        , n = nv.shift()
-        , v = nv.join("@")
-      if (n !== data.name) continue
-      if (!semver.satisfies(data.version, v, true)) continue
-      pass = true
-      break
-    }
-  }
-  if (pass && data._id) {
-    log.verbose("rebuild", "path, id", [data.path, data._id])
-    set[data.path] = data._id
-  }
-  // need to also dive through kids, always.
-  // since this isn't an install these won't get auto-built unless
-  // they're not dependencies.
-  Object.keys(data.dependencies || {}).forEach(function (d) {
-    // return
-    var dep = data.dependencies[d]
-    if (typeof dep === "string") return
-    filter(dep, args, set, seen)
-  })
-  return set
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/repo.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-
-module.exports = repo
-
-repo.usage = "npm repo <pkgname>"
-
-repo.completion = function (opts, cb) {
-  if (opts.conf.argv.remain.length > 2) return cb()
-  registry.get("/-/short", 60000, function (er, list) {
-    return cb(null, list || [])
-  })
-}
-
-var npm = require("./npm.js")
-  , registry = npm.registry
-  , log = require("npmlog")
-  , opener = require("opener")
-  , github = require('github-url-from-git')
-  , githubUserRepo = require("github-url-from-username-repo")
-
-function repo (args, cb) {
-  if (!args.length) return cb(repo.usage)
-  var n = args[0].split("@").shift()
-  registry.get(n + "/latest", 3600, function (er, d) {
-    if (er) return cb(er)
-    var r = d.repository;
-    if (!r) return cb(new Error('no repository'));
-    // XXX remove this when npm@v1.3.10 from node 0.10 is deprecated
-    // from https://github.com/isaacs/npm-www/issues/418
-    if (githubUserRepo(r.url))
-      r.url = githubUserRepo(r.url)
-
-    var url = github(r.url)
-    if (!url)
-      return cb(new Error('no repository: could not get url'))
-    opener(url, { command: npm.config.get("browser") }, cb)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/restart.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require("./utils/lifecycle.js").cmd("restart")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/root.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
-module.exports = root
-
-var npm = require("./npm.js")
-
-root.usage = "npm root\nnpm root -g\n(just prints the root folder)"
-
-function root (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-  if (!silent) console.log(npm.dir)
-  process.nextTick(cb.bind(this, null, npm.dir))
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/run-script.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,102 +0,0 @@
-
-module.exports = runScript
-
-var lifecycle = require("./utils/lifecycle.js")
-  , npm = require("./npm.js")
-  , path = require("path")
-  , readJson = require("read-package-json")
-  , log = require("npmlog")
-  , chain = require("slide").chain
-  , fs = require("graceful-fs")
-  , asyncMap = require("slide").asyncMap
-
-runScript.usage = "npm run-script [<pkg>] <command>"
-
-runScript.completion = function (opts, cb) {
-
-  // see if there's already a package specified.
-  var argv = opts.conf.argv.remain
-    , installedShallow = require("./utils/completion/installed-shallow.js")
-
-  if (argv.length >= 4) return cb()
-
-  if (argv.length === 3) {
-    // either specified a script locally, in which case, done,
-    // or a package, in which case, complete against its scripts
-    var json = path.join(npm.prefix, "package.json")
-    return readJson(json, function (er, d) {
-      if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-      if (er) d = {}
-      var scripts = Object.keys(d.scripts || {})
-      console.error("local scripts", scripts)
-      if (scripts.indexOf(argv[2]) !== -1) return cb()
-      // ok, try to find out which package it was, then
-      var pref = npm.config.get("global") ? npm.config.get("prefix")
-               : npm.prefix
-      var pkgDir = path.resolve( pref, "node_modules"
-                               , argv[2], "package.json" )
-      readJson(pkgDir, function (er, d) {
-        if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-        if (er) d = {}
-        var scripts = Object.keys(d.scripts || {})
-        return cb(null, scripts)
-      })
-    })
-  }
-
-  // complete against the installed-shallow, and the pwd's scripts.
-  // but only packages that have scripts
-  var installed
-    , scripts
-  installedShallow(opts, function (d) {
-    return d.scripts
-  }, function (er, inst) {
-    installed = inst
-    next()
-  })
-
-  if (npm.config.get("global")) scripts = [], next()
-  else readJson(path.join(npm.prefix, "package.json"), function (er, d) {
-    if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-    d = d || {}
-    scripts = Object.keys(d.scripts || {})
-    next()
-  })
-
-  function next () {
-    if (!installed || !scripts) return
-    return cb(null, scripts.concat(installed))
-  }
-}
-
-function runScript (args, cb) {
-  if (!args.length) return cb(runScript.usage)
-  var pkgdir = args.length === 1 ? process.cwd()
-             : path.resolve(npm.dir, args[0])
-    , cmd = args.pop()
-
-  readJson(path.resolve(pkgdir, "package.json"), function (er, d) {
-    if (er) return cb(er)
-    run(d, pkgdir, cmd, cb)
-  })
-}
-
-function run (pkg, wd, cmd, cb) {
-  var cmds = []
-  if (!pkg.scripts) pkg.scripts = {}
-  if (cmd === "restart") {
-    cmds = ["prestop","stop","poststop"
-           ,"restart"
-           ,"prestart","start","poststart"]
-  } else {
-    cmds = [cmd]
-  }
-  if (!cmd.match(/^(pre|post)/)) {
-    cmds = ["pre"+cmd].concat(cmds).concat("post"+cmd)
-  }
-  log.verbose("run-script", cmds)
-  chain(cmds.map(function (c) {
-    // when running scripts explicitly, assume that they're trusted.
-    return [lifecycle, pkg, c, wd, true]
-  }), cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/search.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,262 +0,0 @@
-
-module.exports = exports = search
-
-var npm = require("./npm.js")
-  , registry = npm.registry
-
-search.usage = "npm search [some search terms ...]"
-
-search.completion = function (opts, cb) {
-  var compl = {}
-    , partial = opts.partialWord
-    , ipartial = partial.toLowerCase()
-    , plen = partial.length
-
-  // get the batch of data that matches so far.
-  // this is an example of using npm.commands.search programmatically
-  // to fetch data that has been filtered by a set of arguments.
-  search(opts.conf.argv.remain.slice(2), true, function (er, data) {
-    if (er) return cb(er)
-    Object.keys(data).forEach(function (name) {
-      data[name].words.split(" ").forEach(function (w) {
-        if (w.toLowerCase().indexOf(ipartial) === 0) {
-          compl[partial + w.substr(plen)] = true
-        }
-      })
-    })
-    cb(null, Object.keys(compl))
-  })
-}
-
-function search (args, silent, staleness, cb) {
-  if (typeof cb !== "function") cb = staleness, staleness = 600
-  if (typeof cb !== "function") cb = silent, silent = false
-
-  var searchopts = npm.config.get("searchopts")
-    , searchexclude = npm.config.get("searchexclude")
-  if (typeof searchopts !== "string") searchopts = ""
-  searchopts = searchopts.split(/\s+/)
-  if (typeof searchexclude === "string") {
-    searchexclude = searchexclude.split(/\s+/)
-  } else searchexclude = []
-  var opts = searchopts.concat(args).map(function (s) {
-    return s.toLowerCase()
-  }).filter(function (s) { return s })
-  searchexclude = searchexclude.map(function (s) {
-    return s.toLowerCase()
-  })
-  getFilteredData( staleness, opts, searchexclude, function (er, data) {
-    // now data is the list of data that we want to show.
-    // prettify and print it, and then provide the raw
-    // data to the cb.
-    if (er || silent) return cb(er, data)
-    console.log(prettify(data, args))
-    cb(null, data)
-  })
-}
-
-function getFilteredData (staleness, args, notArgs, cb) {
-  registry.get( "/-/all", staleness, false
-              , true, function (er, data) {
-    if (er) return cb(er)
-    return cb(null, filter(data, args, notArgs))
-  })
-}
-
-function filter (data, args, notArgs) {
-  // data={<name>:{package data}}
-  return Object.keys(data).map(function (d) {
-    return data[d]
-  }).filter(function (d) {
-    return typeof d === "object"
-  }).map(stripData).map(getWords).filter(function (data) {
-    return filterWords(data, args, notArgs)
-  }).reduce(function (l, r) {
-    l[r.name] = r
-    return l
-  }, {})
-}
-
-function stripData (data) {
-  return { name: data.name
-         , description: npm.config.get("description") ? data.description : ""
-         , maintainers: (data.maintainers || []).map(function (m) {
-             return "=" + m.name
-           })
-         , url: !Object.keys(data.versions || {}).length ? data.url : null
-         , keywords: data.keywords || []
-         , version: Object.keys(data.versions || {})[0] || []
-         , time: data.time
-                 && data.time.modified
-                 && (new Date(data.time.modified).toISOString()
-                     .split("T").join(" ")
-                     .replace(/:[0-9]{2}\.[0-9]{3}Z$/, ""))
-                 || "(prehistoric)"
-         }
-}
-
-function getWords (data) {
-  data.words = [ data.name ]
-               .concat(data.description)
-               .concat(data.maintainers)
-               .concat(data.url && ("<" + data.url + ">"))
-               .concat(data.keywords)
-               .map(function (f) { return f && f.trim && f.trim() })
-               .filter(function (f) { return f })
-               .join(" ")
-               .toLowerCase()
-  return data
-}
-
-function filterWords (data, args, notArgs) {
-  var words = data.words
-  for (var i = 0, l = args.length; i < l; i ++) {
-    if (!match(words, args[i])) return false
-  }
-  for (var i = 0, l = notArgs.length; i < l; i ++) {
-    if (match(words, notArgs[i])) return false
-  }
-  return true
-}
-
-function match (words, arg) {
-  if (arg.charAt(0) === "/") {
-    arg = arg.replace(/\/$/, "")
-    arg = new RegExp(arg.substr(1, arg.length - 1))
-    return words.match(arg)
-  }
-  return words.indexOf(arg) !== -1
-}
-
-function prettify (data, args) {
-  try {
-    var tty = require("tty")
-      , stdout = process.stdout
-      , cols = !tty.isatty(stdout.fd) ? Infinity
-             : process.stdout.getWindowSize()[0]
-      cols = (cols == 0) ? Infinity : cols
-  } catch (ex) { cols = Infinity }
-
-  // name, desc, author, keywords
-  var longest = []
-    , spaces
-    , maxLen = npm.config.get("description")
-             ? [20, 60, 20, 20, 10, Infinity]
-             : [20, 20, 20, 10, Infinity]
-    , headings = npm.config.get("description")
-               ? ["NAME", "DESCRIPTION", "AUTHOR", "DATE", "VERSION", "KEYWORDS"]
-               : ["NAME", "AUTHOR", "DATE", "VERSION", "KEYWORDS"]
-    , lines
-    , searchsort = (npm.config.get("searchsort") || "NAME").toLowerCase()
-    , sortFields = { name: 0
-                   , description: 1
-                   , author: 2
-                   , date: 3
-                   , version: 4
-                   , keywords: 5 }
-    , searchRev = searchsort.charAt(0) === "-"
-    , sortField = sortFields[searchsort.replace(/^\-+/, "")]
-
-  lines = Object.keys(data).map(function (d) {
-    return data[d]
-  }).map(function (data) {
-    // turn a pkg data into a string
-    // [name,who,desc,targets,keywords] tuple
-    // also set longest to the longest name
-    if (typeof data.keywords === "string") {
-      data.keywords = data.keywords.split(/[,\s]+/)
-    }
-    if (!Array.isArray(data.keywords)) data.keywords = []
-    var l = [ data.name
-            , data.description || ""
-            , data.maintainers.join(" ")
-            , data.time
-            , data.version || ""
-            , (data.keywords || []).join(" ")
-            ]
-    l.forEach(function (s, i) {
-      var len = s.length
-      longest[i] = Math.min(maxLen[i] || Infinity
-                           ,Math.max(longest[i] || 0, len))
-      if (len > longest[i]) {
-        l._undent = l._undent || []
-        l._undent[i] = len - longest[i]
-      }
-      l[i] = ('' + l[i]).replace(/\s+/g, " ")
-    })
-    return l
-  }).sort(function (a, b) {
-    // a and b are "line" objects of [name, desc, maint, time, kw]
-    var aa = a[sortField].toLowerCase()
-      , bb = b[sortField].toLowerCase()
-    return aa === bb ? 0
-         : aa < bb ? (searchRev ? 1 : -1)
-         : (searchRev ? -1 : 1)
-  }).map(function (line) {
-    return line.map(function (s, i) {
-      spaces = spaces || longest.map(function (n) {
-        return new Array(n + 2).join(" ")
-      })
-      var len = s.length
-      if (line._undent && line._undent[i - 1]) {
-        len += line._undent[i - 1] - 1
-      }
-      return s + spaces[i].substr(len)
-    }).join(" ").substr(0, cols).trim()
-  }).map(function (line) {
-    // colorize!
-    args.forEach(function (arg, i) {
-      line = addColorMarker(line, arg, i)
-    })
-    return colorize(line).trim()
-  })
-
-  if (lines.length === 0) {
-    return "No match found for "+(args.map(JSON.stringify).join(" "))
-  }
-
-  // build the heading padded to the longest in each field
-  return headings.map(function (h, i) {
-    var space = Math.max(2, 3 + (longest[i] || 0) - h.length)
-    return h + (new Array(space).join(" "))
-  }).join("").substr(0, cols).trim() + "\n" + lines.join("\n")
-
-}
-
-var colors = [31, 33, 32, 36, 34, 35 ]
-  , cl = colors.length
-function addColorMarker (str, arg, i) {
-  var m = i % cl + 1
-    , markStart = String.fromCharCode(m)
-    , markEnd = String.fromCharCode(0)
-
-  if (arg.charAt(0) === "/") {
-    //arg = arg.replace(/\/$/, "")
-    return str.replace( new RegExp(arg.substr(1, arg.length - 1), "gi")
-                      , function (bit) { return markStart + bit + markEnd } )
-
-  }
-
-  // just a normal string, do the split/map thing
-  var pieces = str.toLowerCase().split(arg.toLowerCase())
-    , p = 0
-
-  return pieces.map(function (piece, i) {
-    piece = str.substr(p, piece.length)
-    var mark = markStart
-             + str.substr(p+piece.length, arg.length)
-             + markEnd
-    p += piece.length + arg.length
-    return piece + mark
-  }).join("")
-}
-
-function colorize (line) {
-  for (var i = 0; i < cl; i ++) {
-    var m = i + 1
-    var color = npm.color ? "\033["+colors[i]+"m" : ""
-    line = line.split(String.fromCharCode(m)).join(color)
-  }
-  var uncolor = npm.color ? "\033[0m" : ""
-  return line.split("\u0000").join(uncolor)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/set.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-
-module.exports = set
-
-set.usage = "npm set <key> <value> (See `npm config`)"
-
-var npm = require("./npm.js")
-
-set.completion = npm.commands.config.completion
-
-function set (args, cb) {
-  if (!args.length) return cb(set.usage)
-  npm.commands.config(["set"].concat(args), cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/shrinkwrap.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,69 +0,0 @@
-// emit JSON describing versions of all packages currently installed (for later
-// use with shrinkwrap install)
-
-module.exports = exports = shrinkwrap
-
-var npm = require("./npm.js")
-  , log = require("npmlog")
-  , fs = require("fs")
-  , path = require("path")
-  , readJson = require("read-package-json")
-
-shrinkwrap.usage = "npm shrinkwrap"
-
-function shrinkwrap (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-
-  if (args.length) {
-    log.warn("shrinkwrap", "doesn't take positional args")
-  }
-
-  npm.commands.ls([], true, function (er, _, pkginfo) {
-    if (er) return cb(er)
-    shrinkwrap_(pkginfo, silent, npm.config.get("dev"), cb)
-  })
-}
-
-function shrinkwrap_ (pkginfo, silent, dev, cb) {
-  if (pkginfo.problems) {
-    return cb(new Error("Problems were encountered\n"
-                       +"Please correct and try again.\n"
-                       +pkginfo.problems.join("\n")))
-  }
-
-  if (!dev) {
-    // remove dev deps unless the user does --dev
-    readJson(path.resolve(npm.prefix, "package.json"), function (er, data) {
-      if (er)
-        return cb(er)
-      if (data.devDependencies) {
-        Object.keys(data.devDependencies).forEach(function (dep) {
-          log.warn("shrinkwrap", "Excluding devDependency: %s", dep)
-          delete pkginfo.dependencies[dep]
-        })
-      }
-      save(pkginfo, silent, cb)
-    })
-  } else {
-    save(pkginfo, silent, cb)
-  }
-}
-
-
-function save (pkginfo, silent, cb) {
-  try {
-    var swdata = JSON.stringify(pkginfo, null, 2) + "\n"
-  } catch (er) {
-    log.error("shrinkwrap", "Error converting package info to json")
-    return cb(er)
-  }
-
-  var file = path.resolve(npm.prefix, "npm-shrinkwrap.json")
-
-  fs.writeFile(file, swdata, function (er) {
-    if (er) return cb(er)
-    if (silent) return cb(null, pkginfo)
-    console.log("wrote npm-shrinkwrap.json")
-    cb(null, pkginfo)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/star.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-
-module.exports = star
-
-var npm = require("./npm.js")
-  , registry = npm.registry
-  , log = require("npmlog")
-  , asyncMap = require("slide").asyncMap
-
-star.usage = "npm star <package> [pkg, pkg, ...]\n"
-           + "npm unstar <package> [pkg, pkg, ...]"
-
-star.completion = function (opts, cb) {
-  registry.get("/-/short", 60000, function (er, list) {
-    return cb(null, list || [])
-  })
-}
-
-function star (args, cb) {
-  if (!args.length) return cb(star.usage)
-  var s = npm.config.get("unicode") ? "\u2605 " : "(*)"
-    , u = npm.config.get("unicode") ? "\u2606 " : "( )"
-    , using = !(npm.command.match(/^un/))
-  if (!using) s = u
-  asyncMap(args, function (pkg, cb) {
-    registry.star(pkg, using, function (er, data, raw, req) {
-      if (!er) {
-        console.log(s + " "+pkg)
-        log.verbose("star", data)
-      }
-      cb(er, data, raw, req)
-    })
-  }, cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/stars.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-module.exports = stars
-
-stars.usage = "npm stars [username]"
-
-var npm = require("./npm.js")
-  , registry = npm.registry
-  , log = require("npmlog")
-
-function stars (args, cb) {
-  var name = args.length === 1 ? args[0] : npm.config.get("username")
-  registry.stars(name, showstars)
-
-  function showstars (er, data) {
-    if (er) {
-      return cb(er)
-    }
-
-    if (data.rows.length === 0) {
-      log.warn('stars', 'user has not starred any packages.')
-    } else {
-      data.rows.forEach(function(a) {
-        console.log(a.value)
-      })
-    }
-    cb()
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/start.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require("./utils/lifecycle.js").cmd("start")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/stop.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require("./utils/lifecycle.js").cmd("stop")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/submodule.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,119 +0,0 @@
-// npm submodule <pkg>
-// Check the package contents for a git repository url.
-// If there is one, then create a git submodule in the node_modules folder.
-
-module.exports = submodule
-
-var npm = require("./npm.js")
-  , exec = require("child_process").execFile
-  , cache = require("./cache.js")
-  , asyncMap = require("slide").asyncMap
-  , chain = require("slide").chain
-  , which = require("which")
-
-submodule.usage = "npm submodule <pkg>"
-
-submodule.completion = require("./docs.js").completion
-
-function submodule (args, cb) {
-  if (npm.config.get("global")) {
-    return cb(new Error("Cannot use submodule command in global mode."))
-  }
-
-  if (args.length === 0) return cb(submodule.usage)
-
-  asyncMap(args, function (arg, cb) {
-    cache.add(arg, cb)
-  }, function (er, pkgs) {
-    if (er) return cb(er)
-    chain(pkgs.map(function (pkg) { return function (cb) {
-      submodule_(pkg, cb)
-    }}), cb)
-  })
-
-}
-
-function submodule_ (pkg, cb) {
-  if (!pkg.repository
-      || pkg.repository.type !== "git"
-      || !pkg.repository.url) {
-    return cb(new Error(pkg._id + ": No git repository listed"))
-  }
-
-  // prefer https:// github urls
-  pkg.repository.url = pkg.repository.url
-    .replace(/^(git:\/\/)?(git@)?github.com[:\/]/, "https://github.com/")
-
-  // first get the list of submodules, and update if it's already there.
-  getSubmodules(function (er, modules) {
-    if (er) return cb(er)
-    // if there's already a submodule, then just update it.
-    if (modules.indexOf(pkg.name) !== -1) {
-      return updateSubmodule(pkg.name, cb)
-    }
-    addSubmodule(pkg.name, pkg.repository.url, cb)
-  })
-}
-
-function updateSubmodule (name, cb) {
-  var git = npm.config.get("git")
-  var args = [ "submodule", "update", "--init", "node_modules/", name ]
-
-  // check for git
-  which(git, function (err) {
-    if (err) {
-      err.code = "ENOGIT"
-      return cb(err)
-    }
-
-    exec(git, args, cb)
-  })
-}
-
-function addSubmodule (name, url, cb) {
-  var git = npm.config.get("git")
-  var args = [ "submodule", "add", url, "node_modules/", name ]
-
-  // check for git
-  which(git, function (err) {
-    if (err) {
-      err.code = "ENOGIT"
-      return cb(err)
-    }
-
-    exec(git, args, function (er) {
-      if (er) return cb(er)
-      updateSubmodule(name, cb)
-    })
-  })
-}
-
-
-var getSubmodules = function getSubmodules (cb) {
-  var git = npm.config.get("git")
-  var args = [ "submodule", "status" ]
-
-  // check for git
-  which(git, function (err) {
-    if (err) {
-      err.code = "ENOGIT"
-      return cb(err)
-    }
-    exec(git, args, function (er, stdout, stderr) {
-      if (er) return cb(er)
-      res = stdout.trim().split(/\n/).map(function (line) {
-        return line.trim().split(/\s+/)[1]
-      }).filter(function (line) {
-        // only care about submodules in the node_modules folder.
-        return line && line.match(/^node_modules\//)
-      }).map(function (line) {
-        return line.replace(/^node_modules\//g, "")
-      })
-
-      // memoize.
-      getSubmodules = function (cb) { return cb(null, res) }
-
-      cb(null, res)
-    })
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/substack.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,20 +0,0 @@
-module.exports = substack
-var npm = require("./npm.js")
-
-var isms =
-  [ "\033[32mbeep \033[35mboop\033[m"
-  , "Replace your configs with services"
-  , "SEPARATE ALL THE CONCERNS!"
-  , "MODULE ALL THE THINGS!"
-  , "\\o/"
-  , "but first, burritos"
-  , "full time mad scientist here"
-  , "c/,,\\" ]
-
-function substack (args, cb) {
-  var i = Math.floor(Math.random() * isms.length)
-  console.log(isms[i])
-  var c = args.shift()
-  if (c) npm.commands[c](args, cb)
-  else cb()
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/tag.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-// turns out tagging isn't very complicated
-// all the smarts are in the couch.
-module.exports = tag
-tag.usage = "npm tag <project>@<version> [<tag>]"
-
-tag.completion = require("./unpublish.js").completion
-
-var npm = require("./npm.js")
-  , registry = npm.registry
-
-function tag (args, cb) {
-  var thing = (args.shift() || "").split("@")
-    , project = thing.shift()
-    , version = thing.join("@")
-    , t = args.shift() || npm.config.get("tag")
-  if (!project || !version || !t) return cb("Usage:\n"+tag.usage)
-  registry.tag(project, version, t, cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/test.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-module.exports = test
-
-var testCmd = require("./utils/lifecycle.js").cmd("test")
-  , log = require("npmlog")
-
-function test (args, cb) {
-  testCmd(args, function (er) {
-    if (!er) return cb()
-    if (er.code === "ELIFECYCLE") {
-      return cb("Test failed.  See above for more details.")
-    }
-    return cb(er)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/unbuild.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,112 +0,0 @@
-module.exports = unbuild
-unbuild.usage = "npm unbuild <folder>\n(this is plumbing)"
-
-var readJson = require("read-package-json")
-  , rm = require("rimraf")
-  , gentlyRm = require("./utils/gently-rm.js")
-  , npm = require("./npm.js")
-  , path = require("path")
-  , fs = require("graceful-fs")
-  , lifecycle = require("./utils/lifecycle.js")
-  , asyncMap = require("slide").asyncMap
-  , chain = require("slide").chain
-  , log = require("npmlog")
-  , build = require("./build.js")
-
-// args is a list of folders.
-// remove any bins/etc, and then delete the folder.
-function unbuild (args, silent, cb) {
-  if (typeof silent === 'function') cb = silent, silent = false
-  asyncMap(args, unbuild_(silent), cb)
-}
-
-function unbuild_ (silent) { return function (folder, cb_) {
-  function cb (er) {
-    cb_(er, path.relative(npm.root, folder))
-  }
-  folder = path.resolve(folder)
-  delete build._didBuild[folder]
-  log.info(folder, "unbuild")
-  readJson(path.resolve(folder, "package.json"), function (er, pkg) {
-    // if no json, then just trash it, but no scripts or whatever.
-    if (er) return rm(folder, cb)
-    readJson.cache.del(folder)
-    chain
-      ( [ [lifecycle, pkg, "preuninstall", folder, false, true]
-        , [lifecycle, pkg, "uninstall", folder, false, true]
-        , !silent && function(cb) {
-            console.log("unbuild " + pkg._id)
-            cb()
-          }
-        , [rmStuff, pkg, folder]
-        , [lifecycle, pkg, "postuninstall", folder, false, true]
-        , [rm, folder] ]
-      , cb )
-  })
-}}
-
-function rmStuff (pkg, folder, cb) {
-  // if it's global, and folder is in {prefix}/node_modules,
-  // then bins are in {prefix}/bin
-  // otherwise, then bins are in folder/../.bin
-  var parent = path.dirname(folder)
-    , gnm = npm.dir
-    , top = gnm === parent
-
-  readJson.cache.del(path.resolve(folder, "package.json"))
-
-  log.verbose([top, gnm, parent], "unbuild " + pkg._id)
-  asyncMap([rmBins, rmMans], function (fn, cb) {
-    fn(pkg, folder, parent, top, cb)
-  }, cb)
-}
-
-function rmBins (pkg, folder, parent, top, cb) {
-  if (!pkg.bin) return cb()
-  var binRoot = top ? npm.bin : path.resolve(parent, ".bin")
-  log.verbose([binRoot, pkg.bin], "binRoot")
-  asyncMap(Object.keys(pkg.bin), function (b, cb) {
-    if (process.platform === "win32") {
-      chain([ [rm, path.resolve(binRoot, b) + ".cmd"]
-            , [rm, path.resolve(binRoot, b) ] ], cb)
-    } else {
-      gentlyRm( path.resolve(binRoot, b)
-              , !npm.config.get("force") && folder
-              , cb )
-    }
-  }, cb)
-}
-
-function rmMans (pkg, folder, parent, top, cb) {
-  if (!pkg.man
-      || !top
-      || process.platform === "win32"
-      || !npm.config.get("global")) {
-    return cb()
-  }
-  var manRoot = path.resolve(npm.config.get("prefix"), "share", "man")
-  asyncMap(pkg.man, function (man, cb) {
-    if (Array.isArray(man)) {
-      man.forEach(rm)
-    } else {
-      rm(man)
-    }
-
-    function rm(man) {
-      var parseMan = man.match(/(.*)\.([0-9]+)(\.gz)?$/)
-        , stem = parseMan[1]
-        , sxn = parseMan[2]
-        , gz = parseMan[3] || ""
-        , bn = path.basename(stem)
-        , manDest = path.join( manRoot
-                            , "man"+sxn
-                            , (bn.indexOf(pkg.name) === 0 ? bn
-                              : pkg.name + "-" + bn)
-                              + "." + sxn + gz
-                            )
-      gentlyRm( manDest
-              , !npm.config.get("force") && folder
-              , cb )
-    }
-  }, cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/uninstall.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,127 +0,0 @@
-
-// remove a package.
-
-module.exports = uninstall
-
-uninstall.usage = "npm uninstall <name>[@<version> [<name>[@<version>] ...]"
-                + "\nnpm rm <name>[@<version> [<name>[@<version>] ...]"
-
-uninstall.completion = require("./utils/completion/installed-shallow.js")
-
-var fs = require("graceful-fs")
-  , log = require("npmlog")
-  , readJson = require("read-package-json")
-  , path = require("path")
-  , npm = require("./npm.js")
-  , asyncMap = require("slide").asyncMap
-
-function uninstall (args, cb) {
-  // this is super easy
-  // get the list of args that correspond to package names in either
-  // the global npm.dir,
-  // then call unbuild on all those folders to pull out their bins
-  // and mans and whatnot, and then delete the folder.
-
-  var nm = npm.dir
-  if (args.length === 1 && args[0] === ".") args = []
-  if (args.length) return uninstall_(args, nm, cb)
-
-  // remove this package from the global space, if it's installed there
-  if (npm.config.get("global")) return cb(uninstall.usage)
-  readJson(path.resolve(npm.prefix, "package.json"), function (er, pkg) {
-    if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-    if (er) return cb(uninstall.usage)
-    uninstall_( [pkg.name]
-              , npm.dir
-              , cb )
-  })
-
-}
-
-function uninstall_ (args, nm, cb) {
-  // if we've been asked to --save or --save-dev or --save-optional,
-  // then also remove it from the associated dependencies hash.
-  var s = npm.config.get('save')
-    , d = npm.config.get('save-dev')
-    , o = npm.config.get('save-optional')
-  if (s || d || o) {
-    cb = saver(args, nm, cb)
-  }
-
-  asyncMap(args, function (arg, cb) {
-    // uninstall .. should not delete /usr/local/lib/node_modules/..
-    var p = path.join(path.resolve(nm), path.join("/", arg))
-    if (path.resolve(p) === nm) {
-      log.warn("uninstall", "invalid argument: %j", arg)
-      return cb(null, [])
-    }
-    fs.lstat(p, function (er) {
-      if (er) {
-        log.warn("uninstall", "not installed in %s: %j", nm, arg)
-        return cb(null, [])
-      }
-      cb(null, p)
-    })
-  }, function (er, folders) {
-    if (er) return cb(er)
-    asyncMap(folders, npm.commands.unbuild, cb)
-  })
-}
-
-function saver (args, nm, cb_) {
-  return cb
-  function cb (er, data) {
-    var s = npm.config.get('save')
-      , d = npm.config.get('save-dev')
-      , o = npm.config.get('save-optional')
-    if (er || !(s || d || o)) return cb_(er, data)
-    var pj = path.resolve(nm, '..', 'package.json')
-    // don't use readJson here, because we don't want all the defaults
-    // filled in, for mans and other bs.
-    fs.readFile(pj, 'utf8', function (er, json) {
-      try {
-        var pkg = JSON.parse(json)
-      } catch (_) {}
-      if (!pkg) return cb_(null, data)
-
-      var bundle
-      if (npm.config.get('save-bundle')) {
-        var bundle = pkg.bundleDependencies || pkg.bundledDependencies
-        if (!Array.isArray(bundle)) bundle = undefined
-      }
-
-      var changed = false
-      args.forEach(function (a) {
-        ; [ [s, 'dependencies']
-          , [o, 'optionalDependencies']
-          , [d, 'devDependencies'] ].forEach(function (f) {
-            var flag = f[0]
-              , field = f[1]
-            if (!flag || !pkg[field] || !pkg[field].hasOwnProperty(a)) return
-            changed = true
-
-            if (bundle) {
-              var i = bundle.indexOf(a)
-              if (i !== -1) bundle.splice(i, 1)
-            }
-
-            delete pkg[field][a]
-          })
-      })
-      if (!changed) return cb_(null, data)
-
-      if (bundle) {
-        delete pkg.bundledDependencies
-        if (bundle.length) {
-          pkg.bundleDependencies = bundle
-        } else {
-          delete pkg.bundleDependencies
-        }
-      }
-
-      fs.writeFile(pj, JSON.stringify(pkg, null, 2) + "\n", function (er) {
-        return cb_(er, data)
-      })
-    })
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/unpublish.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,83 +0,0 @@
-
-module.exports = unpublish
-
-var log = require("npmlog")
-  , npm = require("./npm.js")
-  , registry = npm.registry
-  , readJson = require("read-package-json")
-  , path = require("path")
-
-unpublish.usage = "npm unpublish <project>[@<version>]"
-
-unpublish.completion = function (opts, cb) {
-  if (opts.conf.argv.remain.length >= 3) return cb()
-  var un = encodeURIComponent(npm.config.get("username"))
-  if (!un) return cb()
-  registry.get("/-/by-user/"+un, function (er, pkgs) {
-    // do a bit of filtering at this point, so that we don't need
-    // to fetch versions for more than one thing, but also don't
-    // accidentally a whole project.
-    pkgs = pkgs[un]
-    if (!pkgs || !pkgs.length) return cb()
-    var partial = opts.partialWord.split("@")
-      , pp = partial.shift()
-      , pv = partial.join("@")
-    pkgs = pkgs.filter(function (p) {
-      return p.indexOf(pp) === 0
-    })
-    if (pkgs.length > 1) return cb(null, pkgs)
-    registry.get(pkgs[0], function (er, d) {
-      if (er) return cb(er)
-      var vers = Object.keys(d.versions)
-      if (!vers.length) return cb(null, pkgs)
-      return cb(null, vers.map(function (v) {
-        return pkgs[0]+"@"+v
-      }))
-    })
-  })
-}
-
-function unpublish (args, cb) {
-
-  if (args.length > 1) return cb(unpublish.usage)
-
-  var thing = args.length ? args.shift().split("@") : []
-    , project = thing.shift()
-    , version = thing.join("@")
-
-  if (!version && !npm.config.get("force")) {
-    return cb("Refusing to delete entire project.\n"
-             +"Run with --force to do this.\n"
-             +unpublish.usage)
-  }
-
-  if (!project || path.resolve(project) === npm.prefix) {
-    // if there's a package.json in the current folder, then
-    // read the package name and version out of that.
-    var cwdJson = path.join(process.cwd(), "package.json")
-    return readJson(cwdJson, function (er, data) {
-      if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er)
-      if (er) return cb("Usage:\n"+unpublish.usage)
-      gotProject(data.name, data.version, cb)
-    })
-  }
-  return gotProject(project, version, cb)
-}
-
-function gotProject (project, version, cb_) {
-  function cb (er) {
-    if (er) return cb_(er)
-    console.log("- " + project + (version ? "@" + version : ""))
-    cb_()
-  }
-
-  // remove from the cache first
-  npm.commands.cache(["clean", project, version], function (er) {
-    if (er) {
-      log.error("unpublish", "Failed to clean cache")
-      return cb(er)
-    }
-
-    registry.unpublish(project, version, cb)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/update.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,43 +0,0 @@
-/*
-for each pkg in prefix that isn't a git repo
-  look for a new version of pkg that satisfies dep
-  if so, install it.
-  if not, then update it
-*/
-
-module.exports = update
-
-update.usage = "npm update [pkg]"
-
-var npm = require("./npm.js")
-  , lifecycle = require("./utils/lifecycle.js")
-  , asyncMap = require("slide").asyncMap
-  , log = require("npmlog")
-
-  // load these, just so that we know that they'll be available, in case
-  // npm itself is getting overwritten.
-  , install = require("./install.js")
-  , build = require("./build.js")
-
-update.completion = npm.commands.outdated.completion
-
-function update (args, cb) {
-  npm.commands.outdated(args, true, function (er, outdated) {
-    log.info("outdated", "updating", outdated)
-    if (er) return cb(er)
-
-    asyncMap(outdated, function (ww, cb) {
-      // [[ dir, dep, has, want, req ]]
-      var where = ww[0]
-        , dep = ww[1]
-        , want = ww[3]
-        , what = dep + "@" + want
-        , req = ww[4]
-        , url = require('url')
-
-      // use the initial installation method (repo, tar, git) for updating
-      if (url.parse(req).protocol) what = req
-      npm.commands.install(where, what, cb)
-    }, cb)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/completion.sh	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-#!/bin/bash
-###-begin-npm-completion-###
-#
-# npm command completion script
-#
-# Installation: npm completion >> ~/.bashrc  (or ~/.zshrc)
-# Or, maybe: npm completion > /usr/local/etc/bash_completion.d/npm
-#
-
-COMP_WORDBREAKS=${COMP_WORDBREAKS/=/}
-COMP_WORDBREAKS=${COMP_WORDBREAKS/@/}
-export COMP_WORDBREAKS
-
-if type complete &>/dev/null; then
-  _npm_completion () {
-    local si="$IFS"
-    IFS=$'\n' COMPREPLY=($(COMP_CWORD="$COMP_CWORD" \
-                           COMP_LINE="$COMP_LINE" \
-                           COMP_POINT="$COMP_POINT" \
-                           npm completion -- "${COMP_WORDS[@]}" \
-                           2>/dev/null)) || return $?
-    IFS="$si"
-  }
-  complete -F _npm_completion npm
-elif type compdef &>/dev/null; then
-  _npm_completion() {
-    si=$IFS
-    compadd -- $(COMP_CWORD=$((CURRENT-1)) \
-                 COMP_LINE=$BUFFER \
-                 COMP_POINT=0 \
-                 npm completion -- "${words[@]}" \
-                 2>/dev/null)
-    IFS=$si
-  }
-  compdef _npm_completion npm
-elif type compctl &>/dev/null; then
-  _npm_completion () {
-    local cword line point words si
-    read -Ac words
-    read -cn cword
-    let cword-=1
-    read -l line
-    read -ln point
-    si="$IFS"
-    IFS=$'\n' reply=($(COMP_CWORD="$cword" \
-                       COMP_LINE="$line" \
-                       COMP_POINT="$point" \
-                       npm completion -- "${words[@]}" \
-                       2>/dev/null)) || return $?
-    IFS="$si"
-  }
-  compctl -K _npm_completion npm
-fi
-###-end-npm-completion-###
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/completion/file-completion.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-module.exports = fileCompletion
-
-var mkdir = require("mkdirp")
-  , path = require("path")
-  , fs = require("graceful-fs")
-  , glob = require("glob")
-
-function fileCompletion (root, req, depth, cb) {
-  if (typeof cb !== "function") cb = depth, depth = Infinity
-  mkdir(root, function (er) {
-    if (er) return cb(er)
-
-    // can be either exactly the req, or a descendent
-    var pattern = root + "/{" + req + "," + req + "/**/*}"
-      , opts = { mark: true, dot: true, maxDepth: depth }
-    glob(pattern, opts, function (er, files) {
-      if (er) return cb(er)
-      return cb(null, (files || []).map(function (f) {
-        return path.join(req, f.substr(root.length + 1)
-                               .substr((f === req ? path.dirname(req)
-                                                  : req).length)
-                               .replace(/^\//, ""))
-      }))
-    })
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/completion/installed-deep.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,46 +0,0 @@
-module.exports = installedDeep
-
-var npm = require("../../npm.js")
-  , readInstalled = require("read-installed")
-
-function installedDeep (opts, cb) {
-  var local
-    , global
-  if (npm.config.get("global")) local = [], next()
-  else readInstalled(npm.prefix, npm.config.get("depth"), function (er, data) {
-    local = getNames(data || {})
-    next()
-  })
-  readInstalled(npm.config.get("prefix"), npm.config.get("depth"), function (er, data) {
-    global = getNames(data || {})
-    next()
-  })
-
-  function getNames_ (d, n) {
-    if (d.realName && n) {
-      if (n[d.realName]) return n
-      n[d.realName] = true
-    }
-    if (!n) n = {}
-    Object.keys(d.dependencies || {}).forEach(function (dep) {
-      getNames_(d.dependencies[dep], n)
-    })
-    return n
-  }
-  function getNames (d) {
-    return Object.keys(getNames_(d))
-  }
-
-  function next () {
-    if (!local || !global) return
-    if (!npm.config.get("global")) {
-      global = global.map(function (g) {
-        return [g, "-g"]
-      })
-    }
-    var names = local.concat(global)
-    return cb(null, names)
-  }
-
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/completion/installed-shallow.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,79 +0,0 @@
-
-module.exports = installedShallow
-
-var npm = require("../../npm.js")
-  , fs = require("graceful-fs")
-  , path = require("path")
-  , readJson = require("read-package-json")
-  , asyncMap = require("slide").asyncMap
-
-function installedShallow (opts, filter, cb) {
-  if (typeof cb !== "function") cb = filter, filter = null
-  var conf = opts.conf
-    , args = conf.argv.remain
-  if (args.length > 3) return cb()
-  var local
-    , global
-    , localDir = npm.dir
-    , globalDir = npm.globalDir
-  if (npm.config.get("global")) local = [], next()
-  else fs.readdir(localDir, function (er, pkgs) {
-    local = (pkgs || []).filter(function (p) {
-      return p.charAt(0) !== "."
-    })
-    next()
-  })
-  fs.readdir(globalDir, function (er, pkgs) {
-    global = (pkgs || []).filter(function (p) {
-      return p.charAt(0) !== "."
-    })
-    next()
-  })
-  function next () {
-    if (!local || !global) return
-    filterInstalled(local, global, filter, cb)
-  }
-}
-
-function filterInstalled (local, global, filter, cb) {
-  var fl
-    , fg
-
-  if (!filter) {
-    fl = local
-    fg = global
-    return next()
-  }
-
-  asyncMap(local, function (p, cb) {
-    readJson(path.join(npm.dir, p, "package.json"), function (er, d) {
-      if (!d || !filter(d)) return cb(null, [])
-      return cb(null, d.name)
-    })
-  }, function (er, local) {
-    fl = local || []
-    next()
-  })
-
-  var globalDir = npm.globalDir
-  asyncMap(global, function (p, cb) {
-    readJson(path.join(globalDir, p, "package.json"), function (er, d) {
-      if (!d || !filter(d)) return cb(null, [])
-      return cb(null, d.name)
-    })
-  }, function (er, global) {
-    fg = global || []
-    next()
-  })
-
-  function next () {
-    if (!fg || !fl) return
-    if (!npm.config.get("global")) {
-      fg = fg.map(function (g) {
-        return [g, "-g"]
-      })
-    }
-    console.error("filtered", fl, fg)
-    return cb(null, fl.concat(fg))
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/error-handler.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,341 +0,0 @@
-
-module.exports = errorHandler
-
-var cbCalled = false
-  , log = require("npmlog")
-  , npm = require("../npm.js")
-  , rm = require("rimraf")
-  , itWorked = false
-  , path = require("path")
-  , wroteLogFile = false
-  , exitCode = 0
-
-
-process.on("exit", function (code) {
-  // console.error("exit", code)
-  if (!npm.config.loaded) return
-  if (code) itWorked = false
-  if (itWorked) log.info("ok")
-  else {
-    if (!cbCalled) {
-      log.error("", "cb() never called!")
-    }
-
-    if (wroteLogFile) {
-      log.error("", [""
-                ,"Additional logging details can be found in:"
-                ,"    " + path.resolve("npm-debug.log")
-                ].join("\n"))
-      wroteLogFile = false
-    }
-    log.error("not ok", "code", code)
-  }
-
-  var doExit = npm.config.get("_exit")
-  if (doExit) {
-    // actually exit.
-    if (exitCode === 0 && !itWorked) {
-      exitCode = 1
-    }
-    if (exitCode !== 0) process.exit(exitCode)
-  } else {
-    itWorked = false // ready for next exit
-  }
-})
-
-function exit (code, noLog) {
-  exitCode = exitCode || process.exitCode || code
-
-  var doExit = npm.config.get("_exit")
-  log.verbose("exit", [code, doExit])
-  if (log.level === "silent") noLog = true
-
-  if (code && !noLog) writeLogFile(reallyExit)
-  else rm("npm-debug.log", function () { rm(npm.tmp, reallyExit) })
-
-  function reallyExit() {
-    // truncate once it's been written.
-    log.record.length = 0
-
-    itWorked = !code
-
-    // just emit a fake exit event.
-    // if we're really exiting, then let it exit on its own, so that
-    // in-process stuff can finish or clean up first.
-    if (!doExit) process.emit("exit", code)
-  }
-}
-
-
-function errorHandler (er) {
-  var printStack = false
-  // console.error("errorHandler", er)
-  if (!npm.config.loaded) {
-    // logging won't work unless we pretend that it's ready
-    er = er || new Error("Exit prior to config file resolving.")
-    console.error(er.stack || er.message)
-  }
-
-  if (cbCalled) {
-    er = er || new Error("Callback called more than once.")
-  }
-
-  cbCalled = true
-  if (!er) return exit(0)
-  if (typeof er === "string") {
-    log.error("", er)
-    return exit(1, true)
-  } else if (!(er instanceof Error)) {
-    log.error("weird error", er)
-    return exit(1, true)
-  }
-
-  var m = er.code || er.message.match(/^(?:Error: )?(E[A-Z]+)/)
-  if (m && !er.code) er.code = m
-
-  switch (er.code) {
-  case "ECONNREFUSED":
-    log.error("", er)
-    log.error("", ["\nIf you are behind a proxy, please make sure that the"
-              ,"'proxy' config is set properly.  See: 'npm help config'"
-              ].join("\n"))
-    printStack = true
-    break
-
-  case "EACCES":
-  case "EPERM":
-    log.error("", er)
-    log.error("", ["\nPlease try running this command again as root/Administrator."
-              ].join("\n"))
-    printStack = true
-    break
-
-  case "ELIFECYCLE":
-    er.code = "ELIFECYCLE"
-    log.error("", er.message)
-    log.error("", ["","Failed at the "+er.pkgid+" "+er.stage+" script."
-              ,"This is most likely a problem with the "+er.pkgname+" package,"
-              ,"not with npm itself."
-              ,"Tell the author that this fails on your system:"
-              ,"    "+er.script
-              ,"You can get their info via:"
-              ,"    npm owner ls "+er.pkgname
-              ,"There is likely additional logging output above."
-              ].join("\n"))
-    break
-
-  case "ENOGIT":
-    er.code = "ENOGIT"
-    log.error("", er.message)
-    log.error("", ["","Failed using git."
-              ,"This is most likely not a problem with npm itself."
-              ,"Please check if you have git installed and in your PATH."
-              ].join("\n"))
-    break
-
-  case "EJSONPARSE":
-    er.code = "EJSONPARSE"
-    log.error("", er.message)
-    log.error("", "File: "+er.file)
-    log.error("", ["Failed to parse package.json data."
-              ,"package.json must be actual JSON, not just JavaScript."
-              ,"","This is not a bug in npm."
-              ,"Tell the package author to fix their package.json file."
-              ].join("\n"), "JSON.parse")
-    break
-
-  case "E404":
-    er.code = "E404"
-    if (er.pkgid && er.pkgid !== "-") {
-      var msg = ["'"+er.pkgid+"' is not in the npm registry."
-                ,"You should bug the author to publish it"]
-      if (er.pkgid.match(/^node[\.\-]|[\.\-]js$/)) {
-        var s = er.pkgid.replace(/^node[\.\-]|[\.\-]js$/g, "")
-        if (s !== er.pkgid) {
-          s = s.replace(/[^a-z0-9]/g, ' ')
-          msg.push("\nMaybe try 'npm search " + s + "'")
-        }
-      }
-      msg.push("\nNote that you can also install from a"
-              ,"tarball, folder, or http url, or git url.")
-      log.error("404", msg.join("\n"))
-    }
-    break
-
-  case "EPUBLISHCONFLICT":
-    er.code = "EPUBLISHCONFLICT"
-    log.error("publish fail", ["Cannot publish over existing version."
-              ,"Update the 'version' field in package.json and try again."
-              ,""
-              ,"If the previous version was published in error, see:"
-              ,"    npm help unpublish"
-              ,""
-              ,"To automatically increment version numbers, see:"
-              ,"    npm help version"
-              ].join("\n"))
-    break
-
-  case "EISGIT":
-    er.code = "EISGIT"
-    log.error("git", [er.message
-              ,"    "+er.path
-              ,"Refusing to remove it. Update manually,"
-              ,"or move it out of the way first."
-              ].join("\n"))
-    break
-
-  case "ECYCLE":
-    er.code = "ECYCLE"
-    log.error("cycle", [er.message
-              ,"While installing: "+er.pkgid
-              ,"Found a pathological dependency case that npm cannot solve."
-              ,"Please report this to the package author."
-              ].join("\n"))
-    break
-
-  case "EBADPLATFORM":
-    er.code = "EBADPLATFORM"
-    log.error("notsup", [er.message
-              ,"Not compatible with your operating system or architecture: "+er.pkgid
-              ,"Valid OS:    "+er.os.join(",")
-              ,"Valid Arch:  "+er.cpu.join(",")
-              ,"Actual OS:   "+process.platform
-              ,"Actual Arch: "+process.arch
-              ].join("\n"))
-    break
-
-  case "EEXIST":
-    log.error([er.message
-              ,"File exists: "+er.path
-              ,"Move it away, and try again."].join("\n"))
-    break
-
-  case "ENEEDAUTH":
-    log.error("need auth", [er.message
-              ,"You need to authorize this machine using `npm adduser`"
-              ].join("\n"))
-    break
-
-  case "EPEERINVALID":
-    var peerErrors = Object.keys(er.peersDepending).map(function (peer) {
-      return "Peer " + peer + " wants " + er.packageName + "@"
-        + er.peersDepending[peer]
-    })
-    log.error("peerinvalid", [er.message].concat(peerErrors).join("\n"))
-    break
-
-  case "ECONNRESET":
-  case "ENOTFOUND":
-  case "ETIMEDOUT":
-    log.error("network", [er.message
-              ,"This is most likely not a problem with npm itself"
-              ,"and is related to network connectivity."
-              ,"In most cases you are behind a proxy or have bad network settings."
-              ,"\nIf you are behind a proxy, please make sure that the"
-              ,"'proxy' config is set properly.  See: 'npm help config'"
-              ].join("\n"))
-    break
-
-  case "ENOPACKAGEJSON":
-    log.error("package.json", [er.message
-              ,"This is most likely not a problem with npm itself."
-              ,"npm can't find a package.json file in your current directory."
-              ].join("\n"))
-    break
-
-  case "ETARGET":
-    log.error("notarget", [er.message
-              ,"This is most likely not a problem with npm itself."
-              ,"In most cases you or one of your dependencies are requesting"
-              ,"a package version that doesn't exist."
-              ].join("\n"))
-    break
-
-  case "ENOTSUP":
-    if (er.required) {
-      log.error("notsup", [er.message
-                ,"Not compatible with your version of node/npm: "+er.pkgid
-                ,"Required: "+JSON.stringify(er.required)
-                ,"Actual:   "
-                +JSON.stringify({npm:npm.version
-                                ,node:npm.config.get("node-version")})
-                ].join("\n"))
-      break
-    } // else passthrough
-
-  default:
-    log.error("", er.stack || er.message || er)
-    log.error("", ["If you need help, you may report this log at:"
-                  ,"    <http://github.com/isaacs/npm/issues>"
-                  ,"or email it to:"
-                  ,"    <npm-@googlegroups.com>"
-                  ].join("\n"))
-    printStack = false
-    break
-  }
-
-  var os = require("os")
-  // just a line break
-  console.error("")
-  log.error("System", os.type() + " " + os.release())
-  log.error("command", process.argv
-            .map(JSON.stringify).join(" "))
-  log.error("cwd", process.cwd())
-  log.error("node -v", process.version)
-  log.error("npm -v", npm.version)
-
-  ; [ "file"
-    , "path"
-    , "type"
-    , "syscall"
-    , "fstream_path"
-    , "fstream_unc_path"
-    , "fstream_type"
-    , "fstream_class"
-    , "fstream_finish_call"
-    , "fstream_linkpath"
-    , "code"
-    , "errno"
-    , "stack"
-    , "fstream_stack"
-    ].forEach(function (k) {
-      var v = er[k]
-      if (k === "stack") {
-        if (!printStack) return
-        if (!v) v = er.message
-      }
-      if (!v) return
-      if (k === "fstream_stack") v = v.join("\n")
-      log.error(k, v)
-    })
-
-  exit(typeof er.errno === "number" ? er.errno : 1)
-}
-
-var writingLogFile = false
-function writeLogFile (cb) {
-  if (writingLogFile) return cb()
-  writingLogFile = true
-  wroteLogFile = true
-
-  var fs = require("graceful-fs")
-    , fstr = fs.createWriteStream("npm-debug.log")
-    , util = require("util")
-    , os = require("os")
-    , out = ""
-
-  log.record.forEach(function (m) {
-    var pref = [m.id, m.level]
-    if (m.prefix) pref.push(m.prefix)
-    pref = pref.join(' ')
-
-    m.message.trim().split(/\r?\n/).map(function (line) {
-      return (pref + ' ' + line).trim()
-    }).forEach(function (line) {
-      out += line + os.EOL
-    })
-  })
-
-  fstr.end(out)
-  fstr.on("close", cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/fetch.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,93 +0,0 @@
-/**
- * Fetch an HTTP url to a local file.
- **/
-
-var request = require("request")
-  , fs = require("graceful-fs")
-  , npm = require("../npm.js")
-  , url = require("url")
-  , log = require("npmlog")
-  , path = require("path")
-  , mkdir = require("mkdirp")
-  , chownr = require("chownr")
-  , regHost
-  , once = require("once")
-
-module.exports = fetch
-
-function fetch (remote, local, headers, cb) {
-  if (typeof cb !== "function") cb = headers, headers = {}
-  cb = once(cb)
-  log.verbose("fetch", "to=", local)
-  mkdir(path.dirname(local), function (er, made) {
-    if (er) return cb(er)
-    fetch_(remote, local, headers, cb)
-  })
-}
-
-function fetch_ (remote, local, headers, cb) {
-  var fstr = fs.createWriteStream(local, { mode : npm.modes.file })
-  var response = null
-
-  fstr.on("error", function (er) {
-    cb(er)
-    fstr.destroy()
-  })
-
-  var req = makeRequest(remote, fstr, headers)
-  req.on("response", function (res) {
-    log.http(res.statusCode, remote)
-    response = res
-    response.resume()
-    // Work around bug in node v0.10.0 where the CryptoStream
-    // gets stuck and never starts reading again.
-    if (process.version === "v0.10.0") {
-      response.resume = function (orig) { return function() {
-        var ret = orig.apply(response, arguments)
-        if (response.socket.encrypted)
-          response.socket.encrypted.read(0)
-        return ret
-      }}(response.resume)
-    }
-  })
-
-  fstr.on("close", function () {
-    var er
-    if (response && response.statusCode && response.statusCode >= 400) {
-      er = new Error(response.statusCode + " "
-                    + require("http").STATUS_CODES[response.statusCode])
-    }
-    cb(er, response)
-  })
-}
-
-function makeRequest (remote, fstr, headers) {
-  remote = url.parse(remote)
-  log.http("GET", remote.href)
-  regHost = regHost || url.parse(npm.config.get("registry")).host
-
-  if (remote.host === regHost && npm.config.get("always-auth")) {
-    remote.auth = new Buffer( npm.config.get("_auth")
-                            , "base64" ).toString("utf8")
-    if (!remote.auth) return fstr.emit("error", new Error(
-      "Auth required and none provided. Please run 'npm adduser'"))
-  }
-
-  var proxy
-  if (remote.protocol !== "https:" || !(proxy = npm.config.get("https-proxy"))) {
-    proxy = npm.config.get("proxy")
-  }
-
-  var opts = { url: remote
-             , proxy: proxy
-             , strictSSL: npm.config.get("strict-ssl")
-             , rejectUnauthorized: npm.config.get("strict-ssl")
-             , ca: remote.host === regHost ? npm.config.get("ca") : undefined
-             , headers: { "user-agent": npm.config.get("user-agent") }}
-  var req = request(opts)
-  req.on("error", function (er) {
-    fstr.emit("error", er)
-  })
-  req.pipe(fstr)
-  return req
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/find-prefix.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,57 +0,0 @@
-// try to find the most reasonable prefix to use
-
-module.exports = findPrefix
-
-var fs = require("graceful-fs")
-  , path = require("path")
-  , npm = require("../npm.js")
-
-function findPrefix (p, cb_) {
-  function cb (er, p) {
-    process.nextTick(function () {
-      cb_(er, p)
-    })
-  }
-
-  p = path.resolve(p)
-  // if there's no node_modules folder, then
-  // walk up until we hopefully find one.
-  // if none anywhere, then use cwd.
-  var walkedUp = false
-  while (path.basename(p) === "node_modules") {
-    p = path.dirname(p)
-    walkedUp = true
-  }
-  if (walkedUp) return cb(null, p)
-
-  findPrefix_(p, p, cb)
-}
-
-function findPrefix_ (p, original, cb) {
-  if (p === "/"
-      || (process.platform === "win32" && p.match(/^[a-zA-Z]:(\\|\/)?$/))) {
-    return cb(null, original)
-  }
-  fs.readdir(p, function (er, files) {
-    // an error right away is a bad sign.
-    // unless the prefix was simply a non
-    // existent directory.
-    if (er && p === original) {
-      if (er.code === "ENOENT") return cb(null, original);
-      return cb(er)
-    }
-
-    // walked up too high or something.
-    if (er) return cb(null, original)
-
-    if (files.indexOf("node_modules") !== -1
-        || files.indexOf("package.json") !== -1) {
-      return cb(null, p)
-    }
-
-    var d = path.dirname(p)
-    if (d === p) return cb(null, original)
-
-    return findPrefix_(d, original, cb)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/gently-rm.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,45 +0,0 @@
-// only remove the thing if it's a symlink into a specific folder.
-// This is a very common use-case of npm's, but not so common elsewhere.
-
-module.exports = gentlyRm
-
-var rimraf = require("rimraf")
-  , fs = require("graceful-fs")
-  , npm = require("../npm.js")
-  , path = require("path")
-
-function gentlyRm (p, gently, cb) {
-  if (npm.config.get("force") || !gently) {
-    return rimraf(p, cb)
-  }
-
-  gently = path.resolve(gently)
-
-  // lstat it, see if it's a symlink.
-  fs.lstat(p, function (er, s) {
-    if (er) return rimraf(p, cb)
-    if (!s.isSymbolicLink()) next(null, path.resolve(p))
-    realish(p, next)
-  })
-
-  function next (er, rp) {
-    if (rp && rp.indexOf(gently) !== 0) {
-      return clobberFail(p, gently, cb)
-    }
-    rimraf(p, cb)
-  }
-}
-
-function realish (p, cb) {
-  fs.readlink(p, function (er, r) {
-    if (er) return cb(er)
-    return cb(null, path.resolve(path.dirname(p), r))
-  })
-}
-
-function clobberFail (p, g, cb) {
-  var er = new Error("Refusing to delete: "+p+" not in "+g)
-  er.code = "EEXIST"
-  er.path = p
-  return cb(er)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/is-git-url.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-module.exports = isGitUrl
-
-function isGitUrl (url) {
-  switch (url.protocol) {
-    case "git:":
-    case "git+http:":
-    case "git+https:":
-    case "git+rsync:":
-    case "git+ftp:":
-    case "git+ssh:":
-      return true
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/lifecycle.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,357 +0,0 @@
-
-exports = module.exports = lifecycle
-exports.cmd = cmd
-
-var log = require("npmlog")
-  , spawn = require("child_process").spawn
-  , npm = require("../npm.js")
-  , path = require("path")
-  , fs = require("graceful-fs")
-  , chain = require("slide").chain
-  , Stream = require("stream").Stream
-  , PATH = "PATH"
-  , uidNumber = require("uid-number")
-
-// windows calls it's path "Path" usually, but this is not guaranteed.
-if (process.platform === "win32") {
-  PATH = "Path"
-  Object.keys(process.env).forEach(function (e) {
-    if (e.match(/^PATH$/i)) {
-      PATH = e
-    }
-  })
-}
-
-function lifecycle (pkg, stage, wd, unsafe, failOk, cb) {
-  if (typeof cb !== "function") cb = failOk, failOk = false
-  if (typeof cb !== "function") cb = unsafe, unsafe = false
-  if (typeof cb !== "function") cb = wd, wd = null
-
-  while (pkg && pkg._data) pkg = pkg._data
-  if (!pkg) return cb(new Error("Invalid package data"))
-
-  log.info(stage, pkg._id)
-  if (!pkg.scripts) pkg.scripts = {}
-
-  validWd(wd || path.resolve(npm.dir, pkg.name), function (er, wd) {
-    if (er) return cb(er)
-
-    unsafe = unsafe || npm.config.get("unsafe-perm")
-
-    if ((wd.indexOf(npm.dir) !== 0 || path.basename(wd) !== pkg.name)
-        && !unsafe && pkg.scripts[stage]) {
-      log.warn( "cannot run in wd", "%s %s (wd=%s)"
-              , pkg._id, pkg.scripts[stage], wd)
-      return cb()
-    }
-
-    // set the env variables, then run scripts as a child process.
-    var env = makeEnv(pkg)
-    env.npm_lifecycle_event = stage
-    env.npm_node_execpath = env.NODE = env.NODE || process.execPath
-    env.npm_execpath = require.main.filename
-
-    // "nobody" typically doesn't have permission to write to /tmp
-    // even if it's never used, sh freaks out.
-    if (!npm.config.get("unsafe-perm")) env.TMPDIR = wd
-
-    lifecycle_(pkg, stage, wd, env, unsafe, failOk, cb)
-  })
-}
-
-function checkForLink (pkg, cb) {
-  var f = path.join(npm.dir, pkg.name)
-  fs.lstat(f, function (er, s) {
-    cb(null, !(er || !s.isSymbolicLink()))
-  })
-}
-
-function lifecycle_ (pkg, stage, wd, env, unsafe, failOk, cb) {
-  var pathArr = []
-    , p = wd.split("node_modules")
-    , acc = path.resolve(p.shift())
-
-  // first add the directory containing the `node` executable currently
-  // running, so that any lifecycle script that invoke "node" will execute
-  // this same one.
-  pathArr.unshift(path.dirname(process.execPath))
-
-  p.forEach(function (pp) {
-    pathArr.unshift(path.join(acc, "node_modules", ".bin"))
-    acc = path.join(acc, "node_modules", pp)
-  })
-  pathArr.unshift(path.join(acc, "node_modules", ".bin"))
-
-  // we also unshift the bundled node-gyp-bin folder so that
-  // the bundled one will be used for installing things.
-  pathArr.unshift(path.join(__dirname, "..", "..", "bin", "node-gyp-bin"))
-
-  if (env[PATH]) pathArr.push(env[PATH])
-  env[PATH] = pathArr.join(process.platform === "win32" ? ";" : ":")
-
-  var packageLifecycle = pkg.scripts && pkg.scripts.hasOwnProperty(stage)
-
-  if (packageLifecycle) {
-    // define this here so it's available to all scripts.
-    env.npm_lifecycle_script = pkg.scripts[stage]
-  }
-
-  if (failOk) {
-    cb = (function (cb_) { return function (er) {
-      if (er) log.warn("continuing anyway", er.message)
-      cb_()
-    }})(cb)
-  }
-
-  if (npm.config.get("force")) {
-    cb = (function (cb_) { return function (er) {
-      if (er) log.info("forced, continuing", er)
-      cb_()
-    }})(cb)
-  }
-
-  chain
-    ( [ packageLifecycle && [runPackageLifecycle, pkg, env, wd, unsafe]
-      , [runHookLifecycle, pkg, env, wd, unsafe] ]
-    , cb )
-}
-
-function validWd (d, cb) {
-  fs.stat(d, function (er, st) {
-    if (er || !st.isDirectory()) {
-      var p = path.dirname(d)
-      if (p === d) {
-        return cb(new Error("Could not find suitable wd"))
-      }
-      return validWd(p, cb)
-    }
-    return cb(null, d)
-  })
-}
-
-function runPackageLifecycle (pkg, env, wd, unsafe, cb) {
-  // run package lifecycle scripts in the package root, or the nearest parent.
-  var stage = env.npm_lifecycle_event
-    , cmd = env.npm_lifecycle_script
-
-  var note = "\n> " + pkg._id + " " + stage + " " + wd
-           + "\n> " + cmd + "\n"
-  runCmd(note, cmd, pkg, env, stage, wd, unsafe, cb)
-}
-
-
-var running = false
-var queue = []
-function dequeue() {
-  running = false
-  if (queue.length) {
-    var r = queue.shift()
-    runCmd.apply(null, r)
-  }
-}
-
-function runCmd (note, cmd, pkg, env, stage, wd, unsafe, cb) {
-  if (running) {
-    queue.push([note, cmd, pkg, env, stage, wd, unsafe, cb])
-    return
-  }
-
-  running = true
-  log.pause()
-  var user = unsafe ? null : npm.config.get("user")
-    , group = unsafe ? null : npm.config.get("group")
-
-  console.log(note)
-  log.verbose("unsafe-perm in lifecycle", unsafe)
-
-  if (process.platform === "win32") {
-    unsafe = true
-  }
-
-  if (unsafe) {
-    runCmd_(cmd, pkg, env, wd, stage, unsafe, 0, 0, cb)
-  } else {
-    uidNumber(user, group, function (er, uid, gid) {
-      runCmd_(cmd, pkg, env, wd, stage, unsafe, uid, gid, cb)
-    })
-  }
-}
-
-function runCmd_ (cmd, pkg, env, wd, stage, unsafe, uid, gid, cb_) {
-
-  function cb (er) {
-    cb_.apply(null, arguments)
-    log.resume()
-    process.nextTick(dequeue)
-  }
-
-  var sh = "sh"
-  var shFlag = "-c"
-
-  if (process.platform === "win32") {
-    sh = "cmd"
-    shFlag = "/c"
-  }
-
-  var conf = { cwd: wd
-             , env: env
-             , stdio: [ 0, 1, 2 ]
-             }
-
-  if (!unsafe) {
-    conf.uid = uid ^ 0
-    conf.gid = gid ^ 0
-  }
-
-  var proc = spawn(sh, [shFlag, cmd], conf)
-  proc.on("close", function (code) {
-    if (code) {
-      var er = new Error("Exit status " + code)
-    }
-    if (er && !npm.ROLLBACK) {
-      log.info(pkg._id, "Failed to exec "+stage+" script")
-      er.message = pkg._id + " "
-                 + stage + ": `" + cmd +"`\n"
-                 + er.message
-      if (er.code !== "EPERM") {
-        er.code = "ELIFECYCLE"
-      }
-      er.pkgid = pkg._id
-      er.stage = stage
-      er.script = cmd
-      er.pkgname = pkg.name
-      return cb(er)
-    } else if (er) {
-      log.error(pkg._id+"."+stage, er)
-      log.error(pkg._id+"."+stage, "continuing anyway")
-      return cb()
-    }
-    cb(er)
-  })
-}
-
-
-function runHookLifecycle (pkg, env, wd, unsafe, cb) {
-  // check for a hook script, run if present.
-  var stage = env.npm_lifecycle_event
-    , hook = path.join(npm.dir, ".hooks", stage)
-    , user = unsafe ? null : npm.config.get("user")
-    , group = unsafe ? null : npm.config.get("group")
-    , cmd = hook
-
-  fs.stat(hook, function (er) {
-    if (er) return cb()
-    var note = "\n> " + pkg._id + " " + stage + " " + wd
-             + "\n> " + cmd
-    runCmd(note, hook, pkg, env, stage, wd, unsafe, cb)
-  })
-}
-
-function makeEnv (data, prefix, env) {
-  prefix = prefix || "npm_package_"
-  if (!env) {
-    env = {}
-    for (var i in process.env) if (!i.match(/^npm_/)) {
-      env[i] = process.env[i]
-    }
-
-    // npat asks for tap output
-    if (npm.config.get("npat")) env.TAP = 1
-
-    // express and others respect the NODE_ENV value.
-    if (npm.config.get("production")) env.NODE_ENV = "production"
-
-  } else if (!data.hasOwnProperty("_lifecycleEnv")) {
-    Object.defineProperty(data, "_lifecycleEnv",
-      { value : env
-      , enumerable : false
-      })
-  }
-
-  for (var i in data) if (i.charAt(0) !== "_") {
-    var envKey = (prefix+i).replace(/[^a-zA-Z0-9_]/g, '_')
-    if (i === "readme") {
-      continue
-    }
-    if (data[i] && typeof(data[i]) === "object") {
-      try {
-        // quick and dirty detection for cyclical structures
-        JSON.stringify(data[i])
-        makeEnv(data[i], envKey+"_", env)
-      } catch (ex) {
-        // usually these are package objects.
-        // just get the path and basic details.
-        var d = data[i]
-        makeEnv( { name: d.name, version: d.version, path:d.path }
-               , envKey+"_", env)
-      }
-    } else {
-      env[envKey] = String(data[i])
-      env[envKey] = -1 !== env[envKey].indexOf("\n")
-                  ? JSON.stringify(env[envKey])
-                  : env[envKey]
-    }
-
-  }
-
-  if (prefix !== "npm_package_") return env
-
-  prefix = "npm_config_"
-  var pkgConfig = {}
-    , keys = npm.config.keys
-    , pkgVerConfig = {}
-    , namePref = data.name + ":"
-    , verPref = data.name + "@" + data.version + ":"
-
-  keys.forEach(function (i) {
-    if (i.charAt(0) === "_" && i.indexOf("_"+namePref) !== 0) {
-      return
-    }
-    var value = npm.config.get(i)
-    if (value instanceof Stream || Array.isArray(value)) return
-    if (!value) value = ""
-    else if (typeof value !== "string") value = JSON.stringify(value)
-
-    value = -1 !== value.indexOf("\n")
-          ? JSON.stringify(value)
-          : value
-    i = i.replace(/^_+/, "")
-    if (i.indexOf(namePref) === 0) {
-      var k = i.substr(namePref.length).replace(/[^a-zA-Z0-9_]/g, "_")
-      pkgConfig[ k ] = value
-    } else if (i.indexOf(verPref) === 0) {
-      var k = i.substr(verPref.length).replace(/[^a-zA-Z0-9_]/g, "_")
-      pkgVerConfig[ k ] = value
-    }
-    var envKey = (prefix+i).replace(/[^a-zA-Z0-9_]/g, "_")
-    env[envKey] = value
-  })
-
-  prefix = "npm_package_config_"
-  ;[pkgConfig, pkgVerConfig].forEach(function (conf) {
-    for (var i in conf) {
-      var envKey = (prefix+i)
-      env[envKey] = conf[i]
-    }
-  })
-
-  return env
-}
-
-function cmd (stage) {
-  function CMD (args, cb) {
-    if (args.length) {
-      chain(args.map(function (p) {
-        return [npm.commands, "run-script", [p, stage]]
-      }), cb)
-    } else npm.commands["run-script"]([stage], cb)
-  }
-  CMD.usage = "npm "+stage+" <name>"
-  var installedShallow = require("./completion/installed-shallow.js")
-  CMD.completion = function (opts, cb) {
-    installedShallow(opts, function (d) {
-      return d.scripts && d.scripts[stage]
-    }, cb)
-  }
-  return CMD
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/link.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-
-module.exports = link
-link.ifExists = linkIfExists
-
-var fs = require("graceful-fs")
-  , chain = require("slide").chain
-  , mkdir = require("mkdirp")
-  , rm = require("./gently-rm.js")
-  , path = require("path")
-  , npm = require("../npm.js")
-
-function linkIfExists (from, to, gently, cb) {
-  fs.stat(from, function (er) {
-    if (er) return cb()
-    link(from, to, gently, cb)
-  })
-}
-
-function link (from, to, gently, cb) {
-  if (typeof cb !== "function") cb = gently, gently = null
-  if (npm.config.get("force")) gently = false
-
-  to = path.resolve(to)
-  var target = from = path.resolve(from)
-  if (process.platform !== "win32") {
-    // junctions on windows must be absolute
-    target = path.relative(path.dirname(to), from)
-    // if there is no folder in common, then it will be much
-    // longer, and using a relative link is dumb.
-    if (target.length >= from.length) target = from
-  }
-
-  chain
-    ( [ [fs, "stat", from]
-      , [rm, to, gently]
-      , [mkdir, path.dirname(to)]
-      , [fs, "symlink", target, to, "junction"] ]
-    , cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/utils/tar.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,323 +0,0 @@
-// commands for packing and unpacking tarballs
-// this file is used by lib/cache.js
-
-var npm = require("../npm.js")
-  , fs = require("graceful-fs")
-  , path = require("path")
-  , log = require("npmlog")
-  , uidNumber = require("uid-number")
-  , rm = require("rimraf")
-  , readJson = require("read-package-json")
-  , cache = require("../cache.js")
-  , myUid = process.getuid && process.getuid()
-  , myGid = process.getgid && process.getgid()
-  , tar = require("tar")
-  , zlib = require("zlib")
-  , fstream = require("fstream")
-  , Packer = require("fstream-npm")
-  , lifecycle = require("./lifecycle.js")
-
-function lock(path, cb) {
-  return cache.lock('tar://' + path, cb)
-}
-
-function unlock(path, cb) {
-  return cache.unlock('tar://' + path, cb)
-}
-
-if (process.env.SUDO_UID && myUid === 0) {
-  if (!isNaN(process.env.SUDO_UID)) myUid = +process.env.SUDO_UID
-  if (!isNaN(process.env.SUDO_GID)) myGid = +process.env.SUDO_GID
-}
-
-exports.pack = pack
-exports.unpack = unpack
-
-function pack (tarball, folder, pkg, dfc, cb) {
-  log.verbose("tar pack", [tarball, folder])
-  if (typeof cb !== "function") cb = dfc, dfc = false
-
-  log.verbose("tarball", tarball)
-  log.verbose("folder", folder)
-
-  if (dfc) {
-    // do fancy crap
-    return lifecycle(pkg, "prepublish", folder, function (er) {
-      if (er) return cb(er)
-      pack_(tarball, folder, pkg, cb)
-    })
-  } else {
-    pack_(tarball, folder, pkg, cb)
-  }
-}
-
-function pack_ (tarball, folder, pkg, cb_) {
-  var tarballLock = false
-    , folderLock = false
-
-  function cb (er) {
-    if (folderLock)
-      unlock(folder, function() {
-        folderLock = false
-        cb(er)
-      })
-    else if (tarballLock)
-      unlock(tarball, function() {
-        tarballLock = false
-        cb(er)
-      })
-    else
-      cb_(er)
-  }
-
-  lock(folder, function(er) {
-    if (er) return cb(er)
-    folderLock = true
-    next()
-  })
-
-  lock(tarball, function (er) {
-    if (er) return cb(er)
-    tarballLock = true
-    next()
-  })
-
-  function next () {
-    if (!tarballLock || !folderLock) return
-
-    new Packer({ path: folder, type: "Directory", isDirectory: true })
-      .on("error", function (er) {
-        if (er) log.error("tar pack", "Error reading " + folder)
-        return cb(er)
-      })
-
-      // By default, npm includes some proprietary attributes in the
-      // package tarball.  This is sane, and allowed by the spec.
-      // However, npm *itself* excludes these from its own package,
-      // so that it can be more easily bootstrapped using old and
-      // non-compliant tar implementations.
-      .pipe(tar.Pack({ noProprietary: !npm.config.get("proprietary-attribs") }))
-      .on("error", function (er) {
-        if (er) log.error("tar.pack", "tar creation error", tarball)
-        cb(er)
-      })
-      .pipe(zlib.Gzip())
-      .on("error", function (er) {
-        if (er) log.error("tar.pack", "gzip error "+tarball)
-        cb(er)
-      })
-      .pipe(fstream.Writer({ type: "File", path: tarball }))
-      .on("error", function (er) {
-        if (er) log.error("tar.pack", "Could not write "+tarball)
-        cb(er)
-      })
-      .on("close", cb)
-  }
-}
-
-
-function unpack (tarball, unpackTarget, dMode, fMode, uid, gid, cb) {
-  log.verbose("tar unpack", tarball)
-  if (typeof cb !== "function") cb = gid, gid = null
-  if (typeof cb !== "function") cb = uid, uid = null
-  if (typeof cb !== "function") cb = fMode, fMode = npm.modes.file
-  if (typeof cb !== "function") cb = dMode, dMode = npm.modes.exec
-
-  uidNumber(uid, gid, function (er, uid, gid) {
-    if (er) return cb(er)
-    unpack_(tarball, unpackTarget, dMode, fMode, uid, gid, cb)
-  })
-}
-
-function unpack_ ( tarball, unpackTarget, dMode, fMode, uid, gid, cb_ ) {
-  var parent = path.dirname(unpackTarget)
-    , base = path.basename(unpackTarget)
-    , folderLock
-    , tarballLock
-
-  function cb (er) {
-    if (folderLock)
-      unlock(unpackTarget, function() {
-        folderLock = false
-        cb(er)
-      })
-    else if (tarballLock)
-      unlock(tarball, function() {
-        tarballLock = false
-        cb(er)
-      })
-    else
-      cb_(er)
-  }
-
-  lock(unpackTarget, function (er) {
-    if (er) return cb(er)
-    folderLock = true
-    next()
-  })
-
-  lock(tarball, function (er) {
-    if (er) return cb(er)
-    tarballLock = true
-    next()
-  })
-
-  function next() {
-    if (!tarballLock || !folderLock) return
-    rmGunz()
-  }
-
-  function rmGunz () {
-    rm(unpackTarget, function (er) {
-      if (er) return cb(er)
-      gtp()
-    })
-  }
-
-  function gtp () {
-    // gzip {tarball} --decompress --stdout \
-    //   | tar -mvxpf - --strip-components=1 -C {unpackTarget}
-    gunzTarPerm( tarball, unpackTarget
-               , dMode, fMode
-               , uid, gid
-               , function (er, folder) {
-      if (er) return cb(er)
-      readJson(path.resolve(folder, "package.json"), cb)
-    })
-  }
-}
-
-
-function gunzTarPerm (tarball, target, dMode, fMode, uid, gid, cb_) {
-  if (!dMode) dMode = npm.modes.exec
-  if (!fMode) fMode = npm.modes.file
-  log.silly("gunzTarPerm", "modes", [dMode.toString(8), fMode.toString(8)])
-
-  var cbCalled = false
-  function cb (er) {
-    if (cbCalled) return
-    cbCalled = true
-    cb_(er, target)
-  }
-
-  var fst = fs.createReadStream(tarball)
-
-  // figure out who we're supposed to be, if we're not pretending
-  // to be a specific user.
-  if (npm.config.get("unsafe-perm") && process.platform !== "win32") {
-    uid = myUid
-    gid = myGid
-  }
-
-  function extractEntry (entry) {
-    log.silly("gunzTarPerm", "extractEntry", entry.path)
-    // never create things that are user-unreadable,
-    // or dirs that are user-un-listable. Only leads to headaches.
-    var originalMode = entry.mode = entry.mode || entry.props.mode
-    entry.mode = entry.mode | (entry.type === "Directory" ? dMode : fMode)
-    entry.mode = entry.mode & (~npm.modes.umask)
-    entry.props.mode = entry.mode
-    if (originalMode !== entry.mode) {
-      log.silly( "gunzTarPerm", "modified mode"
-               , [entry.path, originalMode, entry.mode])
-    }
-
-    // if there's a specific owner uid/gid that we want, then set that
-    if (process.platform !== "win32" &&
-        typeof uid === "number" &&
-        typeof gid === "number") {
-      entry.props.uid = entry.uid = uid
-      entry.props.gid = entry.gid = gid
-    }
-  }
-
-  var extractOpts = { type: "Directory", path: target, strip: 1 }
-
-  if (process.platform !== "win32" &&
-      typeof uid === "number" &&
-      typeof gid === "number") {
-    extractOpts.uid = uid
-    extractOpts.gid = gid
-  }
-
-  extractOpts.filter = function () {
-    // symbolic links are not allowed in packages.
-    if (this.type.match(/^.*Link$/)) {
-      log.warn( "excluding symbolic link"
-              , this.path.substr(target.length + 1)
-              + " -> " + this.linkpath )
-      return false
-    }
-    return true
-  }
-
-
-  fst.on("error", function (er) {
-    if (er) log.error("tar.unpack", "error reading "+tarball)
-    cb(er)
-  })
-  fst.on("data", function OD (c) {
-    // detect what it is.
-    // Then, depending on that, we'll figure out whether it's
-    // a single-file module, gzipped tarball, or naked tarball.
-    // gzipped files all start with 1f8b08
-    if (c[0] === 0x1F &&
-        c[1] === 0x8B &&
-        c[2] === 0x08) {
-      fst
-        .pipe(zlib.Unzip())
-        .on("error", function (er) {
-          if (er) log.error("tar.unpack", "unzip error "+tarball)
-          cb(er)
-        })
-        .pipe(tar.Extract(extractOpts))
-        .on("entry", extractEntry)
-        .on("error", function (er) {
-          if (er) log.error("tar.unpack", "untar error "+tarball)
-          cb(er)
-        })
-        .on("close", cb)
-    } else if (c.toString().match(/^package\//)) {
-      // naked tar
-      fst
-        .pipe(tar.Extract(extractOpts))
-        .on("entry", extractEntry)
-        .on("error", function (er) {
-          if (er) log.error("tar.unpack", "untar error "+tarball)
-          cb(er)
-        })
-        .on("close", cb)
-    } else {
-      // naked js file
-      var jsOpts = { path: path.resolve(target, "index.js") }
-
-      if (process.platform !== "win32" &&
-          typeof uid === "number" &&
-          typeof gid === "number") {
-        jsOpts.uid = uid
-        jsOpts.gid = gid
-      }
-
-      fst
-        .pipe(fstream.Writer(jsOpts))
-        .on("error", function (er) {
-          if (er) log.error("tar.unpack", "copy error "+tarball)
-          cb(er)
-        })
-        .on("close", function () {
-          var j = path.resolve(target, "package.json")
-          readJson(j, function (er, d) {
-            if (er) {
-              log.error("not a package", tarball)
-              return cb(er)
-            }
-            fs.writeFile(j, JSON.stringify(d) + "\n", cb)
-          })
-        })
-    }
-
-    // now un-hook, and re-emit the chunk
-    fst.removeListener("data", OD)
-    fst.emit("data", c)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/version.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,119 +0,0 @@
-// npm version <newver>
-
-module.exports = version
-
-var exec = require("child_process").execFile
-  , semver = require("semver")
-  , path = require("path")
-  , fs = require("graceful-fs")
-  , chain = require("slide").chain
-  , log = require("npmlog")
-  , which = require("which")
-  , npm = require("./npm.js")
-
-version.usage = "npm version [<newversion> | major | minor | patch]\n"
-              + "\n(run in package dir)\n"
-              + "'npm -v' or 'npm --version' to print npm version "
-              + "("+npm.version+")\n"
-              + "'npm view <pkg> version' to view a package's "
-              + "published version\n"
-              + "'npm ls' to inspect current package/dependency versions"
-
-function version (args, silent, cb_) {
-  if (typeof cb_ !== "function") cb_ = silent, silent = false
-  if (args.length > 1) return cb_(version.usage)
-  fs.readFile(path.join(process.cwd(), "package.json"), function (er, data) {
-    if (!args.length) {
-      var v = {}
-      Object.keys(process.versions).forEach(function (k) {
-        v[k] = process.versions[k]
-      })
-      v.npm = npm.version
-      try {
-        data = JSON.parse(data.toString())
-      } catch (er) {
-        data = null
-      }
-      if (data && data.name && data.version) {
-        v[data.name] = data.version
-      }
-      console.log(v)
-      return cb_()
-    }
-
-    if (er) {
-      log.error("version", "No package.json found")
-      return cb_(er)
-    }
-
-    try {
-      data = JSON.parse(data)
-    } catch (er) {
-      log.error("version", "Bad package.json data")
-      return cb_(er)
-    }
-
-		var newVer = semver.valid(args[0])
-		if (!newVer) newVer = semver.inc(data.version, args[0])
-		if (!newVer) return cb_(version.usage)
-    if (data.version === newVer) return cb_(new Error("Version not changed"))
-    data.version = newVer
-
-    fs.stat(path.join(process.cwd(), ".git"), function (er, s) {
-      function cb (er) {
-        if (!er && !silent) console.log("v" + newVer)
-        cb_(er)
-      }
-
-      var doGit = !er && s.isDirectory()
-      if (!doGit) return write(data, cb)
-      else checkGit(data, cb)
-    })
-  })
-}
-
-function checkGit (data, cb) {
-  var git = npm.config.get("git")
-  var args = [ "status", "--porcelain" ]
-  var env = process.env
-
-  // check for git
-  which(git, function (err) {
-    if (err) {
-      err.code = "ENOGIT"
-      return cb(err)
-    }
-
-    gitFound()
-  })
-
-  function gitFound () {
-    exec(git, args, {env: env}, function (er, stdout, stderr) {
-      var lines = stdout.trim().split("\n").filter(function (line) {
-        return line.trim() && !line.match(/^\?\? /)
-      }).map(function (line) {
-        return line.trim()
-      })
-      if (lines.length) return cb(new Error(
-        "Git working directory not clean.\n"+lines.join("\n")))
-      write(data, function (er) {
-        if (er) return cb(er)
-        var message = npm.config.get("message").replace(/%s/g, data.version)
-          , sign = npm.config.get("sign-git-tag")
-          , flag = sign ? "-sm" : "-am"
-        chain
-          ( [ [ exec, git, [ "add", "package.json" ], {env: process.env} ]
-            , [ exec, git, [ "commit", "-m", message ], {env: process.env} ]
-            , [ exec, git, [ "tag", "v" + data.version, flag, message ]
-              , {env: process.env} ] ]
-          , cb )
-      })
-    })
-  }
-}
-
-function write (data, cb) {
-  fs.writeFile( path.join(process.cwd(), "package.json")
-              , new Buffer(JSON.stringify(data, null, 2) + "\n")
-              , cb )
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/view.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,246 +0,0 @@
-// npm view [pkg [pkg ...]]
-
-module.exports = view
-view.usage = "npm view pkg[@version] [<field>[.subfield]...]"
-
-view.completion = function (opts, cb) {
-  if (opts.conf.argv.remain.length <= 2) {
-    return registry.get("/-/short", cb)
-  }
-  // have the package, get the fields.
-  var tag = npm.config.get("tag")
-  registry.get(opts.conf.argv.remain[2], function (er, d) {
-    if (er) return cb(er)
-    var dv = d.versions[d["dist-tags"][tag]]
-      , fields = []
-    d.versions = Object.keys(d.versions).sort(semver.compareLoose)
-    fields = getFields(d).concat(getFields(dv))
-    cb(null, fields)
-  })
-
-  function getFields (d, f, pref) {
-    f = f || []
-    if (!d) return f
-    pref = pref || []
-    Object.keys(d).forEach(function (k) {
-      if (k.charAt(0) === "_" || k.indexOf(".") !== -1) return
-      var p = pref.concat(k).join(".")
-      f.push(p)
-      if (Array.isArray(d[k])) {
-        return d[k].forEach(function (val, i) {
-          var pi = p + "[" + i + "]"
-          if (val && typeof val === "object") getFields(val, f, [p])
-          else f.push(pi)
-        })
-      }
-      if (typeof d[k] === "object") getFields(d[k], f, [p])
-    })
-    return f
-  }
-}
-
-var npm = require("./npm.js")
-  , registry = npm.registry
-  , log = require("npmlog")
-  , util = require("util")
-  , semver = require("semver")
-
-function view (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-  if (!args.length) return cb("Usage: "+view.usage)
-  var pkg = args.shift()
-    , nv = pkg.split("@")
-    , name = nv.shift()
-    , version = nv.join("@") || npm.config.get("tag")
-
-  if (name === ".") return cb(view.usage)
-
-  // get the data about this package
-  registry.get(name, 600, function (er, data) {
-    if (er) return cb(er)
-    if (data["dist-tags"].hasOwnProperty(version)) {
-      version = data["dist-tags"][version]
-    }
-    var results = []
-      , error = null
-      , versions = data.versions
-    data.versions = Object.keys(data.versions).sort(semver.compareLoose)
-    if (!args.length) args = [""]
-
-    // remove readme unless we asked for it
-    if (-1 === args.indexOf("readme")) {
-      delete data.readme
-    }
-
-    Object.keys(versions).forEach(function (v) {
-      if (semver.satisfies(v, version, true)) args.forEach(function (args) {
-        // remove readme unless we asked for it
-        if (-1 === args.indexOf("readme")) {
-          delete versions[v].readme
-        }
-        results.push(showFields(data, versions[v], args))
-      })
-    })
-    results = results.reduce(reducer, {})
-    var retval = results
-
-    if (args.length === 1 && args[0] === "") {
-      retval = cleanBlanks(retval)
-      log.silly("cleanup", retval)
-    }
-
-    if (error || silent) cb(error, retval)
-    else printData(results, data._id, cb.bind(null, error, retval))
-  })
-}
-
-function cleanBlanks (obj) {
-  var clean = {}
-  Object.keys(obj).forEach(function (version) {
-    clean[version] = obj[version][""]
-  })
-  return clean
-}
-
-function reducer (l, r) {
-  if (r) Object.keys(r).forEach(function (v) {
-    l[v] = l[v] || {}
-    Object.keys(r[v]).forEach(function (t) {
-      l[v][t] = r[v][t]
-    })
-  })
-  return l
-}
-
-// return whatever was printed
-function showFields (data, version, fields) {
-  var o = {}
-  ;[data, version].forEach(function (s) {
-    Object.keys(s).forEach(function (k) {
-      o[k] = s[k]
-    })
-  })
-  return search(o, fields.split("."), version.version, fields)
-}
-
-function search (data, fields, version, title) {
-  var field
-    , tail = fields
-  while (!field && fields.length) field = tail.shift()
-  fields = [field].concat(tail)
-  if (!field && !tail.length) {
-    var o = {}
-    o[version] = {}
-    o[version][title] = data
-    return o
-  }
-  var index = field.match(/(.+)\[([^\]]+)\]$/)
-  if (index) {
-    field = index[1]
-    index = index[2]
-    if (data.field && data.field.hasOwnProperty(index)) {
-      return search(data[field][index], tail, version, title)
-    } else {
-      field = field + "[" + index + "]"
-    }
-  }
-  if (Array.isArray(data)) {
-    if (data.length === 1) {
-      return search(data[0], fields, version, title)
-    }
-    var results = []
-      , res = null
-    data.forEach(function (data, i) {
-      var tl = title.length
-        , newt = title.substr(0, tl-(fields.join(".").length) - 1)
-               + "["+i+"]" + [""].concat(fields).join(".")
-      results.push(search(data, fields.slice(), version, newt))
-    })
-    results = results.reduce(reducer, {})
-    return results
-  }
-  if (!data.hasOwnProperty(field)) {
-    return
-  }
-  data = data[field]
-  if (tail.length) {
-    if (typeof data === "object") {
-      // there are more fields to deal with.
-      return search(data, tail, version, title)
-    } else {
-      return new Error("Not an object: "+data)
-    }
-  }
-  var o = {}
-  o[version] = {}
-  o[version][title] = data
-  return o
-}
-
-function printData (data, name, cb) {
-  var versions = Object.keys(data)
-    , msg = ""
-    , showVersions = versions.length > 1
-    , showFields
-
-  versions.forEach(function (v, i) {
-    var fields = Object.keys(data[v])
-    showFields = showFields || (fields.length > 1)
-    fields.forEach(function (f) {
-      var d = cleanup(data[v][f])
-      if (showVersions || showFields || typeof d !== "string") {
-        d = cleanup(data[v][f])
-        d = npm.config.get("json")
-          ? JSON.stringify(d, null, 2)
-          : util.inspect(d, false, 5, npm.color)
-      } else if (typeof d === "string" && npm.config.get("json")) {
-        d = JSON.stringify(d)
-      }
-      if (f && showFields) f += " = "
-      if (d.indexOf("\n") !== -1) d = "\n" + d
-      msg += (showVersions ? name + "@" + v + " " : "")
-           + (showFields ? f : "") + d + "\n"
-    })
-  })
-
-  console.log(msg)
-  cb(null, data)
-}
-function cleanup (data) {
-  if (Array.isArray(data)) {
-    if (data.length === 1) {
-      data = data[0]
-    } else {
-      return data.map(cleanup)
-    }
-  }
-  if (!data || typeof data !== "object") return data
-
-  if (typeof data.versions === "object"
-      && data.versions
-      && !Array.isArray(data.versions)) {
-    data.versions = Object.keys(data.versions || {})
-  }
-
-  var keys = Object.keys(data)
-  keys.forEach(function (d) {
-    if (d.charAt(0) === "_") delete data[d]
-    else if (typeof data[d] === "object") data[d] = cleanup(data[d])
-  })
-  keys = Object.keys(data)
-  if (keys.length <= 3
-      && data.name
-      && (keys.length === 1
-          || keys.length === 3 && data.email && data.url
-          || keys.length === 2 && (data.email || data.url))) {
-    data = unparsePerson(data)
-  }
-  return data
-}
-function unparsePerson (d) {
-  if (typeof d === "string") return d
-  return d.name
-       + (d.email ? " <"+d.email+">" : "")
-       + (d.url ? " ("+d.url+")" : "")
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/visnup.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-module.exports = visnup
-var npm = require("./npm.js")
-
-var handsomeFace = [
-  [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 232, 237, 236, 236, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 235, 236, 235, 233, 237, 235, 233, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 232, 235, 233, 232, 235, 235, 234, 233, 236, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 0, 0, 237, 235, 232, 232, 234, 233, 233, 232, 232, 233, 232, 232, 235, 232, 233, 234, 234, 0, 0, 0, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 0, 232, 232, 232, 239, 238, 235, 233, 232, 232, 232, 232, 232, 232, 232, 233, 235, 232, 233, 233, 232, 0, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 234, 234, 232, 233, 234, 233, 234, 235, 233, 235, 60, 238, 238, 234, 234, 233, 234, 233, 238, 251, 246, 233, 233, 232, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 233, 233, 233, 232, 232, 239, 249, 251, 252, 231, 231, 188, 250, 254, 59, 60, 255, 231, 231, 231, 252, 235, 239, 235, 232, 233, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 232, 233, 232, 232, 232, 248, 231, 231, 231, 231, 231, 231, 231, 254, 238, 254, 231, 231, 231, 231, 231, 252, 233, 235, 237, 233, 234, 0, 0, 0, 0, 0]
- ,[0, 0, 233, 232, 232, 232, 248, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 251, 233, 233, 233, 236, 233, 0, 0, 0, 0]
- ,[232, 233, 233, 232, 232, 246, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 249, 233, 234, 234, 0, 0, 0, 0]
- ,[232, 232, 232, 232, 233, 249, 231, 255, 255, 255, 255, 254, 109, 60, 239, 237, 238, 237, 235, 235, 235, 235, 236, 235, 235, 235, 234, 232, 232, 232, 232, 232, 233, 0]
- ,[0, 232, 232, 233, 233, 233, 233, 233, 233, 233, 233, 233, 235, 236, 238, 238, 235, 188, 254, 254, 145, 236, 252, 254, 254, 254, 254, 249, 236, 235, 232, 232, 233, 0]
- ,[0, 0, 233, 237, 249, 239, 233, 252, 231, 231, 231, 231, 231, 231, 254, 235, 235, 254, 231, 231, 251, 235, 237, 231, 231, 231, 231, 7, 237, 235, 232, 233, 233, 0]
- ,[0, 0, 0, 0, 233, 248, 239, 233, 231, 231, 231, 231, 254, 233, 233, 235, 254, 255, 231, 254, 237, 236, 254, 239, 235, 235, 233, 233, 232, 232, 233, 232, 0, 0]
- ,[0, 0, 0, 232, 233, 246, 255, 255, 236, 236, 236, 236, 236, 255, 231, 231, 231, 231, 231, 231, 252, 234, 248, 231, 231, 231, 231, 248, 232, 232, 232, 0, 0, 0]
- ,[0, 0, 0, 0, 235, 237, 7, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 255, 238, 235, 7, 231, 231, 231, 246, 232, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 235, 103, 188, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 252, 232, 238, 231, 231, 255, 244, 232, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 235, 236, 103, 146, 253, 255, 231, 231, 231, 231, 231, 253, 251, 250, 250, 250, 246, 232, 235, 152, 255, 146, 66, 233, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 233, 103, 146, 146, 146, 146, 254, 231, 231, 231, 109, 103, 146, 255, 188, 239, 240, 103, 255, 253, 103, 238, 234, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 232, 235, 109, 146, 146, 146, 146, 146, 252, 152, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 103, 235, 233, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 0, 235, 235, 103, 146, 146, 146, 146, 146, 146, 188, 188, 188, 188, 188, 188, 152, 146, 146, 146, 66, 235, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 0, 0, 233, 235, 66, 146, 146, 146, 146, 152, 255, 146, 240, 239, 241, 109, 146, 146, 146, 103, 233, 0, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 234, 237, 109, 146, 146, 146, 146, 146, 254, 231, 231, 188, 146, 146, 146, 103, 233, 0, 0, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 233, 237, 60, 103, 146, 146, 146, 146, 146, 103, 66, 60, 235, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0]
- ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 232, 233, 233, 236, 235, 237, 235, 237, 237, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]]
-
-
-function visnup (args, cb) {
-  handsomeFace.forEach(function (line) {
-    console.log(line.map(function (ch) {
-      return "\033[" + (ch ? "48;5;" + ch : ch) + "m"
-    }).join(' '))
-  })
-
-  var c = args.shift()
-  if (c) npm.commands[c](args, cb)
-  else cb()
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/whoami.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-module.exports = whoami
-
-var npm = require("./npm.js")
-
-whoami.usage = "npm whoami\n(just prints the 'username' config)"
-
-function whoami (args, silent, cb) {
-  if (typeof cb !== "function") cb = silent, silent = false
-  var me = npm.config.get("username")
-  msg = me ? me : "Not authed.  Run 'npm adduser'"
-  if (!silent) console.log(msg)
-  process.nextTick(cb.bind(this, null, me))
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/lib/xmas.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-// happy xmas
-var npm = require("./npm.js")
-  , log = require("npmlog")
-
-module.exports = function (args, cb) {
-var s = process.platform === "win32" ? " *" : " \u2605"
-  , f = "\uFF0F"
-  , b = "\uFF3C"
-  , x = process.platform === "win32" ? " " : ""
-  , o = [ "\u0069" , "\u0020", "\u0020", "\u0020", "\u0020", "\u0020"
-        , "\u0020", "\u0020", "\u0020", "\u0020", "\u0020", "\u0020"
-        , "\u0020", "\u2E1B","\u2042","\u2E2E","&","@","\uFF61" ]
-  , oc = [21,33,34,35,36,37]
-  , l = "\u005e"
-
-function w (s) { process.stderr.write(s) }
-
-w("\n")
-;(function T (H) {
-  for (var i = 0; i < H; i ++) w(" ")
-  w(x+"\033[33m"+s+"\n")
-  var M = H * 2 - 1
-  for (L = 1; L <= H; L ++) {
-    var O = L * 2 - 2
-    var S = (M - O) / 2
-    for (var i = 0; i < S; i ++) w(" ")
-    w(x+"\033[32m"+f)
-    for (var i = 0; i < O; i ++) w(
-      "\033["+oc[Math.floor(Math.random()*oc.length)]+"m"+
-      o[Math.floor(Math.random() * o.length)]
-    )
-    w(x+"\033[32m"+b+"\n")
-  }
-  w(" ")
-  for (var i = 1; i < H; i ++) w("\033[32m"+l)
-  w("| "+x+" |")
-  for (var i = 1; i < H; i ++) w("\033[32m"+l)
-  if (H > 10) {
-    w("\n ")
-    for (var i = 1; i < H; i ++) w(" ")
-    w("| "+x+" |")
-    for (var i = 1; i < H; i ++) w(" ")
-  }
-})(20)
-w("\n\n")
-log.heading = ''
-log.addLevel('npm', 100000, log.headingStyle)
-log.npm("loves you", "Happy Xmas, Noders!")
-cb()
-}
-var dg=false
-Object.defineProperty(module.exports, "usage", {get:function () {
-  if (dg) module.exports([], function () {})
-  dg = true
-  return " "
-}})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/make.bat	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-:: The tests run "make doc" in the prepublish script,
-:: so this file gives windows something that'll exit
-:: successfully, without having to install make.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-README.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,324 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm\fR \-\- node package manager
-.
-.SH "SYNOPSIS"
-This is just enough info to get you up and running\.
-.
-.P
-Much more info available via \fBnpm help\fR once it\'s installed\.
-.
-.SH "IMPORTANT"
-\fBYou need node v0\.8 or higher to run this program\.\fR
-.
-.P
-To install an old \fBand unsupported\fR version of npm that works on node 0\.3
-and prior, clone the git repo and dig through the old tags and branches\.
-.
-.SH "Super Easy Install"
-npm comes with node now\.
-.
-.SS "Windows Computers"
-Get the MSI\.  npm is in it\.
-.
-.SS "Apple Macintosh Computers"
-Get the pkg\.  npm is in it\.
-.
-.SS "Other Sorts of Unices"
-Run \fBmake install\fR\|\.  npm will be installed with node\.
-.
-.P
-If you want a more fancy pants install (a different version, customized
-paths, etc\.) then read on\.
-.
-.SH "Fancy Install (Unix)"
-There\'s a pretty robust install script at \fIhttps://npmjs\.org/install\.sh\fR\|\.  You can download that and run it\.
-.
-.SS "Slightly Fancier"
-You can set any npm configuration params with that script:
-.
-.IP "" 4
-.
-.nf
-npm_config_prefix=/some/path sh install\.sh
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Or, you can run it in uber\-debuggery mode:
-.
-.IP "" 4
-.
-.nf
-npm_debug=1 sh install\.sh
-.
-.fi
-.
-.IP "" 0
-.
-.SS "Even Fancier"
-Get the code with git\.  Use \fBmake\fR to build the docs and do other stuff\.
-If you plan on hacking on npm, \fBmake link\fR is your friend\.
-.
-.P
-If you\'ve got the npm source code, you can also semi\-permanently set
-arbitrary config keys using the \fB\|\./configure \-\-key=val \.\.\.\fR, and then
-run npm commands by doing \fBnode cli\.js <cmd> <args>\fR\|\.  (This is helpful
-for testing, or running stuff without actually installing npm itself\.)
-.
-.SH "Fancy Windows Install"
-You can download a zip file from \fIhttps://npmjs\.org/dist/\fR, and unpack it
-in the same folder where node\.exe lives\.
-.
-.P
-If that\'s not fancy enough for you, then you can fetch the code with
-git, and mess with it directly\.
-.
-.SH "Installing on Cygwin"
-No\.
-.
-.SH "Permissions when Using npm to Install Other Stuff"
-\fBtl;dr\fR
-.
-.IP "\(bu" 4
-Use \fBsudo\fR for greater safety\.  Or don\'t, if you prefer not to\.
-.
-.IP "\(bu" 4
-npm will downgrade permissions if it\'s root before running any build
-scripts that package authors specified\.
-.
-.IP "" 0
-.
-.SS "More details\.\.\."
-As of version 0\.3, it is recommended to run npm as root\.
-This allows npm to change the user identifier to the \fBnobody\fR user prior
-to running any package build or test commands\.
-.
-.P
-If you are not the root user, or if you are on a platform that does not
-support uid switching, then npm will not attempt to change the userid\.
-.
-.P
-If you would like to ensure that npm \fBalways\fR runs scripts as the
-"nobody" user, and have it fail if it cannot downgrade permissions, then
-set the following configuration param:
-.
-.IP "" 4
-.
-.nf
-npm config set unsafe\-perm false
-.
-.fi
-.
-.IP "" 0
-.
-.P
-This will prevent running in unsafe mode, even as non\-root users\.
-.
-.SH "Uninstalling"
-So sad to see you go\.
-.
-.IP "" 4
-.
-.nf
-sudo npm uninstall npm \-g
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Or, if that fails,
-.
-.IP "" 4
-.
-.nf
-sudo make uninstall
-.
-.fi
-.
-.IP "" 0
-.
-.SH "More Severe Uninstalling"
-Usually, the above instructions are sufficient\.  That will remove
-npm, but leave behind anything you\'ve installed\.
-.
-.P
-If you would like to remove all the packages that you have installed,
-then you can use the \fBnpm ls\fR command to find them, and then \fBnpm rm\fR to
-remove them\.
-.
-.P
-To remove cruft left behind by npm 0\.x, you can use the included \fBclean\-old\.sh\fR script file\.  You can run it conveniently like this:
-.
-.IP "" 4
-.
-.nf
-npm explore npm \-g \-\- sh scripts/clean\-old\.sh
-.
-.fi
-.
-.IP "" 0
-.
-.P
-npm uses two configuration files, one for per\-user configs, and another
-for global (every\-user) configs\.  You can view them by doing:
-.
-.IP "" 4
-.
-.nf
-npm config get userconfig   # defaults to ~/\.npmrc
-npm config get globalconfig # defaults to /usr/local/etc/npmrc
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Uninstalling npm does not remove configuration files by default\.  You
-must remove them yourself manually if you want them gone\.  Note that
-this means that future npm installs will not remember the settings that
-you have chosen\.
-.
-.SH "Using npm Programmatically"
-If you would like to use npm programmatically, you can do that\.
-It\'s not very well documented, but it \fIis\fR rather simple\.
-.
-.P
-Most of the time, unless you actually want to do all the things that
-npm does, you should try using one of npm\'s dependencies rather than
-using npm itself, if possible\.
-.
-.P
-Eventually, npm will be just a thin cli wrapper around the modules
-that it depends on, but for now, there are some things that you must
-use npm itself to do\.
-.
-.IP "" 4
-.
-.nf
-var npm = require("npm")
-npm\.load(myConfigObject, function (er) {
-  if (er) return handlError(er)
-  npm\.commands\.install(["some", "args"], function (er, data) {
-    if (er) return commandFailed(er)
-    // command succeeded, and data might have some info
-  })
-  npm\.on("log", function (message) { \.\.\.\. })
-})
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The \fBload\fR function takes an object hash of the command\-line configs\.
-The various \fBnpm\.commands\.<cmd>\fR functions take an \fBarray\fR of
-positional argument \fBstrings\fR\|\.  The last argument to any \fBnpm\.commands\.<cmd>\fR function is a callback\.  Some commands take other
-optional arguments\.  Read the source\.
-.
-.P
-You cannot set configs individually for any single npm function at this
-time\.  Since \fBnpm\fR is a singleton, any call to \fBnpm\.config\.set\fR will
-change the value for \fIall\fR npm commands in that process\.
-.
-.P
-See \fB\|\./bin/npm\-cli\.js\fR for an example of pulling config values off of the
-command line arguments using nopt\.  You may also want to check out \fBnpm
-help config\fR to learn about all the options you can set there\.
-.
-.SH "More Docs"
-Check out the docs \fIhttps://npmjs\.org/doc/\fR,
-especially the faq \fIhttps://npmjs\.org/doc/faq\.html\fR\|\.
-.
-.P
-You can use the \fBnpm help\fR command to read any of them\.
-.
-.P
-If you\'re a developer, and you want to use npm to publish your program,
-you should read this \fIhttps://npmjs\.org/doc/developers\.html\fR
-.
-.SH "Legal Stuff"
-"npm" and "the npm registry" are owned by Isaac Z\. Schlueter\.
-All rights reserved\.  See the included LICENSE file for more details\.
-.
-.P
-"Node\.js" and "node" are trademarks owned by Joyent, Inc\.  npm is not
-officially part of the Node\.js project, and is neither owned by nor
-officially affiliated with Joyent, Inc\.
-.
-.P
-The packages in the npm registry are not part of npm itself, and are the
-sole property of their respective maintainers\.  While every effort is
-made to ensure accountability, there is absolutely no guarantee,
-warrantee, or assertion made as to the quality, fitness for a specific
-purpose, or lack of malice in any given npm package\.  Modules
-published on the npm registry are not affiliated with or endorsed by
-Joyent, Inc\., Isaac Z\. Schlueter, Ryan Dahl, or the Node\.js project\.
-.
-.P
-If you have a complaint about a package in the npm registry, and cannot
-resolve it with the package owner, please express your concerns to
-Isaac Z\. Schlueter at \fIi@izs\.me\fR\|\.
-.
-.SS "In plain english"
-This is mine; not my employer\'s, not Node\'s, not Joyent\'s, not Ryan
-Dahl\'s\.
-.
-.P
-If you publish something, it\'s yours, and you are solely accountable
-for it\.  Not me, not Node, not Joyent, not Ryan Dahl\.
-.
-.P
-If other people publish something, it\'s theirs\.  Not mine, not Node\'s,
-not Joyent\'s, not Ryan Dahl\'s\.
-.
-.P
-Yes, you can publish something evil\.  It will be removed promptly if
-reported, and we\'ll lose respect for you\.  But there is no vetting
-process for published modules\.
-.
-.P
-If this concerns you, inspect the source before using packages\.
-.
-.SH "BUGS"
-When you find issues, please report them:
-.
-.IP "\(bu" 4
-web: \fIhttps://github\.com/isaacs/npm/issues\fR
-.
-.IP "\(bu" 4
-email: \fInpm\-@googlegroups\.com\fR
-.
-.IP "" 0
-.
-.P
-Be sure to include \fIall\fR of the output from the npm command that didn\'t work
-as expected\.  The \fBnpm\-debug\.log\fR file is also helpful to provide\.
-.
-.P
-You can also look for isaacs in #node\.js on irc://irc\.freenode\.net\.  He
-will no doubt tell you to put the output in a gist or email\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help help
-.
-.IP "\(bu" 4
-npm help  index
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-adduser.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-ADDUSER" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-adduser\fR \-\- Add a registry user account
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm adduser
-.
-.fi
-.
-.SH "DESCRIPTION"
-Create or verify a user named \fB<username>\fR in the npm registry, and
-save the credentials to the \fB\|\.npmrc\fR file\.
-.
-.P
-The username, password, and email are read in from prompts\.
-.
-.P
-You may use this command to change your email address, but not username
-or password\.
-.
-.P
-To reset your password, go to \fIhttp://admin\.npmjs\.org/\fR
-.
-.P
-You may use this command multiple times with the same user account to
-authorize on a new machine\.
-.
-.SH "CONFIGURATION"
-.
-.SS "registry"
-Default: http://registry\.npmjs\.org/
-.
-.P
-The base URL of the npm package registry\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help owner
-.
-.IP "\(bu" 4
-npm help whoami
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-bin.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-BIN" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-bin\fR \-\- Display npm bin folder
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm bin
-.
-.fi
-.
-.SH "DESCRIPTION"
-Print the folder where npm will install executables\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help prefix
-.
-.IP "\(bu" 4
-npm help root
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-bugs.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,76 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-BUGS" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-bugs\fR \-\- Bugs for a package in a web browser maybe
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm bugs <pkgname>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command tries to guess at the likely location of a package\'s
-bug tracker URL, and then tries to open it using the \fB\-\-browser\fR
-config param\.
-.
-.SH "CONFIGURATION"
-.
-.SS "browser"
-.
-.IP "\(bu" 4
-Default: OS X: \fB"open"\fR, Windows: \fB"start"\fR, Others: \fB"xdg\-open"\fR
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-The browser that is called by the \fBnpm bugs\fR command to open websites\.
-.
-.SS "registry"
-.
-.IP "\(bu" 4
-Default: https://registry\.npmjs\.org/
-.
-.IP "\(bu" 4
-Type: url
-.
-.IP "" 0
-.
-.P
-The base URL of the npm package registry\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help docs
-.
-.IP "\(bu" 4
-npm help view
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-build.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,43 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-BUILD" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-build\fR \-\- Build a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm build <package\-folder>
-.
-.fi
-.
-.IP "\(bu" 4
-\fB<package\-folder>\fR:
-A folder containing a \fBpackage\.json\fR file in its root\.
-.
-.IP "" 0
-.
-.SH "DESCRIPTION"
-This is the plumbing command called by \fBnpm link\fR and \fBnpm install\fR\|\.
-.
-.P
-It should generally not be called directly\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help link
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-bundle.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-BUNDLE" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-bundle\fR \-\- REMOVED
-.
-.SH "DESCRIPTION"
-The \fBnpm bundle\fR command has been removed in 1\.0, for the simple reason
-that it is no longer necessary, as the default behavior is now to
-install packages into the local space\.
-.
-.P
-Just use \fBnpm install\fR now to do what \fBnpm bundle\fR used to do\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-cache.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,104 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-CACHE" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-cache\fR \-\- Manipulates packages cache
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm cache add <tarball file>
-npm cache add <folder>
-npm cache add <tarball url>
-npm cache add <name>@<version>
-npm cache ls [<path>]
-npm cache clean [<path>]
-.
-.fi
-.
-.SH "DESCRIPTION"
-Used to add, list, or clear the npm cache folder\.
-.
-.IP "\(bu" 4
-add:
-Add the specified package to the local cache\.  This command is primarily
-intended to be used internally by npm, but it can provide a way to
-add data to the local installation cache explicitly\.
-.
-.IP "\(bu" 4
-ls:
-Show the data in the cache\.  Argument is a path to show in the cache
-folder\.  Works a bit like the \fBfind\fR program, but limited by the \fBdepth\fR config\.
-.
-.IP "\(bu" 4
-clean:
-Delete data out of the cache folder\.  If an argument is provided, then
-it specifies a subpath to delete\.  If no argument is provided, then
-the entire cache is cleared\.
-.
-.IP "" 0
-.
-.SH "DETAILS"
-npm stores cache data in the directory specified in \fBnpm config get cache\fR\|\.
-For each package that is added to the cache, three pieces of information are
-stored in \fB{cache}/{name}/{version}\fR:
-.
-.IP "\(bu" 4
-\|\.\.\./package/:
-A folder containing the package contents as they appear in the tarball\.
-.
-.IP "\(bu" 4
-\|\.\.\./package\.json:
-The package\.json file, as npm sees it, with overlays applied and a _id attribute\.
-.
-.IP "\(bu" 4
-\|\.\.\./package\.tgz:
-The tarball for that version\.
-.
-.IP "" 0
-.
-.P
-Additionally, whenever a registry request is made, a \fB\|\.cache\.json\fR file
-is placed at the corresponding URI, to store the ETag and the requested
-data\.
-.
-.P
-Commands that make non\-essential registry requests (such as \fBsearch\fR and \fBview\fR, or the completion scripts) generally specify a minimum timeout\.
-If the \fB\|\.cache\.json\fR file is younger than the specified timeout, then
-they do not make an HTTP request to the registry\.
-.
-.SH "CONFIGURATION"
-.
-.SS "cache"
-Default: \fB~/\.npm\fR on Posix, or \fB%AppData%/npm\-cache\fR on Windows\.
-.
-.P
-The root cache folder\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help pack
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-completion.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,47 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-COMPLETION" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-completion\fR \-\- Tab Completion for npm
-.
-.SH "SYNOPSIS"
-.
-.nf
-\|\. <(npm completion)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Enables tab\-completion in all npm commands\.
-.
-.P
-The synopsis above
-loads the completions into your current shell\.  Adding it to
-your ~/\.bashrc or ~/\.zshrc will make the completions available
-everywhere\.
-.
-.P
-You may of course also pipe the output of npm completion to a file
-such as \fB/usr/local/etc/bash_completion\.d/npm\fR if you have a system
-that will read that file for you\.
-.
-.P
-When \fBCOMP_CWORD\fR, \fBCOMP_LINE\fR, and \fBCOMP_POINT\fR are defined in the
-environment, \fBnpm completion\fR acts in "plumbing mode", and outputs
-completions based on the arguments\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  developers
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-config.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,113 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-CONFIG" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-config\fR \-\- Manage the npm configuration files
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm config set <key> <value> [\-\-global]
-npm config get <key>
-npm config delete <key>
-npm config list
-npm config edit
-npm c [set|get|delete|list]
-npm get <key>
-npm set <key> <value> [\-\-global]
-.
-.fi
-.
-.SH "DESCRIPTION"
-npm gets its config settings from the command line, environment
-variables, \fBnpmrc\fR files, and in some cases, the \fBpackage\.json\fR file\.
-.
-.P
-npm help  See npmrc for more information about the npmrc files\.
-.
-.P
-npm help  See \fBnpm\-config\fR for a more thorough discussion of the mechanisms
-involved\.
-.
-.P
-The \fBnpm config\fR command can be used to update and edit the contents
-of the user and global npmrc files\.
-.
-.SH "Sub\-commands"
-Config supports the following sub\-commands:
-.
-.SS "set"
-.
-.nf
-npm config set key value
-.
-.fi
-.
-.P
-Sets the config key to the value\.
-.
-.P
-If value is omitted, then it sets it to "true"\.
-.
-.SS "get"
-.
-.nf
-npm config get key
-.
-.fi
-.
-.P
-Echo the config value to stdout\.
-.
-.SS "list"
-.
-.nf
-npm config list
-.
-.fi
-.
-.P
-Show all the config settings\.
-.
-.SS "delete"
-.
-.nf
-npm config delete key
-.
-.fi
-.
-.P
-Deletes the key from all configuration files\.
-.
-.SS "edit"
-.
-.nf
-npm config edit
-.
-.fi
-.
-.P
-Opens the config file in an editor\.  Use the \fB\-\-global\fR flag to edit the
-global config\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-dedupe.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,96 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-DEDUPE" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-dedupe\fR \-\- Reduce duplication
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm dedupe [package names\.\.\.]
-npm ddp [package names\.\.\.]
-.
-.fi
-.
-.SH "DESCRIPTION"
-Searches the local package tree and attempts to simplify the overall
-structure by moving dependencies further up the tree, where they can
-be more effectively shared by multiple dependent packages\.
-.
-.P
-For example, consider this dependency graph:
-.
-.IP "" 4
-.
-.nf
-a
-+\-\- b <\-\- depends on c@1\.0\.x
-|   `\-\- c@1\.0\.3
-`\-\- d <\-\- depends on c@~1\.0\.9
-    `\-\- c@1\.0\.10
-.
-.fi
-.
-.IP "" 0
-.
-.P
-npm help In this case, \fBnpm\-dedupe\fR will transform the tree to:
-.
-.IP "" 4
-.
-.nf
-a
-+\-\- b
-+\-\- d
-`\-\- c@1\.0\.10
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Because of the hierarchical nature of node\'s module lookup, b and d
-will both get their dependency met by the single c package at the root
-level of the tree\.
-.
-.P
-If a suitable version exists at the target location in the tree
-already, then it will be left untouched, but the other duplicates will
-be deleted\.
-.
-.P
-If no suitable version can be found, then a warning is printed, and
-nothing is done\.
-.
-.P
-If any arguments are supplied, then they are filters, and only the
-named packages will be touched\.
-.
-.P
-Note that this operation transforms the dependency tree, and may
-result in packages getting updated versions, perhaps from the npm
-registry\.
-.
-.P
-This feature is experimental, and may change in future versions\.
-.
-.P
-The \fB\-\-tag\fR argument will apply to all of the affected dependencies\. If a
-tag with the given name exists, the tagged version is preferred over newer
-versions\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help ls
-.
-.IP "\(bu" 4
-npm help update
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-deprecate.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,48 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-DEPRECATE" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-deprecate\fR \-\- Deprecate a version of a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm deprecate <name>[@<version>] <message>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command will update the npm registry entry for a package, providing
-a deprecation warning to all who attempt to install it\.
-.
-.P
-It works on version ranges as well as specific versions, so you can do
-something like this:
-.
-.IP "" 4
-.
-.nf
-npm deprecate my\-thing@"< 0\.2\.3" "critical bug fixed in v0\.2\.3"
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Note that you must be the package owner to deprecate something\.  See the \fBowner\fR and \fBadduser\fR help topics\.
-.
-.P
-To un\-deprecate a package, specify an empty string (\fB""\fR) for the \fBmessage\fR argument\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-docs.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,74 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-DOCS" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-docs\fR \-\- Docs for a package in a web browser maybe
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm docs <pkgname>
-npm home <pkgname>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command tries to guess at the likely location of a package\'s
-documentation URL, and then tries to open it using the \fB\-\-browser\fR
-config param\.
-.
-.SH "CONFIGURATION"
-.
-.SS "browser"
-.
-.IP "\(bu" 4
-Default: OS X: \fB"open"\fR, Windows: \fB"start"\fR, Others: \fB"xdg\-open"\fR
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-The browser that is called by the \fBnpm docs\fR command to open websites\.
-.
-.SS "registry"
-.
-.IP "\(bu" 4
-Default: https://registry\.npmjs\.org/
-.
-.IP "\(bu" 4
-Type: url
-.
-.IP "" 0
-.
-.P
-The base URL of the npm package registry\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help view
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-edit.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,66 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-EDIT" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-edit\fR \-\- Edit an installed package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm edit <name>[@<version>]
-.
-.fi
-.
-.SH "DESCRIPTION"
-Opens the package folder in the default editor (or whatever you\'ve
-npm help  configured as the npm \fBeditor\fR config \-\- see \fBnpm\-config\fR\|\.)
-.
-.P
-After it has been edited, the package is rebuilt so as to pick up any
-changes in compiled packages\.
-.
-.P
-For instance, you can do \fBnpm install connect\fR to install connect
-into your package, and then \fBnpm edit connect\fR to make a few
-changes to your locally installed copy\.
-.
-.SH "CONFIGURATION"
-.
-.SS "editor"
-.
-.IP "\(bu" 4
-Default: \fBEDITOR\fR environment variable if set, or \fB"vi"\fR on Posix,
-or \fB"notepad"\fR on Windows\.
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The command to run for \fBnpm edit\fR or \fBnpm config edit\fR\|\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help explore
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-explore.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,76 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-EXPLORE" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-explore\fR \-\- Browse an installed package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm explore <name>[@<version>] [ \-\- <cmd>]
-.
-.fi
-.
-.SH "DESCRIPTION"
-Spawn a subshell in the directory of the installed package specified\.
-.
-.P
-If a command is specified, then it is run in the subshell, which then
-immediately terminates\.
-.
-.P
-This is particularly handy in the case of git submodules in the \fBnode_modules\fR folder:
-.
-.IP "" 4
-.
-.nf
-npm explore some\-dependency \-\- git pull origin master
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Note that the package is \fInot\fR automatically rebuilt afterwards, so be
-sure to use \fBnpm rebuild <pkg>\fR if you make any changes\.
-.
-.SH "CONFIGURATION"
-.
-.SS "shell"
-.
-.IP "\(bu" 4
-Default: SHELL environment variable, or "bash" on Posix, or "cmd" on
-Windows
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The shell to run for the \fBnpm explore\fR command\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help submodule
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help edit
-.
-.IP "\(bu" 4
-npm help rebuild
-.
-.IP "\(bu" 4
-npm help build
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-help-search.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-HELP\-SEARCH" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-help-search\fR \-\- Search npm help documentation
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm help\-search some search terms
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command will search the npm markdown documentation files for the
-terms provided, and then list the results, sorted by relevance\.
-.
-.P
-If only one result is found, then it will show that help topic\.
-.
-.P
-If the argument to \fBnpm help\fR is not a known help topic, then it will
-call \fBhelp\-search\fR\|\.  It is rarely if ever necessary to call this
-command directly\.
-.
-.SH "CONFIGURATION"
-.
-.SS "long"
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "\(bu" 4
-Default false
-.
-.IP "" 0
-.
-.P
-If true, the "long" flag will cause help\-search to output context around
-where the terms were found in the documentation\.
-.
-.P
-If false, then help\-search will just list out the help topics found\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help help
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-help.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,77 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-HELP" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-help\fR \-\- Get help on npm
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm help <topic>
-npm help some search terms
-.
-.fi
-.
-.SH "DESCRIPTION"
-If supplied a topic, then show the appropriate documentation page\.
-.
-.P
-If the topic does not exist, or if multiple terms are provided, then run
-the \fBhelp\-search\fR command to find a match\.  Note that, if \fBhelp\-search\fR
-finds a single subject, then it will run \fBhelp\fR on that topic, so unique
-matches are equivalent to specifying a topic name\.
-.
-.SH "CONFIGURATION"
-.
-.SS "viewer"
-.
-.IP "\(bu" 4
-Default: "man" on Posix, "browser" on Windows
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The program to use to view help content\.
-.
-.P
-Set to \fB"browser"\fR to view html help content in the default web browser\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "\(bu" 4
-README
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help help\-search
-.
-.IP "\(bu" 4
-npm help  index
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-init.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,43 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-INIT" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-init\fR \-\- Interactively create a package\.json file
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm init
-.
-.fi
-.
-.SH "DESCRIPTION"
-This will ask you a bunch of questions, and then write a package\.json for you\.
-.
-.P
-It attempts to make reasonable guesses about what you want things to be set to,
-and then writes a package\.json file with the options you\'ve selected\.
-.
-.P
-If you already have a package\.json file, it\'ll read that first, and default to
-the options in there\.
-.
-.P
-It is strictly additive, so it does not delete options from your package\.json
-without a really good reason to do so\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-\fIhttps://github\.com/isaacs/init\-package\-json\fR
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help version
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-install.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,430 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-INSTALL" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-install\fR \-\- Install a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm install (with no args in a package dir)
-npm install <tarball file>
-npm install <tarball url>
-npm install <folder>
-npm install <name> [\-\-save|\-\-save\-dev|\-\-save\-optional]
-npm install <name>@<tag>
-npm install <name>@<version>
-npm install <name>@<version range>
-npm i (with any of the previous argument usage)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command installs a package, and any packages that it depends on\. If the
-package has a shrinkwrap file, the installation of dependencies will be driven
-npm help by that\. See npm\-shrinkwrap\.
-.
-.P
-A \fBpackage\fR is:
-.
-.IP "\(bu" 4
-a) a folder containing a program described by a package\.json file
-.
-.IP "\(bu" 4
-b) a gzipped tarball containing (a)
-.
-.IP "\(bu" 4
-c) a url that resolves to (b)
-.
-.IP "\(bu" 4
-d) a \fB<name>@<version>\fR that is published on the registry with (c)
-.
-.IP "\(bu" 4
-e) a \fB<name>@<tag>\fR that points to (d)
-.
-.IP "\(bu" 4
-f) a \fB<name>\fR that has a "latest" tag satisfying (e)
-.
-.IP "\(bu" 4
-g) a \fB<git remote url>\fR that resolves to (b)
-.
-.IP "" 0
-.
-.P
-Even if you never publish your package, you can still get a lot of
-benefits of using npm if you just want to write a node program (a), and
-perhaps if you also want to be able to easily install it elsewhere
-after packing it up into a tarball (b)\.
-.
-.IP "\(bu" 4
-\fBnpm install\fR (in package directory, no arguments):
-.
-.IP
-Install the dependencies in the local node_modules folder\.
-.
-.IP
-In global mode (ie, with \fB\-g\fR or \fB\-\-global\fR appended to the command),
-it installs the current package context (ie, the current working
-directory) as a global package\.
-.
-.IP
-By default, \fBnpm install\fR will install all modules listed as
-dependencies\. With the \fB\-\-production\fR flag,
-npm will not install modules listed in \fBdevDependencies\fR\|\.
-.
-.IP "\(bu" 4
-\fBnpm install <folder>\fR:
-.
-.IP
-Install a package that is sitting in a folder on the filesystem\.
-.
-.IP "\(bu" 4
-\fBnpm install <tarball file>\fR:
-.
-.IP
-Install a package that is sitting on the filesystem\.  Note: if you just want
-to link a dev directory into your npm root, you can do this more easily by
-using \fBnpm link\fR\|\.
-.
-.IP
-Example:
-.
-.IP "" 4
-.
-.nf
-  npm install \./package\.tgz
-.
-.fi
-.
-.IP "" 0
-
-.
-.IP "\(bu" 4
-\fBnpm install <tarball url>\fR:
-.
-.IP
-Fetch the tarball url, and then install it\.  In order to distinguish between
-this and other options, the argument must start with "http://" or "https://"
-.
-.IP
-Example:
-.
-.IP "" 4
-.
-.nf
-  npm install https://github\.com/indexzero/forever/tarball/v0\.5\.6
-.
-.fi
-.
-.IP "" 0
-
-.
-.IP "\(bu" 4
-\fBnpm install <name> [\-\-save|\-\-save\-dev|\-\-save\-optional]\fR:
-.
-.IP
-Do a \fB<name>@<tag>\fR install, where \fB<tag>\fR is the "tag" config\. (npm help  See \fBnpm\-config\fR\|\.)
-.
-.IP
-In most cases, this will install the latest version
-of the module published on npm\.
-.
-.IP
-Example:
-.
-.IP
-      npm install sax
-.
-.IP
-\fBnpm install\fR takes 3 exclusive, optional flags which save or update
-the package version in your main package\.json:
-.
-.IP "\(bu" 4
-\fB\-\-save\fR: Package will appear in your \fBdependencies\fR\|\.
-.
-.IP "\(bu" 4
-\fB\-\-save\-dev\fR: Package will appear in your \fBdevDependencies\fR\|\.
-.
-.IP "\(bu" 4
-\fB\-\-save\-optional\fR: Package will appear in your \fBoptionalDependencies\fR\|\.
-.
-.IP
-Examples:
-.
-.IP
-  npm install sax \-\-save
-  npm install node\-tap \-\-save\-dev
-  npm install dtrace\-provider \-\-save\-optional
-.
-.IP
-\fBNote\fR: If there is a file or folder named \fB<name>\fR in the current
-working directory, then it will try to install that, and only try to
-fetch the package by name if it is not valid\.
-.
-.IP "" 0
-
-.
-.IP "\(bu" 4
-\fBnpm install <name>@<tag>\fR:
-.
-.IP
-Install the version of the package that is referenced by the specified tag\.
-If the tag does not exist in the registry data for that package, then this
-will fail\.
-.
-.IP
-Example:
-.
-.IP "" 4
-.
-.nf
-  npm install sax@latest
-.
-.fi
-.
-.IP "" 0
-
-.
-.IP "\(bu" 4
-\fBnpm install <name>@<version>\fR:
-.
-.IP
-Install the specified version of the package\.  This will fail if the version
-has not been published to the registry\.
-.
-.IP
-Example:
-.
-.IP "" 4
-.
-.nf
-  npm install sax@0\.1\.1
-.
-.fi
-.
-.IP "" 0
-
-.
-.IP "\(bu" 4
-\fBnpm install <name>@<version range>\fR:
-.
-.IP
-Install a version of the package matching the specified version range\.  This
-npm help  will follow the same rules for resolving dependencies described in \fBpackage\.json\fR\|\.
-.
-.IP
-Note that most version ranges must be put in quotes so that your shell will
-treat it as a single argument\.
-.
-.IP
-Example:
-.
-.IP
-      npm install sax@">=0\.1\.0 <0\.2\.0"
-.
-.IP "\(bu" 4
-\fBnpm install <git remote url>\fR:
-.
-.IP
-Install a package by cloning a git remote url\.  The format of the git
-url is:
-.
-.IP
-      <protocol>://[<user>@]<hostname><separator><path>[#<commit\-ish>]
-.
-.IP
-\fB<protocol>\fR is one of \fBgit\fR, \fBgit+ssh\fR, \fBgit+http\fR, or \fBgit+https\fR\|\.  If no \fB<commit\-ish>\fR is specified, then \fBmaster\fR is
-used\.
-.
-.IP
-Examples:
-.
-.IP "" 4
-.
-.nf
-  git+ssh://git@github\.com:isaacs/npm\.git#v1\.0\.27
-  git+https://isaacs@github\.com/isaacs/npm\.git
-  git://github\.com/isaacs/npm\.git#v1\.0\.27
-.
-.fi
-.
-.IP "" 0
-
-.
-.IP "" 0
-.
-.P
-You may combine multiple arguments, and even multiple types of arguments\.
-For example:
-.
-.IP "" 4
-.
-.nf
-npm install sax@">=0\.1\.0 <0\.2\.0" bench supervisor
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The \fB\-\-tag\fR argument will apply to all of the specified install targets\. If a
-tag with the given name exists, the tagged version is preferred over newer
-versions\.
-.
-.P
-The \fB\-\-force\fR argument will force npm to fetch remote resources even if a
-local copy exists on disk\.
-.
-.IP "" 4
-.
-.nf
-npm install sax \-\-force
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The \fB\-\-global\fR argument will cause npm to install the package globally
-npm help  rather than locally\.  See \fBnpm\-folders\fR\|\.
-.
-.P
-The \fB\-\-link\fR argument will cause npm to link global installs into the
-local space in some cases\.
-.
-.P
-The \fB\-\-no\-bin\-links\fR argument will prevent npm from creating symlinks for
-any binaries the package might contain\.
-.
-.P
-The \fB\-\-no\-shrinkwrap\fR argument, which will ignore an available
-shrinkwrap file and use the package\.json instead\.
-.
-.P
-The \fB\-\-nodedir=/path/to/node/source\fR argument will allow npm to find the
-node source code so that npm can compile native modules\.
-.
-.P
-npm help  See \fBnpm\-config\fR\|\.  Many of the configuration params have some
-effect on installation, since that\'s most of what npm does\.
-.
-.SH "ALGORITHM"
-To install a package, npm uses the following algorithm:
-.
-.IP "" 4
-.
-.nf
-install(where, what, family, ancestors)
-fetch what, unpack to <where>/node_modules/<what>
-for each dep in what\.dependencies
-  resolve dep to precise version
-for each dep@version in what\.dependencies
-    not in <where>/node_modules/<what>/node_modules/*
-    and not in <family>
-  add precise version deps to <family>
-  install(<where>/node_modules/<what>, dep, family)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-For this \fBpackage{dep}\fR structure: \fBA{B,C}, B{C}, C{D}\fR,
-this algorithm produces:
-.
-.IP "" 4
-.
-.nf
-A
-+\-\- B
-`\-\- C
-    `\-\- D
-.
-.fi
-.
-.IP "" 0
-.
-.P
-That is, the dependency from B to C is satisfied by the fact that A
-already caused C to be installed at a higher level\.
-.
-.P
-npm help  See npm\-folders for a more detailed description of the specific
-folder structures that npm creates\.
-.
-.SS "Limitations of npm&#39;s Install Algorithm"
-There are some very rare and pathological edge\-cases where a cycle can
-cause npm to try to install a never\-ending tree of packages\.  Here is
-the simplest case:
-.
-.IP "" 4
-.
-.nf
-A \-> B \-> A\' \-> B\' \-> A \-> B \-> A\' \-> B\' \-> A \-> \.\.\.
-.
-.fi
-.
-.IP "" 0
-.
-.P
-where \fBA\fR is some version of a package, and \fBA\'\fR is a different version
-of the same package\.  Because \fBB\fR depends on a different version of \fBA\fR
-than the one that is already in the tree, it must install a separate
-copy\.  The same is true of \fBA\'\fR, which must install \fBB\'\fR\|\.  Because \fBB\'\fR
-depends on the original version of \fBA\fR, which has been overridden, the
-cycle falls into infinite regress\.
-.
-.P
-To avoid this situation, npm flat\-out refuses to install any \fBname@version\fR that is already present anywhere in the tree of package
-folder ancestors\.  A more correct, but more complex, solution would be
-to symlink the existing version into the new location\.  If this ever
-affects a real use\-case, it will be investigated\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help update
-.
-.IP "\(bu" 4
-npm help link
-.
-.IP "\(bu" 4
-npm help rebuild
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm help build
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help tag
-.
-.IP "\(bu" 4
-npm help rm
-.
-.IP "\(bu" 4
-npm help shrinkwrap
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-link.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,119 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-LINK" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-link\fR \-\- Symlink a package folder
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm link (in package folder)
-npm link <pkgname>
-npm ln (with any of the previous argument usage)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Package linking is a two\-step process\.
-.
-.P
-First, \fBnpm link\fR in a package folder will create a globally\-installed
-symbolic link from \fBprefix/package\-name\fR to the current folder\.
-.
-.P
-Next, in some other location, \fBnpm link package\-name\fR will create a
-symlink from the local \fBnode_modules\fR folder to the global symlink\.
-.
-.P
-Note that \fBpackage\-name\fR is taken from \fBpackage\.json\fR,
-not from directory name\.
-.
-.P
-When creating tarballs for \fBnpm publish\fR, the linked packages are
-"snapshotted" to their current state by resolving the symbolic links\.
-.
-.P
-This is
-handy for installing your own stuff, so that you can work on it and test it
-iteratively without having to continually rebuild\.
-.
-.P
-For example:
-.
-.IP "" 4
-.
-.nf
-cd ~/projects/node\-redis    # go into the package directory
-npm link                    # creates global link
-cd ~/projects/node\-bloggy   # go into some other package directory\.
-npm link redis              # link\-install the package
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Now, any changes to ~/projects/node\-redis will be reflected in
-~/projects/node\-bloggy/node_modules/redis/
-.
-.P
-You may also shortcut the two steps in one\.  For example, to do the
-above use\-case in a shorter way:
-.
-.IP "" 4
-.
-.nf
-cd ~/projects/node\-bloggy  # go into the dir of your main project
-npm link \.\./node\-redis     # link the dir of your dependency
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The second line is the equivalent of doing:
-.
-.IP "" 4
-.
-.nf
-(cd \.\./node\-redis; npm link)
-npm link redis
-.
-.fi
-.
-.IP "" 0
-.
-.P
-That is, it first creates a global link, and then links the global
-installation target into your project\'s \fBnode_modules\fR folder\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  developers
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-ls.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,136 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-LS" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-ls\fR \-\- List installed packages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm list [<pkg> \.\.\.]
-npm ls [<pkg> \.\.\.]
-npm la [<pkg> \.\.\.]
-npm ll [<pkg> \.\.\.]
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command will print to stdout all the versions of packages that are
-installed, as well as their dependencies, in a tree\-structure\.
-.
-.P
-Positional arguments are \fBname@version\-range\fR identifiers, which will
-limit the results to only the paths to the packages named\.  Note that
-nested packages will \fIalso\fR show the paths to the specified packages\.
-For example, running \fBnpm ls promzard\fR in npm\'s source tree will show:
-.
-.IP "" 4
-.
-.nf
-npm@1.3.14 /path/to/npm
-└─┬ init\-package\-json@0\.0\.4
-  └── promzard@0\.1\.5
-.
-.fi
-.
-.IP "" 0
-.
-.P
-It will print out extraneous, missing, and invalid packages\.
-.
-.P
-If a project specifies git urls for dependencies these are shown
-in parentheses after the name@version to make it easier for users to
-recognize potential forks of a project\.
-.
-.P
-When run as \fBll\fR or \fBla\fR, it shows extended information by default\.
-.
-.SH "CONFIGURATION"
-.
-.SS "json"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Show information in JSON format\.
-.
-.SS "long"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Show extended information\.
-.
-.SS "parseable"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Show parseable output instead of tree view\.
-.
-.SS "global"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-List packages in the global install prefix instead of in the current
-project\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help link
-.
-.IP "\(bu" 4
-npm help prune
-.
-.IP "\(bu" 4
-npm help outdated
-.
-.IP "\(bu" 4
-npm help update
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-outdated.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-OUTDATED" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-outdated\fR \-\- Check for outdated packages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm outdated [<name> [<name> \.\.\.]]
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command will check the registry to see if any (or, specific) installed
-packages are currently outdated\.
-.
-.P
-The resulting field \'wanted\' shows the latest version according to the
-version specified in the package\.json, the field \'latest\' the very latest
-version of the package\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help update
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-owner.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,58 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-OWNER" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-owner\fR \-\- Manage package owners
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm owner ls <package name>
-npm owner add <user> <package name>
-npm owner rm <user> <package name>
-.
-.fi
-.
-.SH "DESCRIPTION"
-Manage ownership of published packages\.
-.
-.IP "\(bu" 4
-ls:
-List all the users who have access to modify a package and push new versions\.
-Handy when you need to know who to bug for help\.
-.
-.IP "\(bu" 4
-add:
-Add a new user as a maintainer of a package\.  This user is enabled to modify
-metadata, publish new versions, and add other owners\.
-.
-.IP "\(bu" 4
-rm:
-Remove a user from the package owner list\.  This immediately revokes their
-privileges\.
-.
-.IP "" 0
-.
-.P
-Note that there is only one level of access\.  Either you can modify a package,
-or you can\'t\.  Future versions may contain more fine\-grained access levels, but
-that is not implemented at this time\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help adduser
-.
-.IP "\(bu" 4
-npm help  disputes
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-pack.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,48 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-PACK" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-pack\fR \-\- Create a tarball from a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm pack [<pkg> [<pkg> \.\.\.]]
-.
-.fi
-.
-.SH "DESCRIPTION"
-For anything that\'s installable (that is, a package folder, tarball,
-tarball url, name@tag, name@version, or name), this command will fetch
-it to the cache, and then copy the tarball to the current working
-directory as \fB<name>\-<version>\.tgz\fR, and then write the filenames out to
-stdout\.
-.
-.P
-If the same package is specified multiple times, then the file will be
-overwritten the second time\.
-.
-.P
-If no arguments are supplied, then npm packs the current package folder\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help cache
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-prefix.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-PREFIX" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-prefix\fR \-\- Display prefix
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm prefix
-.
-.fi
-.
-.SH "DESCRIPTION"
-Print the prefix to standard out\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help root
-.
-.IP "\(bu" 4
-npm help bin
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-prune.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-PRUNE" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-prune\fR \-\- Remove extraneous packages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm prune [<name> [<name \.\.\.]]
-npm prune [<name> [<name \.\.\.]] [\-\-production]
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command removes "extraneous" packages\.  If a package name is
-provided, then only packages matching one of the supplied names are
-removed\.
-.
-.P
-Extraneous packages are packages that are not listed on the parent
-package\'s dependencies list\.
-.
-.P
-If the \fB\-\-production\fR flag is specified, this command will remove the
-packages specified in your \fBdevDependencies\fR\|\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help rm
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help ls
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-publish.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-PUBLISH" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-publish\fR \-\- Publish a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm publish <tarball>
-npm publish <folder>
-.
-.fi
-.
-.SH "DESCRIPTION"
-Publishes a package to the registry so that it can be installed by name\.
-.
-.IP "\(bu" 4
-\fB<folder>\fR:
-A folder containing a package\.json file
-.
-.IP "\(bu" 4
-\fB<tarball>\fR:
-A url or file path to a gzipped tar archive containing a single folder
-with a package\.json file inside\.
-.
-.IP "" 0
-.
-.P
-Fails if the package name and version combination already exists in
-the registry\.  Overwrites when the "\-\-force" flag is set\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help adduser
-.
-.IP "\(bu" 4
-npm help owner
-.
-.IP "\(bu" 4
-npm help deprecate
-.
-.IP "\(bu" 4
-npm help tag
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-rebuild.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-REBUILD" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-rebuild\fR \-\- Rebuild a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm rebuild [<name> [<name> \.\.\.]]
-npm rb [<name> [<name> \.\.\.]]
-.
-.fi
-.
-.IP "\(bu" 4
-\fB<name>\fR:
-The package to rebuild
-.
-.IP "" 0
-.
-.SH "DESCRIPTION"
-This command runs the \fBnpm build\fR command on the matched folders\.  This is useful
-when you install a new version of node, and must recompile all your C++ addons with
-the new binary\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help build
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-restart.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-RESTART" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-restart\fR \-\- Start a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm restart <name>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs a package\'s "restart" script, if one was provided\.
-Otherwise it runs package\'s "stop" script, if one was provided, and then
-the "start" script\.
-.
-.P
-If no version is specified, then it restarts the "active" version\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help run\-script
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm help test
-.
-.IP "\(bu" 4
-npm help start
-.
-.IP "\(bu" 4
-npm help stop
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-rm.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,44 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-RM" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-rm\fR \-\- Remove a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm rm <name>
-npm r <name>
-npm uninstall <name>
-npm un <name>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This uninstalls a package, completely removing everything npm installed
-on its behalf\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help prune
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-root.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-ROOT" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-root\fR \-\- Display npm root
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm root
-.
-.fi
-.
-.SH "DESCRIPTION"
-Print the effective \fBnode_modules\fR folder to standard out\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help prefix
-.
-.IP "\(bu" 4
-npm help bin
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-run-script.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,41 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-RUN\-SCRIPT" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-run-script\fR \-\- Run arbitrary package scripts
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm run\-script <script> <name>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs an arbitrary command from a package\'s "scripts" object\.
-.
-.P
-It is used by the test, start, restart, and stop commands, but can be
-called directly, as well\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm help test
-.
-.IP "\(bu" 4
-npm help start
-.
-.IP "\(bu" 4
-npm help restart
-.
-.IP "\(bu" 4
-npm help stop
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-search.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,44 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-SEARCH" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-search\fR \-\- Search for packages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm search [search terms \.\.\.]
-npm s [search terms \.\.\.]
-npm se [search terms \.\.\.]
-.
-.fi
-.
-.SH "DESCRIPTION"
-Search the registry for packages matching the search terms\.
-.
-.P
-If a term starts with \fB/\fR, then it\'s interpreted as a regular expression\.
-A trailing \fB/\fR will be ignored in this case\.  (Note that many regular
-expression characters must be escaped or quoted in most shells\.)
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help view
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-shrinkwrap.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,275 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-SHRINKWRAP" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-shrinkwrap\fR \-\- Lock down dependency versions
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm shrinkwrap
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command locks down the versions of a package\'s dependencies so
-that you can control exactly which versions of each dependency will be
-used when your package is installed\. The "package\.json" file is still
-required if you want to use "npm install"\.
-.
-.P
-By default, "npm install" recursively installs the target\'s
-dependencies (as specified in package\.json), choosing the latest
-available version that satisfies the dependency\'s semver pattern\. In
-some situations, particularly when shipping software where each change
-is tightly managed, it\'s desirable to fully specify each version of
-each dependency recursively so that subsequent builds and deploys do
-not inadvertently pick up newer versions of a dependency that satisfy
-the semver pattern\. Specifying specific semver patterns in each
-dependency\'s package\.json would facilitate this, but that\'s not always
-possible or desirable, as when another author owns the npm package\.
-It\'s also possible to check dependencies directly into source control,
-but that may be undesirable for other reasons\.
-.
-.P
-As an example, consider package A:
-.
-.IP "" 4
-.
-.nf
-{
-  "name": "A",
-  "version": "0\.1\.0",
-  "dependencies": {
-    "B": "<0\.1\.0"
-  }
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-package B:
-.
-.IP "" 4
-.
-.nf
-{
-  "name": "B",
-  "version": "0\.0\.1",
-  "dependencies": {
-    "C": "<0\.1\.0"
-  }
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-and package C:
-.
-.IP "" 4
-.
-.nf
-{
-  "name": "C,
-  "version": "0\.0\.1"
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If these are the only versions of A, B, and C available in the
-registry, then a normal "npm install A" will install:
-.
-.IP "" 4
-.
-.nf
-A@0\.1\.0
-`\-\- B@0\.0\.1
-    `\-\- C@0\.0\.1
-.
-.fi
-.
-.IP "" 0
-.
-.P
-However, if B@0\.0\.2 is published, then a fresh "npm install A" will
-install:
-.
-.IP "" 4
-.
-.nf
-A@0\.1\.0
-`\-\- B@0\.0\.2
-    `\-\- C@0\.0\.1
-.
-.fi
-.
-.IP "" 0
-.
-.P
-assuming the new version did not modify B\'s dependencies\. Of course,
-the new version of B could include a new version of C and any number
-of new dependencies\. If such changes are undesirable, the author of A
-could specify a dependency on B@0\.0\.1\. However, if A\'s author and B\'s
-author are not the same person, there\'s no way for A\'s author to say
-that he or she does not want to pull in newly published versions of C
-when B hasn\'t changed at all\.
-.
-.P
-In this case, A\'s author can run
-.
-.IP "" 4
-.
-.nf
-npm shrinkwrap
-.
-.fi
-.
-.IP "" 0
-.
-.P
-This generates npm\-shrinkwrap\.json, which will look something like this:
-.
-.IP "" 4
-.
-.nf
-{
-  "name": "A",
-  "version": "0\.1\.0",
-  "dependencies": {
-    "B": {
-      "version": "0\.0\.1",
-      "dependencies": {
-        "C": {
-          "version": "0\.1\.0"
-        }
-      }
-    }
-  }
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The shrinkwrap command has locked down the dependencies based on
-what\'s currently installed in node_modules\.  When "npm install"
-installs a package with a npm\-shrinkwrap\.json file in the package
-root, the shrinkwrap file (rather than package\.json files) completely
-drives the installation of that package and all of its dependencies
-(recursively)\.  So now the author publishes A@0\.1\.0, and subsequent
-installs of this package will use B@0\.0\.1 and C@0\.1\.0, regardless the
-dependencies and versions listed in A\'s, B\'s, and C\'s package\.json
-files\.
-.
-.SS "Using shrinkwrapped packages"
-Using a shrinkwrapped package is no different than using any other
-package: you can "npm install" it by hand, or add a dependency to your
-package\.json file and "npm install" it\.
-.
-.SS "Building shrinkwrapped packages"
-To shrinkwrap an existing package:
-.
-.IP "1" 4
-Run "npm install" in the package root to install the current
-versions of all dependencies\.
-.
-.IP "2" 4
-Validate that the package works as expected with these versions\.
-.
-.IP "3" 4
-Run "npm shrinkwrap", add npm\-shrinkwrap\.json to git, and publish
-your package\.
-.
-.IP "" 0
-.
-.P
-To add or update a dependency in a shrinkwrapped package:
-.
-.IP "1" 4
-Run "npm install" in the package root to install the current
-versions of all dependencies\.
-.
-.IP "2" 4
-Add or update dependencies\. "npm install" each new or updated
-package individually and then update package\.json\.  Note that they
-must be explicitly named in order to be installed: running \fBnpm
-install\fR with no arguments will merely reproduce the existing
-shrinkwrap\.
-.
-.IP "3" 4
-Validate that the package works as expected with the new
-dependencies\.
-.
-.IP "4" 4
-Run "npm shrinkwrap", commit the new npm\-shrinkwrap\.json, and
-publish your package\.
-.
-.IP "" 0
-.
-.P
-npm help You can use npm\-outdated to view dependencies with newer versions
-available\.
-.
-.SS "Other Notes"
-A shrinkwrap file must be consistent with the package\'s package\.json
-file\. "npm shrinkwrap" will fail if required dependencies are not
-already installed, since that would result in a shrinkwrap that
-wouldn\'t actually work\. Similarly, the command will fail if there are
-extraneous packages (not referenced by package\.json), since that would
-indicate that package\.json is not correct\.
-.
-.P
-Since "npm shrinkwrap" is intended to lock down your dependencies for
-production use, \fBdevDependencies\fR will not be included unless you
-explicitly set the \fB\-\-dev\fR flag when you run \fBnpm shrinkwrap\fR\|\.  If
-installed \fBdevDependencies\fR are excluded, then npm will print a
-warning\.  If you want them to be installed with your module by
-default, please consider adding them to \fBdependencies\fR instead\.
-.
-.P
-If shrinkwrapped package A depends on shrinkwrapped package B, B\'s
-shrinkwrap will not be used as part of the installation of A\. However,
-because A\'s shrinkwrap is constructed from a valid installation of B
-and recursively specifies all dependencies, the contents of B\'s
-shrinkwrap will implicitly be included in A\'s shrinkwrap\.
-.
-.SS "Caveats"
-Shrinkwrap files only lock down package versions, not actual package
-contents\.  While discouraged, a package author can republish an
-existing version of a package, causing shrinkwrapped packages using
-that version to pick up different code than they were before\. If you
-want to avoid any risk that a byzantine author replaces a package
-you\'re using with code that breaks your application, you could modify
-the shrinkwrap file to use git URL references rather than version
-numbers so that npm always fetches all packages from git\.
-.
-.P
-If you wish to lock down the specific bytes included in a package, for
-example to have 100% confidence in being able to reproduce a
-deployment or build, then you ought to check your dependencies into
-source control, or pursue some other mechanism that can verify
-contents rather than versions\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help ls
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-star.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-STAR" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-star\fR \-\- Mark your favorite packages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm star <pkgname> [<pkg>, \.\.\.]
-npm unstar <pkgname> [<pkg>, \.\.\.]
-.
-.fi
-.
-.SH "DESCRIPTION"
-"Starring" a package means that you have some interest in it\.  It\'s
-a vaguely positive way to show that you care\.
-.
-.P
-"Unstarring" is the same thing, but in reverse\.
-.
-.P
-It\'s a boolean thing\.  Starring repeatedly has no additional effect\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help view
-.
-.IP "\(bu" 4
-npm help whoami
-.
-.IP "\(bu" 4
-npm help adduser
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-stars.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-STARS" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-stars\fR \-\- View packages marked as favorites
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm stars
-npm stars [username]
-.
-.fi
-.
-.SH "DESCRIPTION"
-If you have starred a lot of neat things and want to find them again
-quickly this command lets you do just that\.
-.
-.P
-You may also want to see your friend\'s favorite packages, in this case
-you will most certainly enjoy this command\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help star
-.
-.IP "\(bu" 4
-npm help view
-.
-.IP "\(bu" 4
-npm help whoami
-.
-.IP "\(bu" 4
-npm help adduser
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-start.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-START" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-start\fR \-\- Start a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm start <name>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs a package\'s "start" script, if one was provided\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help run\-script
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm help test
-.
-.IP "\(bu" 4
-npm help restart
-.
-.IP "\(bu" 4
-npm help stop
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-stop.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-STOP" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-stop\fR \-\- Stop a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm stop <name>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs a package\'s "stop" script, if one was provided\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help run\-script
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm help test
-.
-.IP "\(bu" 4
-npm help start
-.
-.IP "\(bu" 4
-npm help restart
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-submodule.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-SUBMODULE" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-submodule\fR \-\- Add a package as a git submodule
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm submodule <pkg>
-.
-.fi
-.
-.SH "DESCRIPTION"
-If the specified package has a git repository url in its package\.json
-description, then this command will add it as a git submodule at \fBnode_modules/<pkg name>\fR\|\.
-.
-.P
-This is a convenience only\.  From then on, it\'s up to you to manage
-updates by using the appropriate git commands\.  npm will stubbornly
-refuse to update, modify, or remove anything with a \fB\|\.git\fR subfolder
-in it\.
-.
-.P
-This command also does not install missing dependencies, if the package
-does not include them in its git repository\.  If \fBnpm ls\fR reports that
-things are missing, you can either install, link, or submodule them yourself,
-or you can do \fBnpm explore <pkgname> \-\- npm install\fR to install the
-dependencies into the submodule folder\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-git help submodule
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-tag.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,74 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-TAG" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-tag\fR \-\- Tag a published version
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm tag <name>@<version> [<tag>]
-.
-.fi
-.
-.SH "DESCRIPTION"
-Tags the specified version of the package with the specified tag, or the \fB\-\-tag\fR config if not specified\.
-.
-.P
-A tag can be used when installing packages as a reference to a version instead
-of using a specific version number:
-.
-.IP "" 4
-.
-.nf
-npm install <name>@<tag>
-.
-.fi
-.
-.IP "" 0
-.
-.P
-When installing dependencies, a preferred tagged version may be specified:
-.
-.IP "" 4
-.
-.nf
-npm install \-\-tag <tag>
-.
-.fi
-.
-.IP "" 0
-.
-.P
-This also applies to \fBnpm dedupe\fR\|\.
-.
-.P
-Publishing a package always sets the "latest" tag to the published version\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help dedupe
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-test.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-TEST" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-test\fR \-\- Test a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-  npm test <name>
-  npm tst <name>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs a package\'s "test" script, if one was provided\.
-.
-.P
-To run tests as a condition of installation, set the \fBnpat\fR config to
-true\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help run\-script
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm help start
-.
-.IP "\(bu" 4
-npm help restart
-.
-.IP "\(bu" 4
-npm help stop
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-uninstall.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-RM" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-rm\fR \-\- Remove a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm rm <name>
-npm uninstall <name>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This uninstalls a package, completely removing everything npm installed
-on its behalf\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help prune
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-unpublish.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-UNPUBLISH" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-unpublish\fR \-\- Remove a package from the registry
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm unpublish <name>[@<version>]
-.
-.fi
-.
-.SH "WARNING"
-\fBIt is generally considered bad behavior to remove versions of a library
-that others are depending on!\fR
-.
-.P
-Consider using the \fBdeprecate\fR command
-instead, if your intent is to encourage users to upgrade\.
-.
-.P
-There is plenty of room on the registry\.
-.
-.SH "DESCRIPTION"
-This removes a package version from the registry, deleting its
-entry and removing the tarball\.
-.
-.P
-If no version is specified, or if all versions are removed then
-the root package entry is removed from the registry entirely\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help deprecate
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help adduser
-.
-.IP "\(bu" 4
-npm help owner
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-update.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,45 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-UPDATE" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-update\fR \-\- Update a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm update [\-g] [<name> [<name> \.\.\.]]
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command will update all the packages listed to the latest version
-(specified by the \fBtag\fR config)\.
-.
-.P
-It will also install missing packages\.
-.
-.P
-If the \fB\-g\fR flag is specified, this command will update globally installed packages\.
-If no package name is specified, all packages in the specified location (global or local) will be updated\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help outdated
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help ls
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-version.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,75 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-VERSION" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-version\fR \-\- Bump a package version
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm version [<newversion> | major | minor | patch | build]
-.
-.fi
-.
-.SH "DESCRIPTION"
-Run this in a package directory to bump the version and write the new
-data back to the package\.json file\.
-.
-.P
-The \fBnewversion\fR argument should be a valid semver string, \fIor\fR a valid
-second argument to semver\.inc (one of "build", "patch", "minor", or
-"major")\. In the second case, the existing version will be incremented
-by 1 in the specified field\.
-.
-.P
-If run in a git repo, it will also create a version commit and tag, and
-fail if the repo is not clean\.
-.
-.P
-If supplied with \fB\-\-message\fR (shorthand: \fB\-m\fR) config option, npm will
-use it as a commit message when creating a version commit\.  If the \fBmessage\fR config contains \fB%s\fR then that will be replaced with the
-resulting version number\.  For example:
-.
-.IP "" 4
-.
-.nf
-npm version patch \-m "Upgrade to %s for reasons"
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If the \fBsign\-git\-tag\fR config is set, then the tag will be signed using
-the \fB\-s\fR flag to git\.  Note that you must have a default GPG key set up
-in your git config for this to work properly\.  For example:
-.
-.IP "" 4
-.
-.nf
-$ npm config set sign\-git\-tag true
-$ npm version patch
-You need a passphrase to unlock the secret key for
-user: "isaacs (http://blog\.izs\.me/) <i@izs\.me>"
-2048\-bit RSA key, ID 6C481CF6, created 2010\-08\-31
-Enter passphrase:
-.
-.fi
-.
-.IP "" 0
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help init
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help  semver
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-view.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,186 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-VIEW" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-view\fR \-\- View registry info
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm view <name>[@<version>] [<field>[\.<subfield>]\.\.\.]
-npm v <name>[@<version>] [<field>[\.<subfield>]\.\.\.]
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command shows data about a package and prints it to the stream
-referenced by the \fBoutfd\fR config, which defaults to stdout\.
-.
-.P
-To show the package registry entry for the \fBconnect\fR package, you can do
-this:
-.
-.IP "" 4
-.
-.nf
-npm view connect
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The default version is "latest" if unspecified\.
-.
-.P
-Field names can be specified after the package descriptor\.
-For example, to show the dependencies of the \fBronn\fR package at version
-0\.3\.5, you could do the following:
-.
-.IP "" 4
-.
-.nf
-npm view ronn@0\.3\.5 dependencies
-.
-.fi
-.
-.IP "" 0
-.
-.P
-You can view child field by separating them with a period\.
-To view the git repository URL for the latest version of npm, you could
-do this:
-.
-.IP "" 4
-.
-.nf
-npm view npm repository\.url
-.
-.fi
-.
-.IP "" 0
-.
-.P
-This makes it easy to view information about a dependency with a bit of
-shell scripting\.  For example, to view all the data about the version of
-opts that ronn depends on, you can do this:
-.
-.IP "" 4
-.
-.nf
-npm view opts@$(npm view ronn dependencies\.opts)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-For fields that are arrays, requesting a non\-numeric field will return
-all of the values from the objects in the list\.  For example, to get all
-the contributor names for the "express" project, you can do this:
-.
-.IP "" 4
-.
-.nf
-npm view express contributors\.email
-.
-.fi
-.
-.IP "" 0
-.
-.P
-You may also use numeric indices in square braces to specifically select
-an item in an array field\.  To just get the email address of the first
-contributor in the list, you can do this:
-.
-.IP "" 4
-.
-.nf
-npm view express contributors[0]\.email
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Multiple fields may be specified, and will be printed one after another\.
-For exampls, to get all the contributor names and email addresses, you
-can do this:
-.
-.IP "" 4
-.
-.nf
-npm view express contributors\.name contributors\.email
-.
-.fi
-.
-.IP "" 0
-.
-.P
-"Person" fields are shown as a string if they would be shown as an
-object\.  So, for example, this will show the list of npm contributors in
-the shortened string format\.  (npm help  See \fBpackage\.json\fR for more on this\.)
-.
-.IP "" 4
-.
-.nf
-npm view npm contributors
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If a version range is provided, then data will be printed for every
-matching version of the package\.  This will show which version of jsdom
-was required by each matching version of yui3:
-.
-.IP "" 4
-.
-.nf
-npm view yui3@\'>0\.5\.4\' dependencies\.jsdom
-.
-.fi
-.
-.IP "" 0
-.
-.SH "OUTPUT"
-If only a single string field for a single version is output, then it
-will not be colorized or quoted, so as to enable piping the output to
-another command\. If the field is an object, it will be output as a JavaScript object literal\.
-.
-.P
-If the \-\-json flag is given, the outputted fields will be JSON\.
-.
-.P
-If the version range matches multiple versions, than each printed value
-will be prefixed with the version it applies to\.
-.
-.P
-If multiple fields are requested, than each of them are prefixed with
-the field name\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help search
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help docs
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm-whoami.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,34 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-WHOAMI" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-whoami\fR \-\- Display npm username
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm whoami
-.
-.fi
-.
-.SH "DESCRIPTION"
-Print the \fBusername\fR config to standard output\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help adduser
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/npm.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,233 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm\fR \-\- node package manager
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm <command> [args]
-.
-.fi
-.
-.SH "VERSION"
-1.3.14
-.
-.SH "DESCRIPTION"
-npm is the package manager for the Node JavaScript platform\.  It puts
-modules in place so that node can find them, and manages dependency
-conflicts intelligently\.
-.
-.P
-It is extremely configurable to support a wide variety of use cases\.
-Most commonly, it is used to publish, discover, install, and develop node
-programs\.
-.
-.P
-Run \fBnpm help\fR to get a list of available commands\.
-.
-.SH "INTRODUCTION"
-You probably got npm because you want to install stuff\.
-.
-.P
-npm help Use \fBnpm install blerg\fR to install the latest version of "blerg"\.  Check out \fBnpm\-install\fR for more info\.  It can do a lot of stuff\.
-.
-.P
-Use the \fBnpm search\fR command to show everything that\'s available\.
-Use \fBnpm ls\fR to show everything you\'ve installed\.
-.
-.SH "DIRECTORIES"
-npm help  See \fBnpm\-folders\fR to learn about where npm puts stuff\.
-.
-.P
-In particular, npm has two modes of operation:
-.
-.IP "\(bu" 4
-global mode:
-.
-.br
-npm installs packages into the install prefix at \fBprefix/lib/node_modules\fR and bins are installed in \fBprefix/bin\fR\|\.
-.
-.IP "\(bu" 4
-local mode:
-.
-.br
-npm installs packages into the current project directory, which
-defaults to the current working directory\.  Packages are installed to \fB\|\./node_modules\fR, and bins are installed to \fB\|\./node_modules/\.bin\fR\|\.
-.
-.IP "" 0
-.
-.P
-Local mode is the default\.  Use \fB\-\-global\fR or \fB\-g\fR on any command to
-operate in global mode instead\.
-.
-.SH "DEVELOPER USAGE"
-If you\'re using npm to develop and publish your code, check out the
-following help topics:
-.
-.IP "\(bu" 4
-json:
-npm help  Make a package\.json file\.  See \fBpackage\.json\fR\|\.
-.
-.IP "\(bu" 4
-link:
-For linking your current working code into Node\'s path, so that you
-don\'t have to reinstall every time you make a change\.  Use \fBnpm link\fR to do this\.
-.
-.IP "\(bu" 4
-install:
-It\'s a good idea to install things if you don\'t need the symbolic link\.
-Especially, installing other peoples code from the registry is done via \fBnpm install\fR
-.
-.IP "\(bu" 4
-adduser:
-Create an account or log in\.  Credentials are stored in the
-user config file\.
-.
-.IP "\(bu" 4
-publish:
-Use the \fBnpm publish\fR command to upload your code to the registry\.
-.
-.IP "" 0
-.
-.SH "CONFIGURATION"
-npm is extremely configurable\.  It reads its configuration options from
-5 places\.
-.
-.IP "\(bu" 4
-Command line switches:
-.
-.br
-Set a config with \fB\-\-key val\fR\|\.  All keys take a value, even if they
-are booleans (the config parser doesn\'t know what the options are at
-the time of parsing\.)  If no value is provided, then the option is set
-to boolean \fBtrue\fR\|\.
-.
-.IP "\(bu" 4
-Environment Variables:
-.
-.br
-Set any config by prefixing the name in an environment variable with \fBnpm_config_\fR\|\.  For example, \fBexport npm_config_key=val\fR\|\.
-.
-.IP "\(bu" 4
-User Configs:
-.
-.br
-The file at $HOME/\.npmrc is an ini\-formatted list of configs\.  If
-present, it is parsed\.  If the \fBuserconfig\fR option is set in the cli
-or env, then that will be used instead\.
-.
-.IP "\(bu" 4
-Global Configs:
-.
-.br
-The file found at \.\./etc/npmrc (from the node executable, by default
-this resolves to /usr/local/etc/npmrc) will be parsed if it is found\.
-If the \fBglobalconfig\fR option is set in the cli, env, or user config,
-then that file is parsed instead\.
-.
-.IP "\(bu" 4
-Defaults:
-.
-.br
-npm\'s default configuration options are defined in
-lib/utils/config\-defs\.js\.  These must not be changed\.
-.
-.IP "" 0
-.
-.P
-npm help  See \fBnpm\-config\fR for much much more information\.
-.
-.SH "CONTRIBUTIONS"
-Patches welcome!
-.
-.IP "\(bu" 4
-code:
-npm help  Read through \fBnpm\-coding\-style\fR if you plan to submit code\.
-You don\'t have to agree with it, but you do have to follow it\.
-.
-.IP "\(bu" 4
-docs:
-If you find an error in the documentation, edit the appropriate markdown
-file in the "doc" folder\.  (Don\'t worry about generating the man page\.)
-.
-.IP "" 0
-.
-.P
-Contributors are listed in npm\'s \fBpackage\.json\fR file\.  You can view them
-easily by doing \fBnpm view npm contributors\fR\|\.
-.
-.P
-If you would like to contribute, but don\'t know what to work on, check
-the issues list or ask on the mailing list\.
-.
-.IP "\(bu" 4
-\fIhttp://github\.com/isaacs/npm/issues\fR
-.
-.IP "\(bu" 4
-\fInpm\-@googlegroups\.com\fR
-.
-.IP "" 0
-.
-.SH "BUGS"
-When you find issues, please report them:
-.
-.IP "\(bu" 4
-web: \fIhttp://github\.com/isaacs/npm/issues\fR
-.
-.IP "\(bu" 4
-email: \fInpm\-@googlegroups\.com\fR
-.
-.IP "" 0
-.
-.P
-Be sure to include \fIall\fR of the output from the npm command that didn\'t work
-as expected\.  The \fBnpm\-debug\.log\fR file is also helpful to provide\.
-.
-.P
-You can also look for isaacs in #node\.js on irc://irc\.freenode\.net\.  He
-will no doubt tell you to put the output in a gist or email\.
-.
-.SH "HISTORY"
-npm help See npm\-changelog
-.
-.SH "AUTHOR"
-Isaac Z\. Schlueter \fIhttp://blog\.izs\.me/\fR :: isaacs \fIhttps://github\.com/isaacs/\fR :: @izs \fIhttp://twitter\.com/izs\fR :: \fIi@izs\.me\fR
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help help
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-README
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  index
-.
-.IP "\(bu" 4
-npm apihelp npm
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man1/repo.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,45 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-REPO" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-repo\fR \-\- Open package repository page in the browser
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm repo <pkgname>
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command tries to guess at the likely location of a package\'s
-repository URL, and then tries to open it using the \fB\-\-browser\fR
-config param\.
-.
-.SH "CONFIGURATION"
-.
-.SS "browser"
-.
-.IP "\(bu" 4
-Default: OS X: \fB"open"\fR, Windows: \fB"start"\fR, Others: \fB"xdg\-open"\fR
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-The browser that is called by the \fBnpm repo\fR command to open websites\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help docs
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-bin.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-BIN" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-bin\fR \-\- Display npm bin folder
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.bin(args, cb)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Print the folder where npm will install executables\.
-.
-.P
-This function should not be used programmatically\.  Instead, just refer
-to the \fBnpm\.bin\fR member\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-bugs.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-BUGS" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-bugs\fR \-\- Bugs for a package in a web browser maybe
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.bugs(package, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command tries to guess at the likely location of a package\'s
-bug tracker URL, and then tries to open it using the \fB\-\-browser\fR
-config param\.
-.
-.P
-Like other commands, the first parameter is an array\. This command only
-uses the first element, which is expected to be a package name with an
-optional version number\.
-.
-.P
-This command will launch a browser, so this command may not be the most
-friendly for programmatic use\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-commands.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-COMMANDS" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-commands\fR \-\- npm commands
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands[<command>](args, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-npm comes with a full set of commands, and each of the commands takes a
-similar set of arguments\.
-.
-.P
-In general, all commands on the command object take an \fBarray\fR of positional
-argument \fBstrings\fR\|\. The last argument to any function is a callback\. Some
-commands are special and take other optional arguments\.
-.
-.P
-All commands have their own man page\. See \fBman npm\-<command>\fR for command\-line
-usage, or \fBman 3 npm\-<command>\fR for programmatic usage\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  index
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-config.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,69 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-CONFIG" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-config\fR \-\- Manage the npm configuration files
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.config(args, callback)
-var val = npm\.config\.get(key)
-npm\.config\.set(key, val)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This function acts much the same way as the command\-line version\.  The first
-element in the array tells config what to do\. Possible values are:
-.
-.IP "\(bu" 4
-\fBset\fR
-.
-.IP
-Sets a config parameter\.  The second element in \fBargs\fR is interpreted as the
-key, and the third element is interpreted as the value\.
-.
-.IP "\(bu" 4
-\fBget\fR
-.
-.IP
-Gets the value of a config parameter\. The second element in \fBargs\fR is the
-key to get the value of\.
-.
-.IP "\(bu" 4
-\fBdelete\fR (\fBrm\fR or \fBdel\fR)
-.
-.IP
-Deletes a parameter from the config\. The second element in \fBargs\fR is the
-key to delete\.
-.
-.IP "\(bu" 4
-\fBlist\fR (\fBls\fR)
-.
-.IP
-Show all configs that aren\'t secret\. No parameters necessary\.
-.
-.IP "\(bu" 4
-\fBedit\fR:
-.
-.IP
-Opens the config file in the default editor\. This command isn\'t very useful
-programmatically, but it is made available\.
-.
-.IP "" 0
-.
-.P
-To programmatically access npm configuration settings, or set them for
-the duration of a program, use the \fBnpm\.config\.set\fR and \fBnpm\.config\.get\fR
-functions instead\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm apihelp npm
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-deprecate.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,57 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-DEPRECATE" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-deprecate\fR \-\- Deprecate a version of a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.deprecate(args, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command will update the npm registry entry for a package, providing
-a deprecation warning to all who attempt to install it\.
-.
-.P
-The \'args\' parameter must have exactly two elements:
-.
-.IP "\(bu" 4
-\fBpackage[@version]\fR
-.
-.IP
-The \fBversion\fR portion is optional, and may be either a range, or a
-specific version, or a tag\.
-.
-.IP "\(bu" 4
-\fBmessage\fR
-.
-.IP
-The warning message that will be printed whenever a user attempts to
-install the package\.
-.
-.IP "" 0
-.
-.P
-Note that you must be the package owner to deprecate something\.  See the \fBowner\fR and \fBadduser\fR help topics\.
-.
-.P
-To un\-deprecate a package, specify an empty string (\fB""\fR) for the \fBmessage\fR argument\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm apihelp publish
-.
-.IP "\(bu" 4
-npm apihelp unpublish
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-docs.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-DOCS" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-docs\fR \-\- Docs for a package in a web browser maybe
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.docs(package, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command tries to guess at the likely location of a package\'s
-documentation URL, and then tries to open it using the \fB\-\-browser\fR
-config param\.
-.
-.P
-Like other commands, the first parameter is an array\. This command only
-uses the first element, which is expected to be a package name with an
-optional version number\.
-.
-.P
-This command will launch a browser, so this command may not be the most
-friendly for programmatic use\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-edit.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-EDIT" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-edit\fR \-\- Edit an installed package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.edit(package, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Opens the package folder in the default editor (or whatever you\'ve
-configured as the npm \fBeditor\fR config \-\- see \fBnpm help config\fR\|\.)
-.
-.P
-After it has been edited, the package is rebuilt so as to pick up any
-changes in compiled packages\.
-.
-.P
-For instance, you can do \fBnpm install connect\fR to install connect
-into your package, and then \fBnpm\.commands\.edit(["connect"], callback)\fR
-to make a few changes to your locally installed copy\.
-.
-.P
-The first parameter is a string array with a single element, the package
-to open\. The package can optionally have a version number attached\.
-.
-.P
-Since this command opens an editor in a new process, be careful about where
-and how this is used\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-explore.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-EXPLORE" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-explore\fR \-\- Browse an installed package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.explore(args, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Spawn a subshell in the directory of the installed package specified\.
-.
-.P
-If a command is specified, then it is run in the subshell, which then
-immediately terminates\.
-.
-.P
-Note that the package is \fInot\fR automatically rebuilt afterwards, so be
-sure to use \fBnpm rebuild <pkg>\fR if you make any changes\.
-.
-.P
-The first element in the \'args\' parameter must be a package name\.  After that is the optional command, which can be any number of strings\. All of the strings will be combined into one, space\-delimited command\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-help-search.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,51 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-HELP\-SEARCH" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-help-search\fR \-\- Search the help pages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.helpSearch(args, [silent,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command is rarely useful, but it exists in the rare case that it is\.
-.
-.P
-This command takes an array of search terms and returns the help pages that
-match in order of best match\.
-.
-.P
-If there is only one match, then npm displays that help section\. If there
-are multiple results, the results are printed to the screen formatted and the
-array of results is returned\. Each result is an object with these properties:
-.
-.IP "\(bu" 4
-hits:
-A map of args to number of hits on that arg\. For example, {"npm": 3}
-.
-.IP "\(bu" 4
-found:
-Total number of unique args that matched\.
-.
-.IP "\(bu" 4
-totalHits:
-Total number of hits\.
-.
-.IP "\(bu" 4
-lines:
-An array of all matching lines (and some adjacent lines)\.
-.
-.IP "\(bu" 4
-file:
-Name of the file that matched
-.
-.IP "" 0
-.
-.P
-The silent parameter is not neccessary not used, but it may in the future\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-init.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "INIT" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBinit\fR \-\- Interactively create a package\.json file
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.init(args, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This will ask you a bunch of questions, and then write a package\.json for you\.
-.
-.P
-It attempts to make reasonable guesses about what you want things to be set to,
-and then writes a package\.json file with the options you\'ve selected\.
-.
-.P
-If you already have a package\.json file, it\'ll read that first, and default to
-the options in there\.
-.
-.P
-It is strictly additive, so it does not delete options from your package\.json
-without a really good reason to do so\.
-.
-.P
-Since this function expects to be run on the command\-line, it doesn\'t work very
-well as a programmatically\. The best option is to roll your own, and since
-JavaScript makes it stupid simple to output formatted JSON, that is the
-preferred method\. If you\'re sure you want to handle command\-line prompting,
-then go ahead and use this programmatically\.
-.
-.SH "SEE ALSO"
-npm help  package\.json
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-install.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,29 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-INSTALL" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-install\fR \-\- install a package programmatically
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.install([where,] packages, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This acts much the same ways as installing on the command\-line\.
-.
-.P
-The \'where\' parameter is optional and only used internally, and it specifies
-where the packages should be installed to\.
-.
-.P
-The \'packages\' parameter is an array of strings\. Each element in the array is
-the name of a package to be installed\.
-.
-.P
-Finally, \'callback\' is a function that will be called when all packages have been
-installed or when an error has been encountered\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-link.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-LINK" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-link\fR \-\- Symlink a package folder
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.command\.link(callback)
-npm\.command\.link(packages, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Package linking is a two\-step process\.
-.
-.P
-Without parameters, link will create a globally\-installed
-symbolic link from \fBprefix/package\-name\fR to the current folder\.
-.
-.P
-With a parameters, link will create a symlink from the local \fBnode_modules\fR
-folder to the global symlink\.
-.
-.P
-When creating tarballs for \fBnpm publish\fR, the linked packages are
-"snapshotted" to their current state by resolving the symbolic links\.
-.
-.P
-This is
-handy for installing your own stuff, so that you can work on it and test it
-iteratively without having to continually rebuild\.
-.
-.P
-For example:
-.
-.IP "" 4
-.
-.nf
-npm\.commands\.link(cb)           # creates global link from the cwd
-                                # (say redis package)
-npm\.commands\.link(\'redis\', cb)  # link\-install the package
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Now, any changes to the redis package will be reflected in
-the package in the current working directory
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-load.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,44 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-LOAD" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-load\fR \-\- Load config settings
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.load(conf, cb)
-.
-.fi
-.
-.SH "DESCRIPTION"
-npm\.load() must be called before any other function call\.  Both parameters are
-optional, but the second is recommended\.
-.
-.P
-The first parameter is an object hash of command\-line config params, and the
-second parameter is a callback that will be called when npm is loaded and
-ready to serve\.
-.
-.P
-The first parameter should follow a similar structure as the package\.json
-config object\.
-.
-.P
-For example, to emulate the \-\-dev flag, pass an object that looks like this:
-.
-.IP "" 4
-.
-.nf
-{
-  "dev": true
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-For a list of all the available command\-line configs, see \fBnpm help config\fR
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-ls.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,86 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-LS" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-ls\fR \-\- List installed packages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.ls(args, [silent,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command will print to stdout all the versions of packages that are
-installed, as well as their dependencies, in a tree\-structure\. It will also
-return that data using the callback\.
-.
-.P
-This command does not take any arguments, but args must be defined\.
-Beyond that, if any arguments are passed in, npm will politely warn that it
-does not take positional arguments, though you may set config flags
-like with any other command, such as \fBglobal\fR to list global packages\.
-.
-.P
-It will print out extraneous, missing, and invalid packages\.
-.
-.P
-If the silent parameter is set to true, nothing will be output to the screen,
-but the data will still be returned\.
-.
-.P
-Callback is provided an error if one occurred, the full data about which
-packages are installed and which dependencies they will receive, and a
-"lite" data object which just shows which versions are installed where\.
-Note that the full data object is a circular structure, so care must be
-taken if it is serialized to JSON\.
-.
-.SH "CONFIGURATION"
-.
-.SS "long"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Show extended information\.
-.
-.SS "parseable"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Show parseable output instead of tree view\.
-.
-.SS "global"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-List packages in the global install prefix instead of in the current
-project\.
-.
-.P
-Note, if parseable is set or long isn\'t set, then duplicates will be trimmed\.
-This means that if a submodule a same dependency as a parent module, then the
-dependency will only be output once\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-outdated.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-OUTDATED" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-outdated\fR \-\- Check for outdated packages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.outdated([packages,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command will check the registry to see if the specified packages are
-currently outdated\.
-.
-.P
-If the \'packages\' parameter is left out, npm will check all packages\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-owner.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,52 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-OWNER" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-owner\fR \-\- Manage package owners
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.owner(args, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-The first element of the \'args\' parameter defines what to do, and the subsequent
-elements depend on the action\. Possible values for the action are (order of
-parameters are given in parenthesis):
-.
-.IP "\(bu" 4
-ls (package):
-List all the users who have access to modify a package and push new versions\.
-Handy when you need to know who to bug for help\.
-.
-.IP "\(bu" 4
-add (user, package):
-Add a new user as a maintainer of a package\.  This user is enabled to modify
-metadata, publish new versions, and add other owners\.
-.
-.IP "\(bu" 4
-rm (user, package):
-Remove a user from the package owner list\.  This immediately revokes their
-privileges\.
-.
-.IP "" 0
-.
-.P
-Note that there is only one level of access\.  Either you can modify a package,
-or you can\'t\.  Future versions may contain more fine\-grained access levels, but
-that is not implemented at this time\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm apihelp publish
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-pack.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-PACK" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-pack\fR \-\- Create a tarball from a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.pack([packages,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-For anything that\'s installable (that is, a package folder, tarball,
-tarball url, name@tag, name@version, or name), this command will fetch
-it to the cache, and then copy the tarball to the current working
-directory as \fB<name>\-<version>\.tgz\fR, and then write the filenames out to
-stdout\.
-.
-.P
-If the same package is specified multiple times, then the file will be
-overwritten the second time\.
-.
-.P
-If no arguments are supplied, then npm packs the current package folder\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-prefix.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-PREFIX" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-prefix\fR \-\- Display prefix
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.prefix(args, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Print the prefix to standard out\.
-.
-.P
-\'args\' is never used and callback is never called with data\.
-\'args\' must be present or things will break\.
-.
-.P
-This function is not useful programmatically
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-prune.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-PRUNE" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-prune\fR \-\- Remove extraneous packages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.prune([packages,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command removes "extraneous" packages\.
-.
-.P
-The first parameter is optional, and it specifies packages to be removed\.
-.
-.P
-No packages are specified, then all packages will be checked\.
-.
-.P
-Extraneous packages are packages that are not listed on the parent
-package\'s dependencies list\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-publish.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,51 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-PUBLISH" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-publish\fR \-\- Publish a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.publish([packages,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Publishes a package to the registry so that it can be installed by name\.
-Possible values in the \'packages\' array are:
-.
-.IP "\(bu" 4
-\fB<folder>\fR:
-A folder containing a package\.json file
-.
-.IP "\(bu" 4
-\fB<tarball>\fR:
-A url or file path to a gzipped tar archive containing a single folder
-with a package\.json file inside\.
-.
-.IP "" 0
-.
-.P
-If the package array is empty, npm will try to publish something in the
-current working directory\.
-.
-.P
-This command could fails if one of the packages specified already exists in
-the registry\.  Overwrites when the "force" environment variable is set\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help adduser
-.
-.IP "\(bu" 4
-npm apihelp owner
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-rebuild.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-REBUILD" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-rebuild\fR \-\- Rebuild a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.rebuild([packages,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command runs the \fBnpm build\fR command on each of the matched packages\.  This is useful
-when you install a new version of node, and must recompile all your C++ addons with
-the new binary\. If no \'packages\' parameter is specify, every package will be rebuilt\.
-.
-.SH "CONFIGURATION"
-See \fBnpm help build\fR
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-restart.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-RESTART" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-restart\fR \-\- Start a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.restart(packages, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs a package\'s "restart" script, if one was provided\.
-Otherwise it runs package\'s "stop" script, if one was provided, and then
-the "start" script\.
-.
-.P
-If no version is specified, then it restarts the "active" version\.
-.
-.P
-npm can run tests on multiple packages\. Just specify multiple packages
-in the \fBpackages\fR parameter\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm apihelp start
-.
-.IP "\(bu" 4
-npm apihelp stop
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-root.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-ROOT" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-root\fR \-\- Display npm root
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.root(args, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Print the effective \fBnode_modules\fR folder to standard out\.
-.
-.P
-\'args\' is never used and callback is never called with data\.
-\'args\' must be present or things will break\.
-.
-.P
-This function is not useful programmatically\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-run-script.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,48 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-RUN\-SCRIPT" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-run-script\fR \-\- Run arbitrary package scripts
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.run\-script(args, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs an arbitrary command from a package\'s "scripts" object\.
-.
-.P
-It is used by the test, start, restart, and stop commands, but can be
-called directly, as well\.
-.
-.P
-The \'args\' parameter is an array of strings\. Behavior depends on the number
-of elements\.  If there is only one element, npm assumes that the element
-represents a command to be run on the local repository\. If there is more than
-one element, then the first is assumed to be the package and the second is
-assumed to be the command to run\. All other elements are ignored\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm apihelp test
-.
-.IP "\(bu" 4
-npm apihelp start
-.
-.IP "\(bu" 4
-npm apihelp restart
-.
-.IP "\(bu" 4
-npm apihelp stop
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-search.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,64 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-SEARCH" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-search\fR \-\- Search for packages
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.search(searchTerms, [silent,] [staleness,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Search the registry for packages matching the search terms\. The available parameters are:
-.
-.IP "\(bu" 4
-searchTerms:
-Array of search terms\. These terms are case\-insensitive\.
-.
-.IP "\(bu" 4
-silent:
-If true, npm will not log anything to the console\.
-.
-.IP "\(bu" 4
-staleness:
-This is the threshold for stale packages\. "Fresh" packages are not refreshed
-from the registry\. This value is measured in seconds\.
-.
-.IP "\(bu" 4
-callback:
-Returns an object where each key is the name of a package, and the value
-is information about that package along with a \'words\' property, which is
-a space\-delimited string of all of the interesting words in that package\.
-The only properties included are those that are searched, which generally include:
-.
-.IP "\(bu" 4
-name
-.
-.IP "\(bu" 4
-description
-.
-.IP "\(bu" 4
-maintainers
-.
-.IP "\(bu" 4
-url
-.
-.IP "\(bu" 4
-keywords
-.
-.IP "" 0
-
-.
-.IP "" 0
-.
-.P
-A search on the registry excludes any result that does not match all of the
-search terms\. It also removes any items from the results that contain an
-excluded term (the "searchexclude" config)\. The search is case insensitive
-and doesn\'t try to read your mind (it doesn\'t do any verb tense matching or the
-like)\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-shrinkwrap.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-SHRINKWRAP" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-shrinkwrap\fR \-\- programmatically generate package shrinkwrap file
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.shrinkwrap(args, [silent,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This acts much the same ways as shrinkwrapping on the command\-line\.
-.
-.P
-This command does not take any arguments, but \'args\' must be defined\.
-Beyond that, if any arguments are passed in, npm will politely warn that it
-does not take positional arguments\.
-.
-.P
-If the \'silent\' parameter is set to true, nothing will be output to the screen,
-but the shrinkwrap file will still be written\.
-.
-.P
-Finally, \'callback\' is a function that will be called when the shrinkwrap has
-been saved\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-start.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-START" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-start\fR \-\- Start a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.start(packages, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs a package\'s "start" script, if one was provided\.
-.
-.P
-npm can run tests on multiple packages\. Just specify multiple packages
-in the \fBpackages\fR parameter\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-stop.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-STOP" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-stop\fR \-\- Stop a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.stop(packages, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs a package\'s "stop" script, if one was provided\.
-.
-.P
-npm can run stop on multiple packages\. Just specify multiple packages
-in the \fBpackages\fR parameter\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-submodule.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-SUBMODULE" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-submodule\fR \-\- Add a package as a git submodule
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.submodule(packages, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-For each package specified, npm will check if it has a git repository url
-in its package\.json description then add it as a git submodule at \fBnode_modules/<pkg name>\fR\|\.
-.
-.P
-This is a convenience only\.  From then on, it\'s up to you to manage
-updates by using the appropriate git commands\.  npm will stubbornly
-refuse to update, modify, or remove anything with a \fB\|\.git\fR subfolder
-in it\.
-.
-.P
-This command also does not install missing dependencies, if the package
-does not include them in its git repository\.  If \fBnpm ls\fR reports that
-things are missing, you can either install, link, or submodule them yourself,
-or you can do \fBnpm explore <pkgname> \-\- npm install\fR to install the
-dependencies into the submodule folder\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help json
-.
-.IP "\(bu" 4
-git help submodule
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-tag.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-TAG" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-tag\fR \-\- Tag a published version
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.tag(package@version, tag, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Tags the specified version of the package with the specified tag, or the \fB\-\-tag\fR config if not specified\.
-.
-.P
-The \'package@version\' is an array of strings, but only the first two elements are
-currently used\.
-.
-.P
-The first element must be in the form package@version, where package
-is the package name and version is the version number (much like installing a
-specific version)\.
-.
-.P
-The second element is the name of the tag to tag this version with\. If this
-parameter is missing or falsey (empty), the default froom the config will be
-used\. For more information about how to set this config, check \fBman 3 npm\-config\fR for programmatic usage or \fBman npm\-config\fR for cli usage\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-test.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-TEST" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-test\fR \-\- Test a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-  npm\.commands\.test(packages, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This runs a package\'s "test" script, if one was provided\.
-.
-.P
-To run tests as a condition of installation, set the \fBnpat\fR config to
-true\.
-.
-.P
-npm can run tests on multiple packages\. Just specify multiple packages
-in the \fBpackages\fR parameter\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-uninstall.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-UNINSTALL" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-uninstall\fR \-\- uninstall a package programmatically
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.uninstall(packages, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This acts much the same ways as uninstalling on the command\-line\.
-.
-.P
-The \'packages\' parameter is an array of strings\. Each element in the array is
-the name of a package to be uninstalled\.
-.
-.P
-Finally, \'callback\' is a function that will be called when all packages have been
-uninstalled or when an error has been encountered\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-unpublish.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-UNPUBLISH" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-unpublish\fR \-\- Remove a package from the registry
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.unpublish(package, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This removes a package version from the registry, deleting its
-entry and removing the tarball\.
-.
-.P
-The package parameter must be defined\.
-.
-.P
-Only the first element in the package parameter is used\.  If there is no first
-element, then npm assumes that the package at the current working directory
-is what is meant\.
-.
-.P
-If no version is specified, or if all versions are removed then
-the root package entry is removed from the registry entirely\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-update.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-UPDATE" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-update\fR \-\- Update a package
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.update(packages, callback)
-.
-.fi
-Updates a package, upgrading it to the latest version\. It also installs any missing packages\.
-.
-.P
-The \'packages\' argument is an array of packages to update\. The \'callback\' parameter will be called when done or when an error occurs\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-version.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-VERSION" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-version\fR \-\- Bump a package version
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.version(newversion, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Run this in a package directory to bump the version and write the new
-data back to the package\.json file\.
-.
-.P
-If run in a git repo, it will also create a version commit and tag, and
-fail if the repo is not clean\.
-.
-.P
-Like all other commands, this function takes a string array as its first
-parameter\. The difference, however, is this function will fail if it does
-not have exactly one element\. The only element should be a version number\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-view.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,176 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-VIEW" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-view\fR \-\- View registry info
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.view(args, [silent,] callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command shows data about a package and prints it to the stream
-referenced by the \fBoutfd\fR config, which defaults to stdout\.
-.
-.P
-The "args" parameter is an ordered list that closely resembles the command\-line
-usage\. The elements should be ordered such that the first element is
-the package and version (package@version)\. The version is optional\. After that,
-the rest of the parameters are fields with optional subfields ("field\.subfield")
-which can be used to get only the information desired from the registry\.
-.
-.P
-The callback will be passed all of the data returned by the query\.
-.
-.P
-For example, to get the package registry entry for the \fBconnect\fR package,
-you can do this:
-.
-.IP "" 4
-.
-.nf
-npm\.commands\.view(["connect"], callback)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If no version is specified, "latest" is assumed\.
-.
-.P
-Field names can be specified after the package descriptor\.
-For example, to show the dependencies of the \fBronn\fR package at version
-0\.3\.5, you could do the following:
-.
-.IP "" 4
-.
-.nf
-npm\.commands\.view(["ronn@0\.3\.5", "dependencies"], callback)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-You can view child field by separating them with a period\.
-To view the git repository URL for the latest version of npm, you could
-do this:
-.
-.IP "" 4
-.
-.nf
-npm\.commands\.view(["npm", "repository\.url"], callback)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-For fields that are arrays, requesting a non\-numeric field will return
-all of the values from the objects in the list\.  For example, to get all
-the contributor names for the "express" project, you can do this:
-.
-.IP "" 4
-.
-.nf
-npm\.commands\.view(["express", "contributors\.email"], callback)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-You may also use numeric indices in square braces to specifically select
-an item in an array field\.  To just get the email address of the first
-contributor in the list, you can do this:
-.
-.IP "" 4
-.
-.nf
-npm\.commands\.view(["express", "contributors[0]\.email"], callback)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Multiple fields may be specified, and will be printed one after another\.
-For exampls, to get all the contributor names and email addresses, you
-can do this:
-.
-.IP "" 4
-.
-.nf
-npm\.commands\.view(["express", "contributors\.name", "contributors\.email"], callback)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-"Person" fields are shown as a string if they would be shown as an
-object\.  So, for example, this will show the list of npm contributors in
-the shortened string format\.  (See \fBnpm help json\fR for more on this\.)
-.
-.IP "" 4
-.
-.nf
-npm\.commands\.view(["npm", "contributors"], callback)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If a version range is provided, then data will be printed for every
-matching version of the package\.  This will show which version of jsdom
-was required by each matching version of yui3:
-.
-.IP "" 4
-.
-.nf
-npm\.commands\.view(["yui3@\'>0\.5\.4\'", "dependencies\.jsdom"], callback)
-.
-.fi
-.
-.IP "" 0
-.
-.SH "OUTPUT"
-If only a single string field for a single version is output, then it
-will not be colorized or quoted, so as to enable piping the output to
-another command\.
-.
-.P
-If the version range matches multiple versions, than each printed value
-will be prefixed with the version it applies to\.
-.
-.P
-If multiple fields are requested, than each of them are prefixed with
-the field name\.
-.
-.P
-Console output can be disabled by setting the \'silent\' parameter to true\.
-.
-.SH "RETURN VALUE"
-The data returned will be an object in this formation:
-.
-.IP "" 4
-.
-.nf
-{ <version>:
-  { <field>: <value>
-  , \.\.\. }
-, \.\.\. }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-corresponding to the list of fields selected\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm-whoami.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-WHOAMI" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-whoami\fR \-\- Display npm username
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.whoami(args, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-Print the \fBusername\fR config to standard output\.
-.
-.P
-\'args\' is never used and callback is never called with data\.
-\'args\' must be present or things will break\.
-.
-.P
-This function is not useful programmatically
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/npm.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,162 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm\fR \-\- node package manager
-.
-.SH "SYNOPSIS"
-.
-.nf
-var npm = require("npm")
-npm\.load([configObject], function (er, npm) {
-  // use the npm object, now that it\'s loaded\.
-  npm\.config\.set(key, val)
-  val = npm\.config\.get(key)
-  console\.log("prefix = %s", npm\.prefix)
-  npm\.commands\.install(["package"], cb)
-})
-.
-.fi
-.
-.SH "VERSION"
-1.3.14
-.
-.SH "DESCRIPTION"
-This is the API documentation for npm\.
-To find documentation of the command line
-npm help client, see \fBnpm\fR\|\.
-.
-.P
-Prior to using npm\'s commands, \fBnpm\.load()\fR must be called\.
-If you provide \fBconfigObject\fR as an object hash of top\-level
-configs, they override the values stored in the various config
-locations\. In the npm command line client, this set of configs
-is parsed from the command line options\. Additional configuration
-npm help  npm help params are loaded from two configuration files\. See \fBnpm\-config\fR, \fBnpm\-confignpm help  \fR, and \fBnpmrc\fR for more information\.
-.
-.P
-After that, each of the functions are accessible in the
-npm help  commands object: \fBnpm\.commands\.<cmd>\fR\|\.  See \fBnpm\-index\fR for a list of
-all possible commands\.
-.
-.P
-All commands on the command object take an \fBarray\fR of positional argument \fBstrings\fR\|\. The last argument to any function is a callback\. Some
-commands take other optional arguments\.
-.
-.P
-Configs cannot currently be set on a per function basis, as each call to
-npm\.config\.set will change the value for \fIall\fR npm commands in that process\.
-.
-.P
-To find API documentation for a specific command, run the \fBnpm apihelp\fR
-command\.
-.
-.SH "METHODS AND PROPERTIES"
-.
-.IP "\(bu" 4
-\fBnpm\.load(configs, cb)\fR
-.
-.IP
-Load the configuration params, and call the \fBcb\fR function once the
-globalconfig and userconfig files have been loaded as well, or on
-nextTick if they\'ve already been loaded\.
-.
-.IP "\(bu" 4
-\fBnpm\.config\fR
-.
-.IP
-An object for accessing npm configuration parameters\.
-.
-.IP "\(bu" 4
-\fBnpm\.config\.get(key)\fR
-.
-.IP "\(bu" 4
-\fBnpm\.config\.set(key, val)\fR
-.
-.IP "\(bu" 4
-\fBnpm\.config\.del(key)\fR
-.
-.IP "" 0
-
-.
-.IP "\(bu" 4
-\fBnpm\.dir\fR or \fBnpm\.root\fR
-.
-.IP
-The \fBnode_modules\fR directory where npm will operate\.
-.
-.IP "\(bu" 4
-\fBnpm\.prefix\fR
-.
-.IP
-The prefix where npm is operating\.  (Most often the current working
-directory\.)
-.
-.IP "\(bu" 4
-\fBnpm\.cache\fR
-.
-.IP
-The place where npm keeps JSON and tarballs it fetches from the
-registry (or uploads to the registry)\.
-.
-.IP "\(bu" 4
-\fBnpm\.tmp\fR
-.
-.IP
-npm\'s temporary working directory\.
-.
-.IP "\(bu" 4
-\fBnpm\.deref\fR
-.
-.IP
-Get the "real" name for a command that has either an alias or
-abbreviation\.
-.
-.IP "" 0
-.
-.SH "MAGIC"
-For each of the methods in the \fBnpm\.commands\fR hash, a method is added to
-the npm object, which takes a set of positional string arguments rather
-than an array and a callback\.
-.
-.P
-If the last argument is a callback, then it will use the supplied
-callback\.  However, if no callback is provided, then it will print out
-the error or results\.
-.
-.P
-For example, this would work in a node repl:
-.
-.IP "" 4
-.
-.nf
-> npm = require("npm")
-> npm\.load()  // wait a sec\.\.\.
-> npm\.install("dnode", "express")
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Note that that \fIwon\'t\fR work in a node program, since the \fBinstall\fR
-method will get called before the configuration load is completed\.
-.
-.SH "ABBREVS"
-In order to support \fBnpm ins foo\fR instead of \fBnpm install foo\fR, the \fBnpm\.commands\fR object has a set of abbreviations as well as the full
-method names\.  Use the \fBnpm\.deref\fR method to find the real name\.
-.
-.P
-For example:
-.
-.IP "" 4
-.
-.nf
-var cmd = npm\.deref("unp") // cmd === "unpublish"
-.
-.fi
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man3/repo.3	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-REPO" "3" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-repo\fR \-\- Open package repository page in the browser
-.
-.SH "SYNOPSIS"
-.
-.nf
-npm\.commands\.repo(package, callback)
-.
-.fi
-.
-.SH "DESCRIPTION"
-This command tries to guess at the likely location of a package\'s
-repository URL, and then tries to open it using the \fB\-\-browser\fR
-config param\.
-.
-.P
-Like other commands, the first parameter is an array\. This command only
-uses the first element, which is expected to be a package name with an
-optional version number\.
-.
-.P
-This command will launch a browser, so this command may not be the most
-friendly for programmatic use\.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/npm-folders.5	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,264 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-FOLDERS" "5" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-folders\fR \-\- Folder Structures Used by npm
-.
-.SH "DESCRIPTION"
-npm puts various things on your computer\.  That\'s its job\.
-.
-.P
-This document will tell you what it puts where\.
-.
-.SS "tl;dr"
-.
-.IP "\(bu" 4
-Local install (default): puts stuff in \fB\|\./node_modules\fR of the current
-package root\.
-.
-.IP "\(bu" 4
-Global install (with \fB\-g\fR): puts stuff in /usr/local or wherever node
-is installed\.
-.
-.IP "\(bu" 4
-Install it \fBlocally\fR if you\'re going to \fBrequire()\fR it\.
-.
-.IP "\(bu" 4
-Install it \fBglobally\fR if you\'re going to run it on the command line\.
-.
-.IP "\(bu" 4
-If you need both, then install it in both places, or use \fBnpm link\fR\|\.
-.
-.IP "" 0
-.
-.SS "prefix Configuration"
-The \fBprefix\fR config defaults to the location where node is installed\.
-On most systems, this is \fB/usr/local\fR, and most of the time is the same
-as node\'s \fBprocess\.installPrefix\fR\|\.
-.
-.P
-On windows, this is the exact location of the node\.exe binary\.  On Unix
-systems, it\'s one level up, since node is typically installed at \fB{prefix}/bin/node\fR rather than \fB{prefix}/node\.exe\fR\|\.
-.
-.P
-When the \fBglobal\fR flag is set, npm installs things into this prefix\.
-When it is not set, it uses the root of the current package, or the
-current working directory if not in a package already\.
-.
-.SS "Node Modules"
-Packages are dropped into the \fBnode_modules\fR folder under the \fBprefix\fR\|\.
-When installing locally, this means that you can \fBrequire("packagename")\fR to load its main module, or \fBrequire("packagename/lib/path/to/sub/module")\fR to load other modules\.
-.
-.P
-Global installs on Unix systems go to \fB{prefix}/lib/node_modules\fR\|\.
-Global installs on Windows go to \fB{prefix}/node_modules\fR (that is, no \fBlib\fR folder\.)
-.
-.P
-If you wish to \fBrequire()\fR a package, then install it locally\.
-.
-.SS "Executables"
-When in global mode, executables are linked into \fB{prefix}/bin\fR on Unix,
-or directly into \fB{prefix}\fR on Windows\.
-.
-.P
-When in local mode, executables are linked into \fB\|\./node_modules/\.bin\fR so that they can be made available to scripts run
-through npm\.  (For example, so that a test runner will be in the path
-when you run \fBnpm test\fR\|\.)
-.
-.SS "Man Pages"
-When in global mode, man pages are linked into \fB{prefix}/share/man\fR\|\.
-.
-.P
-When in local mode, man pages are not installed\.
-.
-.P
-Man pages are not installed on Windows systems\.
-.
-.SS "Cache"
-npm help See \fBnpm\-cache\fR\|\.  Cache files are stored in \fB~/\.npm\fR on Posix, or \fB~/npm\-cache\fR on Windows\.
-.
-.P
-This is controlled by the \fBcache\fR configuration param\.
-.
-.SS "Temp Files"
-Temporary files are stored by default in the folder specified by the \fBtmp\fR config, which defaults to the TMPDIR, TMP, or TEMP environment
-variables, or \fB/tmp\fR on Unix and \fBc:\\windows\\temp\fR on Windows\.
-.
-.P
-Temp files are given a unique folder under this root for each run of the
-program, and are deleted upon successful exit\.
-.
-.SH "More Information"
-When installing locally, npm first tries to find an appropriate \fBprefix\fR folder\.  This is so that \fBnpm install foo@1\.2\.3\fR will install
-to the sensible root of your package, even if you happen to have \fBcd\fRed
-into some other folder\.
-.
-.P
-Starting at the $PWD, npm will walk up the folder tree checking for a
-folder that contains either a \fBpackage\.json\fR file, or a \fBnode_modules\fR
-folder\.  If such a thing is found, then that is treated as the effective
-"current directory" for the purpose of running npm commands\.  (This
-behavior is inspired by and similar to git\'s \.git\-folder seeking
-logic when running git commands in a working dir\.)
-.
-.P
-If no package root is found, then the current folder is used\.
-.
-.P
-When you run \fBnpm install foo@1\.2\.3\fR, then the package is loaded into
-the cache, and then unpacked into \fB\|\./node_modules/foo\fR\|\.  Then, any of
-foo\'s dependencies are similarly unpacked into \fB\|\./node_modules/foo/node_modules/\.\.\.\fR\|\.
-.
-.P
-Any bin files are symlinked to \fB\|\./node_modules/\.bin/\fR, so that they may
-be found by npm scripts when necessary\.
-.
-.SS "Global Installation"
-If the \fBglobal\fR configuration is set to true, then npm will
-install packages "globally"\.
-.
-.P
-For global installation, packages are installed roughly the same way,
-but using the folders described above\.
-.
-.SS "Cycles, Conflicts, and Folder Parsimony"
-Cycles are handled using the property of node\'s module system that it
-walks up the directories looking for \fBnode_modules\fR folders\.  So, at every
-stage, if a package is already installed in an ancestor \fBnode_modules\fR
-folder, then it is not installed at the current location\.
-.
-.P
-Consider the case above, where \fBfoo \-> bar \-> baz\fR\|\.  Imagine if, in
-addition to that, baz depended on bar, so you\'d have: \fBfoo \-> bar \-> baz \-> bar \-> baz \.\.\.\fR\|\.  However, since the folder
-structure is: \fBfoo/node_modules/bar/node_modules/baz\fR, there\'s no need to
-put another copy of bar into \fB\|\.\.\./baz/node_modules\fR, since when it calls
-require("bar"), it will get the copy that is installed in \fBfoo/node_modules/bar\fR\|\.
-.
-.P
-This shortcut is only used if the exact same
-version would be installed in multiple nested \fBnode_modules\fR folders\.  It
-is still possible to have \fBa/node_modules/b/node_modules/a\fR if the two
-"a" packages are different versions\.  However, without repeating the
-exact same package multiple times, an infinite regress will always be
-prevented\.
-.
-.P
-Another optimization can be made by installing dependencies at the
-highest level possible, below the localized "target" folder\.
-.
-.SS "\fIExample\fR"
-Consider this dependency graph:
-.
-.IP "" 4
-.
-.nf
-foo
-+\-\- blerg@1\.2\.5
-+\-\- bar@1\.2\.3
-|   +\-\- blerg@1\.x (latest=1\.3\.7)
-|   +\-\- baz@2\.x
-|   |   `\-\- quux@3\.x
-|   |       `\-\- bar@1\.2\.3 (cycle)
-|   `\-\- asdf@*
-`\-\- baz@1\.2\.3
-    `\-\- quux@3\.x
-        `\-\- bar
-.
-.fi
-.
-.IP "" 0
-.
-.P
-In this case, we might expect a folder structure like this:
-.
-.IP "" 4
-.
-.nf
-foo
-+\-\- node_modules
-    +\-\- blerg (1\.2\.5) <\-\-\-[A]
-    +\-\- bar (1\.2\.3) <\-\-\-[B]
-    |   `\-\- node_modules
-    |       +\-\- baz (2\.0\.2) <\-\-\-[C]
-    |       |   `\-\- node_modules
-    |       |       `\-\- quux (3\.2\.0)
-    |       `\-\- asdf (2\.3\.4)
-    `\-\- baz (1\.2\.3) <\-\-\-[D]
-        `\-\- node_modules
-            `\-\- quux (3\.2\.0) <\-\-\-[E]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Since foo depends directly on \fBbar@1\.2\.3\fR and \fBbaz@1\.2\.3\fR, those are
-installed in foo\'s \fBnode_modules\fR folder\.
-.
-.P
-Even though the latest copy of blerg is 1\.3\.7, foo has a specific
-dependency on version 1\.2\.5\.  So, that gets installed at [A]\.  Since the
-parent installation of blerg satisfies bar\'s dependency on \fBblerg@1\.x\fR,
-it does not install another copy under [B]\.
-.
-.P
-Bar [B] also has dependencies on baz and asdf, so those are installed in
-bar\'s \fBnode_modules\fR folder\.  Because it depends on \fBbaz@2\.x\fR, it cannot
-re\-use the \fBbaz@1\.2\.3\fR installed in the parent \fBnode_modules\fR folder [D],
-and must install its own copy [C]\.
-.
-.P
-Underneath bar, the \fBbaz \-> quux \-> bar\fR dependency creates a cycle\.
-However, because bar is already in quux\'s ancestry [B], it does not
-unpack another copy of bar into that folder\.
-.
-.P
-Underneath \fBfoo \-> baz\fR [D], quux\'s [E] folder tree is empty, because its
-dependency on bar is satisfied by the parent folder copy installed at [B]\.
-.
-.P
-For a graphical breakdown of what is installed where, use \fBnpm ls\fR\|\.
-.
-.SS "Publishing"
-Upon publishing, npm will look in the \fBnode_modules\fR folder\.  If any of
-the items there are not in the \fBbundledDependencies\fR array, then they will
-not be included in the package tarball\.
-.
-.P
-This allows a package maintainer to install all of their dependencies
-(and dev dependencies) locally, but only re\-publish those items that
-npm help  cannot be found elsewhere\.  See \fBpackage\.json\fR for more information\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help pack
-.
-.IP "\(bu" 4
-npm help cache
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/npm-global.5	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,264 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-FOLDERS" "5" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-folders\fR \-\- Folder Structures Used by npm
-.
-.SH "DESCRIPTION"
-npm puts various things on your computer\.  That\'s its job\.
-.
-.P
-This document will tell you what it puts where\.
-.
-.SS "tl;dr"
-.
-.IP "\(bu" 4
-Local install (default): puts stuff in \fB\|\./node_modules\fR of the current
-package root\.
-.
-.IP "\(bu" 4
-Global install (with \fB\-g\fR): puts stuff in /usr/local or wherever node
-is installed\.
-.
-.IP "\(bu" 4
-Install it \fBlocally\fR if you\'re going to \fBrequire()\fR it\.
-.
-.IP "\(bu" 4
-Install it \fBglobally\fR if you\'re going to run it on the command line\.
-.
-.IP "\(bu" 4
-If you need both, then install it in both places, or use \fBnpm link\fR\|\.
-.
-.IP "" 0
-.
-.SS "prefix Configuration"
-The \fBprefix\fR config defaults to the location where node is installed\.
-On most systems, this is \fB/usr/local\fR, and most of the time is the same
-as node\'s \fBprocess\.installPrefix\fR\|\.
-.
-.P
-On windows, this is the exact location of the node\.exe binary\.  On Unix
-systems, it\'s one level up, since node is typically installed at \fB{prefix}/bin/node\fR rather than \fB{prefix}/node\.exe\fR\|\.
-.
-.P
-When the \fBglobal\fR flag is set, npm installs things into this prefix\.
-When it is not set, it uses the root of the current package, or the
-current working directory if not in a package already\.
-.
-.SS "Node Modules"
-Packages are dropped into the \fBnode_modules\fR folder under the \fBprefix\fR\|\.
-When installing locally, this means that you can \fBrequire("packagename")\fR to load its main module, or \fBrequire("packagename/lib/path/to/sub/module")\fR to load other modules\.
-.
-.P
-Global installs on Unix systems go to \fB{prefix}/lib/node_modules\fR\|\.
-Global installs on Windows go to \fB{prefix}/node_modules\fR (that is, no \fBlib\fR folder\.)
-.
-.P
-If you wish to \fBrequire()\fR a package, then install it locally\.
-.
-.SS "Executables"
-When in global mode, executables are linked into \fB{prefix}/bin\fR on Unix,
-or directly into \fB{prefix}\fR on Windows\.
-.
-.P
-When in local mode, executables are linked into \fB\|\./node_modules/\.bin\fR so that they can be made available to scripts run
-through npm\.  (For example, so that a test runner will be in the path
-when you run \fBnpm test\fR\|\.)
-.
-.SS "Man Pages"
-When in global mode, man pages are linked into \fB{prefix}/share/man\fR\|\.
-.
-.P
-When in local mode, man pages are not installed\.
-.
-.P
-Man pages are not installed on Windows systems\.
-.
-.SS "Cache"
-npm help See \fBnpm\-cache\fR\|\.  Cache files are stored in \fB~/\.npm\fR on Posix, or \fB~/npm\-cache\fR on Windows\.
-.
-.P
-This is controlled by the \fBcache\fR configuration param\.
-.
-.SS "Temp Files"
-Temporary files are stored by default in the folder specified by the \fBtmp\fR config, which defaults to the TMPDIR, TMP, or TEMP environment
-variables, or \fB/tmp\fR on Unix and \fBc:\\windows\\temp\fR on Windows\.
-.
-.P
-Temp files are given a unique folder under this root for each run of the
-program, and are deleted upon successful exit\.
-.
-.SH "More Information"
-When installing locally, npm first tries to find an appropriate \fBprefix\fR folder\.  This is so that \fBnpm install foo@1\.2\.3\fR will install
-to the sensible root of your package, even if you happen to have \fBcd\fRed
-into some other folder\.
-.
-.P
-Starting at the $PWD, npm will walk up the folder tree checking for a
-folder that contains either a \fBpackage\.json\fR file, or a \fBnode_modules\fR
-folder\.  If such a thing is found, then that is treated as the effective
-"current directory" for the purpose of running npm commands\.  (This
-behavior is inspired by and similar to git\'s \.git\-folder seeking
-logic when running git commands in a working dir\.)
-.
-.P
-If no package root is found, then the current folder is used\.
-.
-.P
-When you run \fBnpm install foo@1\.2\.3\fR, then the package is loaded into
-the cache, and then unpacked into \fB\|\./node_modules/foo\fR\|\.  Then, any of
-foo\'s dependencies are similarly unpacked into \fB\|\./node_modules/foo/node_modules/\.\.\.\fR\|\.
-.
-.P
-Any bin files are symlinked to \fB\|\./node_modules/\.bin/\fR, so that they may
-be found by npm scripts when necessary\.
-.
-.SS "Global Installation"
-If the \fBglobal\fR configuration is set to true, then npm will
-install packages "globally"\.
-.
-.P
-For global installation, packages are installed roughly the same way,
-but using the folders described above\.
-.
-.SS "Cycles, Conflicts, and Folder Parsimony"
-Cycles are handled using the property of node\'s module system that it
-walks up the directories looking for \fBnode_modules\fR folders\.  So, at every
-stage, if a package is already installed in an ancestor \fBnode_modules\fR
-folder, then it is not installed at the current location\.
-.
-.P
-Consider the case above, where \fBfoo \-> bar \-> baz\fR\|\.  Imagine if, in
-addition to that, baz depended on bar, so you\'d have: \fBfoo \-> bar \-> baz \-> bar \-> baz \.\.\.\fR\|\.  However, since the folder
-structure is: \fBfoo/node_modules/bar/node_modules/baz\fR, there\'s no need to
-put another copy of bar into \fB\|\.\.\./baz/node_modules\fR, since when it calls
-require("bar"), it will get the copy that is installed in \fBfoo/node_modules/bar\fR\|\.
-.
-.P
-This shortcut is only used if the exact same
-version would be installed in multiple nested \fBnode_modules\fR folders\.  It
-is still possible to have \fBa/node_modules/b/node_modules/a\fR if the two
-"a" packages are different versions\.  However, without repeating the
-exact same package multiple times, an infinite regress will always be
-prevented\.
-.
-.P
-Another optimization can be made by installing dependencies at the
-highest level possible, below the localized "target" folder\.
-.
-.SS "\fIExample\fR"
-Consider this dependency graph:
-.
-.IP "" 4
-.
-.nf
-foo
-+\-\- blerg@1\.2\.5
-+\-\- bar@1\.2\.3
-|   +\-\- blerg@1\.x (latest=1\.3\.7)
-|   +\-\- baz@2\.x
-|   |   `\-\- quux@3\.x
-|   |       `\-\- bar@1\.2\.3 (cycle)
-|   `\-\- asdf@*
-`\-\- baz@1\.2\.3
-    `\-\- quux@3\.x
-        `\-\- bar
-.
-.fi
-.
-.IP "" 0
-.
-.P
-In this case, we might expect a folder structure like this:
-.
-.IP "" 4
-.
-.nf
-foo
-+\-\- node_modules
-    +\-\- blerg (1\.2\.5) <\-\-\-[A]
-    +\-\- bar (1\.2\.3) <\-\-\-[B]
-    |   `\-\- node_modules
-    |       +\-\- baz (2\.0\.2) <\-\-\-[C]
-    |       |   `\-\- node_modules
-    |       |       `\-\- quux (3\.2\.0)
-    |       `\-\- asdf (2\.3\.4)
-    `\-\- baz (1\.2\.3) <\-\-\-[D]
-        `\-\- node_modules
-            `\-\- quux (3\.2\.0) <\-\-\-[E]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Since foo depends directly on \fBbar@1\.2\.3\fR and \fBbaz@1\.2\.3\fR, those are
-installed in foo\'s \fBnode_modules\fR folder\.
-.
-.P
-Even though the latest copy of blerg is 1\.3\.7, foo has a specific
-dependency on version 1\.2\.5\.  So, that gets installed at [A]\.  Since the
-parent installation of blerg satisfies bar\'s dependency on \fBblerg@1\.x\fR,
-it does not install another copy under [B]\.
-.
-.P
-Bar [B] also has dependencies on baz and asdf, so those are installed in
-bar\'s \fBnode_modules\fR folder\.  Because it depends on \fBbaz@2\.x\fR, it cannot
-re\-use the \fBbaz@1\.2\.3\fR installed in the parent \fBnode_modules\fR folder [D],
-and must install its own copy [C]\.
-.
-.P
-Underneath bar, the \fBbaz \-> quux \-> bar\fR dependency creates a cycle\.
-However, because bar is already in quux\'s ancestry [B], it does not
-unpack another copy of bar into that folder\.
-.
-.P
-Underneath \fBfoo \-> baz\fR [D], quux\'s [E] folder tree is empty, because its
-dependency on bar is satisfied by the parent folder copy installed at [B]\.
-.
-.P
-For a graphical breakdown of what is installed where, use \fBnpm ls\fR\|\.
-.
-.SS "Publishing"
-Upon publishing, npm will look in the \fBnode_modules\fR folder\.  If any of
-the items there are not in the \fBbundledDependencies\fR array, then they will
-not be included in the package tarball\.
-.
-.P
-This allows a package maintainer to install all of their dependencies
-(and dev dependencies) locally, but only re\-publish those items that
-npm help  cannot be found elsewhere\.  See \fBpackage\.json\fR for more information\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help pack
-.
-.IP "\(bu" 4
-npm help cache
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/npm-json.5	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,825 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "PACKAGE\.JSON" "5" "November 2013" "" ""
-.
-.SH "NAME"
-\fBpackage.json\fR \-\- Specifics of npm\'s package\.json handling
-.
-.SH "DESCRIPTION"
-This document is all you need to know about what\'s required in your package\.json
-file\.  It must be actual JSON, not just a JavaScript object literal\.
-.
-.P
-A lot of the behavior described in this document is affected by the config
-npm help  settings described in \fBnpm\-config\fR\|\.
-.
-.SH "DEFAULT VALUES"
-npm will default some values based on package contents\.
-.
-.IP "\(bu" 4
-\fB"scripts": {"start": "node server\.js"}\fR
-.
-.IP
-If there is a \fBserver\.js\fR file in the root of your package, then npm
-will default the \fBstart\fR command to \fBnode server\.js\fR\|\.
-.
-.IP "\(bu" 4
-\fB"scripts":{"preinstall": "node\-waf clean || true; node\-waf configure build"}\fR
-.
-.IP
-If there is a \fBwscript\fR file in the root of your package, npm will
-default the \fBpreinstall\fR command to compile using node\-waf\.
-.
-.IP "\(bu" 4
-\fB"scripts":{"preinstall": "node\-gyp rebuild"}\fR
-.
-.IP
-If there is a \fBbinding\.gyp\fR file in the root of your package, npm will
-default the \fBpreinstall\fR command to compile using node\-gyp\.
-.
-.IP "\(bu" 4
-\fB"contributors": [\.\.\.]\fR
-.
-.IP
-If there is an \fBAUTHORS\fR file in the root of your package, npm will
-treat each line as a \fBName <email> (url)\fR format, where email and url
-are optional\.  Lines which start with a \fB#\fR or are blank, will be
-ignored\.
-.
-.IP "" 0
-.
-.SH "name"
-The \fImost\fR important things in your package\.json are the name and version fields\.
-Those are actually required, and your package won\'t install without
-them\.  The name and version together form an identifier that is assumed
-to be completely unique\.  Changes to the package should come along with
-changes to the version\.
-.
-.P
-The name is what your thing is called\.  Some tips:
-.
-.IP "\(bu" 4
-Don\'t put "js" or "node" in the name\.  It\'s assumed that it\'s js, since you\'re
-writing a package\.json file, and you can specify the engine using the "engines"
-field\.  (See below\.)
-.
-.IP "\(bu" 4
-The name ends up being part of a URL, an argument on the command line, and a
-folder name\. Any name with non\-url\-safe characters will be rejected\.
-Also, it can\'t start with a dot or an underscore\.
-.
-.IP "\(bu" 4
-The name will probably be passed as an argument to require(), so it should
-be something short, but also reasonably descriptive\.
-.
-.IP "\(bu" 4
-You may want to check the npm registry to see if there\'s something by that name
-already, before you get too attached to it\.  http://registry\.npmjs\.org/
-.
-.IP "" 0
-.
-.SH "version"
-The \fImost\fR important things in your package\.json are the name and version fields\.
-Those are actually required, and your package won\'t install without
-them\.  The name and version together form an identifier that is assumed
-to be completely unique\.  Changes to the package should come along with
-changes to the version\.
-.
-.P
-Version must be parseable by node\-semver \fIhttps://github\.com/isaacs/node\-semver\fR, which is bundled
-with npm as a dependency\.  (\fBnpm install semver\fR to use it yourself\.)
-.
-.P
-npm help  More on version numbers and ranges at semver\.
-.
-.SH "description"
-Put a description in it\.  It\'s a string\.  This helps people discover your
-package, as it\'s listed in \fBnpm search\fR\|\.
-.
-.SH "keywords"
-Put keywords in it\.  It\'s an array of strings\.  This helps people
-discover your package as it\'s listed in \fBnpm search\fR\|\.
-.
-.SH "homepage"
-The url to the project homepage\.
-.
-.P
-\fBNOTE\fR: This is \fInot\fR the same as "url"\.  If you put a "url" field,
-then the registry will think it\'s a redirection to your package that has
-been published somewhere else, and spit at you\.
-.
-.P
-Literally\.  Spit\.  I\'m so not kidding\.
-.
-.SH "bugs"
-The url to your project\'s issue tracker and / or the email address to which
-issues should be reported\. These are helpful for people who encounter issues
-with your package\.
-.
-.P
-It should look like this:
-.
-.IP "" 4
-.
-.nf
-{ "url" : "http://github\.com/owner/project/issues"
-, "email" : "project@hostname\.com"
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-You can specify either one or both values\. If you want to provide only a url,
-you can specify the value for "bugs" as a simple string instead of an object\.
-.
-.P
-If a url is provided, it will be used by the \fBnpm bugs\fR command\.
-.
-.SH "license"
-You should specify a license for your package so that people know how they are
-permitted to use it, and any restrictions you\'re placing on it\.
-.
-.P
-The simplest way, assuming you\'re using a common license such as BSD or MIT, is
-to just specify the name of the license you\'re using, like this:
-.
-.IP "" 4
-.
-.nf
-{ "license" : "BSD" }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If you have more complex licensing terms, or you want to provide more detail
-in your package\.json file, you can use the more verbose plural form, like this:
-.
-.IP "" 4
-.
-.nf
-"licenses" : [
-  { "type" : "MyLicense"
-  , "url" : "http://github\.com/owner/project/path/to/license"
-  }
-]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-It\'s also a good idea to include a license file at the top level in your package\.
-.
-.SH "people fields: author, contributors"
-The "author" is one person\.  "contributors" is an array of people\.  A "person"
-is an object with a "name" field and optionally "url" and "email", like this:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "Barney Rubble"
-, "email" : "b@rubble\.com"
-, "url" : "http://barnyrubble\.tumblr\.com/"
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Or you can shorten that all into a single string, and npm will parse it for you:
-.
-.IP "" 4
-.
-.nf
-"Barney Rubble <b@rubble\.com> (http://barnyrubble\.tumblr\.com/)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Both email and url are optional either way\.
-.
-.P
-npm also sets a top\-level "maintainers" field with your npm user info\.
-.
-.SH "files"
-The "files" field is an array of files to include in your project\.  If
-you name a folder in the array, then it will also include the files
-inside that folder\. (Unless they would be ignored by another rule\.)
-.
-.P
-You can also provide a "\.npmignore" file in the root of your package,
-which will keep files from being included, even if they would be picked
-up by the files array\.  The "\.npmignore" file works just like a
-"\.gitignore"\.
-.
-.SH "main"
-The main field is a module ID that is the primary entry point to your program\.
-That is, if your package is named \fBfoo\fR, and a user installs it, and then does \fBrequire("foo")\fR, then your main module\'s exports object will be returned\.
-.
-.P
-This should be a module ID relative to the root of your package folder\.
-.
-.P
-For most modules, it makes the most sense to have a main script and often not
-much else\.
-.
-.SH "bin"
-A lot of packages have one or more executable files that they\'d like to
-install into the PATH\. npm makes this pretty easy (in fact, it uses this
-feature to install the "npm" executable\.)
-.
-.P
-To use this, supply a \fBbin\fR field in your package\.json which is a map of
-command name to local file name\. On install, npm will symlink that file into \fBprefix/bin\fR for global installs, or \fB\|\./node_modules/\.bin/\fR for local
-installs\.
-.
-.P
-For example, npm has this:
-.
-.IP "" 4
-.
-.nf
-{ "bin" : { "npm" : "\./cli\.js" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-So, when you install npm, it\'ll create a symlink from the \fBcli\.js\fR script to \fB/usr/local/bin/npm\fR\|\.
-.
-.P
-If you have a single executable, and its name should be the name
-of the package, then you can just supply it as a string\.  For example:
-.
-.IP "" 4
-.
-.nf
-{ "name": "my\-program"
-, "version": "1\.2\.5"
-, "bin": "\./path/to/program" }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-would be the same as this:
-.
-.IP "" 4
-.
-.nf
-{ "name": "my\-program"
-, "version": "1\.2\.5"
-, "bin" : { "my\-program" : "\./path/to/program" } }
-.
-.fi
-.
-.IP "" 0
-.
-.SH "man"
-Specify either a single file or an array of filenames to put in place for the \fBman\fR program to find\.
-.
-.P
-If only a single file is provided, then it\'s installed such that it is the
-result from \fBman <pkgname>\fR, regardless of its actual filename\.  For example:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "version" : "1\.2\.3"
-, "description" : "A packaged foo fooer for fooing foos"
-, "main" : "foo\.js"
-, "man" : "\./man/doc\.1"
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-would link the \fB\|\./man/doc\.1\fR file in such that it is the target for \fBman foo\fR
-.
-.P
-If the filename doesn\'t start with the package name, then it\'s prefixed\.
-So, this:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "version" : "1\.2\.3"
-, "description" : "A packaged foo fooer for fooing foos"
-, "main" : "foo\.js"
-, "man" : [ "\./man/foo\.1", "\./man/bar\.1" ]
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-will create files to do \fBman foo\fR and \fBman foo\-bar\fR\|\.
-.
-.P
-Man files must end with a number, and optionally a \fB\|\.gz\fR suffix if they are
-compressed\.  The number dictates which man section the file is installed into\.
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "version" : "1\.2\.3"
-, "description" : "A packaged foo fooer for fooing foos"
-, "main" : "foo\.js"
-, "man" : [ "\./man/foo\.1", "\./man/foo\.2" ]
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-will create entries for \fBman foo\fR and \fBman 2 foo\fR
-.
-.SH "directories"
-The CommonJS Packages \fIhttp://wiki\.commonjs\.org/wiki/Packages/1\.0\fR spec details a
-few ways that you can indicate the structure of your package using a \fBdirectories\fR
-hash\. If you look at npm\'s package\.json \fIhttp://registry\.npmjs\.org/npm/latest\fR,
-you\'ll see that it has directories for doc, lib, and man\.
-.
-.P
-In the future, this information may be used in other creative ways\.
-.
-.SS "directories\.lib"
-Tell people where the bulk of your library is\.  Nothing special is done
-with the lib folder in any way, but it\'s useful meta info\.
-.
-.SS "directories\.bin"
-If you specify a "bin" directory, then all the files in that folder will
-be used as the "bin" hash\.
-.
-.P
-If you have a "bin" hash already, then this has no effect\.
-.
-.SS "directories\.man"
-A folder that is full of man pages\.  Sugar to generate a "man" array by
-walking the folder\.
-.
-.SS "directories\.doc"
-Put markdown files in here\.  Eventually, these will be displayed nicely,
-maybe, someday\.
-.
-.SS "directories\.example"
-Put example scripts in here\.  Someday, it might be exposed in some clever way\.
-.
-.SH "repository"
-Specify the place where your code lives\. This is helpful for people who
-want to contribute\.  If the git repo is on github, then the \fBnpm docs\fR
-command will be able to find you\.
-.
-.P
-Do it like this:
-.
-.IP "" 4
-.
-.nf
-"repository" :
-  { "type" : "git"
-  , "url" : "http://github\.com/isaacs/npm\.git"
-  }
-"repository" :
-  { "type" : "svn"
-  , "url" : "http://v8\.googlecode\.com/svn/trunk/"
-  }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The URL should be a publicly available (perhaps read\-only) url that can be handed
-directly to a VCS program without any modification\.  It should not be a url to an
-html project page that you put in your browser\.  It\'s for computers\.
-.
-.SH "scripts"
-The "scripts" member is an object hash of script commands that are run
-at various times in the lifecycle of your package\.  The key is the lifecycle
-event, and the value is the command to run at that point\.
-.
-.P
-npm help  See \fBnpm\-scripts\fR to find out more about writing package scripts\.
-.
-.SH "config"
-A "config" hash can be used to set configuration
-parameters used in package scripts that persist across upgrades\.  For
-instance, if a package had the following:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "config" : { "port" : "8080" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-and then had a "start" command that then referenced the \fBnpm_package_config_port\fR environment variable, then the user could
-override that by doing \fBnpm config set foo:port 8001\fR\|\.
-.
-.P
-npm help  See \fBnpm\-confignpm help  \fR and \fBnpm\-scripts\fR for more on package
-configs\.
-.
-.SH "dependencies"
-Dependencies are specified with a simple hash of package name to
-version range\. The version range is a string which has one or more
-space\-separated descriptors\.  Dependencies can also be identified with
-a tarball or git URL\.
-.
-.P
-\fBPlease do not put test harnesses or transpilers in your \fBdependencies\fR hash\.\fR  See \fBdevDependencies\fR, below\.
-.
-.P
-npm help  See semver for more details about specifying version ranges\.
-.
-.IP "\(bu" 4
-\fBversion\fR Must match \fBversion\fR exactly
-.
-.IP "\(bu" 4
-\fB>version\fR Must be greater than \fBversion\fR
-.
-.IP "\(bu" 4
-\fB>=version\fR etc
-.
-.IP "\(bu" 4
-\fB<version\fR
-.
-.IP "\(bu" 4
-\fB<=version\fR
-.
-.IP "\(bu" 4
-npm help  \fB~version\fR "Approximately equivalent to version"  See semver
-.
-.IP "\(bu" 4
-\fB1\.2\.x\fR 1\.2\.0, 1\.2\.1, etc\., but not 1\.3\.0
-.
-.IP "\(bu" 4
-\fBhttp://\.\.\.\fR See \'URLs as Dependencies\' below
-.
-.IP "\(bu" 4
-\fB*\fR Matches any version
-.
-.IP "\(bu" 4
-\fB""\fR (just an empty string) Same as \fB*\fR
-.
-.IP "\(bu" 4
-\fBversion1 \- version2\fR Same as \fB>=version1 <=version2\fR\|\.
-.
-.IP "\(bu" 4
-\fBrange1 || range2\fR Passes if either range1 or range2 are satisfied\.
-.
-.IP "\(bu" 4
-\fBgit\.\.\.\fR See \'Git URLs as Dependencies\' below
-.
-.IP "\(bu" 4
-\fBuser/repo\fR See \'GitHub URLs\' below
-.
-.IP "" 0
-.
-.P
-For example, these are all valid:
-.
-.IP "" 4
-.
-.nf
-{ "dependencies" :
-  { "foo" : "1\.0\.0 \- 2\.9999\.9999"
-  , "bar" : ">=1\.0\.2 <2\.1\.2"
-  , "baz" : ">1\.0\.2 <=2\.3\.4"
-  , "boo" : "2\.0\.1"
-  , "qux" : "<1\.0\.0 || >=2\.3\.1 <2\.4\.5 || >=2\.5\.2 <3\.0\.0"
-  , "asd" : "http://asdf\.com/asdf\.tar\.gz"
-  , "til" : "~1\.2"
-  , "elf" : "~1\.2\.3"
-  , "two" : "2\.x"
-  , "thr" : "3\.3\.x"
-  }
-}
-.
-.fi
-.
-.IP "" 0
-.
-.SS "URLs as Dependencies"
-You may specify a tarball URL in place of a version range\.
-.
-.P
-This tarball will be downloaded and installed locally to your package at
-install time\.
-.
-.SS "Git URLs as Dependencies"
-Git urls can be of the form:
-.
-.IP "" 4
-.
-.nf
-git://github\.com/user/project\.git#commit\-ish
-git+ssh://user@hostname:project\.git#commit\-ish
-git+ssh://user@hostname/project\.git#commit\-ish
-git+http://user@hostname/project/blah\.git#commit\-ish
-git+https://user@hostname/project/blah\.git#commit\-ish
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The \fBcommit\-ish\fR can be any tag, sha, or branch which can be supplied as
-an argument to \fBgit checkout\fR\|\.  The default is \fBmaster\fR\|\.
-.
-.SH "GitHub URLs"
-As of version 1\.1\.65, you can refer to GitHub urls as just "foo": "user/foo\-project"\. For example:
-.
-.P
-\fBjson
-{
-  "name": "foo",
-  "version": "0\.0\.0",
-  "dependencies": {
-    "express": "visionmedia/express"
-  }
-}\fR
-.
-.SH "devDependencies"
-If someone is planning on downloading and using your module in their
-program, then they probably don\'t want or need to download and build
-the external test or documentation framework that you use\.
-.
-.P
-In this case, it\'s best to list these additional items in a \fBdevDependencies\fR hash\.
-.
-.P
-These things will be installed when doing \fBnpm link\fR or \fBnpm install\fR
-from the root of a package, and can be managed like any other npm
-npm help  configuration param\.  See \fBnpm\-config\fR for more on the topic\.
-.
-.P
-For build steps that are not platform\-specific, such as compiling
-CoffeeScript or other languages to JavaScript, use the \fBprepublish\fR
-script to do this, and make the required package a devDependency\.
-.
-.P
-For example:
-.
-.P
-\fBjson
-{ "name": "ethopia\-waza",
-  "description": "a delightfully fruity coffee varietal",
-  "version": "1\.2\.3",
-  "devDependencies": {
-    "coffee\-script": "~1\.6\.3"
-  },
-  "scripts": {
-    "prepublish": "coffee \-o lib/ \-c src/waza\.coffee"
-  },
-  "main": "lib/waza\.js"
-}\fR
-.
-.P
-The \fBprepublish\fR script will be run before publishing, so that users
-can consume the functionality without requiring them to compile it
-themselves\.  In dev mode (ie, locally running \fBnpm install\fR), it\'ll
-run this script as well, so that you can test it easily\.
-.
-.SH "bundledDependencies"
-Array of package names that will be bundled when publishing the package\.
-.
-.P
-If this is spelled \fB"bundleDependencies"\fR, then that is also honorable\.
-.
-.SH "optionalDependencies"
-If a dependency can be used, but you would like npm to proceed if it
-cannot be found or fails to install, then you may put it in the \fBoptionalDependencies\fR hash\.  This is a map of package name to version
-or url, just like the \fBdependencies\fR hash\.  The difference is that
-failure is tolerated\.
-.
-.P
-It is still your program\'s responsibility to handle the lack of the
-dependency\.  For example, something like this:
-.
-.IP "" 4
-.
-.nf
-try {
-  var foo = require(\'foo\')
-  var fooVersion = require(\'foo/package\.json\')\.version
-} catch (er) {
-  foo = null
-}
-if ( notGoodFooVersion(fooVersion) ) {
-  foo = null
-}
-// \.\. then later in your program \.\.
-if (foo) {
-  foo\.doFooThings()
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Entries in \fBoptionalDependencies\fR will override entries of the same name in \fBdependencies\fR, so it\'s usually best to only put in one place\.
-.
-.SH "engines"
-You can specify the version of node that your stuff works on:
-.
-.IP "" 4
-.
-.nf
-{ "engines" : { "node" : ">=0\.10\.3 <0\.12" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-And, like with dependencies, if you don\'t specify the version (or if you
-specify "*" as the version), then any version of node will do\.
-.
-.P
-If you specify an "engines" field, then npm will require that "node" be
-somewhere on that list\. If "engines" is omitted, then npm will just assume
-that it works on node\.
-.
-.P
-You can also use the "engines" field to specify which versions of npm
-are capable of properly installing your program\.  For example:
-.
-.IP "" 4
-.
-.nf
-{ "engines" : { "npm" : "~1\.0\.20" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Note that, unless the user has set the \fBengine\-strict\fR config flag, this
-field is advisory only\.
-.
-.SH "engineStrict"
-If you are sure that your module will \fIdefinitely not\fR run properly on
-versions of Node/npm other than those specified in the \fBengines\fR hash,
-then you can set \fB"engineStrict": true\fR in your package\.json file\.
-This will override the user\'s \fBengine\-strict\fR config setting\.
-.
-.P
-Please do not do this unless you are really very very sure\.  If your
-engines hash is something overly restrictive, you can quite easily and
-inadvertently lock yourself into obscurity and prevent your users from
-updating to new versions of Node\.  Consider this choice carefully\.  If
-people abuse it, it will be removed in a future version of npm\.
-.
-.SH "os"
-You can specify which operating systems your
-module will run on:
-.
-.IP "" 4
-.
-.nf
-"os" : [ "darwin", "linux" ]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-You can also blacklist instead of whitelist operating systems,
-just prepend the blacklisted os with a \'!\':
-.
-.IP "" 4
-.
-.nf
-"os" : [ "!win32" ]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The host operating system is determined by \fBprocess\.platform\fR
-.
-.P
-It is allowed to both blacklist, and whitelist, although there isn\'t any
-good reason to do this\.
-.
-.SH "cpu"
-If your code only runs on certain cpu architectures,
-you can specify which ones\.
-.
-.IP "" 4
-.
-.nf
-"cpu" : [ "x64", "ia32" ]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Like the \fBos\fR option, you can also blacklist architectures:
-.
-.IP "" 4
-.
-.nf
-"cpu" : [ "!arm", "!mips" ]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The host architecture is determined by \fBprocess\.arch\fR
-.
-.SH "preferGlobal"
-If your package is primarily a command\-line application that should be
-installed globally, then set this value to \fBtrue\fR to provide a warning
-if it is installed locally\.
-.
-.P
-It doesn\'t actually prevent users from installing it locally, but it
-does help prevent some confusion if it doesn\'t work as expected\.
-.
-.SH "private"
-If you set \fB"private": true\fR in your package\.json, then npm will refuse
-to publish it\.
-.
-.P
-This is a way to prevent accidental publication of private repositories\.
-If you would like to ensure that a given package is only ever published
-to a specific registry (for example, an internal registry),
-then use the \fBpublishConfig\fR hash described below
-to override the \fBregistry\fR config param at publish\-time\.
-.
-.SH "publishConfig"
-This is a set of config values that will be used at publish\-time\.  It\'s
-especially handy if you want to set the tag or registry, so that you can
-ensure that a given package is not tagged with "latest" or published to
-the global public registry by default\.
-.
-.P
-Any config values can be overridden, but of course only "tag" and
-"registry" probably matter for the purposes of publishing\.
-.
-.P
-npm help  See \fBnpm\-config\fR to see the list of config options that can be
-overridden\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  semver
-.
-.IP "\(bu" 4
-npm help init
-.
-.IP "\(bu" 4
-npm help version
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help help
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help rm
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/npmrc.5	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,89 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPMRC" "5" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpmrc\fR \-\- The npm config files
-.
-.SH "DESCRIPTION"
-npm gets its config settings from the command line, environment
-variables, and \fBnpmrc\fR files\.
-.
-.P
-The \fBnpm config\fR command can be used to update and edit the contents
-of the user and global npmrc files\.
-.
-.P
-npm help  For a list of available configuration options, see npm\-config\.
-.
-.SH "FILES"
-The three relevant files are:
-.
-.IP "\(bu" 4
-per\-user config file (~/\.npmrc)
-.
-.IP "\(bu" 4
-global config file ($PREFIX/npmrc)
-.
-.IP "\(bu" 4
-npm builtin config file (/path/to/npm/npmrc)
-.
-.IP "" 0
-.
-.P
-All npm config files are an ini\-formatted list of \fBkey = value\fR
-parameters\.  Environment variables can be replaced using \fB${VARIABLE_NAME}\fR\|\. For example:
-.
-.IP "" 4
-.
-.nf
-prefix = ${HOME}/\.npm\-packages
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Each of these files is loaded, and config options are resolved in
-priority order\.  For example, a setting in the userconfig file would
-override the setting in the globalconfig file\.
-.
-.SS "Per\-user config file"
-\fB$HOME/\.npmrc\fR (or the \fBuserconfig\fR param, if set in the environment
-or on the command line)
-.
-.SS "Global config file"
-\fB$PREFIX/etc/npmrc\fR (or the \fBglobalconfig\fR param, if set above):
-This file is an ini\-file formatted list of \fBkey = value\fR parameters\.
-Environment variables can be replaced as above\.
-.
-.SS "Built\-in config file"
-\fBpath/to/npm/itself/npmrc\fR
-.
-.P
-This is an unchangeable "builtin" configuration file that npm keeps
-consistent across updates\.  Set fields in here using the \fB\|\./configure\fR
-script that comes with npm\.  This is primarily for distribution
-maintainers to override default configs in a standard and consistent
-manner\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man5/package.json.5	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,825 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "PACKAGE\.JSON" "5" "November 2013" "" ""
-.
-.SH "NAME"
-\fBpackage.json\fR \-\- Specifics of npm\'s package\.json handling
-.
-.SH "DESCRIPTION"
-This document is all you need to know about what\'s required in your package\.json
-file\.  It must be actual JSON, not just a JavaScript object literal\.
-.
-.P
-A lot of the behavior described in this document is affected by the config
-npm help  settings described in \fBnpm\-config\fR\|\.
-.
-.SH "DEFAULT VALUES"
-npm will default some values based on package contents\.
-.
-.IP "\(bu" 4
-\fB"scripts": {"start": "node server\.js"}\fR
-.
-.IP
-If there is a \fBserver\.js\fR file in the root of your package, then npm
-will default the \fBstart\fR command to \fBnode server\.js\fR\|\.
-.
-.IP "\(bu" 4
-\fB"scripts":{"preinstall": "node\-waf clean || true; node\-waf configure build"}\fR
-.
-.IP
-If there is a \fBwscript\fR file in the root of your package, npm will
-default the \fBpreinstall\fR command to compile using node\-waf\.
-.
-.IP "\(bu" 4
-\fB"scripts":{"preinstall": "node\-gyp rebuild"}\fR
-.
-.IP
-If there is a \fBbinding\.gyp\fR file in the root of your package, npm will
-default the \fBpreinstall\fR command to compile using node\-gyp\.
-.
-.IP "\(bu" 4
-\fB"contributors": [\.\.\.]\fR
-.
-.IP
-If there is an \fBAUTHORS\fR file in the root of your package, npm will
-treat each line as a \fBName <email> (url)\fR format, where email and url
-are optional\.  Lines which start with a \fB#\fR or are blank, will be
-ignored\.
-.
-.IP "" 0
-.
-.SH "name"
-The \fImost\fR important things in your package\.json are the name and version fields\.
-Those are actually required, and your package won\'t install without
-them\.  The name and version together form an identifier that is assumed
-to be completely unique\.  Changes to the package should come along with
-changes to the version\.
-.
-.P
-The name is what your thing is called\.  Some tips:
-.
-.IP "\(bu" 4
-Don\'t put "js" or "node" in the name\.  It\'s assumed that it\'s js, since you\'re
-writing a package\.json file, and you can specify the engine using the "engines"
-field\.  (See below\.)
-.
-.IP "\(bu" 4
-The name ends up being part of a URL, an argument on the command line, and a
-folder name\. Any name with non\-url\-safe characters will be rejected\.
-Also, it can\'t start with a dot or an underscore\.
-.
-.IP "\(bu" 4
-The name will probably be passed as an argument to require(), so it should
-be something short, but also reasonably descriptive\.
-.
-.IP "\(bu" 4
-You may want to check the npm registry to see if there\'s something by that name
-already, before you get too attached to it\.  http://registry\.npmjs\.org/
-.
-.IP "" 0
-.
-.SH "version"
-The \fImost\fR important things in your package\.json are the name and version fields\.
-Those are actually required, and your package won\'t install without
-them\.  The name and version together form an identifier that is assumed
-to be completely unique\.  Changes to the package should come along with
-changes to the version\.
-.
-.P
-Version must be parseable by node\-semver \fIhttps://github\.com/isaacs/node\-semver\fR, which is bundled
-with npm as a dependency\.  (\fBnpm install semver\fR to use it yourself\.)
-.
-.P
-npm help  More on version numbers and ranges at semver\.
-.
-.SH "description"
-Put a description in it\.  It\'s a string\.  This helps people discover your
-package, as it\'s listed in \fBnpm search\fR\|\.
-.
-.SH "keywords"
-Put keywords in it\.  It\'s an array of strings\.  This helps people
-discover your package as it\'s listed in \fBnpm search\fR\|\.
-.
-.SH "homepage"
-The url to the project homepage\.
-.
-.P
-\fBNOTE\fR: This is \fInot\fR the same as "url"\.  If you put a "url" field,
-then the registry will think it\'s a redirection to your package that has
-been published somewhere else, and spit at you\.
-.
-.P
-Literally\.  Spit\.  I\'m so not kidding\.
-.
-.SH "bugs"
-The url to your project\'s issue tracker and / or the email address to which
-issues should be reported\. These are helpful for people who encounter issues
-with your package\.
-.
-.P
-It should look like this:
-.
-.IP "" 4
-.
-.nf
-{ "url" : "http://github\.com/owner/project/issues"
-, "email" : "project@hostname\.com"
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-You can specify either one or both values\. If you want to provide only a url,
-you can specify the value for "bugs" as a simple string instead of an object\.
-.
-.P
-If a url is provided, it will be used by the \fBnpm bugs\fR command\.
-.
-.SH "license"
-You should specify a license for your package so that people know how they are
-permitted to use it, and any restrictions you\'re placing on it\.
-.
-.P
-The simplest way, assuming you\'re using a common license such as BSD or MIT, is
-to just specify the name of the license you\'re using, like this:
-.
-.IP "" 4
-.
-.nf
-{ "license" : "BSD" }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If you have more complex licensing terms, or you want to provide more detail
-in your package\.json file, you can use the more verbose plural form, like this:
-.
-.IP "" 4
-.
-.nf
-"licenses" : [
-  { "type" : "MyLicense"
-  , "url" : "http://github\.com/owner/project/path/to/license"
-  }
-]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-It\'s also a good idea to include a license file at the top level in your package\.
-.
-.SH "people fields: author, contributors"
-The "author" is one person\.  "contributors" is an array of people\.  A "person"
-is an object with a "name" field and optionally "url" and "email", like this:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "Barney Rubble"
-, "email" : "b@rubble\.com"
-, "url" : "http://barnyrubble\.tumblr\.com/"
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Or you can shorten that all into a single string, and npm will parse it for you:
-.
-.IP "" 4
-.
-.nf
-"Barney Rubble <b@rubble\.com> (http://barnyrubble\.tumblr\.com/)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Both email and url are optional either way\.
-.
-.P
-npm also sets a top\-level "maintainers" field with your npm user info\.
-.
-.SH "files"
-The "files" field is an array of files to include in your project\.  If
-you name a folder in the array, then it will also include the files
-inside that folder\. (Unless they would be ignored by another rule\.)
-.
-.P
-You can also provide a "\.npmignore" file in the root of your package,
-which will keep files from being included, even if they would be picked
-up by the files array\.  The "\.npmignore" file works just like a
-"\.gitignore"\.
-.
-.SH "main"
-The main field is a module ID that is the primary entry point to your program\.
-That is, if your package is named \fBfoo\fR, and a user installs it, and then does \fBrequire("foo")\fR, then your main module\'s exports object will be returned\.
-.
-.P
-This should be a module ID relative to the root of your package folder\.
-.
-.P
-For most modules, it makes the most sense to have a main script and often not
-much else\.
-.
-.SH "bin"
-A lot of packages have one or more executable files that they\'d like to
-install into the PATH\. npm makes this pretty easy (in fact, it uses this
-feature to install the "npm" executable\.)
-.
-.P
-To use this, supply a \fBbin\fR field in your package\.json which is a map of
-command name to local file name\. On install, npm will symlink that file into \fBprefix/bin\fR for global installs, or \fB\|\./node_modules/\.bin/\fR for local
-installs\.
-.
-.P
-For example, npm has this:
-.
-.IP "" 4
-.
-.nf
-{ "bin" : { "npm" : "\./cli\.js" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-So, when you install npm, it\'ll create a symlink from the \fBcli\.js\fR script to \fB/usr/local/bin/npm\fR\|\.
-.
-.P
-If you have a single executable, and its name should be the name
-of the package, then you can just supply it as a string\.  For example:
-.
-.IP "" 4
-.
-.nf
-{ "name": "my\-program"
-, "version": "1\.2\.5"
-, "bin": "\./path/to/program" }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-would be the same as this:
-.
-.IP "" 4
-.
-.nf
-{ "name": "my\-program"
-, "version": "1\.2\.5"
-, "bin" : { "my\-program" : "\./path/to/program" } }
-.
-.fi
-.
-.IP "" 0
-.
-.SH "man"
-Specify either a single file or an array of filenames to put in place for the \fBman\fR program to find\.
-.
-.P
-If only a single file is provided, then it\'s installed such that it is the
-result from \fBman <pkgname>\fR, regardless of its actual filename\.  For example:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "version" : "1\.2\.3"
-, "description" : "A packaged foo fooer for fooing foos"
-, "main" : "foo\.js"
-, "man" : "\./man/doc\.1"
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-would link the \fB\|\./man/doc\.1\fR file in such that it is the target for \fBman foo\fR
-.
-.P
-If the filename doesn\'t start with the package name, then it\'s prefixed\.
-So, this:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "version" : "1\.2\.3"
-, "description" : "A packaged foo fooer for fooing foos"
-, "main" : "foo\.js"
-, "man" : [ "\./man/foo\.1", "\./man/bar\.1" ]
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-will create files to do \fBman foo\fR and \fBman foo\-bar\fR\|\.
-.
-.P
-Man files must end with a number, and optionally a \fB\|\.gz\fR suffix if they are
-compressed\.  The number dictates which man section the file is installed into\.
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "version" : "1\.2\.3"
-, "description" : "A packaged foo fooer for fooing foos"
-, "main" : "foo\.js"
-, "man" : [ "\./man/foo\.1", "\./man/foo\.2" ]
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-will create entries for \fBman foo\fR and \fBman 2 foo\fR
-.
-.SH "directories"
-The CommonJS Packages \fIhttp://wiki\.commonjs\.org/wiki/Packages/1\.0\fR spec details a
-few ways that you can indicate the structure of your package using a \fBdirectories\fR
-hash\. If you look at npm\'s package\.json \fIhttp://registry\.npmjs\.org/npm/latest\fR,
-you\'ll see that it has directories for doc, lib, and man\.
-.
-.P
-In the future, this information may be used in other creative ways\.
-.
-.SS "directories\.lib"
-Tell people where the bulk of your library is\.  Nothing special is done
-with the lib folder in any way, but it\'s useful meta info\.
-.
-.SS "directories\.bin"
-If you specify a "bin" directory, then all the files in that folder will
-be used as the "bin" hash\.
-.
-.P
-If you have a "bin" hash already, then this has no effect\.
-.
-.SS "directories\.man"
-A folder that is full of man pages\.  Sugar to generate a "man" array by
-walking the folder\.
-.
-.SS "directories\.doc"
-Put markdown files in here\.  Eventually, these will be displayed nicely,
-maybe, someday\.
-.
-.SS "directories\.example"
-Put example scripts in here\.  Someday, it might be exposed in some clever way\.
-.
-.SH "repository"
-Specify the place where your code lives\. This is helpful for people who
-want to contribute\.  If the git repo is on github, then the \fBnpm docs\fR
-command will be able to find you\.
-.
-.P
-Do it like this:
-.
-.IP "" 4
-.
-.nf
-"repository" :
-  { "type" : "git"
-  , "url" : "http://github\.com/isaacs/npm\.git"
-  }
-"repository" :
-  { "type" : "svn"
-  , "url" : "http://v8\.googlecode\.com/svn/trunk/"
-  }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The URL should be a publicly available (perhaps read\-only) url that can be handed
-directly to a VCS program without any modification\.  It should not be a url to an
-html project page that you put in your browser\.  It\'s for computers\.
-.
-.SH "scripts"
-The "scripts" member is an object hash of script commands that are run
-at various times in the lifecycle of your package\.  The key is the lifecycle
-event, and the value is the command to run at that point\.
-.
-.P
-npm help  See \fBnpm\-scripts\fR to find out more about writing package scripts\.
-.
-.SH "config"
-A "config" hash can be used to set configuration
-parameters used in package scripts that persist across upgrades\.  For
-instance, if a package had the following:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "config" : { "port" : "8080" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-and then had a "start" command that then referenced the \fBnpm_package_config_port\fR environment variable, then the user could
-override that by doing \fBnpm config set foo:port 8001\fR\|\.
-.
-.P
-npm help  See \fBnpm\-confignpm help  \fR and \fBnpm\-scripts\fR for more on package
-configs\.
-.
-.SH "dependencies"
-Dependencies are specified with a simple hash of package name to
-version range\. The version range is a string which has one or more
-space\-separated descriptors\.  Dependencies can also be identified with
-a tarball or git URL\.
-.
-.P
-\fBPlease do not put test harnesses or transpilers in your \fBdependencies\fR hash\.\fR  See \fBdevDependencies\fR, below\.
-.
-.P
-npm help  See semver for more details about specifying version ranges\.
-.
-.IP "\(bu" 4
-\fBversion\fR Must match \fBversion\fR exactly
-.
-.IP "\(bu" 4
-\fB>version\fR Must be greater than \fBversion\fR
-.
-.IP "\(bu" 4
-\fB>=version\fR etc
-.
-.IP "\(bu" 4
-\fB<version\fR
-.
-.IP "\(bu" 4
-\fB<=version\fR
-.
-.IP "\(bu" 4
-npm help  \fB~version\fR "Approximately equivalent to version"  See semver
-.
-.IP "\(bu" 4
-\fB1\.2\.x\fR 1\.2\.0, 1\.2\.1, etc\., but not 1\.3\.0
-.
-.IP "\(bu" 4
-\fBhttp://\.\.\.\fR See \'URLs as Dependencies\' below
-.
-.IP "\(bu" 4
-\fB*\fR Matches any version
-.
-.IP "\(bu" 4
-\fB""\fR (just an empty string) Same as \fB*\fR
-.
-.IP "\(bu" 4
-\fBversion1 \- version2\fR Same as \fB>=version1 <=version2\fR\|\.
-.
-.IP "\(bu" 4
-\fBrange1 || range2\fR Passes if either range1 or range2 are satisfied\.
-.
-.IP "\(bu" 4
-\fBgit\.\.\.\fR See \'Git URLs as Dependencies\' below
-.
-.IP "\(bu" 4
-\fBuser/repo\fR See \'GitHub URLs\' below
-.
-.IP "" 0
-.
-.P
-For example, these are all valid:
-.
-.IP "" 4
-.
-.nf
-{ "dependencies" :
-  { "foo" : "1\.0\.0 \- 2\.9999\.9999"
-  , "bar" : ">=1\.0\.2 <2\.1\.2"
-  , "baz" : ">1\.0\.2 <=2\.3\.4"
-  , "boo" : "2\.0\.1"
-  , "qux" : "<1\.0\.0 || >=2\.3\.1 <2\.4\.5 || >=2\.5\.2 <3\.0\.0"
-  , "asd" : "http://asdf\.com/asdf\.tar\.gz"
-  , "til" : "~1\.2"
-  , "elf" : "~1\.2\.3"
-  , "two" : "2\.x"
-  , "thr" : "3\.3\.x"
-  }
-}
-.
-.fi
-.
-.IP "" 0
-.
-.SS "URLs as Dependencies"
-You may specify a tarball URL in place of a version range\.
-.
-.P
-This tarball will be downloaded and installed locally to your package at
-install time\.
-.
-.SS "Git URLs as Dependencies"
-Git urls can be of the form:
-.
-.IP "" 4
-.
-.nf
-git://github\.com/user/project\.git#commit\-ish
-git+ssh://user@hostname:project\.git#commit\-ish
-git+ssh://user@hostname/project\.git#commit\-ish
-git+http://user@hostname/project/blah\.git#commit\-ish
-git+https://user@hostname/project/blah\.git#commit\-ish
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The \fBcommit\-ish\fR can be any tag, sha, or branch which can be supplied as
-an argument to \fBgit checkout\fR\|\.  The default is \fBmaster\fR\|\.
-.
-.SH "GitHub URLs"
-As of version 1\.1\.65, you can refer to GitHub urls as just "foo": "user/foo\-project"\. For example:
-.
-.P
-\fBjson
-{
-  "name": "foo",
-  "version": "0\.0\.0",
-  "dependencies": {
-    "express": "visionmedia/express"
-  }
-}\fR
-.
-.SH "devDependencies"
-If someone is planning on downloading and using your module in their
-program, then they probably don\'t want or need to download and build
-the external test or documentation framework that you use\.
-.
-.P
-In this case, it\'s best to list these additional items in a \fBdevDependencies\fR hash\.
-.
-.P
-These things will be installed when doing \fBnpm link\fR or \fBnpm install\fR
-from the root of a package, and can be managed like any other npm
-npm help  configuration param\.  See \fBnpm\-config\fR for more on the topic\.
-.
-.P
-For build steps that are not platform\-specific, such as compiling
-CoffeeScript or other languages to JavaScript, use the \fBprepublish\fR
-script to do this, and make the required package a devDependency\.
-.
-.P
-For example:
-.
-.P
-\fBjson
-{ "name": "ethopia\-waza",
-  "description": "a delightfully fruity coffee varietal",
-  "version": "1\.2\.3",
-  "devDependencies": {
-    "coffee\-script": "~1\.6\.3"
-  },
-  "scripts": {
-    "prepublish": "coffee \-o lib/ \-c src/waza\.coffee"
-  },
-  "main": "lib/waza\.js"
-}\fR
-.
-.P
-The \fBprepublish\fR script will be run before publishing, so that users
-can consume the functionality without requiring them to compile it
-themselves\.  In dev mode (ie, locally running \fBnpm install\fR), it\'ll
-run this script as well, so that you can test it easily\.
-.
-.SH "bundledDependencies"
-Array of package names that will be bundled when publishing the package\.
-.
-.P
-If this is spelled \fB"bundleDependencies"\fR, then that is also honorable\.
-.
-.SH "optionalDependencies"
-If a dependency can be used, but you would like npm to proceed if it
-cannot be found or fails to install, then you may put it in the \fBoptionalDependencies\fR hash\.  This is a map of package name to version
-or url, just like the \fBdependencies\fR hash\.  The difference is that
-failure is tolerated\.
-.
-.P
-It is still your program\'s responsibility to handle the lack of the
-dependency\.  For example, something like this:
-.
-.IP "" 4
-.
-.nf
-try {
-  var foo = require(\'foo\')
-  var fooVersion = require(\'foo/package\.json\')\.version
-} catch (er) {
-  foo = null
-}
-if ( notGoodFooVersion(fooVersion) ) {
-  foo = null
-}
-// \.\. then later in your program \.\.
-if (foo) {
-  foo\.doFooThings()
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Entries in \fBoptionalDependencies\fR will override entries of the same name in \fBdependencies\fR, so it\'s usually best to only put in one place\.
-.
-.SH "engines"
-You can specify the version of node that your stuff works on:
-.
-.IP "" 4
-.
-.nf
-{ "engines" : { "node" : ">=0\.10\.3 <0\.12" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-And, like with dependencies, if you don\'t specify the version (or if you
-specify "*" as the version), then any version of node will do\.
-.
-.P
-If you specify an "engines" field, then npm will require that "node" be
-somewhere on that list\. If "engines" is omitted, then npm will just assume
-that it works on node\.
-.
-.P
-You can also use the "engines" field to specify which versions of npm
-are capable of properly installing your program\.  For example:
-.
-.IP "" 4
-.
-.nf
-{ "engines" : { "npm" : "~1\.0\.20" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Note that, unless the user has set the \fBengine\-strict\fR config flag, this
-field is advisory only\.
-.
-.SH "engineStrict"
-If you are sure that your module will \fIdefinitely not\fR run properly on
-versions of Node/npm other than those specified in the \fBengines\fR hash,
-then you can set \fB"engineStrict": true\fR in your package\.json file\.
-This will override the user\'s \fBengine\-strict\fR config setting\.
-.
-.P
-Please do not do this unless you are really very very sure\.  If your
-engines hash is something overly restrictive, you can quite easily and
-inadvertently lock yourself into obscurity and prevent your users from
-updating to new versions of Node\.  Consider this choice carefully\.  If
-people abuse it, it will be removed in a future version of npm\.
-.
-.SH "os"
-You can specify which operating systems your
-module will run on:
-.
-.IP "" 4
-.
-.nf
-"os" : [ "darwin", "linux" ]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-You can also blacklist instead of whitelist operating systems,
-just prepend the blacklisted os with a \'!\':
-.
-.IP "" 4
-.
-.nf
-"os" : [ "!win32" ]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The host operating system is determined by \fBprocess\.platform\fR
-.
-.P
-It is allowed to both blacklist, and whitelist, although there isn\'t any
-good reason to do this\.
-.
-.SH "cpu"
-If your code only runs on certain cpu architectures,
-you can specify which ones\.
-.
-.IP "" 4
-.
-.nf
-"cpu" : [ "x64", "ia32" ]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Like the \fBos\fR option, you can also blacklist architectures:
-.
-.IP "" 4
-.
-.nf
-"cpu" : [ "!arm", "!mips" ]
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The host architecture is determined by \fBprocess\.arch\fR
-.
-.SH "preferGlobal"
-If your package is primarily a command\-line application that should be
-installed globally, then set this value to \fBtrue\fR to provide a warning
-if it is installed locally\.
-.
-.P
-It doesn\'t actually prevent users from installing it locally, but it
-does help prevent some confusion if it doesn\'t work as expected\.
-.
-.SH "private"
-If you set \fB"private": true\fR in your package\.json, then npm will refuse
-to publish it\.
-.
-.P
-This is a way to prevent accidental publication of private repositories\.
-If you would like to ensure that a given package is only ever published
-to a specific registry (for example, an internal registry),
-then use the \fBpublishConfig\fR hash described below
-to override the \fBregistry\fR config param at publish\-time\.
-.
-.SH "publishConfig"
-This is a set of config values that will be used at publish\-time\.  It\'s
-especially handy if you want to set the tag or registry, so that you can
-ensure that a given package is not tagged with "latest" or published to
-the global public registry by default\.
-.
-.P
-Any config values can be overridden, but of course only "tag" and
-"registry" probably matter for the purposes of publishing\.
-.
-.P
-npm help  See \fBnpm\-config\fR to see the list of config options that can be
-overridden\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  semver
-.
-.IP "\(bu" 4
-npm help init
-.
-.IP "\(bu" 4
-npm help version
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help help
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help rm
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-coding-style.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,254 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-CODING\-STYLE" "7" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-coding-style\fR \-\- npm\'s "funny" coding style
-.
-.SH "DESCRIPTION"
-npm\'s coding style is a bit unconventional\.  It is not different for
-difference\'s sake, but rather a carefully crafted style that is
-designed to reduce visual clutter and make bugs more apparent\.
-.
-.P
-If you want to contribute to npm (which is very encouraged), you should
-make your code conform to npm\'s style\.
-.
-.P
-Note: this concerns npm\'s code not the specific packages at npmjs\.org
-.
-.SH "Line Length"
-Keep lines shorter than 80 characters\.  It\'s better for lines to be
-too short than to be too long\.  Break up long lists, objects, and other
-statements onto multiple lines\.
-.
-.SH "Indentation"
-Two\-spaces\.  Tabs are better, but they look like hell in web browsers
-(and on github), and node uses 2 spaces, so that\'s that\.
-.
-.P
-Configure your editor appropriately\.
-.
-.SH "Curly braces"
-Curly braces belong on the same line as the thing that necessitates them\.
-.
-.P
-Bad:
-.
-.IP "" 4
-.
-.nf
-function ()
-{
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Good:
-.
-.IP "" 4
-.
-.nf
-function () {
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If a block needs to wrap to the next line, use a curly brace\.  Don\'t
-use it if it doesn\'t\.
-.
-.P
-Bad:
-.
-.IP "" 4
-.
-.nf
-if (foo) { bar() }
-while (foo)
-  bar()
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Good:
-.
-.IP "" 4
-.
-.nf
-if (foo) bar()
-while (foo) {
-  bar()
-}
-.
-.fi
-.
-.IP "" 0
-.
-.SH "Semicolons"
-Don\'t use them except in four situations:
-.
-.IP "\(bu" 4
-\fBfor (;;)\fR loops\.  They\'re actually required\.
-.
-.IP "\(bu" 4
-null loops like: \fBwhile (something) ;\fR (But you\'d better have a good
-reason for doing that\.)
-.
-.IP "\(bu" 4
-\fBcase "foo": doSomething(); break\fR
-.
-.IP "\(bu" 4
-In front of a leading \fB(\fR or \fB[\fR at the start of the line\.
-This prevents the expression from being interpreted
-as a function call or property access, respectively\.
-.
-.IP "" 0
-.
-.P
-Some examples of good semicolon usage:
-.
-.IP "" 4
-.
-.nf
-;(x || y)\.doSomething()
-;[a, b, c]\.forEach(doSomething)
-for (var i = 0; i < 10; i ++) {
-  switch (state) {
-    case "begin": start(); continue
-    case "end": finish(); break
-    default: throw new Error("unknown state")
-  }
-  end()
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Note that starting lines with \fB\-\fR and \fB+\fR also should be prefixed
-with a semicolon, but this is much less common\.
-.
-.SH "Comma First"
-If there is a list of things separated by commas, and it wraps
-across multiple lines, put the comma at the start of the next
-line, directly below the token that starts the list\.  Put the
-final token in the list on a line by itself\.  For example:
-.
-.IP "" 4
-.
-.nf
-var magicWords = [ "abracadabra"
-                 , "gesundheit"
-                 , "ventrilo"
-                 ]
-  , spells = { "fireball" : function () { setOnFire() }
-             , "water" : function () { putOut() }
-             }
-  , a = 1
-  , b = "abc"
-  , etc
-  , somethingElse
-.
-.fi
-.
-.IP "" 0
-.
-.SH "Whitespace"
-Put a single space in front of ( for anything other than a function call\.
-Also use a single space wherever it makes things more readable\.
-.
-.P
-Don\'t leave trailing whitespace at the end of lines\.  Don\'t indent empty
-lines\.  Don\'t use more spaces than are helpful\.
-.
-.SH "Functions"
-Use named functions\.  They make stack traces a lot easier to read\.
-.
-.SH "Callbacks, Sync/async Style"
-Use the asynchronous/non\-blocking versions of things as much as possible\.
-It might make more sense for npm to use the synchronous fs APIs, but this
-way, the fs and http and child process stuff all uses the same callback\-passing
-methodology\.
-.
-.P
-The callback should always be the last argument in the list\.  Its first
-argument is the Error or null\.
-.
-.P
-Be very careful never to ever ever throw anything\.  It\'s worse than useless\.
-Just send the error message back as the first argument to the callback\.
-.
-.SH "Errors"
-Always create a new Error object with your message\.  Don\'t just return a
-string message to the callback\.  Stack traces are handy\.
-.
-.SH "Logging"
-Logging is done using the npmlog \fIhttps://github\.com/isaacs/npmlog\fR
-utility\.
-.
-.P
-Please clean up logs when they are no longer helpful\.  In particular,
-logging the same object over and over again is not helpful\.  Logs should
-report what\'s happening so that it\'s easier to track down where a fault
-occurs\.
-.
-.P
-npm help  Use appropriate log levels\.  See \fBnpm\-config\fR and search for
-"loglevel"\.
-.
-.SH "Case, naming, etc\."
-Use \fBlowerCamelCase\fR for multiword identifiers when they refer to objects,
-functions, methods, members, or anything not specified in this section\.
-.
-.P
-Use \fBUpperCamelCase\fR for class names (things that you\'d pass to "new")\.
-.
-.P
-Use \fBall\-lower\-hyphen\-css\-case\fR for multiword filenames and config keys\.
-.
-.P
-Use named functions\.  They make stack traces easier to follow\.
-.
-.P
-Use \fBCAPS_SNAKE_CASE\fR for constants, things that should never change
-and are rarely used\.
-.
-.P
-Use a single uppercase letter for function names where the function
-would normally be anonymous, but needs to call itself recursively\.  It
-makes it clear that it\'s a "throwaway" function\.
-.
-.SH "null, undefined, false, 0"
-Boolean variables and functions should always be either \fBtrue\fR or \fBfalse\fR\|\.  Don\'t set it to 0 unless it\'s supposed to be a number\.
-.
-.P
-When something is intentionally missing or removed, set it to \fBnull\fR\|\.
-.
-.P
-Don\'t set things to \fBundefined\fR\|\.  Reserve that value to mean "not yet
-set to anything\."
-.
-.P
-Boolean objects are verboten\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  developers
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-config.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1454 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-CONFIG" "7" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-config\fR \-\- More than you probably want to know about npm configuration
-.
-.SH "DESCRIPTION"
-npm gets its configuration values from 6 sources, in this priority:
-.
-.SS "Command Line Flags"
-Putting \fB\-\-foo bar\fR on the command line sets the \fBfoo\fR configuration
-parameter to \fB"bar"\fR\|\.  A \fB\-\-\fR argument tells the cli parser to stop
-reading flags\.  A \fB\-\-flag\fR parameter that is at the \fIend\fR of the
-command will be given the value of \fBtrue\fR\|\.
-.
-.SS "Environment Variables"
-Any environment variables that start with \fBnpm_config_\fR will be
-interpreted as a configuration parameter\.  For example, putting \fBnpm_config_foo=bar\fR in your environment will set the \fBfoo\fR
-configuration parameter to \fBbar\fR\|\.  Any environment configurations that
-are not given a value will be given the value of \fBtrue\fR\|\.  Config
-values are case\-insensitive, so \fBNPM_CONFIG_FOO=bar\fR will work the
-same\.
-.
-.SS "npmrc Files"
-The three relevant files are:
-.
-.IP "\(bu" 4
-per\-user config file (~/\.npmrc)
-.
-.IP "\(bu" 4
-global config file ($PREFIX/npmrc)
-.
-.IP "\(bu" 4
-npm builtin config file (/path/to/npm/npmrc)
-.
-.IP "" 0
-.
-.P
-npm help  See npmrc for more details\.
-.
-.SS "Default Configs"
-A set of configuration parameters that are internal to npm, and are
-defaults if nothing else is specified\.
-.
-.SH "Shorthands and Other CLI Niceties"
-The following shorthands are parsed on the command\-line:
-.
-.IP "\(bu" 4
-\fB\-v\fR: \fB\-\-version\fR
-.
-.IP "\(bu" 4
-\fB\-h\fR, \fB\-?\fR, \fB\-\-help\fR, \fB\-H\fR: \fB\-\-usage\fR
-.
-.IP "\(bu" 4
-\fB\-s\fR, \fB\-\-silent\fR: \fB\-\-loglevel silent\fR
-.
-.IP "\(bu" 4
-\fB\-q\fR, \fB\-\-quiet\fR: \fB\-\-loglevel warn\fR
-.
-.IP "\(bu" 4
-\fB\-d\fR: \fB\-\-loglevel info\fR
-.
-.IP "\(bu" 4
-\fB\-dd\fR, \fB\-\-verbose\fR: \fB\-\-loglevel verbose\fR
-.
-.IP "\(bu" 4
-\fB\-ddd\fR: \fB\-\-loglevel silly\fR
-.
-.IP "\(bu" 4
-\fB\-g\fR: \fB\-\-global\fR
-.
-.IP "\(bu" 4
-\fB\-l\fR: \fB\-\-long\fR
-.
-.IP "\(bu" 4
-\fB\-m\fR: \fB\-\-message\fR
-.
-.IP "\(bu" 4
-\fB\-p\fR, \fB\-\-porcelain\fR: \fB\-\-parseable\fR
-.
-.IP "\(bu" 4
-\fB\-reg\fR: \fB\-\-registry\fR
-.
-.IP "\(bu" 4
-\fB\-v\fR: \fB\-\-version\fR
-.
-.IP "\(bu" 4
-\fB\-f\fR: \fB\-\-force\fR
-.
-.IP "\(bu" 4
-\fB\-desc\fR: \fB\-\-description\fR
-.
-.IP "\(bu" 4
-\fB\-S\fR: \fB\-\-save\fR
-.
-.IP "\(bu" 4
-\fB\-D\fR: \fB\-\-save\-dev\fR
-.
-.IP "\(bu" 4
-\fB\-O\fR: \fB\-\-save\-optional\fR
-.
-.IP "\(bu" 4
-\fB\-B\fR: \fB\-\-save\-bundle\fR
-.
-.IP "\(bu" 4
-\fB\-y\fR: \fB\-\-yes\fR
-.
-.IP "\(bu" 4
-\fB\-n\fR: \fB\-\-yes false\fR
-.
-.IP "\(bu" 4
-\fBll\fR and \fBla\fR commands: \fBls \-\-long\fR
-.
-.IP "" 0
-.
-.P
-If the specified configuration param resolves unambiguously to a known
-configuration parameter, then it is expanded to that configuration
-parameter\.  For example:
-.
-.IP "" 4
-.
-.nf
-npm ls \-\-par
-# same as:
-npm ls \-\-parseable
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If multiple single\-character shorthands are strung together, and the
-resulting combination is unambiguously not some other configuration
-param, then it is expanded to its various component pieces\.  For
-example:
-.
-.IP "" 4
-.
-.nf
-npm ls \-gpld
-# same as:
-npm ls \-\-global \-\-parseable \-\-long \-\-loglevel info
-.
-.fi
-.
-.IP "" 0
-.
-.SH "Per\-Package Config Settings"
-When running scripts (npm help  see \fBnpm\-scripts\fR) the package\.json "config"
-keys are overwritten in the environment if there is a config param of \fB<name>[@<version>]:<key>\fR\|\.  For example, if the package\.json has
-this:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "config" : { "port" : "8080" }
-, "scripts" : { "start" : "node server\.js" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-and the server\.js is this:
-.
-.IP "" 4
-.
-.nf
-http\.createServer(\.\.\.)\.listen(process\.env\.npm_package_config_port)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-then the user could change the behavior by doing:
-.
-.IP "" 4
-.
-.nf
-npm config set foo:port 80
-.
-.fi
-.
-.IP "" 0
-.
-.P
-npm help  See package\.json for more information\.
-.
-.SH "Config Settings"
-.
-.SS "always\-auth"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Force npm to always require authentication when accessing the registry,
-even for \fBGET\fR requests\.
-.
-.SS "bin\-links"
-.
-.IP "\(bu" 4
-Default: \fBtrue\fR
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Tells npm to create symlinks (or \fB\|\.cmd\fR shims on Windows) for package
-executables\.
-.
-.P
-Set to false to have it not do this\.  This can be used to work around
-the fact that some file systems don\'t support symlinks, even on
-ostensibly Unix systems\.
-.
-.SS "browser"
-.
-.IP "\(bu" 4
-Default: OS X: \fB"open"\fR, Windows: \fB"start"\fR, Others: \fB"xdg\-open"\fR
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-The browser that is called by the \fBnpm docs\fR command to open websites\.
-.
-.SS "ca"
-.
-.IP "\(bu" 4
-Default: The npm CA certificate
-.
-.IP "\(bu" 4
-Type: String or null
-.
-.IP "" 0
-.
-.P
-The Certificate Authority signing certificate that is trusted for SSL
-connections to the registry\.
-.
-.P
-Set to \fBnull\fR to only allow "known" registrars, or to a specific CA cert
-to trust only that specific signing authority\.
-.
-.P
-See also the \fBstrict\-ssl\fR config\.
-.
-.SS "cache"
-.
-.IP "\(bu" 4
-Default: Windows: \fB%AppData%\\npm\-cache\fR, Posix: \fB~/\.npm\fR
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-npm help The location of npm\'s cache directory\.  See \fBnpm\-cache\fR
-.
-.SS "cache\-lock\-stale"
-.
-.IP "\(bu" 4
-Default: 60000 (1 minute)
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-The number of ms before cache folder lockfiles are considered stale\.
-.
-.SS "cache\-lock\-retries"
-.
-.IP "\(bu" 4
-Default: 10
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-Number of times to retry to acquire a lock on cache folder lockfiles\.
-.
-.SS "cache\-lock\-wait"
-.
-.IP "\(bu" 4
-Default: 10000 (10 seconds)
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-Number of ms to wait for cache lock files to expire\.
-.
-.SS "cache\-max"
-.
-.IP "\(bu" 4
-Default: Infinity
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-The maximum time (in seconds) to keep items in the registry cache before
-re\-checking against the registry\.
-.
-.P
-Note that no purging is done unless the \fBnpm cache clean\fR command is
-explicitly used, and that only GET requests use the cache\.
-.
-.SS "cache\-min"
-.
-.IP "\(bu" 4
-Default: 10
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-The minimum time (in seconds) to keep items in the registry cache before
-re\-checking against the registry\.
-.
-.P
-Note that no purging is done unless the \fBnpm cache clean\fR command is
-explicitly used, and that only GET requests use the cache\.
-.
-.SS "color"
-.
-.IP "\(bu" 4
-Default: true on Posix, false on Windows
-.
-.IP "\(bu" 4
-Type: Boolean or \fB"always"\fR
-.
-.IP "" 0
-.
-.P
-If false, never shows colors\.  If \fB"always"\fR then always shows colors\.
-If true, then only prints color codes for tty file descriptors\.
-.
-.SS "coverage"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-A flag to tell test\-harness to run with their coverage options enabled,
-if they respond to the \fBnpm_config_coverage\fR environment variable\.
-.
-.SS "depth"
-.
-.IP "\(bu" 4
-Default: Infinity
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-The depth to go when recursing directories for \fBnpm ls\fR and \fBnpm cache ls\fR\|\.
-.
-.SS "description"
-.
-.IP "\(bu" 4
-Default: true
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Show the description in \fBnpm search\fR
-.
-.SS "dev"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Install \fBdev\-dependencies\fR along with packages\.
-.
-.P
-Note that \fBdev\-dependencies\fR are also installed if the \fBnpat\fR flag is
-set\.
-.
-.SS "editor"
-.
-.IP "\(bu" 4
-Default: \fBEDITOR\fR environment variable if set, or \fB"vi"\fR on Posix,
-or \fB"notepad"\fR on Windows\.
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The command to run for \fBnpm edit\fR or \fBnpm config edit\fR\|\.
-.
-.SS "engine\-strict"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-If set to true, then npm will stubbornly refuse to install (or even
-consider installing) any package that claims to not be compatible with
-the current Node\.js version\.
-.
-.SS "force"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Makes various commands more forceful\.
-.
-.IP "\(bu" 4
-lifecycle script failure does not block progress\.
-.
-.IP "\(bu" 4
-publishing clobbers previously published versions\.
-.
-.IP "\(bu" 4
-skips cache when requesting from the registry\.
-.
-.IP "\(bu" 4
-prevents checks against clobbering non\-npm files\.
-.
-.IP "" 0
-.
-.SS "fetch\-retries"
-.
-.IP "\(bu" 4
-Default: 2
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-The "retries" config for the \fBretry\fR module to use when fetching
-packages from the registry\.
-.
-.SS "fetch\-retry\-factor"
-.
-.IP "\(bu" 4
-Default: 10
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-The "factor" config for the \fBretry\fR module to use when fetching
-packages\.
-.
-.SS "fetch\-retry\-mintimeout"
-.
-.IP "\(bu" 4
-Default: 10000 (10 seconds)
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-The "minTimeout" config for the \fBretry\fR module to use when fetching
-packages\.
-.
-.SS "fetch\-retry\-maxtimeout"
-.
-.IP "\(bu" 4
-Default: 60000 (1 minute)
-.
-.IP "\(bu" 4
-Type: Number
-.
-.IP "" 0
-.
-.P
-The "maxTimeout" config for the \fBretry\fR module to use when fetching
-packages\.
-.
-.SS "git"
-.
-.IP "\(bu" 4
-Default: \fB"git"\fR
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-The command to use for git commands\.  If git is installed on the
-computer, but is not in the \fBPATH\fR, then set this to the full path to
-the git binary\.
-.
-.SS "global"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-npm help  Operates in "global" mode, so that packages are installed into the \fBprefix\fR folder instead of the current working directory\.  See \fBnpm\-folders\fR for more on the differences in behavior\.
-.
-.IP "\(bu" 4
-packages are installed into the \fB{prefix}/lib/node_modules\fR folder, instead of the
-current working directory\.
-.
-.IP "\(bu" 4
-bin files are linked to \fB{prefix}/bin\fR
-.
-.IP "\(bu" 4
-man pages are linked to \fB{prefix}/share/man\fR
-.
-.IP "" 0
-.
-.SS "globalconfig"
-.
-.IP "\(bu" 4
-Default: {prefix}/etc/npmrc
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The config file to read for global config options\.
-.
-.SS "globalignorefile"
-.
-.IP "\(bu" 4
-Default: {prefix}/etc/npmignore
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The config file to read for global ignore patterns to apply to all users
-and all projects\.
-.
-.P
-If not found, but there is a "gitignore" file in the
-same directory, then that will be used instead\.
-.
-.SS "group"
-.
-.IP "\(bu" 4
-Default: GID of the current process
-.
-.IP "\(bu" 4
-Type: String or Number
-.
-.IP "" 0
-.
-.P
-The group to use when running package scripts in global mode as the root
-user\.
-.
-.SS "https\-proxy"
-.
-.IP "\(bu" 4
-Default: the \fBHTTPS_PROXY\fR or \fBhttps_proxy\fR or \fBHTTP_PROXY\fR or \fBhttp_proxy\fR environment variables\.
-.
-.IP "\(bu" 4
-Type: url
-.
-.IP "" 0
-.
-.P
-A proxy to use for outgoing https requests\.
-.
-.SS "user\-agent"
-.
-.IP "\(bu" 4
-Default: node/{process\.version} {process\.platform} {process\.arch}
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-Sets a User\-Agent to the request header
-.
-.SS "ignore"
-.
-.IP "\(bu" 4
-Default: ""
-.
-.IP "\(bu" 4
-Type: string
-.
-.IP "" 0
-.
-.P
-A white\-space separated list of glob patterns of files to always exclude
-from packages when building tarballs\.
-.
-.SS "init\-module"
-.
-.IP "\(bu" 4
-Default: ~/\.npm\-init\.js
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-A module that will be loaded by the \fBnpm init\fR command\.  See the
-documentation for the init\-package\-json \fIhttps://github\.com/isaacs/init\-package\-json\fR module
-npm help for more information, or npm\-init\.
-.
-.SS "init\.version"
-.
-.IP "\(bu" 4
-Default: "0\.0\.0"
-.
-.IP "\(bu" 4
-Type: semver
-.
-.IP "" 0
-.
-.P
-The value \fBnpm init\fR should use by default for the package version\.
-.
-.SS "init\.author\.name"
-.
-.IP "\(bu" 4
-Default: ""
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-The value \fBnpm init\fR should use by default for the package author\'s name\.
-.
-.SS "init\.author\.email"
-.
-.IP "\(bu" 4
-Default: ""
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-The value \fBnpm init\fR should use by default for the package author\'s email\.
-.
-.SS "init\.author\.url"
-.
-.IP "\(bu" 4
-Default: ""
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-The value \fBnpm init\fR should use by default for the package author\'s homepage\.
-.
-.SS "json"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Whether or not to output JSON data, rather than the normal output\.
-.
-.P
-This feature is currently experimental, and the output data structures
-for many commands is either not implemented in JSON yet, or subject to
-change\.  Only the output from \fBnpm ls \-\-json\fR is currently valid\.
-.
-.SS "link"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-If true, then local installs will link if there is a suitable globally
-installed package\.
-.
-.P
-Note that this means that local installs can cause things to be
-installed into the global space at the same time\.  The link is only done
-if one of the two conditions are met:
-.
-.IP "\(bu" 4
-The package is not already installed globally, or
-.
-.IP "\(bu" 4
-the globally installed version is identical to the version that is
-being installed locally\.
-.
-.IP "" 0
-.
-.SS "loglevel"
-.
-.IP "\(bu" 4
-Default: "http"
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "\(bu" 4
-Values: "silent", "win", "error", "warn", "http", "info", "verbose", "silly"
-.
-.IP "" 0
-.
-.P
-What level of logs to report\.  On failure, \fIall\fR logs are written to \fBnpm\-debug\.log\fR in the current working directory\.
-.
-.P
-Any logs of a higher level than the setting are shown\.
-The default is "http", which shows http, warn, and error output\.
-.
-.SS "logstream"
-.
-.IP "\(bu" 4
-Default: process\.stderr
-.
-.IP "\(bu" 4
-Type: Stream
-.
-.IP "" 0
-.
-.P
-This is the stream that is passed to the npmlog \fIhttps://github\.com/isaacs/npmlog\fR module at run time\.
-.
-.P
-It cannot be set from the command line, but if you are using npm
-programmatically, you may wish to send logs to somewhere other than
-stderr\.
-.
-.P
-If the \fBcolor\fR config is set to true, then this stream will receive
-colored output if it is a TTY\.
-.
-.SS "long"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Show extended information in \fBnpm ls\fR
-.
-.SS "message"
-.
-.IP "\(bu" 4
-Default: "%s"
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-Commit message which is used by \fBnpm version\fR when creating version commit\.
-.
-.P
-Any "%s" in the message will be replaced with the version number\.
-.
-.SS "node\-version"
-.
-.IP "\(bu" 4
-Default: process\.version
-.
-.IP "\(bu" 4
-Type: semver or false
-.
-.IP "" 0
-.
-.P
-The node version to use when checking package\'s "engines" hash\.
-.
-.SS "npat"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Run tests on installation and report results to the \fBnpaturl\fR\|\.
-.
-.SS "npaturl"
-.
-.IP "\(bu" 4
-Default: Not yet implemented
-.
-.IP "\(bu" 4
-Type: url
-.
-.IP "" 0
-.
-.P
-The url to report npat test results\.
-.
-.SS "onload\-script"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-A node module to \fBrequire()\fR when npm loads\.  Useful for programmatic
-usage\.
-.
-.SS "optional"
-.
-.IP "\(bu" 4
-Default: true
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Attempt to install packages in the \fBoptionalDependencies\fR hash\.  Note
-that if these packages fail to install, the overall installation
-process is not aborted\.
-.
-.SS "parseable"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Output parseable results from commands that write to
-standard output\.
-.
-.SS "prefix"
-.
-.IP "\(bu" 4
-npm help  Default: see npm\-folders
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The location to install global items\.  If set on the command line, then
-it forces non\-global commands to run in the specified folder\.
-.
-.SS "production"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Set to true to run in "production" mode\.
-.
-.IP "1" 4
-devDependencies are not installed at the topmost level when running
-local \fBnpm install\fR without any arguments\.
-.
-.IP "2" 4
-Set the NODE_ENV="production" for lifecycle scripts\.
-.
-.IP "" 0
-.
-.SS "proprietary\-attribs"
-.
-.IP "\(bu" 4
-Default: true
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Whether or not to include proprietary extended attributes in the
-tarballs created by npm\.
-.
-.P
-Unless you are expecting to unpack package tarballs with something other
-than npm \-\- particularly a very outdated tar implementation \-\- leave
-this as true\.
-.
-.SS "proxy"
-.
-.IP "\(bu" 4
-Default: \fBHTTP_PROXY\fR or \fBhttp_proxy\fR environment variable, or null
-.
-.IP "\(bu" 4
-Type: url
-.
-.IP "" 0
-.
-.P
-A proxy to use for outgoing http requests\.
-.
-.SS "rebuild\-bundle"
-.
-.IP "\(bu" 4
-Default: true
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Rebuild bundled dependencies after installation\.
-.
-.SS "registry"
-.
-.IP "\(bu" 4
-Default: https://registry\.npmjs\.org/
-.
-.IP "\(bu" 4
-Type: url
-.
-.IP "" 0
-.
-.P
-The base URL of the npm package registry\.
-.
-.SS "rollback"
-.
-.IP "\(bu" 4
-Default: true
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Remove failed installs\.
-.
-.SS "save"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Save installed packages to a package\.json file as dependencies\.
-.
-.P
-When used with the \fBnpm rm\fR command, it removes it from the dependencies
-hash\.
-.
-.P
-Only works if there is already a package\.json file present\.
-.
-.SS "save\-bundle"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-If a package would be saved at install time by the use of \fB\-\-save\fR, \fB\-\-save\-dev\fR, or \fB\-\-save\-optional\fR, then also put it in the \fBbundleDependencies\fR list\.
-.
-.P
-When used with the \fBnpm rm\fR command, it removes it from the
-bundledDependencies list\.
-.
-.SS "save\-dev"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Save installed packages to a package\.json file as devDependencies\.
-.
-.P
-When used with the \fBnpm rm\fR command, it removes it from the devDependencies
-hash\.
-.
-.P
-Only works if there is already a package\.json file present\.
-.
-.SS "save\-optional"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Save installed packages to a package\.json file as optionalDependencies\.
-.
-.P
-When used with the \fBnpm rm\fR command, it removes it from the devDependencies
-hash\.
-.
-.P
-Only works if there is already a package\.json file present\.
-.
-.SS "searchopts"
-.
-.IP "\(bu" 4
-Default: ""
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-Space\-separated options that are always passed to search\.
-.
-.SS "searchexclude"
-.
-.IP "\(bu" 4
-Default: ""
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-Space\-separated options that limit the results from search\.
-.
-.SS "searchsort"
-.
-.IP "\(bu" 4
-Default: "name"
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "\(bu" 4
-Values: "name", "\-name", "date", "\-date", "description",
-"\-description", "keywords", "\-keywords"
-.
-.IP "" 0
-.
-.P
-Indication of which field to sort search results by\.  Prefix with a \fB\-\fR
-character to indicate reverse sort\.
-.
-.SS "shell"
-.
-.IP "\(bu" 4
-Default: SHELL environment variable, or "bash" on Posix, or "cmd" on
-Windows
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The shell to run for the \fBnpm explore\fR command\.
-.
-.SS "shrinkwrap"
-.
-.IP "\(bu" 4
-Default: true
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-If set to false, then ignore \fBnpm\-shrinkwrap\.json\fR files when
-installing\.
-.
-.SS "sign\-git\-tag"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-If set to true, then the \fBnpm version\fR command will tag the version
-using \fB\-s\fR to add a signature\.
-.
-.P
-Note that git requires you to have set up GPG keys in your git configs
-for this to work properly\.
-.
-.SS "strict\-ssl"
-.
-.IP "\(bu" 4
-Default: true
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Whether or not to do SSL key validation when making requests to the
-registry via https\.
-.
-.P
-See also the \fBca\fR config\.
-.
-.SS "tag"
-.
-.IP "\(bu" 4
-Default: latest
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-If you ask npm to install a package and don\'t tell it a specific version, then
-it will install the specified tag\.
-.
-.P
-Also the tag that is added to the package@version specified by the \fBnpm
-tag\fR command, if no explicit tag is given\.
-.
-.SS "tmp"
-.
-.IP "\(bu" 4
-Default: TMPDIR environment variable, or "/tmp"
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-Where to store temporary files and folders\.  All temp files are deleted
-on success, but left behind on failure for forensic purposes\.
-.
-.SS "unicode"
-.
-.IP "\(bu" 4
-Default: true
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-When set to true, npm uses unicode characters in the tree output\.  When
-false, it uses ascii characters to draw trees\.
-.
-.SS "unsafe\-perm"
-.
-.IP "\(bu" 4
-Default: false if running as root, true otherwise
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Set to true to suppress the UID/GID switching when running package
-scripts\.  If set explicitly to false, then installing as a non\-root user
-will fail\.
-.
-.SS "usage"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: Boolean
-.
-.IP "" 0
-.
-.P
-Set to show short usage output (like the \-H output)
-npm help instead of complete help when doing \fBnpm\-help\fR\|\.
-.
-.SS "user"
-.
-.IP "\(bu" 4
-Default: "nobody"
-.
-.IP "\(bu" 4
-Type: String or Number
-.
-.IP "" 0
-.
-.P
-The UID to set to when running package scripts as root\.
-.
-.SS "username"
-.
-.IP "\(bu" 4
-Default: null
-.
-.IP "\(bu" 4
-Type: String
-.
-.IP "" 0
-.
-.P
-The username on the npm registry\.  Set with \fBnpm adduser\fR
-.
-.SS "userconfig"
-.
-.IP "\(bu" 4
-Default: ~/\.npmrc
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The location of user\-level configuration settings\.
-.
-.SS "userignorefile"
-.
-.IP "\(bu" 4
-Default: ~/\.npmignore
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The location of a user\-level ignore file to apply to all packages\.
-.
-.P
-If not found, but there is a \.gitignore file in the same directory, then
-that will be used instead\.
-.
-.SS "umask"
-.
-.IP "\(bu" 4
-Default: 022
-.
-.IP "\(bu" 4
-Type: Octal numeric string
-.
-.IP "" 0
-.
-.P
-The "umask" value to use when setting the file creation mode on files
-and folders\.
-.
-.P
-Folders and executables are given a mode which is \fB0777\fR masked against
-this value\.  Other files are given a mode which is \fB0666\fR masked against
-this value\.  Thus, the defaults are \fB0755\fR and \fB0644\fR respectively\.
-.
-.SS "version"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: boolean
-.
-.IP "" 0
-.
-.P
-If true, output the npm version and exit successfully\.
-.
-.P
-Only relevant when specified explicitly on the command line\.
-.
-.SS "versions"
-.
-.IP "\(bu" 4
-Default: false
-.
-.IP "\(bu" 4
-Type: boolean
-.
-.IP "" 0
-.
-.P
-If true, output the npm version as well as node\'s \fBprocess\.versions\fR
-hash, and exit successfully\.
-.
-.P
-Only relevant when specified explicitly on the command line\.
-.
-.SS "viewer"
-.
-.IP "\(bu" 4
-Default: "man" on Posix, "browser" on Windows
-.
-.IP "\(bu" 4
-Type: path
-.
-.IP "" 0
-.
-.P
-The program to use to view help content\.
-.
-.P
-Set to \fB"browser"\fR to view html help content in the default web browser\.
-.
-.SS "yes"
-.
-.IP "\(bu" 4
-Default: null
-.
-.IP "\(bu" 4
-Type: Boolean or null
-.
-.IP "" 0
-.
-.P
-If set to \fBnull\fR, then prompt the user for responses in some
-circumstances\.
-.
-.P
-If set to \fBtrue\fR, then answer "yes" to any prompt\.  If set to \fBfalse\fR
-then answer "no" to any prompt\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-developers.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,335 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-DEVELOPERS" "7" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-developers\fR \-\- Developer Guide
-.
-.SH "DESCRIPTION"
-So, you\'ve decided to use npm to develop (and maybe publish/deploy)
-your project\.
-.
-.P
-Fantastic!
-.
-.P
-There are a few things that you need to do above the simple steps
-that your users will do to install your program\.
-.
-.SH "About These Documents"
-These are man pages\.  If you install npm, you should be able to
-then do \fBman npm\-thing\fR to get the documentation on a particular
-topic, or \fBnpm help thing\fR to see the same information\.
-.
-.SH "What is a "
-A package is:
-.
-.IP "\(bu" 4
-a) a folder containing a program described by a package\.json file
-.
-.IP "\(bu" 4
-b) a gzipped tarball containing (a)
-.
-.IP "\(bu" 4
-c) a url that resolves to (b)
-.
-.IP "\(bu" 4
-d) a \fB<name>@<version>\fR that is published on the registry with (c)
-.
-.IP "\(bu" 4
-e) a \fB<name>@<tag>\fR that points to (d)
-.
-.IP "\(bu" 4
-f) a \fB<name>\fR that has a "latest" tag satisfying (e)
-.
-.IP "\(bu" 4
-g) a \fBgit\fR url that, when cloned, results in (a)\.
-.
-.IP "" 0
-.
-.P
-Even if you never publish your package, you can still get a lot of
-benefits of using npm if you just want to write a node program (a), and
-perhaps if you also want to be able to easily install it elsewhere
-after packing it up into a tarball (b)\.
-.
-.P
-Git urls can be of the form:
-.
-.IP "" 4
-.
-.nf
-git://github\.com/user/project\.git#commit\-ish
-git+ssh://user@hostname:project\.git#commit\-ish
-git+http://user@hostname/project/blah\.git#commit\-ish
-git+https://user@hostname/project/blah\.git#commit\-ish
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The \fBcommit\-ish\fR can be any tag, sha, or branch which can be supplied as
-an argument to \fBgit checkout\fR\|\.  The default is \fBmaster\fR\|\.
-.
-.SH "The package\.json File"
-You need to have a \fBpackage\.json\fR file in the root of your project to do
-much of anything with npm\.  That is basically the whole interface\.
-.
-.P
-npm help  See \fBpackage\.json\fR for details about what goes in that file\.  At the very
-least, you need:
-.
-.IP "\(bu" 4
-name:
-This should be a string that identifies your project\.  Please do not
-use the name to specify that it runs on node, or is in JavaScript\.
-You can use the "engines" field to explicitly state the versions of
-node (or whatever else) that your program requires, and it\'s pretty
-well assumed that it\'s javascript\.
-.
-.IP
-It does not necessarily need to match your github repository name\.
-.
-.IP
-So, \fBnode\-foo\fR and \fBbar\-js\fR are bad names\.  \fBfoo\fR or \fBbar\fR are better\.
-.
-.IP "\(bu" 4
-version:
-A semver\-compatible version\.
-.
-.IP "\(bu" 4
-engines:
-Specify the versions of node (or whatever else) that your program
-runs on\.  The node API changes a lot, and there may be bugs or new
-functionality that you depend on\.  Be explicit\.
-.
-.IP "\(bu" 4
-author:
-Take some credit\.
-.
-.IP "\(bu" 4
-scripts:
-If you have a special compilation or installation script, then you
-should put it in the \fBscripts\fR hash\.  You should definitely have at
-least a basic smoke\-test command as the "scripts\.test" field\.
-npm help  See npm\-scripts\.
-.
-.IP "\(bu" 4
-main:
-If you have a single module that serves as the entry point to your
-program (like what the "foo" package gives you at require("foo")),
-then you need to specify that in the "main" field\.
-.
-.IP "\(bu" 4
-directories:
-This is a hash of folders\.  The best ones to include are "lib" and
-"doc", but if you specify a folder full of man pages in "man", then
-they\'ll get installed just like these ones\.
-.
-.IP "" 0
-.
-.P
-You can use \fBnpm init\fR in the root of your package in order to get you
-npm help started with a pretty basic package\.json file\.  See \fBnpm\-init\fR for
-more info\.
-.
-.SH "Keeping files "
-Use a \fB\|\.npmignore\fR file to keep stuff out of your package\.  If there\'s
-no \fB\|\.npmignore\fR file, but there \fIis\fR a \fB\|\.gitignore\fR file, then npm will
-ignore the stuff matched by the \fB\|\.gitignore\fR file\.  If you \fIwant\fR to
-include something that is excluded by your \fB\|\.gitignore\fR file, you can
-create an empty \fB\|\.npmignore\fR file to override it\.
-.
-.P
-By default, the following paths and files are ignored, so there\'s no
-need to add them to \fB\|\.npmignore\fR explicitly:
-.
-.IP "\(bu" 4
-\fB\|\.*\.swp\fR
-.
-.IP "\(bu" 4
-\fB\|\._*\fR
-.
-.IP "\(bu" 4
-\fB\|\.DS_Store\fR
-.
-.IP "\(bu" 4
-\fB\|\.git\fR
-.
-.IP "\(bu" 4
-\fB\|\.hg\fR
-.
-.IP "\(bu" 4
-\fB\|\.lock\-wscript\fR
-.
-.IP "\(bu" 4
-\fB\|\.svn\fR
-.
-.IP "\(bu" 4
-\fB\|\.wafpickle\-*\fR
-.
-.IP "\(bu" 4
-\fBCVS\fR
-.
-.IP "\(bu" 4
-\fBnpm\-debug\.log\fR
-.
-.IP "" 0
-.
-.P
-Additionally, everything in \fBnode_modules\fR is ignored, except for
-bundled dependencies\. npm automatically handles this for you, so don\'t
-bother adding \fBnode_modules\fR to \fB\|\.npmignore\fR\|\.
-.
-.P
-The following paths and files are never ignored, so adding them to \fB\|\.npmignore\fR is pointless:
-.
-.IP "\(bu" 4
-\fBpackage\.json\fR
-.
-.IP "\(bu" 4
-\fBREADME\.*\fR
-.
-.IP "" 0
-.
-.SH "Link Packages"
-\fBnpm link\fR is designed to install a development package and see the
-changes in real time without having to keep re\-installing it\.  (You do
-need to either re\-link or \fBnpm rebuild \-g\fR to update compiled packages,
-of course\.)
-.
-.P
-npm help More info at \fBnpm\-link\fR\|\.
-.
-.SH "Before Publishing: Make Sure Your Package Installs and Works"
-\fBThis is important\.\fR
-.
-.P
-If you can not install it locally, you\'ll have
-problems trying to publish it\.  Or, worse yet, you\'ll be able to
-publish it, but you\'ll be publishing a broken or pointless package\.
-So don\'t do that\.
-.
-.P
-In the root of your package, do this:
-.
-.IP "" 4
-.
-.nf
-npm install \. \-g
-.
-.fi
-.
-.IP "" 0
-.
-.P
-That\'ll show you that it\'s working\.  If you\'d rather just create a symlink
-package that points to your working directory, then do this:
-.
-.IP "" 4
-.
-.nf
-npm link
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Use \fBnpm ls \-g\fR to see if it\'s there\.
-.
-.P
-To test a local install, go into some other folder, and then do:
-.
-.IP "" 4
-.
-.nf
-cd \.\./some\-other\-folder
-npm install \.\./my\-package
-.
-.fi
-.
-.IP "" 0
-.
-.P
-to install it locally into the node_modules folder in that other place\.
-.
-.P
-Then go into the node\-repl, and try using require("my\-thing") to
-bring in your module\'s main module\.
-.
-.SH "Create a User Account"
-Create a user with the adduser command\.  It works like this:
-.
-.IP "" 4
-.
-.nf
-npm adduser
-.
-.fi
-.
-.IP "" 0
-.
-.P
-and then follow the prompts\.
-.
-.P
-npm help This is documented better in npm\-adduser\.
-.
-.SH "Publish your package"
-This part\'s easy\.  IN the root of your folder, do this:
-.
-.IP "" 4
-.
-.nf
-npm publish
-.
-.fi
-.
-.IP "" 0
-.
-.P
-You can give publish a url to a tarball, or a filename of a tarball,
-or a path to a folder\.
-.
-.P
-Note that pretty much \fBeverything in that folder will be exposed\fR
-by default\.  So, if you have secret stuff in there, use a \fB\|\.npmignore\fR file to list out the globs to ignore, or publish
-from a fresh checkout\.
-.
-.SH "Brag about it"
-Send emails, write blogs, blab in IRC\.
-.
-.P
-Tell the world how easy it is to install your program!
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  faq
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "\(bu" 4
-npm help init
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help  scripts
-.
-.IP "\(bu" 4
-npm help publish
-.
-.IP "\(bu" 4
-npm help adduser
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-disputes.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,145 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-DISPUTES" "7" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-disputes\fR \-\- Handling Module Name Disputes
-.
-.SH "SYNOPSIS"
-.
-.IP "1" 4
-Get the author email with \fBnpm owner ls <pkgname>\fR
-.
-.IP "2" 4
-Email the author, CC \fIi@izs\.me\fR\|\.
-.
-.IP "3" 4
-After a few weeks, if there\'s no resolution, we\'ll sort it out\.
-.
-.IP "" 0
-.
-.P
-Don\'t squat on package names\.  Publish code or move out of the way\.
-.
-.SH "DESCRIPTION"
-There sometimes arise cases where a user publishes a module, and then
-later, some other user wants to use that name\.  Here are some common
-ways that happens (each of these is based on actual events\.)
-.
-.IP "1" 4
-Joe writes a JavaScript module \fBfoo\fR, which is not node\-specific\.
-Joe doesn\'t use node at all\.  Bob   wants to use \fBfoo\fR in node, so he
-wraps it in an npm module\.  Some time later, Joe starts using node,
-and wants to take over management of his program\.
-.
-.IP "2" 4
-Bob writes an npm module \fBfoo\fR, and publishes it\.  Perhaps much
-later, Joe finds a bug in \fBfoo\fR, and fixes it\.  He sends a pull
-request to Bob, but Bob doesn\'t have the time to deal with it,
-because he has a new job and a new baby and is focused on his new
-erlang project, and kind of not involved with node any more\.  Joe
-would like to publish a new \fBfoo\fR, but can\'t, because the name is
-taken\.
-.
-.IP "3" 4
-Bob writes a 10\-line flow\-control library, and calls it \fBfoo\fR, and
-publishes it to the npm registry\.  Being a simple little thing, it
-never really has to be updated\.  Joe works for Foo Inc, the makers
-of the critically acclaimed and widely\-marketed \fBfoo\fR JavaScript
-toolkit framework\.  They publish it to npm as \fBfoojs\fR, but people are
-routinely confused when \fBnpm install foo\fR is some different thing\.
-.
-.IP "4" 4
-Bob writes a parser for the widely\-known \fBfoo\fR file format, because
-he needs it for work\.  Then, he gets a new job, and never updates the
-prototype\.  Later on, Joe writes a much more complete \fBfoo\fR parser,
-but can\'t publish, because Bob\'s \fBfoo\fR is in the way\.
-.
-.IP "" 0
-.
-.P
-The validity of Joe\'s claim in each situation can be debated\.  However,
-Joe\'s appropriate course of action in each case is the same\.
-.
-.IP "1" 4
-\fBnpm owner ls foo\fR\|\.  This will tell Joe the email address of the
-owner (Bob)\.
-.
-.IP "2" 4
-Joe emails Bob, explaining the situation \fBas respectfully as possible\fR,
-and what he would like to do with the module name\.  He adds
-isaacs \fIi@izs\.me\fR to the CC list of the email\.  Mention in the email
-that Bob can run \fBnpm owner add joe foo\fR to add Joe as an owner of
-the \fBfoo\fR package\.
-.
-.IP "3" 4
-After a reasonable amount of time, if Bob has not responded, or if
-Bob and Joe can\'t come to any sort of resolution, email isaacs \fIi@izs\.me\fR and we\'ll sort it out\.  ("Reasonable" is usually about 4
-weeks, but extra time is allowed around common holidays\.)
-.
-.IP "" 0
-.
-.SH "REASONING"
-In almost every case so far, the parties involved have been able to reach
-an amicable resolution without any major intervention\.  Most people
-really do want to be reasonable, and are probably not even aware that
-they\'re in your way\.
-.
-.P
-Module ecosystems are most vibrant and powerful when they are as
-self\-directed as possible\.  If an admin one day deletes something you
-had worked on, then that is going to make most people quite upset,
-regardless of the justification\.  When humans solve their problems by
-talking to other humans with respect, everyone has the chance to end up
-feeling good about the interaction\.
-.
-.SH "EXCEPTIONS"
-Some things are not allowed, and will be removed without discussion if
-they are brought to the attention of the npm registry admins, including
-but not limited to:
-.
-.IP "1" 4
-Malware (that is, a package designed to exploit or harm the machine on
-which it is installed)\.
-.
-.IP "2" 4
-Violations of copyright or licenses (for example, cloning an
-MIT\-licensed program, and then removing or changing the copyright and
-license statement)\.
-.
-.IP "3" 4
-Illegal content\.
-.
-.IP "4" 4
-"Squatting" on a package name that you \fIplan\fR to use, but aren\'t
-actually using\.  Sorry, I don\'t care how great the name is, or how
-perfect a fit it is for the thing that someday might happen\.  If
-someone wants to use it today, and you\'re just taking up space with
-an empty tarball, you\'re going to be evicted\.
-.
-.IP "5" 4
-Putting empty packages in the registry\.  Packages must have SOME
-functionality\.  It can be silly, but it can\'t be \fInothing\fR\|\.  (See
-also: squatting\.)
-.
-.IP "6" 4
-Doing weird things with the registry, like using it as your own
-personal application database or otherwise putting non\-packagey
-things into it\.
-.
-.IP "" 0
-.
-.P
-If you see bad behavior like this, please report it right away\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help  registry
-.
-.IP "\(bu" 4
-npm help owner
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-faq.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,468 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-FAQ" "7" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-faq\fR \-\- Frequently Asked Questions
-.
-.SH "Where can I find these docs in HTML?"
-\fIhttps://npmjs\.org/doc/\fR, or run:
-.
-.IP "" 4
-.
-.nf
-npm config set viewer browser
-.
-.fi
-.
-.IP "" 0
-.
-.P
-to open these documents in your default web browser rather than \fBman\fR\|\.
-.
-.SH "It didn&#39;t work\."
-That\'s not really a question\.
-.
-.SH "Why didn&#39;t it work?"
-I don\'t know yet\.
-.
-.P
-Read the error output, and if you can\'t figure out what it means,
-do what it says and post a bug with all the information it asks for\.
-.
-.SH "Where does npm put stuff?"
-npm help  See \fBnpm\-folders\fR
-.
-.P
-tl;dr:
-.
-.IP "\(bu" 4
-Use the \fBnpm root\fR command to see where modules go, and the \fBnpm bin\fR
-command to see where executables go
-.
-.IP "\(bu" 4
-Global installs are different from local installs\.  If you install
-something with the \fB\-g\fR flag, then its executables go in \fBnpm bin \-g\fR
-and its modules go in \fBnpm root \-g\fR\|\.
-.
-.IP "" 0
-.
-.SH "How do I install something on my computer in a central location?"
-Install it globally by tacking \fB\-g\fR or \fB\-\-global\fR to the command\.  (This
-is especially important for command line utilities that need to add
-their bins to the global system \fBPATH\fR\|\.)
-.
-.SH "I installed something globally, but I can&#39;t "
-Install it locally\.
-.
-.P
-The global install location is a place for command\-line utilities
-to put their bins in the system \fBPATH\fR\|\.  It\'s not for use with \fBrequire()\fR\|\.
-.
-.P
-If you \fBrequire()\fR a module in your code, then that means it\'s a
-dependency, and a part of your program\.  You need to install it locally
-in your program\.
-.
-.SH "Why can&#39;t npm just put everything in one place, like other package managers?"
-Not every change is an improvement, but every improvement is a change\.
-This would be like asking git to do network IO for every commit\.  It\'s
-not going to happen, because it\'s a terrible idea that causes more
-problems than it solves\.
-.
-.P
-It is much harder to avoid dependency conflicts without nesting
-dependencies\.  This is fundamental to the way that npm works, and has
-npm help  proven to be an extremely successful approach\.  See \fBnpm\-folders\fR for
-more details\.
-.
-.P
-If you want a package to be installed in one place, and have all your
-programs reference the same copy of it, then use the \fBnpm link\fR command\.
-That\'s what it\'s for\.  Install it globally, then link it into each
-program that uses it\.
-.
-.SH "Whatever, I really want the old style &#39;everything global&#39; style\."
-Write your own package manager, then\.  It\'s not that hard\.
-.
-.P
-npm will not help you do something that is known to be a bad idea\.
-.
-.SH "Should I check my "
-Mikeal Rogers answered this question very well:
-.
-.P
-\fIhttp://www\.mikealrogers\.com/posts/nodemodules\-in\-git\.html\fR
-.
-.P
-tl;dr
-.
-.IP "\(bu" 4
-Check \fBnode_modules\fR into git for things you \fBdeploy\fR, such as
-websites and apps\.
-.
-.IP "\(bu" 4
-Do not check \fBnode_modules\fR into git for libraries and modules
-intended to be reused\.
-.
-.IP "\(bu" 4
-Use npm to manage dependencies in your dev environment, but not in
-your deployment scripts\.
-.
-.IP "" 0
-.
-.SH "Is it &#39;npm&#39; or &#39;NPM&#39; or &#39;Npm&#39;?"
-npm should never be capitalized unless it is being displayed in a
-location that is customarily all\-caps (such as the title of man pages\.)
-.
-.SH "If &#39;npm&#39; is an acronym, why is it never capitalized?"
-Contrary to the belief of many, "npm" is not in fact an abbreviation for
-"Node Package Manager"\.  It is a recursive bacronymic abbreviation for
-"npm is not an acronym"\.  (If it was "ninaa", then it would be an
-acronym, and thus incorrectly named\.)
-.
-.P
-"NPM", however, \fIis\fR an acronym (more precisely, a capitonym) for the
-National Association of Pastoral Musicians\.  You can learn more
-about them at \fIhttp://npm\.org/\fR\|\.
-.
-.P
-In software, "NPM" is a Non\-Parametric Mapping utility written by
-Chris Rorden\.  You can analyze pictures of brains with it\.  Learn more
-about the (capitalized) NPM program at \fIhttp://www\.cabiatl\.com/mricro/npm/\fR\|\.
-.
-.P
-The first seed that eventually grew into this flower was a bash utility
-named "pm", which was a shortened descendent of "pkgmakeinst", a
-bash function that was used to install various different things on different
-platforms, most often using Yahoo\'s \fByinst\fR\|\.  If \fBnpm\fR was ever an
-acronym for anything, it was \fBnode pm\fR or maybe \fBnew pm\fR\|\.
-.
-.P
-So, in all seriousness, the "npm" project is named after its command\-line
-utility, which was organically selected to be easily typed by a right\-handed
-programmer using a US QWERTY keyboard layout, ending with the
-right\-ring\-finger in a postition to type the \fB\-\fR key for flags and
-other command\-line arguments\.  That command\-line utility is always
-lower\-case, though it starts most sentences it is a part of\.
-.
-.SH "How do I list installed packages?"
-\fBnpm ls\fR
-.
-.SH "How do I search for packages?"
-\fBnpm search\fR
-.
-.P
-Arguments are greps\.  \fBnpm search jsdom\fR shows jsdom packages\.
-.
-.SH "How do I update npm?"
-.
-.nf
-npm update npm \-g
-.
-.fi
-.
-.P
-You can also update all outdated local packages by doing \fBnpm update\fR without
-any arguments, or global packages by doing \fBnpm update \-g\fR\|\.
-.
-.P
-Occasionally, the version of npm will progress such that the current
-version cannot be properly installed with the version that you have
-installed already\.  (Consider, if there is ever a bug in the \fBupdate\fR
-command\.)
-.
-.P
-In those cases, you can do this:
-.
-.IP "" 4
-.
-.nf
-curl https://npmjs\.org/install\.sh | sh
-.
-.fi
-.
-.IP "" 0
-.
-.SH "What is a "
-A package is:
-.
-.IP "\(bu" 4
-a) a folder containing a program described by a package\.json file
-.
-.IP "\(bu" 4
-b) a gzipped tarball containing (a)
-.
-.IP "\(bu" 4
-c) a url that resolves to (b)
-.
-.IP "\(bu" 4
-d) a \fB<name>@<version>\fR that is published on the registry with (c)
-.
-.IP "\(bu" 4
-e) a \fB<name>@<tag>\fR that points to (d)
-.
-.IP "\(bu" 4
-f) a \fB<name>\fR that has a "latest" tag satisfying (e)
-.
-.IP "\(bu" 4
-g) a \fBgit\fR url that, when cloned, results in (a)\.
-.
-.IP "" 0
-.
-.P
-Even if you never publish your package, you can still get a lot of
-benefits of using npm if you just want to write a node program (a), and
-perhaps if you also want to be able to easily install it elsewhere
-after packing it up into a tarball (b)\.
-.
-.P
-Git urls can be of the form:
-.
-.IP "" 4
-.
-.nf
-git://github\.com/user/project\.git#commit\-ish
-git+ssh://user@hostname:project\.git#commit\-ish
-git+http://user@hostname/project/blah\.git#commit\-ish
-git+https://user@hostname/project/blah\.git#commit\-ish
-.
-.fi
-.
-.IP "" 0
-.
-.P
-The \fBcommit\-ish\fR can be any tag, sha, or branch which can be supplied as
-an argument to \fBgit checkout\fR\|\.  The default is \fBmaster\fR\|\.
-.
-.SH "What is a "
-A module is anything that can be loaded with \fBrequire()\fR in a Node\.js
-program\.  The following things are all examples of things that can be
-loaded as modules:
-.
-.IP "\(bu" 4
-A folder with a \fBpackage\.json\fR file containing a \fBmain\fR field\.
-.
-.IP "\(bu" 4
-A folder with an \fBindex\.js\fR file in it\.
-.
-.IP "\(bu" 4
-A JavaScript file\.
-.
-.IP "" 0
-.
-.P
-Most npm packages are modules, because they are libraries that you
-load with \fBrequire\fR\|\.  However, there\'s no requirement that an npm
-package be a module!  Some only contain an executable command\-line
-interface, and don\'t provide a \fBmain\fR field for use in Node programs\.
-.
-.P
-Almost all npm packages (at least, those that are Node programs) \fIcontain\fR many modules within them (because every file they load with \fBrequire()\fR is a module)\.
-.
-.P
-In the context of a Node program, the \fBmodule\fR is also the thing that
-was loaded \fIfrom\fR a file\.  For example, in the following program:
-.
-.IP "" 4
-.
-.nf
-var req = require(\'request\')
-.
-.fi
-.
-.IP "" 0
-.
-.P
-we might say that "The variable \fBreq\fR refers to the \fBrequest\fR module"\.
-.
-.SH "So, why is it the &quot;"
-The \fBpackage\.json\fR file defines the package\.  (See "What is a
-package?" above\.)
-.
-.P
-The \fBnode_modules\fR folder is the place Node\.js looks for modules\.
-(See "What is a module?" above\.)
-.
-.P
-For example, if you create a file at \fBnode_modules/foo\.js\fR and then
-had a program that did \fBvar f = require(\'foo\.js\')\fR then it would load
-the module\.  However, \fBfoo\.js\fR is not a "package" in this case,
-because it does not have a package\.json\.
-.
-.P
-Alternatively, if you create a package which does not have an \fBindex\.js\fR or a \fB"main"\fR field in the \fBpackage\.json\fR file, then it is
-not a module\.  Even if it\'s installed in \fBnode_modules\fR, it can\'t be
-an argument to \fBrequire()\fR\|\.
-.
-.SH "<code>&quot;node_modules&quot;</code>"
-No\.  This will never happen\.  This question comes up sometimes,
-because it seems silly from the outside that npm couldn\'t just be
-configured to put stuff somewhere else, and then npm could load them
-from there\.  It\'s an arbitrary spelling choice, right?  What\'s the big
-deal?
-.
-.P
-At the time of this writing, the string \fB\'node_modules\'\fR appears 151
-times in 53 separate files in npm and node core (excluding tests and
-documentation)\.
-.
-.P
-Some of these references are in node\'s built\-in module loader\.  Since
-npm is not involved \fBat all\fR at run\-time, node itself would have to
-be configured to know where you\'ve decided to stick stuff\.  Complexity
-hurdle #1\.  Since the Node module system is locked, this cannot be
-changed, and is enough to kill this request\.  But I\'ll continue, in
-deference to your deity\'s delicate feelings regarding spelling\.
-.
-.P
-Many of the others are in dependencies that npm uses, which are not
-necessarily tightly coupled to npm (in the sense that they do not read
-npm\'s configuration files, etc\.)  Each of these would have to be
-configured to take the name of the \fBnode_modules\fR folder as a
-parameter\.  Complexity hurdle #2\.
-.
-.P
-Furthermore, npm has the ability to "bundle" dependencies by adding
-the dep names to the \fB"bundledDependencies"\fR list in package\.json,
-which causes the folder to be included in the package tarball\.  What
-if the author of a module bundles its dependencies, and they use a
-different spelling for \fBnode_modules\fR?  npm would have to rename the
-folder at publish time, and then be smart enough to unpack it using
-your locally configured name\.  Complexity hurdle #3\.
-.
-.P
-Furthermore, what happens when you \fIchange\fR this name?  Fine, it\'s
-easy enough the first time, just rename the \fBnode_modules\fR folders to \fB\|\./blergyblerp/\fR or whatever name you choose\.  But what about when you
-change it again?  npm doesn\'t currently track any state about past
-configuration settings, so this would be rather difficult to do
-properly\.  It would have to track every previous value for this
-config, and always accept any of them, or else yesterday\'s install may
-be broken tomorrow\.  Complexity hurdle #5\.
-.
-.P
-Never going to happen\.  The folder is named \fBnode_modules\fR\|\.  It is
-written indelibly in the Node Way, handed down from the ancient times
-of Node 0\.3\.
-.
-.SH "How do I install node with npm?"
-You don\'t\.  Try one of these node version managers:
-.
-.P
-Unix:
-.
-.IP "\(bu" 4
-\fIhttp://github\.com/isaacs/nave\fR
-.
-.IP "\(bu" 4
-\fIhttp://github\.com/visionmedia/n\fR
-.
-.IP "\(bu" 4
-\fIhttp://github\.com/creationix/nvm\fR
-.
-.IP "" 0
-.
-.P
-Windows:
-.
-.IP "\(bu" 4
-\fIhttp://github\.com/marcelklehr/nodist\fR
-.
-.IP "\(bu" 4
-\fIhttps://github\.com/hakobera/nvmw\fR
-.
-.IP "" 0
-.
-.SH "How can I use npm for development?"
-npm help  See \fBnpm\-developersnpm help  \fR and \fBpackage\.json\fR\|\.
-.
-.P
-You\'ll most likely want to \fBnpm link\fR your development folder\.  That\'s
-awesomely handy\.
-.
-.P
-npm help  To set up your own private registry, check out \fBnpm\-registry\fR\|\.
-.
-.SH "Can I list a url as a dependency?"
-Yes\.  It should be a url to a gzipped tarball containing a single folder
-that has a package\.json in its root, or a git url\.
-(See "what is a package?" above\.)
-.
-.SH "How do I symlink to a dev folder so I don&#39;t have to keep re\-installing?"
-npm help See \fBnpm\-link\fR
-.
-.SH "The package registry website\.  What is that exactly?"
-npm help  See \fBnpm\-registry\fR\|\.
-.
-.SH "I forgot my password, and can&#39;t publish\.  How do I reset it?"
-Go to \fIhttps://npmjs\.org/forgot\fR\|\.
-.
-.SH "I get ECONNREFUSED a lot\.  What&#39;s up?"
-Either the registry is down, or node\'s DNS isn\'t able to reach out\.
-.
-.P
-To check if the registry is down, open up \fIhttp://registry\.npmjs\.org/\fR
-in a web browser\.  This will also tell you if you are just unable to
-access the internet for some reason\.
-.
-.P
-If the registry IS down, let me know by emailing \fIi@izs\.me\fR or posting
-an issue at \fIhttps://github\.com/isaacs/npm/issues\fR\|\.  We\'ll have
-someone kick it or something\.
-.
-.SH "Why no namespaces?"
-Please see this discussion: \fIhttps://github\.com/isaacs/npm/issues/798\fR
-.
-.P
-tl;dr \- It doesn\'t actually make things better, and can make them worse\.
-.
-.P
-If you want to namespace your own packages, you may: simply use the \fB\-\fR character to separate the names\.  npm is a mostly anarchic system\.
-There is not sufficient need to impose namespace rules on everyone\.
-.
-.SH "Who does npm?"
-\fBnpm view npm author\fR
-.
-.P
-\fBnpm view npm contributors\fR
-.
-.SH "I have a question or request not addressed here\. Where should I put it?"
-Post an issue on the github project:
-.
-.IP "\(bu" 4
-\fIhttps://github\.com/isaacs/npm/issues\fR
-.
-.IP "" 0
-.
-.SH "Why does npm hate me?"
-npm is not capable of hatred\.  It loves everyone, especially you\.
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help npm
-.
-.IP "\(bu" 4
-npm help  developers
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  folders
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-index.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,307 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-INDEX" "7" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-index\fR \-\- Index of all npm documentation
-.
-npm help .SH "README"
-node package manager
-.
-npm help .SH "npm"
-node package manager
-.
-npm help .SH "npm\-adduser"
-Add a registry user account
-.
-npm help .SH "npm\-bin"
-Display npm bin folder
-.
-npm help .SH "npm\-bugs"
-Bugs for a package in a web browser maybe
-.
-npm help .SH "npm\-build"
-Build a package
-.
-npm help .SH "npm\-bundle"
-REMOVED
-.
-npm help .SH "npm\-cache"
-Manipulates packages cache
-.
-npm help .SH "npm\-completion"
-Tab Completion for npm
-.
-npm help .SH "npm\-config"
-Manage the npm configuration files
-.
-npm help .SH "npm\-dedupe"
-Reduce duplication
-.
-npm help .SH "npm\-deprecate"
-Deprecate a version of a package
-.
-npm help .SH "npm\-docs"
-Docs for a package in a web browser maybe
-.
-npm help .SH "npm\-edit"
-Edit an installed package
-.
-npm help .SH "npm\-explore"
-Browse an installed package
-.
-npm help .SH "npm\-help\-search"
-Search npm help documentation
-.
-npm help .SH "npm\-help"
-Get help on npm
-.
-npm help .SH "npm\-init"
-Interactively create a package\.json file
-.
-npm help .SH "npm\-install"
-Install a package
-.
-npm help .SH "npm\-link"
-Symlink a package folder
-.
-npm help .SH "npm\-ls"
-List installed packages
-.
-npm help .SH "npm\-outdated"
-Check for outdated packages
-.
-npm help .SH "npm\-owner"
-Manage package owners
-.
-npm help .SH "npm\-pack"
-Create a tarball from a package
-.
-npm help .SH "npm\-prefix"
-Display prefix
-.
-npm help .SH "npm\-prune"
-Remove extraneous packages
-.
-npm help .SH "npm\-publish"
-Publish a package
-.
-npm help .SH "npm\-rebuild"
-Rebuild a package
-.
-npm help .SH "npm\-restart"
-Start a package
-.
-npm help .SH "npm\-rm"
-Remove a package
-.
-npm help .SH "npm\-root"
-Display npm root
-.
-npm help .SH "npm\-run\-script"
-Run arbitrary package scripts
-.
-npm help .SH "npm\-search"
-Search for packages
-.
-npm help .SH "npm\-shrinkwrap"
-Lock down dependency versions
-.
-npm help .SH "npm\-star"
-Mark your favorite packages
-.
-npm help .SH "npm\-stars"
-View packages marked as favorites
-.
-npm help .SH "npm\-start"
-Start a package
-.
-npm help .SH "npm\-stop"
-Stop a package
-.
-npm help .SH "npm\-submodule"
-Add a package as a git submodule
-.
-npm help .SH "npm\-tag"
-Tag a published version
-.
-npm help .SH "npm\-test"
-Test a package
-.
-npm help .SH "npm\-uninstall"
-Remove a package
-.
-npm help .SH "npm\-unpublish"
-Remove a package from the registry
-.
-npm help .SH "npm\-update"
-Update a package
-.
-npm help .SH "npm\-version"
-Bump a package version
-.
-npm help .SH "npm\-view"
-View registry info
-.
-npm help .SH "npm\-whoami"
-Display npm username
-.
-npm help .SH "repo"
-Open package repository page in the browser
-.
-npm apihelp .SH "npm"
-node package manager
-.
-npm apihelp .SH "npm\-bin"
-Display npm bin folder
-.
-npm apihelp .SH "npm\-bugs"
-Bugs for a package in a web browser maybe
-.
-npm apihelp .SH "npm\-commands"
-npm commands
-.
-npm apihelp .SH "npm\-config"
-Manage the npm configuration files
-.
-npm apihelp .SH "npm\-deprecate"
-Deprecate a version of a package
-.
-npm apihelp .SH "npm\-docs"
-Docs for a package in a web browser maybe
-.
-npm apihelp .SH "npm\-edit"
-Edit an installed package
-.
-npm apihelp .SH "npm\-explore"
-Browse an installed package
-.
-npm apihelp .SH "npm\-help\-search"
-Search the help pages
-.
-npm apihelp .SH "npm\-init"
-Interactively create a package\.json file
-.
-npm apihelp .SH "npm\-install"
-install a package programmatically
-.
-npm apihelp .SH "npm\-link"
-Symlink a package folder
-.
-npm apihelp .SH "npm\-load"
-Load config settings
-.
-npm apihelp .SH "npm\-ls"
-List installed packages
-.
-npm apihelp .SH "npm\-outdated"
-Check for outdated packages
-.
-npm apihelp .SH "npm\-owner"
-Manage package owners
-.
-npm apihelp .SH "npm\-pack"
-Create a tarball from a package
-.
-npm apihelp .SH "npm\-prefix"
-Display prefix
-.
-npm apihelp .SH "npm\-prune"
-Remove extraneous packages
-.
-npm apihelp .SH "npm\-publish"
-Publish a package
-.
-npm apihelp .SH "npm\-rebuild"
-Rebuild a package
-.
-npm apihelp .SH "npm\-restart"
-Start a package
-.
-npm apihelp .SH "npm\-root"
-Display npm root
-.
-npm apihelp .SH "npm\-run\-script"
-Run arbitrary package scripts
-.
-npm apihelp .SH "npm\-search"
-Search for packages
-.
-npm apihelp .SH "npm\-shrinkwrap"
-programmatically generate package shrinkwrap file
-.
-npm apihelp .SH "npm\-start"
-Start a package
-.
-npm apihelp .SH "npm\-stop"
-Stop a package
-.
-npm apihelp .SH "npm\-submodule"
-Add a package as a git submodule
-.
-npm apihelp .SH "npm\-tag"
-Tag a published version
-.
-npm apihelp .SH "npm\-test"
-Test a package
-.
-npm apihelp .SH "npm\-uninstall"
-uninstall a package programmatically
-.
-npm apihelp .SH "npm\-unpublish"
-Remove a package from the registry
-.
-npm apihelp .SH "npm\-update"
-Update a package
-.
-npm apihelp .SH "npm\-version"
-Bump a package version
-.
-npm apihelp .SH "npm\-view"
-View registry info
-.
-npm apihelp .SH "npm\-whoami"
-Display npm username
-.
-npm apihelp .SH "repo"
-Open package repository page in the browser
-.
-npm help  .SH "npm\-folders"
-Folder Structures Used by npm
-.
-npm help  .SH "npmrc"
-The npm config files
-.
-npm help  .SH "package\.json"
-Specifics of npm\'s package\.json handling
-.
-npm help  .SH "npm\-coding\-style"
-npm\'s "funny" coding style
-.
-npm help  .SH "npm\-config"
-More than you probably want to know about npm configuration
-.
-npm help  .SH "npm\-developers"
-Developer Guide
-.
-npm help  .SH "npm\-disputes"
-Handling Module Name Disputes
-.
-npm help  .SH "npm\-faq"
-Frequently Asked Questions
-.
-npm help  .SH "npm\-index"
-Index of all npm documentation
-.
-npm help  .SH "npm\-registry"
-The JavaScript Package Registry
-.
-npm help  .SH "npm\-scripts"
-How npm handles the "scripts" field
-.
-npm help  .SH "removing\-npm"
-Cleaning the Slate
-.
-npm help  .SH "semver"
-The semantic versioner for npm
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-registry.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,82 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-REGISTRY" "7" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-registry\fR \-\- The JavaScript Package Registry
-.
-.SH "DESCRIPTION"
-To resolve packages by name and version, npm talks to a registry website
-that implements the CommonJS Package Registry specification for reading
-package info\.
-.
-.P
-Additionally, npm\'s package registry implementation supports several
-write APIs as well, to allow for publishing packages and managing user
-account information\.
-.
-.P
-The official public npm registry is at \fIhttp://registry\.npmjs\.org/\fR\|\.  It
-is powered by a CouchDB database at \fIhttp://isaacs\.iriscouch\.com/registry\fR\|\.  The code for the couchapp is
-available at \fIhttp://github\.com/isaacs/npmjs\.org\fR\|\.  npm user accounts
-are CouchDB users, stored in the \fIhttp://isaacs\.iriscouch\.com/_users\fR
-database\.
-.
-.P
-npm help  npm help The registry URL is supplied by the \fBregistry\fR config parameter\.  See \fBnpm\-config\fR, \fBnpmrcnpm help  \fR, and \fBnpm\-config\fR for more on managing
-npm\'s configuration\.
-.
-.SH "Can I run my own private registry?"
-Yes!
-.
-.P
-The easiest way is to replicate the couch database, and use the same (or
-similar) design doc to implement the APIs\.
-.
-.P
-If you set up continuous replication from the official CouchDB, and then
-set your internal CouchDB as the registry config, then you\'ll be able
-to read any published packages, in addition to your private ones, and by
-default will only publish internally\.  If you then want to publish a
-package for the whole world to see, you can simply override the \fB\-\-registry\fR config for that command\.
-.
-.SH "I don&#39;t want my package published in the official registry\. It&#39;s private\."
-Set \fB"private": true\fR in your package\.json to prevent it from being
-published at all, or \fB"publishConfig":{"registry":"http://my\-internal\-registry\.local"}\fR
-to force it to be published only to your internal registry\.
-.
-.P
-npm help  See \fBpackage\.json\fR for more info on what goes in the package\.json file\.
-.
-.SH "Will you replicate from my registry into the public one?"
-No\.  If you want things to be public, then publish them into the public
-registry using npm\.  What little security there is would be for nought
-otherwise\.
-.
-.SH "Do I have to use couchdb to build a registry that npm can talk to?"
-No, but it\'s way easier\.  Basically, yes, you do, or you have to
-effectively implement the entire CouchDB API anyway\.
-.
-.SH "Is there a website or something to see package docs and such?"
-Yes, head over to \fIhttps://npmjs\.org/\fR
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help config
-.
-.IP "\(bu" 4
-npm help  config
-.
-.IP "\(bu" 4
-npm help  npmrc
-.
-.IP "\(bu" 4
-npm help  developers
-.
-.IP "\(bu" 4
-npm help  disputes
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/npm-scripts.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,354 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-SCRIPTS" "7" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-scripts\fR \-\- How npm handles the "scripts" field
-.
-.SH "DESCRIPTION"
-npm supports the "scripts" member of the package\.json script, for the
-following scripts:
-.
-.IP "\(bu" 4
-prepublish:
-Run BEFORE the package is published\.  (Also run on local \fBnpm
-install\fR without any arguments\.)
-.
-.IP "\(bu" 4
-publish, postpublish:
-Run AFTER the package is published\.
-.
-.IP "\(bu" 4
-preinstall:
-Run BEFORE the package is installed
-.
-.IP "\(bu" 4
-install, postinstall:
-Run AFTER the package is installed\.
-.
-.IP "\(bu" 4
-preuninstall, uninstall:
-Run BEFORE the package is uninstalled\.
-.
-.IP "\(bu" 4
-postuninstall:
-Run AFTER the package is uninstalled\.
-.
-.IP "\(bu" 4
-preupdate:
-Run BEFORE the package is updated with the update command\.
-.
-.IP "\(bu" 4
-update, postupdate:
-Run AFTER the package is updated with the update command\.
-.
-.IP "\(bu" 4
-pretest, test, posttest:
-Run by the \fBnpm test\fR command\.
-.
-.IP "\(bu" 4
-prestop, stop, poststop:
-Run by the \fBnpm stop\fR command\.
-.
-.IP "\(bu" 4
-prestart, start, poststart:
-Run by the \fBnpm start\fR command\.
-.
-.IP "\(bu" 4
-prerestart, restart, postrestart:
-Run by the \fBnpm restart\fR command\. Note: \fBnpm restart\fR will run the
-stop and start scripts if no \fBrestart\fR script is provided\.
-.
-.IP "" 0
-.
-.P
-Additionally, arbitrary scripts can be run by doing \fBnpm run\-script <stage> <pkg>\fR\|\.
-.
-.SH "NOTE: INSTALL SCRIPTS ARE AN ANTIPATTERN"
-\fBtl;dr\fR Don\'t use \fBinstall\fR\|\.  Use a \fB\|\.gyp\fR file for compilation, and \fBprepublish\fR for anything else\.
-.
-.P
-You should almost never have to explicitly set a \fBpreinstall\fR or \fBinstall\fR script\.  If you are doing this, please consider if there is
-another option\.
-.
-.P
-The only valid use of \fBinstall\fR or \fBpreinstall\fR scripts is for
-compilation which must be done on the target architecture\.  In early
-versions of node, this was often done using the \fBnode\-waf\fR scripts, or
-a standalone \fBMakefile\fR, and early versions of npm required that it be
-explicitly set in package\.json\.  This was not portable, and harder to
-do properly\.
-.
-.P
-In the current version of node, the standard way to do this is using a \fB\|\.gyp\fR file\.  If you have a file with a \fB\|\.gyp\fR extension in the root
-of your package, then npm will run the appropriate \fBnode\-gyp\fR commands
-automatically at install time\.  This is the only officially supported
-method for compiling binary addons, and does not require that you add
-anything to your package\.json file\.
-.
-.P
-If you have to do other things before your package is used, in a way
-that is not dependent on the operating system or architecture of the
-target system, then use a \fBprepublish\fR script instead\.  This includes
-tasks such as:
-.
-.IP "\(bu" 4
-Compile CoffeeScript source code into JavaScript\.
-.
-.IP "\(bu" 4
-Create minified versions of JavaScript source code\.
-.
-.IP "\(bu" 4
-Fetching remote resources that your package will use\.
-.
-.IP "" 0
-.
-.P
-The advantage of doing these things at \fBprepublish\fR time instead of \fBpreinstall\fR or \fBinstall\fR time is that they can be done once, in a
-single place, and thus greatly reduce complexity and variability\.
-Additionally, this means that:
-.
-.IP "\(bu" 4
-You can depend on \fBcoffee\-script\fR as a \fBdevDependency\fR, and thus
-your users don\'t need to have it installed\.
-.
-.IP "\(bu" 4
-You don\'t need to include the minifiers in your package, reducing
-the size for your users\.
-.
-.IP "\(bu" 4
-You don\'t need to rely on your users having \fBcurl\fR or \fBwget\fR or
-other system tools on the target machines\.
-.
-.IP "" 0
-.
-.SH "DEFAULT VALUES"
-npm will default some script values based on package contents\.
-.
-.IP "\(bu" 4
-\fB"start": "node server\.js"\fR:
-.
-.IP
-If there is a \fBserver\.js\fR file in the root of your package, then npm
-will default the \fBstart\fR command to \fBnode server\.js\fR\|\.
-.
-.IP "\(bu" 4
-\fB"preinstall": "node\-waf clean || true; node\-waf configure build"\fR:
-.
-.IP
-If there is a \fBwscript\fR file in the root of your package, npm will
-default the \fBpreinstall\fR command to compile using node\-waf\.
-.
-.IP "" 0
-.
-.SH "USER"
-If npm was invoked with root privileges, then it will change the uid
-to the user account or uid specified by the \fBuser\fR config, which
-defaults to \fBnobody\fR\|\.  Set the \fBunsafe\-perm\fR flag to run scripts with
-root privileges\.
-.
-.SH "ENVIRONMENT"
-Package scripts run in an environment where many pieces of information
-are made available regarding the setup of npm and the current state of
-the process\.
-.
-.SS "path"
-If you depend on modules that define executable scripts, like test
-suites, then those executables will be added to the \fBPATH\fR for
-executing the scripts\.  So, if your package\.json has this:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "dependencies" : { "bar" : "0\.1\.x" }
-, "scripts": { "start" : "bar \./test" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-then you could run \fBnpm start\fR to execute the \fBbar\fR script, which is
-exported into the \fBnode_modules/\.bin\fR directory on \fBnpm install\fR\|\.
-.
-.SS "package\.json vars"
-The package\.json fields are tacked onto the \fBnpm_package_\fR prefix\. So,
-for instance, if you had \fB{"name":"foo", "version":"1\.2\.5"}\fR in your
-package\.json file, then your package scripts would have the \fBnpm_package_name\fR environment variable set to "foo", and the \fBnpm_package_version\fR set to "1\.2\.5"
-.
-.SS "configuration"
-Configuration parameters are put in the environment with the \fBnpm_config_\fR prefix\. For instance, you can view the effective \fBroot\fR
-config by checking the \fBnpm_config_root\fR environment variable\.
-.
-.SS "Special: package\.json &quot;config&quot; hash"
-The package\.json "config" keys are overwritten in the environment if
-there is a config param of \fB<name>[@<version>]:<key>\fR\|\.  For example,
-if the package\.json has this:
-.
-.IP "" 4
-.
-.nf
-{ "name" : "foo"
-, "config" : { "port" : "8080" }
-, "scripts" : { "start" : "node server\.js" } }
-.
-.fi
-.
-.IP "" 0
-.
-.P
-and the server\.js is this:
-.
-.IP "" 4
-.
-.nf
-http\.createServer(\.\.\.)\.listen(process\.env\.npm_package_config_port)
-.
-.fi
-.
-.IP "" 0
-.
-.P
-then the user could change the behavior by doing:
-.
-.IP "" 4
-.
-.nf
-npm config set foo:port 80
-.
-.fi
-.
-.IP "" 0
-.
-.SS "current lifecycle event"
-Lastly, the \fBnpm_lifecycle_event\fR environment variable is set to
-whichever stage of the cycle is being executed\. So, you could have a
-single script used for different parts of the process which switches
-based on what\'s currently happening\.
-.
-.P
-Objects are flattened following this format, so if you had \fB{"scripts":{"install":"foo\.js"}}\fR in your package\.json, then you\'d
-see this in the script:
-.
-.IP "" 4
-.
-.nf
-process\.env\.npm_package_scripts_install === "foo\.js"
-.
-.fi
-.
-.IP "" 0
-.
-.SH "EXAMPLES"
-For example, if your package\.json contains this:
-.
-.IP "" 4
-.
-.nf
-{ "scripts" :
-  { "install" : "scripts/install\.js"
-  , "postinstall" : "scripts/install\.js"
-  , "uninstall" : "scripts/uninstall\.js"
-  }
-}
-.
-.fi
-.
-.IP "" 0
-.
-.P
-then the \fBscripts/install\.js\fR will be called for the install,
-post\-install, stages of the lifecycle, and the \fBscripts/uninstall\.js\fR
-would be called when the package is uninstalled\.  Since \fBscripts/install\.js\fR is running for three different phases, it would
-be wise in this case to look at the \fBnpm_lifecycle_event\fR environment
-variable\.
-.
-.P
-If you want to run a make command, you can do so\.  This works just
-fine:
-.
-.IP "" 4
-.
-.nf
-{ "scripts" :
-  { "preinstall" : "\./configure"
-  , "install" : "make && make install"
-  , "test" : "make test"
-  }
-}
-.
-.fi
-.
-.IP "" 0
-.
-.SH "EXITING"
-Scripts are run by passing the line as a script argument to \fBsh\fR\|\.
-.
-.P
-If the script exits with a code other than 0, then this will abort the
-process\.
-.
-.P
-Note that these script files don\'t have to be nodejs or even
-javascript programs\. They just have to be some kind of executable
-file\.
-.
-.SH "HOOK SCRIPTS"
-If you want to run a specific script at a specific lifecycle event for
-ALL packages, then you can use a hook script\.
-.
-.P
-Place an executable file at \fBnode_modules/\.hooks/{eventname}\fR, and
-it\'ll get run for all packages when they are going through that point
-in the package lifecycle for any packages installed in that root\.
-.
-.P
-Hook scripts are run exactly the same way as package\.json scripts\.
-That is, they are in a separate child process, with the env described
-above\.
-.
-.SH "BEST PRACTICES"
-.
-.IP "\(bu" 4
-Don\'t exit with a non\-zero error code unless you \fIreally\fR mean it\.
-Except for uninstall scripts, this will cause the npm action to
-fail, and potentially be rolled back\.  If the failure is minor or
-only will prevent some optional features, then it\'s better to just
-print a warning and exit successfully\.
-.
-.IP "\(bu" 4
-npm help  Try not to use scripts to do what npm can do for you\.  Read through \fBpackage\.json\fR to see all the things that you can specify and enable
-by simply describing your package appropriately\.  In general, this
-will lead to a more robust and consistent state\.
-.
-.IP "\(bu" 4
-Inspect the env to determine where to put things\.  For instance, if
-the \fBnpm_config_binroot\fR environ is set to \fB/home/user/bin\fR, then
-don\'t try to install executables into \fB/usr/local/bin\fR\|\.  The user
-probably set it up that way for a reason\.
-.
-.IP "\(bu" 4
-Don\'t prefix your script commands with "sudo"\.  If root permissions
-are required for some reason, then it\'ll fail with that error, and
-the user will sudo the npm command in question\.
-.
-.IP "" 0
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-npm help run\-script
-.
-.IP "\(bu" 4
-npm help  package\.json
-.
-.IP "\(bu" 4
-npm help  developers
-.
-.IP "\(bu" 4
-npm help install
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/removing-npm.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,107 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "NPM\-REMOVAL" "1" "November 2013" "" ""
-.
-.SH "NAME"
-\fBnpm-removal\fR \-\- Cleaning the Slate
-.
-.SH "SYNOPSIS"
-So sad to see you go\.
-.
-.IP "" 4
-.
-.nf
-sudo npm uninstall npm \-g
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Or, if that fails, get the npm source code, and do:
-.
-.IP "" 4
-.
-.nf
-sudo make uninstall
-.
-.fi
-.
-.IP "" 0
-.
-.SH "More Severe Uninstalling"
-Usually, the above instructions are sufficient\.  That will remove
-npm, but leave behind anything you\'ve installed\.
-.
-.P
-If that doesn\'t work, or if you require more drastic measures,
-continue reading\.
-.
-.P
-Note that this is only necessary for globally\-installed packages\.  Local
-installs are completely contained within a project\'s \fBnode_modules\fR
-folder\.  Delete that folder, and everything is gone (unless a package\'s
-install script is particularly ill\-behaved)\.
-.
-.P
-This assumes that you installed node and npm in the default place\.  If
-you configured node with a different \fB\-\-prefix\fR, or installed npm with a
-different prefix setting, then adjust the paths accordingly, replacing \fB/usr/local\fR with your install prefix\.
-.
-.P
-To remove everything npm\-related manually:
-.
-.IP "" 4
-.
-.nf
-rm \-rf /usr/local/{lib/node{,/\.npm,_modules},bin,share/man}/npm*
-.
-.fi
-.
-.IP "" 0
-.
-.P
-If you installed things \fIwith\fR npm, then your best bet is to uninstall
-them with npm first, and then install them again once you have a
-proper install\.  This can help find any symlinks that are lying
-around:
-.
-.IP "" 4
-.
-.nf
-ls \-laF /usr/local/{lib/node{,/\.npm},bin,share/man} | grep npm
-.
-.fi
-.
-.IP "" 0
-.
-.P
-Prior to version 0\.3, npm used shim files for executables and node
-modules\.  To track those down, you can do the following:
-.
-.IP "" 4
-.
-.nf
-find /usr/local/{lib/node,bin} \-exec grep \-l npm \\{\\} \\; ;
-.
-.fi
-.
-.IP "" 0
-.
-.P
-(This is also in the README file\.)
-.
-.SH "SEE ALSO"
-.
-.IP "\(bu" 4
-README
-.
-.IP "\(bu" 4
-npm help rm
-.
-.IP "\(bu" 4
-npm help prune
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/man/man7/semver.7	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,172 +0,0 @@
-.\" Generated with Ronnjs 0.3.8
-.\" http://github.com/kapouer/ronnjs/
-.
-.TH "SEMVER" "7" "November 2013" "" ""
-.
-.SH "NAME"
-\fBsemver\fR \-\- The semantic versioner for npm
-.
-.SH "Usage"
-.
-.nf
-$ npm install semver
-semver\.valid(\'1\.2\.3\') // \'1\.2\.3\'
-semver\.valid(\'a\.b\.c\') // null
-semver\.clean(\'  =v1\.2\.3   \') // \'1\.2\.3\'
-semver\.satisfies(\'1\.2\.3\', \'1\.x || >=2\.5\.0 || 5\.0\.0 \- 7\.2\.3\') // true
-semver\.gt(\'1\.2\.3\', \'9\.8\.7\') // false
-semver\.lt(\'1\.2\.3\', \'9\.8\.7\') // true
-.
-.fi
-.
-.P
-As a command\-line utility:
-.
-.IP "" 4
-.
-.nf
-$ semver \-h
-Usage: semver <version> [<version> [\.\.\.]] [\-r <range> | \-i <inc> | \-d <dec>]
-Test if version(s) satisfy the supplied range(s), and sort them\.
-Multiple versions or ranges may be supplied, unless increment
-or decrement options are specified\.  In that case, only a single
-version may be used, and it is incremented by the specified level
-Program exits successfully if any valid version satisfies
-all supplied ranges, and prints all satisfying versions\.
-If no versions are valid, or ranges are not satisfied,
-then exits failure\.
-Versions are printed in ascending order, so supplying
-multiple versions to the utility will just sort them\.
-.
-.fi
-.
-.IP "" 0
-.
-.SH "Versions"
-A "version" is described by the v2\.0\.0 specification found at \fIhttp://semver\.org/\fR\|\.
-.
-.P
-A leading \fB"="\fR or \fB"v"\fR character is stripped off and ignored\.
-.
-.SH "Ranges"
-The following range styles are supported:
-.
-.IP "\(bu" 4
-\fB1\.2\.3\fR A specific version\.  When nothing else will do\.  Note that
-build metadata is still ignored, so \fB1\.2\.3+build2012\fR will satisfy
-this range\.
-.
-.IP "\(bu" 4
-\fB>1\.2\.3\fR Greater than a specific version\.
-.
-.IP "\(bu" 4
-\fB<1\.2\.3\fR Less than a specific version\.  If there is no prerelease
-tag on the version range, then no prerelease version will be allowed
-either, even though these are technically "less than"\.
-.
-.IP "\(bu" 4
-\fB>=1\.2\.3\fR Greater than or equal to\.  Note that prerelease versions
-are NOT equal to their "normal" equivalents, so \fB1\.2\.3\-beta\fR will
-not satisfy this range, but \fB2\.3\.0\-beta\fR will\.
-.
-.IP "\(bu" 4
-\fB<=1\.2\.3\fR Less than or equal to\.  In this case, prerelease versions
-ARE allowed, so \fB1\.2\.3\-beta\fR would satisfy\.
-.
-.IP "\(bu" 4
-\fB1\.2\.3 \- 2\.3\.4\fR := \fB>=1\.2\.3 <=2\.3\.4\fR
-.
-.IP "\(bu" 4
-\fB~1\.2\.3\fR := \fB>=1\.2\.3\-0 <1\.3\.0\-0\fR  "Reasonably close to 1\.2\.3"\.  When
-using tilde operators, prerelease versions are supported as well,
-but a prerelease of the next significant digit will NOT be
-satisfactory, so \fB1\.3\.0\-beta\fR will not satisfy \fB~1\.2\.3\fR\|\.
-.
-.IP "\(bu" 4
-\fB~1\.2\fR := \fB>=1\.2\.0\-0 <1\.3\.0\-0\fR "Any version starting with 1\.2"
-.
-.IP "\(bu" 4
-\fB1\.2\.x\fR := \fB>=1\.2\.0\-0 <1\.3\.0\-0\fR "Any version starting with 1\.2"
-.
-.IP "\(bu" 4
-\fB~1\fR := \fB>=1\.0\.0\-0 <2\.0\.0\-0\fR "Any version starting with 1"
-.
-.IP "\(bu" 4
-\fB1\.x\fR := \fB>=1\.0\.0\-0 <2\.0\.0\-0\fR "Any version starting with 1"
-.
-.IP "" 0
-.
-.P
-Ranges can be joined with either a space (which implies "and") or a \fB||\fR (which implies "or")\.
-.
-.SH "Functions"
-All methods and classes take a final \fBloose\fR boolean argument that, if
-true, will be more forgiving about not\-quite\-valid semver strings\.
-The resulting output will always be 100% strict, of course\.
-.
-.P
-Strict\-mode Comparators and Ranges will be strict about the SemVer
-strings that they parse\.
-.
-.IP "\(bu" 4
-valid(v): Return the parsed version, or null if it\'s not valid\.
-.
-.IP "\(bu" 4
-inc(v, release): Return the version incremented by the release type
-(major, minor, patch, or prerelease), or null if it\'s not valid\.
-.
-.IP "" 0
-.
-.SS "Comparison"
-.
-.IP "\(bu" 4
-gt(v1, v2): \fBv1 > v2\fR
-.
-.IP "\(bu" 4
-gte(v1, v2): \fBv1 >= v2\fR
-.
-.IP "\(bu" 4
-lt(v1, v2): \fBv1 < v2\fR
-.
-.IP "\(bu" 4
-lte(v1, v2): \fBv1 <= v2\fR
-.
-.IP "\(bu" 4
-eq(v1, v2): \fBv1 == v2\fR This is true if they\'re logically equivalent,
-even if they\'re not the exact same string\.  You already know how to
-compare strings\.
-.
-.IP "\(bu" 4
-neq(v1, v2): \fBv1 != v2\fR The opposite of eq\.
-.
-.IP "\(bu" 4
-cmp(v1, comparator, v2): Pass in a comparison string, and it\'ll call
-the corresponding function above\.  \fB"==="\fR and \fB"!=="\fR do simple
-string comparison, but are included for completeness\.  Throws if an
-invalid comparison string is provided\.
-.
-.IP "\(bu" 4
-compare(v1, v2): Return 0 if v1 == v2, or 1 if v1 is greater, or \-1 if
-v2 is greater\.  Sorts in ascending order if passed to Array\.sort()\.
-.
-.IP "\(bu" 4
-rcompare(v1, v2): The reverse of compare\.  Sorts an array of versions
-in descending order when passed to Array\.sort()\.
-.
-.IP "" 0
-.
-.SS "Ranges"
-.
-.IP "\(bu" 4
-validRange(range): Return the valid range or null if it\'s not valid
-.
-.IP "\(bu" 4
-satisfies(version, range): Return true if the version satisfies the
-range\.
-.
-.IP "\(bu" 4
-maxSatisfying(versions, range): Return the highest version in the list
-that satisfies the range, or null if none of them do\.
-.
-.IP "" 0
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/abbrev/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-Copyright 2009, 2010, 2011 Isaac Z. Schlueter.
-All rights reserved.
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/abbrev/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-# abbrev-js
-
-Just like [ruby's Abbrev](http://apidock.com/ruby/Abbrev).
-
-Usage:
-
-    var abbrev = require("abbrev");
-    abbrev("foo", "fool", "folding", "flop");
-    
-    // returns:
-    { fl: 'flop'
-    , flo: 'flop'
-    , flop: 'flop'
-    , fol: 'folding'
-    , fold: 'folding'
-    , foldi: 'folding'
-    , foldin: 'folding'
-    , folding: 'folding'
-    , foo: 'foo'
-    , fool: 'fool'
-    }
-
-This is handy for command-line scripts, or other cases where you want to be able to accept shorthands.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/abbrev/lib/abbrev.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,111 +0,0 @@
-
-module.exports = exports = abbrev.abbrev = abbrev
-
-abbrev.monkeyPatch = monkeyPatch
-
-function monkeyPatch () {
-  Object.defineProperty(Array.prototype, 'abbrev', {
-    value: function () { return abbrev(this) },
-    enumerable: false, configurable: true, writable: true
-  })
-
-  Object.defineProperty(Object.prototype, 'abbrev', {
-    value: function () { return abbrev(Object.keys(this)) },
-    enumerable: false, configurable: true, writable: true
-  })
-}
-
-function abbrev (list) {
-  if (arguments.length !== 1 || !Array.isArray(list)) {
-    list = Array.prototype.slice.call(arguments, 0)
-  }
-  for (var i = 0, l = list.length, args = [] ; i < l ; i ++) {
-    args[i] = typeof list[i] === "string" ? list[i] : String(list[i])
-  }
-
-  // sort them lexicographically, so that they're next to their nearest kin
-  args = args.sort(lexSort)
-
-  // walk through each, seeing how much it has in common with the next and previous
-  var abbrevs = {}
-    , prev = ""
-  for (var i = 0, l = args.length ; i < l ; i ++) {
-    var current = args[i]
-      , next = args[i + 1] || ""
-      , nextMatches = true
-      , prevMatches = true
-    if (current === next) continue
-    for (var j = 0, cl = current.length ; j < cl ; j ++) {
-      var curChar = current.charAt(j)
-      nextMatches = nextMatches && curChar === next.charAt(j)
-      prevMatches = prevMatches && curChar === prev.charAt(j)
-      if (!nextMatches && !prevMatches) {
-        j ++
-        break
-      }
-    }
-    prev = current
-    if (j === cl) {
-      abbrevs[current] = current
-      continue
-    }
-    for (var a = current.substr(0, j) ; j <= cl ; j ++) {
-      abbrevs[a] = current
-      a += current.charAt(j)
-    }
-  }
-  return abbrevs
-}
-
-function lexSort (a, b) {
-  return a === b ? 0 : a > b ? 1 : -1
-}
-
-
-// tests
-if (module === require.main) {
-
-var assert = require("assert")
-var util = require("util")
-
-console.log("running tests")
-function test (list, expect) {
-  var actual = abbrev(list)
-  assert.deepEqual(actual, expect,
-    "abbrev("+util.inspect(list)+") === " + util.inspect(expect) + "\n"+
-    "actual: "+util.inspect(actual))
-  actual = abbrev.apply(exports, list)
-  assert.deepEqual(abbrev.apply(exports, list), expect,
-    "abbrev("+list.map(JSON.stringify).join(",")+") === " + util.inspect(expect) + "\n"+
-    "actual: "+util.inspect(actual))
-}
-
-test([ "ruby", "ruby", "rules", "rules", "rules" ],
-{ rub: 'ruby'
-, ruby: 'ruby'
-, rul: 'rules'
-, rule: 'rules'
-, rules: 'rules'
-})
-test(["fool", "foom", "pool", "pope"],
-{ fool: 'fool'
-, foom: 'foom'
-, poo: 'pool'
-, pool: 'pool'
-, pop: 'pope'
-, pope: 'pope'
-})
-test(["a", "ab", "abc", "abcd", "abcde", "acde"],
-{ a: 'a'
-, ab: 'ab'
-, abc: 'abc'
-, abcd: 'abcd'
-, abcde: 'abcde'
-, ac: 'acde'
-, acd: 'acde'
-, acde: 'acde'
-})
-
-console.log("pass")
-
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/abbrev/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-{
-  "name": "abbrev",
-  "version": "1.0.4",
-  "description": "Like ruby's abbrev module, but in js",
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me"
-  },
-  "main": "./lib/abbrev.js",
-  "scripts": {
-    "test": "node lib/abbrev.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "http://github.com/isaacs/abbrev-js"
-  },
-  "license": {
-    "type": "MIT",
-    "url": "https://github.com/isaacs/abbrev-js/raw/master/LICENSE"
-  },
-  "readme": "# abbrev-js\n\nJust like [ruby's Abbrev](http://apidock.com/ruby/Abbrev).\n\nUsage:\n\n    var abbrev = require(\"abbrev\");\n    abbrev(\"foo\", \"fool\", \"folding\", \"flop\");\n    \n    // returns:\n    { fl: 'flop'\n    , flo: 'flop'\n    , flop: 'flop'\n    , fol: 'folding'\n    , fold: 'folding'\n    , foldi: 'folding'\n    , foldin: 'folding'\n    , folding: 'folding'\n    , foo: 'foo'\n    , fool: 'fool'\n    }\n\nThis is handy for command-line scripts, or other cases where you want to be able to accept shorthands.\n",
-  "readmeFilename": "README.md",
-  "_id": "abbrev@1.0.4",
-  "_from": "abbrev@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-node_modules
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,91 +0,0 @@
-ansi.js
-=========
-### Advanced ANSI formatting tool for Node.js
-
-`ansi.js` is a module for Node.js that provides an easy-to-use API for
-writing ANSI escape codes to `Stream` instances. ANSI escape codes are used to do
-fancy things in a terminal window, like render text in colors, delete characters,
-lines, the entire window, or hide and show the cursor, and lots more!
-
-The code for the example in the screenshot above can be found in the
-`examples/imgcat` directory.
-
-#### Features:
-
- * 256 color support for the terminal!
- * Make a beep sound from your terminal!
- * Works with *any* writable `Stream` instance.
- * Allows you to move the cursor anywhere on the terminal window.
- * Allows you to delete existing contents from the terminal window.
- * Allows you to hide and show the cursor.
- * Converts CSS color codes and RGB values into ANSI escape codes.
- * Low-level; you are in control of when escape codes are used, it's not abstracted.
-
-
-Installation
-------------
-
-Install with `npm`:
-
-``` bash
-$ npm install ansi
-```
-
-
-Example
--------
-
-``` js
-var ansi = require('ansi')
-  , cursor = ansi(process.stdout)
-
-// You can chain your calls forever:
-cursor
-  .red()                 // Set font color to red
-  .bg.grey()             // Set background color to grey
-  .write('Hello World!') // Write 'Hello World!' to stdout
-  .bg.reset()            // Reset the bgcolor before writing the trailing \n,
-                         //      to avoid Terminal glitches
-  .write('\n')           // And a final \n to wrap things up
-
-// Rendering modes are persistent:
-cursor.hex('#660000').bold().underline()
-
-// You can use the regular logging functions, text will be green
-console.log('This is blood red, bold text')
-
-// To reset just the foreground color:
-cursor.fg.reset()
-
-console.log('This will still be bold')
-
-// Clean up after yourself!
-cursor.reset()
-```
-
-
-License
--------
-
-(The MIT License)
-
-Copyright (c) 2012 Nathan Rajlich &lt;nathan@tootallnate.net&gt;
-
-Permission is hereby granted, free of charge, to any person obtaining
-a copy of this software and associated documentation files (the
-'Software'), to deal in the Software without restriction, including
-without limitation the rights to use, copy, modify, merge, publish,
-distribute, sublicense, and/or sell copies of the Software, and to
-permit persons to whom the Software is furnished to do so, subject to
-the following conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
-IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
-CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
-TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
-SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/color-spaces.pl	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,67 +0,0 @@
-#!/usr/bin/perl
-# Author: Todd Larason <jtl@molehill.org>
-# $XFree86: xc/programs/xterm/vttests/256colors2.pl,v 1.1 1999/07/11 08:49:54 dawes Exp $
-
-print "256 color mode\n\n";
-
-# display back ground colors
-
-for ($fgbg = 38; $fgbg <= 48; $fgbg +=10) {
-
-# first the system ones:
-print "System colors:\n";
-for ($color = 0; $color < 8; $color++) {
-    print "\x1b[${fgbg};5;${color}m::";
-}
-print "\x1b[0m\n";
-for ($color = 8; $color < 16; $color++) {
-    print "\x1b[${fgbg};5;${color}m::";
-}
-print "\x1b[0m\n\n";
-
-# now the color cube
-print "Color cube, 6x6x6:\n";
-for ($green = 0; $green < 6; $green++) {
-    for ($red = 0; $red < 6; $red++) {
-	for ($blue = 0; $blue < 6; $blue++) {
-	    $color = 16 + ($red * 36) + ($green * 6) + $blue;
-	    print "\x1b[${fgbg};5;${color}m::";
-	}
-	print "\x1b[0m ";
-    }
-    print "\n";
-}
-
-# now the grayscale ramp
-print "Grayscale ramp:\n";
-for ($color = 232; $color < 256; $color++) {
-    print "\x1b[${fgbg};5;${color}m::";
-}
-print "\x1b[0m\n\n";
-
-}
-
-print "Examples for the 3-byte color mode\n\n";
-
-for ($fgbg = 38; $fgbg <= 48; $fgbg +=10) {
-
-# now the color cube
-print "Color cube\n";
-for ($green = 0; $green < 256; $green+=51) {
-    for ($red = 0; $red < 256; $red+=51) {
-	for ($blue = 0; $blue < 256; $blue+=51) {
-            print "\x1b[${fgbg};2;${red};${green};${blue}m::";
-	}
-	print "\x1b[0m ";
-    }
-    print "\n";
-}
-
-# now the grayscale ramp
-print "Grayscale ramp:\n";
-for ($gray = 8; $gray < 256; $gray+=10) {
-    print "\x1b[${fgbg};2;${gray};${gray};${gray}m::";
-}
-print "\x1b[0m\n\n";
-
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/beep/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-#!/usr/bin/env node
-
-/**
- * Invokes the terminal "beep" sound once per second on every exact second.
- */
-
-process.title = 'beep'
-
-var cursor = require('../../')(process.stdout)
-
-function beep () {
-  cursor.beep()
-  setTimeout(beep, 1000 - (new Date()).getMilliseconds())
-}
-
-setTimeout(beep, 1000 - (new Date()).getMilliseconds())
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/clear/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-#!/usr/bin/env node
-
-/**
- * Like GNU ncurses "clear" command.
- * https://github.com/mscdex/node-ncurses/blob/master/deps/ncurses/progs/clear.c
- */
-
-process.title = 'clear'
-
-function lf () { return '\n' }
-
-require('../../')(process.stdout)
-  .write(Array.apply(null, Array(process.stdout.getWindowSize()[1])).map(lf).join(''))
-  .eraseData(2)
-  .goto(1, 1)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/cursorPosition.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-#!/usr/bin/env node
-
-var tty = require('tty')
-var cursor = require('../')(process.stdout)
-
-// listen for the queryPosition report on stdin
-process.stdin.resume()
-raw(true)
-
-process.stdin.once('data', function (b) {
-  var match = /\[(\d+)\;(\d+)R$/.exec(b.toString())
-  if (match) {
-    var xy = match.slice(1, 3).reverse().map(Number)
-    console.error(xy)
-  }
-
-  // cleanup and close stdin
-  raw(false)
-  process.stdin.pause()
-})
-
-
-// send the query position request code to stdout
-cursor.queryPosition()
-
-function raw (mode) {
-  if (process.stdin.setRawMode) {
-    process.stdin.setRawMode(mode)
-  } else {
-    tty.setRawMode(mode)
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/progress/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,87 +0,0 @@
-#!/usr/bin/env node
-
-var assert = require('assert')
-  , ansi = require('../../')
-
-function Progress (stream, width) {
-  this.cursor = ansi(stream)
-  this.delta = this.cursor.newlines
-  this.width = width | 0 || 10
-  this.open = '['
-  this.close = ']'
-  this.complete = '█'
-  this.incomplete = '_'
-
-  // initial render
-  this.progress = 0
-}
-
-Object.defineProperty(Progress.prototype, 'progress', {
-    get: get
-  , set: set
-  , configurable: true
-  , enumerable: true
-})
-
-function get () {
-  return this._progress
-}
-
-function set (v) {
-  this._progress = Math.max(0, Math.min(v, 100))
-
-  var w = this.width - this.complete.length - this.incomplete.length
-    , n = w * (this._progress / 100) | 0
-    , i = w - n
-    , com = c(this.complete, n)
-    , inc = c(this.incomplete, i)
-    , delta = this.cursor.newlines - this.delta
-
-  assert.equal(com.length + inc.length, w)
-
-  if (delta > 0) {
-    this.cursor.up(delta)
-    this.delta = this.cursor.newlines
-  }
-
-  this.cursor
-    .horizontalAbsolute(0)
-    .eraseLine(2)
-    .fg.white()
-    .write(this.open)
-    .fg.grey()
-    .bold()
-    .write(com)
-    .resetBold()
-    .write(inc)
-    .fg.white()
-    .write(this.close)
-    .fg.reset()
-    .write('\n')
-}
-
-function c (char, length) {
-  return Array.apply(null, Array(length)).map(function () {
-    return char
-  }).join('')
-}
-
-
-
-
-// Usage
-var width = parseInt(process.argv[2], 10) || process.stdout.getWindowSize()[0] / 2
-  , p = new Progress(process.stdout, width)
-
-;(function tick () {
-  p.progress += Math.random() * 5
-  p.cursor
-    .eraseLine(2)
-    .write('Progress: ')
-    .bold().write(p.progress.toFixed(2))
-    .write('%')
-    .resetBold()
-    .write('\n')
-  if (p.progress < 100)
-    setTimeout(tick, 100)
-})()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/examples/starwars.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,46 +0,0 @@
-#!/usr/bin/env node
-
-/**
- * A little script to play the ASCII Star Wars, but with a hidden cursor,
- * since over `telnet(1)` the cursor remains visible which is annoying.
- */
-
-process.title = 'starwars'
-
-var net = require('net')
-  , cursor = require('../')(process.stdout)
-  , color = process.argv[2]
-
-// enable "raw mode" so that keystrokes aren't visible
-process.stdin.resume()
-if (process.stdin.setRawMode) {
-  process.stdin.setRawMode(true)
-} else {
-  require('tty').setRawMode(true)
-}
-
-// connect to the ASCII Star Wars server
-var socket = net.connect(23, 'towel.blinkenlights.nl')
-
-socket.on('connect', function () {
-  if (color in cursor.fg) {
-    cursor.fg[color]()
-  }
-  cursor.hide()
-  socket.pipe(process.stdout)
-})
-
-process.stdin.on('data', function (data) {
-  if (data.toString() === '\u0003') {
-    // Ctrl+C; a.k.a SIGINT
-    socket.destroy()
-    process.stdin.pause()
-  }
-})
-
-process.on('exit', function () {
-  cursor
-    .show()
-    .fg.reset()
-    .write('\n')
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/lib/ansi.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,405 +0,0 @@
-
-/**
- * References:
- *
- *   - http://en.wikipedia.org/wiki/ANSI_escape_code
- *   - http://www.termsys.demon.co.uk/vtansi.htm
- *
- */
-
-/**
- * Module dependencies.
- */
-
-var emitNewlineEvents = require('./newlines')
-  , prefix = '\x1b[' // For all escape codes
-  , suffix = 'm'     // Only for color codes
-
-/**
- * The ANSI escape sequences.
- */
-
-var codes = {
-    up: 'A'
-  , down: 'B'
-  , forward: 'C'
-  , back: 'D'
-  , nextLine: 'E'
-  , previousLine: 'F'
-  , horizontalAbsolute: 'G'
-  , eraseData: 'J'
-  , eraseLine: 'K'
-  , scrollUp: 'S'
-  , scrollDown: 'T'
-  , savePosition: 's'
-  , restorePosition: 'u'
-  , queryPosition: '6n'
-  , hide: '?25l'
-  , show: '?25h'
-}
-
-/**
- * Rendering ANSI codes.
- */
-
-var styles = {
-    bold: 1
-  , italic: 3
-  , underline: 4
-  , inverse: 7
-}
-
-/**
- * The negating ANSI code for the rendering modes.
- */
-
-var reset = {
-    bold: 22
-  , italic: 23
-  , underline: 24
-  , inverse: 27
-}
-
-/**
- * The standard, styleable ANSI colors.
- */
-
-var colors = {
-    white: 37
-  , black: 30
-  , blue: 34
-  , cyan: 36
-  , green: 32
-  , magenta: 35
-  , red: 31
-  , yellow: 33
-  , grey: 90
-  , brightBlack: 90
-  , brightRed: 91
-  , brightGreen: 92
-  , brightYellow: 93
-  , brightBlue: 94
-  , brightMagenta: 95
-  , brightCyan: 96
-  , brightWhite: 97
-}
-
-
-/**
- * Creates a Cursor instance based off the given `writable stream` instance.
- */
-
-function ansi (stream, options) {
-  if (stream._ansicursor) {
-    return stream._ansicursor
-  } else {
-    return stream._ansicursor = new Cursor(stream, options)
-  }
-}
-module.exports = exports = ansi
-
-/**
- * The `Cursor` class.
- */
-
-function Cursor (stream, options) {
-  if (!(this instanceof Cursor)) {
-    return new Cursor(stream, options)
-  }
-  if (typeof stream != 'object' || typeof stream.write != 'function') {
-    throw new Error('a valid Stream instance must be passed in')
-  }
-
-  // the stream to use
-  this.stream = stream
-
-  // when 'enabled' is false then all the functions are no-ops except for write()
-  this.enabled = options && options.enabled
-  if (typeof this.enabled === 'undefined') {
-    this.enabled = stream.isTTY
-  }
-  this.enabled = !!this.enabled
-
-  // then `buffering` is true, then `write()` calls are buffered in
-  // memory until `flush()` is invoked
-  this.buffering = !!(options && options.buffering)
-  this._buffer = []
-
-  // controls the foreground and background colors
-  this.fg = this.foreground = new Colorer(this, 0)
-  this.bg = this.background = new Colorer(this, 10)
-
-  // defaults
-  this.Bold = false
-  this.Italic = false
-  this.Underline = false
-  this.Inverse = false
-
-  // keep track of the number of "newlines" that get encountered
-  this.newlines = 0
-  emitNewlineEvents(stream)
-  stream.on('newline', function () {
-    this.newlines++
-  }.bind(this))
-}
-exports.Cursor = Cursor
-
-/**
- * Helper function that calls `write()` on the underlying Stream.
- * Returns `this` instead of the write() return value to keep
- * the chaining going.
- */
-
-Cursor.prototype.write = function (data) {
-  if (this.buffering) {
-    this._buffer.push(arguments)
-  } else {
-    this.stream.write.apply(this.stream, arguments)
-  }
-  return this
-}
-
-/**
- * Buffer `write()` calls into memory.
- *
- * @api public
- */
-
-Cursor.prototype.buffer = function () {
-  this.buffering = true
-  return this
-}
-
-/**
- * Write out the in-memory buffer.
- *
- * @api public
- */
-
-Cursor.prototype.flush = function () {
-  this.buffering = false
-  var str = this._buffer.map(function (args) {
-    if (args.length != 1) throw new Error('unexpected args length! ' + args.length);
-    return args[0];
-  }).join('');
-  this._buffer.splice(0); // empty
-  this.write(str);
-  return this
-}
-
-
-/**
- * The `Colorer` class manages both the background and foreground colors.
- */
-
-function Colorer (cursor, base) {
-  this.current = null
-  this.cursor = cursor
-  this.base = base
-}
-exports.Colorer = Colorer
-
-/**
- * Write an ANSI color code, ensuring that the same code doesn't get rewritten.
- */
-
-Colorer.prototype._setColorCode = function setColorCode (code) {
-  var c = String(code)
-  if (this.current === c) return
-  this.cursor.enabled && this.cursor.write(prefix + c + suffix)
-  this.current = c
-  return this
-}
-
-
-/**
- * Set up the positional ANSI codes.
- */
-
-Object.keys(codes).forEach(function (name) {
-  var code = String(codes[name])
-  Cursor.prototype[name] = function () {
-    var c = code
-    if (arguments.length > 0) {
-      c = toArray(arguments).map(Math.round).join(';') + code
-    }
-    this.enabled && this.write(prefix + c)
-    return this
-  }
-})
-
-/**
- * Set up the functions for the rendering ANSI codes.
- */
-
-Object.keys(styles).forEach(function (style) {
-  var name = style[0].toUpperCase() + style.substring(1)
-    , c = styles[style]
-    , r = reset[style]
-
-  Cursor.prototype[style] = function () {
-    if (this[name]) return
-    this.enabled && this.write(prefix + c + suffix)
-    this[name] = true
-    return this
-  }
-
-  Cursor.prototype['reset' + name] = function () {
-    if (!this[name]) return
-    this.enabled && this.write(prefix + r + suffix)
-    this[name] = false
-    return this
-  }
-})
-
-/**
- * Setup the functions for the standard colors.
- */
-
-Object.keys(colors).forEach(function (color) {
-  var code = colors[color]
-
-  Colorer.prototype[color] = function () {
-    this._setColorCode(this.base + code)
-    return this.cursor
-  }
-
-  Cursor.prototype[color] = function () {
-    return this.foreground[color]()
-  }
-})
-
-/**
- * Makes a beep sound!
- */
-
-Cursor.prototype.beep = function () {
-  this.enabled && this.write('\x07')
-  return this
-}
-
-/**
- * Moves cursor to specific position
- */
-
-Cursor.prototype.goto = function (x, y) {
-  x = x | 0
-  y = y | 0
-  this.enabled && this.write(prefix + y + ';' + x + 'H')
-  return this
-}
-
-/**
- * Resets the color.
- */
-
-Colorer.prototype.reset = function () {
-  this._setColorCode(this.base + 39)
-  return this.cursor
-}
-
-/**
- * Resets all ANSI formatting on the stream.
- */
-
-Cursor.prototype.reset = function () {
-  this.enabled && this.write(prefix + '0' + suffix)
-  this.Bold = false
-  this.Italic = false
-  this.Underline = false
-  this.Inverse = false
-  this.foreground.current = null
-  this.background.current = null
-  return this
-}
-
-/**
- * Sets the foreground color with the given RGB values.
- * The closest match out of the 216 colors is picked.
- */
-
-Colorer.prototype.rgb = function (r, g, b) {
-  var base = this.base + 38
-    , code = rgb(r, g, b)
-  this._setColorCode(base + ';5;' + code)
-  return this.cursor
-}
-
-/**
- * Same as `cursor.fg.rgb(r, g, b)`.
- */
-
-Cursor.prototype.rgb = function (r, g, b) {
-  return this.foreground.rgb(r, g, b)
-}
-
-/**
- * Accepts CSS color codes for use with ANSI escape codes.
- * For example: `#FF000` would be bright red.
- */
-
-Colorer.prototype.hex = function (color) {
-  return this.rgb.apply(this, hex(color))
-}
-
-/**
- * Same as `cursor.fg.hex(color)`.
- */
-
-Cursor.prototype.hex = function (color) {
-  return this.foreground.hex(color)
-}
-
-
-// UTIL FUNCTIONS //
-
-/**
- * Translates a 255 RGB value to a 0-5 ANSI RGV value,
- * then returns the single ANSI color code to use.
- */
-
-function rgb (r, g, b) {
-  var red = r / 255 * 5
-    , green = g / 255 * 5
-    , blue = b / 255 * 5
-  return rgb5(red, green, blue)
-}
-
-/**
- * Turns rgb 0-5 values into a single ANSI color code to use.
- */
-
-function rgb5 (r, g, b) {
-  var red = Math.round(r)
-    , green = Math.round(g)
-    , blue = Math.round(b)
-  return 16 + (red*36) + (green*6) + blue
-}
-
-/**
- * Accepts a hex CSS color code string (# is optional) and
- * translates it into an Array of 3 RGB 0-255 values, which
- * can then be used with rgb().
- */
-
-function hex (color) {
-  var c = color[0] === '#' ? color.substring(1) : color
-    , r = c.substring(0, 2)
-    , g = c.substring(2, 4)
-    , b = c.substring(4, 6)
-  return [parseInt(r, 16), parseInt(g, 16), parseInt(b, 16)]
-}
-
-/**
- * Turns an array-like object into a real array.
- */
-
-function toArray (a) {
-  var i = 0
-    , l = a.length
-    , rtn = []
-  for (; i<l; i++) {
-    rtn.push(a[i])
-  }
-  return rtn
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/lib/newlines.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,71 +0,0 @@
-
-/**
- * Accepts any node Stream instance and hijacks its "write()" function,
- * so that it can count any newlines that get written to the output.
- *
- * When a '\n' byte is encountered, then a "newline" event will be emitted
- * on the stream, with no arguments. It is up to the listeners to determine
- * any necessary deltas required for their use-case.
- *
- * Ex:
- *
- *   var cursor = ansi(process.stdout)
- *     , ln = 0
- *   process.stdout.on('newline', function () {
- *    ln++
- *   })
- */
-
-/**
- * Module dependencies.
- */
-
-var assert = require('assert')
-var NEWLINE = '\n'.charCodeAt(0)
-
-function emitNewlineEvents (stream) {
-  if (stream._emittingNewlines) {
-    // already emitting newline events
-    return
-  }
-
-  var write = stream.write
-
-  stream.write = function (data) {
-    // first write the data
-    var rtn = write.apply(stream, arguments)
-
-    if (stream.listeners('newline').length > 0) {
-      var len = data.length
-        , i = 0
-      // now try to calculate any deltas
-      if (typeof data == 'string') {
-        for (; i<len; i++) {
-          processByte(stream, data.charCodeAt(i))
-        }
-      } else {
-        // buffer
-        for (; i<len; i++) {
-          processByte(stream, data[i])
-        }
-      }
-    }
-
-    return rtn
-  }
-
-  stream._emittingNewlines = true
-}
-module.exports = emitNewlineEvents
-
-
-/**
- * Processes an individual byte being written to a stream
- */
-
-function processByte (stream, b) {
-  assert.equal(typeof b, 'number')
-  if (b === NEWLINE) {
-    stream.emit('newline')
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ansi/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,50 +0,0 @@
-{
-  "name": "ansi",
-  "description": "Advanced ANSI formatting tool for Node.js",
-  "keywords": [
-    "ansi",
-    "formatting",
-    "cursor",
-    "color",
-    "terminal",
-    "rgb",
-    "256",
-    "stream"
-  ],
-  "version": "0.2.1",
-  "author": {
-    "name": "Nathan Rajlich",
-    "email": "nathan@tootallnate.net",
-    "url": "http://tootallnate.net"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/TooTallNate/ansi.js.git"
-  },
-  "main": "./lib/ansi.js",
-  "bin": {
-    "beep": "./examples/beep/index.js",
-    "clear": "./examples/clear/index.js",
-    "starwars": "./examples/starwars.js"
-  },
-  "scripts": {
-    "test": "mocha --reporter spec"
-  },
-  "devDependencies": {
-    "mocha": "*"
-  },
-  "engines": {
-    "node": "*"
-  },
-  "readme": "ansi.js\n=========\n### Advanced ANSI formatting tool for Node.js\n\n`ansi.js` is a module for Node.js that provides an easy-to-use API for\nwriting ANSI escape codes to `Stream` instances. ANSI escape codes are used to do\nfancy things in a terminal window, like render text in colors, delete characters,\nlines, the entire window, or hide and show the cursor, and lots more!\n\nThe code for the example in the screenshot above can be found in the\n`examples/imgcat` directory.\n\n#### Features:\n\n * 256 color support for the terminal!\n * Make a beep sound from your terminal!\n * Works with *any* writable `Stream` instance.\n * Allows you to move the cursor anywhere on the terminal window.\n * Allows you to delete existing contents from the terminal window.\n * Allows you to hide and show the cursor.\n * Converts CSS color codes and RGB values into ANSI escape codes.\n * Low-level; you are in control of when escape codes are used, it's not abstracted.\n\n\nInstallation\n------------\n\nInstall with `npm`:\n\n``` bash\n$ npm install ansi\n```\n\n\nExample\n-------\n\n``` js\nvar ansi = require('ansi')\n  , cursor = ansi(process.stdout)\n\n// You can chain your calls forever:\ncursor\n  .red()                 // Set font color to red\n  .bg.grey()             // Set background color to grey\n  .write('Hello World!') // Write 'Hello World!' to stdout\n  .bg.reset()            // Reset the bgcolor before writing the trailing \\n,\n                         //      to avoid Terminal glitches\n  .write('\\n')           // And a final \\n to wrap things up\n\n// Rendering modes are persistent:\ncursor.hex('#660000').bold().underline()\n\n// You can use the regular logging functions, text will be green\nconsole.log('This is blood red, bold text')\n\n// To reset just the foreground color:\ncursor.fg.reset()\n\nconsole.log('This will still be bold')\n\n// Clean up after yourself!\ncursor.reset()\n```\n\n\nLicense\n-------\n\n(The MIT License)\n\nCopyright (c) 2012 Nathan Rajlich &lt;nathan@tootallnate.net&gt;\n\nPermission is hereby granted, free of charge, to any person obtaining\na copy of this software and associated documentation files (the\n'Software'), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to\npermit persons to whom the Software is furnished to do so, subject to\nthe following conditions:\n\nThe above copyright notice and this permission notice shall be\nincluded in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,\nEXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.\nIN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY\nCLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,\nTORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\nSOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/TooTallNate/ansi.js/issues"
-  },
-  "_id": "ansi@0.2.1",
-  "dist": {
-    "shasum": "76961682ac06d5ea0729af53295ea8f953a0cb21"
-  },
-  "_from": "ansi@latest",
-  "_resolved": "https://registry.npmjs.org/ansi/-/ansi-0.2.1.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/archy/README.markdown	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,92 +0,0 @@
-archy
-=====
-
-Render nested hierarchies `npm ls` style with unicode pipes.
-
-[![build status](https://secure.travis-ci.org/substack/node-archy.png)](http://travis-ci.org/substack/node-archy)
-
-example
-=======
-
-``` js
-var archy = require('archy');
-var s = archy({
-  label : 'beep',
-  nodes : [
-    'ity',
-    {
-      label : 'boop',
-      nodes : [
-        {
-          label : 'o_O',
-          nodes : [
-            {
-              label : 'oh',
-              nodes : [ 'hello', 'puny' ]
-            },
-            'human'
-          ]
-        },
-        'party\ntime!'
-      ]
-    }
-  ]
-});
-console.log(s);
-```
-
-output
-
-```
-beep
-├── ity
-└─┬ boop
-  ├─┬ o_O
-  │ ├─┬ oh
-  │ │ ├── hello
-  │ │ └── puny
-  │ └── human
-  └── party
-      time!
-```
-
-methods
-=======
-
-var archy = require('archy')
-
-archy(obj, prefix='', opts={})
-------------------------------
-
-Return a string representation of `obj` with unicode pipe characters like how
-`npm ls` looks.
-
-`obj` should be a tree of nested objects with `'label'` and `'nodes'` fields.
-`'label'` is a string of text to display at a node level and `'nodes'` is an
-array of the descendents of the current node.
-
-If a node is a string, that string will be used as the `'label'` and an empty
-array of `'nodes'` will be used.
-
-`prefix` gets prepended to all the lines and is used by the algorithm to
-recursively update.
-
-If `'label'` has newlines they will be indented at the present indentation level
-with the current prefix.
-
-To disable unicode results in favor of all-ansi output set `opts.unicode` to
-`false`.
-
-install
-=======
-
-With [npm](http://npmjs.org) do:
-
-```
-npm install archy
-```
-
-license
-=======
-
-MIT/X11
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/archy/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-module.exports = function archy (obj, prefix, opts) {
-    if (prefix === undefined) prefix = '';
-    if (!opts) opts = {};
-    var chr = function (s) {
-        var chars = {
-            '│' : '|',
-            '└' : '`',
-            '├' : '+',
-            '─' : '-',
-            '┬' : '-'
-        };
-        return opts.unicode === false ? chars[s] : s;
-    };
-    
-    if (typeof obj === 'string') obj = { label : obj };
-    
-    var nodes = obj.nodes || [];
-    var lines = (obj.label || '').split('\n');
-    var splitter = '\n' + prefix + (nodes.length ? chr('│') : ' ') + ' ';
-    
-    return prefix
-        + lines.join(splitter) + '\n'
-        + nodes.map(function (node, ix) {
-            var last = ix === nodes.length - 1;
-            var more = node.nodes && node.nodes.length;
-            var prefix_ = prefix + (last ? ' ' : chr('│')) + ' ';
-            
-            return prefix
-                + (last ? chr('└') : chr('├')) + chr('─')
-                + (more ? chr('┬') : chr('─')) + ' '
-                + archy(node, prefix_, opts).slice(prefix.length + 2)
-            ;
-        }).join('')
-    ;
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/archy/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,52 +0,0 @@
-{
-  "name": "archy",
-  "version": "0.0.2",
-  "description": "render nested hierarchies `npm ls` style with unicode pipes",
-  "main": "index.js",
-  "directories": {
-    "lib": ".",
-    "example": "example",
-    "test": "test"
-  },
-  "devDependencies": {
-    "tap": "~0.2.3"
-  },
-  "scripts": {
-    "test": "tap test"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/substack/node-archy.git"
-  },
-  "keywords": [
-    "hierarchy",
-    "npm ls",
-    "unicode",
-    "pretty",
-    "print"
-  ],
-  "author": {
-    "name": "James Halliday",
-    "email": "mail@substack.net",
-    "url": "http://substack.net"
-  },
-  "license": "MIT/X11",
-  "engine": {
-    "node": ">=0.4"
-  },
-  "_npmUser": {
-    "name": "isaacs",
-    "email": "i@izs.me"
-  },
-  "_id": "archy@0.0.2",
-  "dependencies": {},
-  "optionalDependencies": {},
-  "engines": {
-    "node": "*"
-  },
-  "_engineSupported": true,
-  "_npmVersion": "1.1.13",
-  "_nodeVersion": "v0.7.7-pre",
-  "_defaultsLoaded": true,
-  "_from": "archy@0.0.2"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/LICENCE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-Copyright (c) Isaac Z. Schlueter
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS
-``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
-TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
-INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
-CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
-ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
-POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-# block-stream
-
-A stream of blocks.
-
-Write data into it, and it'll output data in buffer blocks the size you
-specify, padding with zeroes if necessary.
-
-```javascript
-var block = new BlockStream(512)
-fs.createReadStream("some-file").pipe(block)
-block.pipe(fs.createWriteStream("block-file"))
-```
-
-When `.end()` or `.flush()` is called, it'll pad the block with zeroes.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/bench/block-stream-pause.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,70 +0,0 @@
-var BlockStream = require("../block-stream.js")
-
-var blockSizes = [16, 25, 1024]
-  , writeSizes = [4, 8, 15, 16, 17, 64, 100]
-  , writeCounts = [1, 10, 100]
-  , tap = require("tap")
-
-writeCounts.forEach(function (writeCount) {
-blockSizes.forEach(function (blockSize) {
-writeSizes.forEach(function (writeSize) {
-  tap.test("writeSize=" + writeSize +
-           " blockSize="+blockSize +
-           " writeCount="+writeCount, function (t) {
-    var f = new BlockStream(blockSize, {nopad: true })
-
-    var actualChunks = 0
-    var actualBytes = 0
-    var timeouts = 0
-
-    f.on("data", function (c) {
-      timeouts ++
-
-      actualChunks ++
-      actualBytes += c.length
-
-      // make sure that no data gets corrupted, and basic sanity
-      var before = c.toString()
-      // simulate a slow write operation
-      f.pause()
-      setTimeout(function () {
-        timeouts --
-
-        var after = c.toString()
-        t.equal(after, before, "should not change data")
-
-        // now corrupt it, to find leaks.
-        for (var i = 0; i < c.length; i ++) {
-          c[i] = "x".charCodeAt(0)
-        }
-        f.resume()
-      }, 100)
-    })
-
-    f.on("end", function () {
-      // round up to the nearest block size
-      var expectChunks = Math.ceil(writeSize * writeCount  * 2 / blockSize)
-      var expectBytes = writeSize * writeCount * 2
-      t.equal(actualBytes, expectBytes,
-              "bytes=" + expectBytes + " writeSize=" + writeSize)
-      t.equal(actualChunks, expectChunks,
-              "chunks=" + expectChunks + " writeSize=" + writeSize)
-
-      // wait for all the timeout checks to finish, then end the test
-      setTimeout(function WAIT () {
-        if (timeouts > 0) return setTimeout(WAIT)
-        t.end()
-      }, 100)
-    })
-
-    for (var i = 0; i < writeCount; i ++) {
-      var a = new Buffer(writeSize);
-      for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0)
-      var b = new Buffer(writeSize);
-      for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0)
-      f.write(a)
-      f.write(b)
-    }
-    f.end()
-  })
-}) }) })
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/bench/block-stream.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-var BlockStream = require("../block-stream.js")
-
-var blockSizes = [16, 25, 1024]
-  , writeSizes = [4, 8, 15, 16, 17, 64, 100]
-  , writeCounts = [1, 10, 100]
-  , tap = require("tap")
-
-writeCounts.forEach(function (writeCount) {
-blockSizes.forEach(function (blockSize) {
-writeSizes.forEach(function (writeSize) {
-  tap.test("writeSize=" + writeSize +
-           " blockSize="+blockSize +
-           " writeCount="+writeCount, function (t) {
-    var f = new BlockStream(blockSize, {nopad: true })
-
-    var actualChunks = 0
-    var actualBytes = 0
-    var timeouts = 0
-
-    f.on("data", function (c) {
-      timeouts ++
-
-      actualChunks ++
-      actualBytes += c.length
-
-      // make sure that no data gets corrupted, and basic sanity
-      var before = c.toString()
-      // simulate a slow write operation
-      setTimeout(function () {
-        timeouts --
-
-        var after = c.toString()
-        t.equal(after, before, "should not change data")
-
-        // now corrupt it, to find leaks.
-        for (var i = 0; i < c.length; i ++) {
-          c[i] = "x".charCodeAt(0)
-        }
-      }, 100)
-    })
-
-    f.on("end", function () {
-      // round up to the nearest block size
-      var expectChunks = Math.ceil(writeSize * writeCount  * 2 / blockSize)
-      var expectBytes = writeSize * writeCount * 2
-      t.equal(actualBytes, expectBytes,
-              "bytes=" + expectBytes + " writeSize=" + writeSize)
-      t.equal(actualChunks, expectChunks,
-              "chunks=" + expectChunks + " writeSize=" + writeSize)
-
-      // wait for all the timeout checks to finish, then end the test
-      setTimeout(function WAIT () {
-        if (timeouts > 0) return setTimeout(WAIT)
-        t.end()
-      }, 100)
-    })
-
-    for (var i = 0; i < writeCount; i ++) {
-      var a = new Buffer(writeSize);
-      for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0)
-      var b = new Buffer(writeSize);
-      for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0)
-      f.write(a)
-      f.write(b)
-    }
-    f.end()
-  })
-}) }) })
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/bench/dropper-pause.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,70 +0,0 @@
-var BlockStream = require("dropper")
-
-var blockSizes = [16, 25, 1024]
-  , writeSizes = [4, 8, 15, 16, 17, 64, 100]
-  , writeCounts = [1, 10, 100]
-  , tap = require("tap")
-
-writeCounts.forEach(function (writeCount) {
-blockSizes.forEach(function (blockSize) {
-writeSizes.forEach(function (writeSize) {
-  tap.test("writeSize=" + writeSize +
-           " blockSize="+blockSize +
-           " writeCount="+writeCount, function (t) {
-    var f = new BlockStream(blockSize, {nopad: true })
-
-    var actualChunks = 0
-    var actualBytes = 0
-    var timeouts = 0
-
-    f.on("data", function (c) {
-      timeouts ++
-
-      actualChunks ++
-      actualBytes += c.length
-
-      // make sure that no data gets corrupted, and basic sanity
-      var before = c.toString()
-      // simulate a slow write operation
-      f.pause()
-      setTimeout(function () {
-        timeouts --
-
-        var after = c.toString()
-        t.equal(after, before, "should not change data")
-
-        // now corrupt it, to find leaks.
-        for (var i = 0; i < c.length; i ++) {
-          c[i] = "x".charCodeAt(0)
-        }
-        f.resume()
-      }, 100)
-    })
-
-    f.on("end", function () {
-      // round up to the nearest block size
-      var expectChunks = Math.ceil(writeSize * writeCount  * 2 / blockSize)
-      var expectBytes = writeSize * writeCount * 2
-      t.equal(actualBytes, expectBytes,
-              "bytes=" + expectBytes + " writeSize=" + writeSize)
-      t.equal(actualChunks, expectChunks,
-              "chunks=" + expectChunks + " writeSize=" + writeSize)
-
-      // wait for all the timeout checks to finish, then end the test
-      setTimeout(function WAIT () {
-        if (timeouts > 0) return setTimeout(WAIT)
-        t.end()
-      }, 100)
-    })
-
-    for (var i = 0; i < writeCount; i ++) {
-      var a = new Buffer(writeSize);
-      for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0)
-      var b = new Buffer(writeSize);
-      for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0)
-      f.write(a)
-      f.write(b)
-    }
-    f.end()
-  })
-}) }) })
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/bench/dropper.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-var BlockStream = require("dropper")
-
-var blockSizes = [16, 25, 1024]
-  , writeSizes = [4, 8, 15, 16, 17, 64, 100]
-  , writeCounts = [1, 10, 100]
-  , tap = require("tap")
-
-writeCounts.forEach(function (writeCount) {
-blockSizes.forEach(function (blockSize) {
-writeSizes.forEach(function (writeSize) {
-  tap.test("writeSize=" + writeSize +
-           " blockSize="+blockSize +
-           " writeCount="+writeCount, function (t) {
-    var f = new BlockStream(blockSize, {nopad: true })
-
-    var actualChunks = 0
-    var actualBytes = 0
-    var timeouts = 0
-
-    f.on("data", function (c) {
-      timeouts ++
-
-      actualChunks ++
-      actualBytes += c.length
-
-      // make sure that no data gets corrupted, and basic sanity
-      var before = c.toString()
-      // simulate a slow write operation
-      setTimeout(function () {
-        timeouts --
-
-        var after = c.toString()
-        t.equal(after, before, "should not change data")
-
-        // now corrupt it, to find leaks.
-        for (var i = 0; i < c.length; i ++) {
-          c[i] = "x".charCodeAt(0)
-        }
-      }, 100)
-    })
-
-    f.on("end", function () {
-      // round up to the nearest block size
-      var expectChunks = Math.ceil(writeSize * writeCount  * 2 / blockSize)
-      var expectBytes = writeSize * writeCount * 2
-      t.equal(actualBytes, expectBytes,
-              "bytes=" + expectBytes + " writeSize=" + writeSize)
-      t.equal(actualChunks, expectChunks,
-              "chunks=" + expectChunks + " writeSize=" + writeSize)
-
-      // wait for all the timeout checks to finish, then end the test
-      setTimeout(function WAIT () {
-        if (timeouts > 0) return setTimeout(WAIT)
-        t.end()
-      }, 100)
-    })
-
-    for (var i = 0; i < writeCount; i ++) {
-      var a = new Buffer(writeSize);
-      for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0)
-      var b = new Buffer(writeSize);
-      for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0)
-      f.write(a)
-      f.write(b)
-    }
-    f.end()
-  })
-}) }) })
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/block-stream.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,209 +0,0 @@
-// write data to it, and it'll emit data in 512 byte blocks.
-// if you .end() or .flush(), it'll emit whatever it's got,
-// padded with nulls to 512 bytes.
-
-module.exports = BlockStream
-
-var Stream = require("stream").Stream
-  , inherits = require("inherits")
-  , assert = require("assert").ok
-  , debug = process.env.DEBUG ? console.error : function () {}
-
-function BlockStream (size, opt) {
-  this.writable = this.readable = true
-  this._opt = opt || {}
-  this._chunkSize = size || 512
-  this._offset = 0
-  this._buffer = []
-  this._bufferLength = 0
-  if (this._opt.nopad) this._zeroes = false
-  else {
-    this._zeroes = new Buffer(this._chunkSize)
-    for (var i = 0; i < this._chunkSize; i ++) {
-      this._zeroes[i] = 0
-    }
-  }
-}
-
-inherits(BlockStream, Stream)
-
-BlockStream.prototype.write = function (c) {
-  // debug("   BS write", c)
-  if (this._ended) throw new Error("BlockStream: write after end")
-  if (c && !Buffer.isBuffer(c)) c = new Buffer(c + "")
-  if (c.length) {
-    this._buffer.push(c)
-    this._bufferLength += c.length
-  }
-  // debug("pushed onto buffer", this._bufferLength)
-  if (this._bufferLength >= this._chunkSize) {
-    if (this._paused) {
-      // debug("   BS paused, return false, need drain")
-      this._needDrain = true
-      return false
-    }
-    this._emitChunk()
-  }
-  return true
-}
-
-BlockStream.prototype.pause = function () {
-  // debug("   BS pausing")
-  this._paused = true
-}
-
-BlockStream.prototype.resume = function () {
-  // debug("   BS resume")
-  this._paused = false
-  return this._emitChunk()
-}
-
-BlockStream.prototype.end = function (chunk) {
-  // debug("end", chunk)
-  if (typeof chunk === "function") cb = chunk, chunk = null
-  if (chunk) this.write(chunk)
-  this._ended = true
-  this.flush()
-}
-
-BlockStream.prototype.flush = function () {
-  this._emitChunk(true)
-}
-
-BlockStream.prototype._emitChunk = function (flush) {
-  // debug("emitChunk flush=%j emitting=%j paused=%j", flush, this._emitting, this._paused)
-
-  // emit a <chunkSize> chunk
-  if (flush && this._zeroes) {
-    // debug("    BS push zeroes", this._bufferLength)
-    // push a chunk of zeroes
-    var padBytes = (this._bufferLength % this._chunkSize)
-    if (padBytes !== 0) padBytes = this._chunkSize - padBytes
-    if (padBytes > 0) {
-      // debug("padBytes", padBytes, this._zeroes.slice(0, padBytes))
-      this._buffer.push(this._zeroes.slice(0, padBytes))
-      this._bufferLength += padBytes
-      // debug(this._buffer[this._buffer.length - 1].length, this._bufferLength)
-    }
-  }
-
-  if (this._emitting || this._paused) return
-  this._emitting = true
-
-  // debug("    BS entering loops")
-  var bufferIndex = 0
-  while (this._bufferLength >= this._chunkSize &&
-         (flush || !this._paused)) {
-    // debug("     BS data emission loop", this._bufferLength)
-
-    var out
-      , outOffset = 0
-      , outHas = this._chunkSize
-
-    while (outHas > 0 && (flush || !this._paused) ) {
-      // debug("    BS data inner emit loop", this._bufferLength)
-      var cur = this._buffer[bufferIndex]
-        , curHas = cur.length - this._offset
-      // debug("cur=", cur)
-      // debug("curHas=%j", curHas)
-      // If it's not big enough to fill the whole thing, then we'll need
-      // to copy multiple buffers into one.  However, if it is big enough,
-      // then just slice out the part we want, to save unnecessary copying.
-      // Also, need to copy if we've already done some copying, since buffers
-      // can't be joined like cons strings.
-      if (out || curHas < outHas) {
-        out = out || new Buffer(this._chunkSize)
-        cur.copy(out, outOffset,
-                 this._offset, this._offset + Math.min(curHas, outHas))
-      } else if (cur.length === outHas && this._offset === 0) {
-        // shortcut -- cur is exactly long enough, and no offset.
-        out = cur
-      } else {
-        // slice out the piece of cur that we need.
-        out = cur.slice(this._offset, this._offset + outHas)
-      }
-
-      if (curHas > outHas) {
-        // means that the current buffer couldn't be completely output
-        // update this._offset to reflect how much WAS written
-        this._offset += outHas
-        outHas = 0
-      } else {
-        // output the entire current chunk.
-        // toss it away
-        outHas -= curHas
-        outOffset += curHas
-        bufferIndex ++
-        this._offset = 0
-      }
-    }
-
-    this._bufferLength -= this._chunkSize
-    assert(out.length === this._chunkSize)
-    // debug("emitting data", out)
-    // debug("   BS emitting, paused=%j", this._paused, this._bufferLength)
-    this.emit("data", out)
-    out = null
-  }
-  // debug("    BS out of loops", this._bufferLength)
-
-  // whatever is left, it's not enough to fill up a block, or we're paused
-  this._buffer = this._buffer.slice(bufferIndex)
-  if (this._paused) {
-    // debug("    BS paused, leaving", this._bufferLength)
-    this._needsDrain = true
-    this._emitting = false
-    return
-  }
-
-  // if flushing, and not using null-padding, then need to emit the last
-  // chunk(s) sitting in the queue.  We know that it's not enough to
-  // fill up a whole block, because otherwise it would have been emitted
-  // above, but there may be some offset.
-  var l = this._buffer.length
-  if (flush && !this._zeroes && l) {
-    if (l === 1) {
-      if (this._offset) {
-        this.emit("data", this._buffer[0].slice(this._offset))
-      } else {
-        this.emit("data", this._buffer[0])
-      }
-    } else {
-      var outHas = this._bufferLength
-        , out = new Buffer(outHas)
-        , outOffset = 0
-      for (var i = 0; i < l; i ++) {
-        var cur = this._buffer[i]
-          , curHas = cur.length - this._offset
-        cur.copy(out, outOffset, this._offset)
-        this._offset = 0
-        outOffset += curHas
-        this._bufferLength -= curHas
-      }
-      this.emit("data", out)
-    }
-    // truncate
-    this._buffer.length = 0
-    this._bufferLength = 0
-    this._offset = 0
-  }
-
-  // now either drained or ended
-  // debug("either draining, or ended", this._bufferLength, this._ended)
-  // means that we've flushed out all that we can so far.
-  if (this._needDrain) {
-    // debug("emitting drain", this._bufferLength)
-    this._needDrain = false
-    this.emit("drain")
-  }
-
-  if ((this._bufferLength === 0) && this._ended && !this._endEmitted) {
-    // debug("emitting end", this._bufferLength)
-    this._endEmitted = true
-    this.emit("end")
-  }
-
-  this._emitting = false
-
-  // debug("    BS no longer emitting", flush, this._paused, this._emitting, this._bufferLength, this._chunkSize)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/block-stream/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "block-stream",
-  "description": "a stream of blocks",
-  "version": "0.0.7",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/block-stream.git"
-  },
-  "engines": {
-    "node": "0.4 || >=0.5.8"
-  },
-  "main": "block-stream.js",
-  "dependencies": {
-    "inherits": "~2.0.0"
-  },
-  "devDependencies": {
-    "tap": "0.x"
-  },
-  "scripts": {
-    "test": "tap test/"
-  },
-  "license": "BSD",
-  "readme": "# block-stream\n\nA stream of blocks.\n\nWrite data into it, and it'll output data in buffer blocks the size you\nspecify, padding with zeroes if necessary.\n\n```javascript\nvar block = new BlockStream(512)\nfs.createReadStream(\"some-file\").pipe(block)\nblock.pipe(fs.createWriteStream(\"block-file\"))\n```\n\nWhen `.end()` or `.flush()` is called, it'll pad the block with zeroes.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/block-stream/issues"
-  },
-  "_id": "block-stream@0.0.7",
-  "_from": "block-stream@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/child-process-close/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,45 +0,0 @@
-
-# child-process-close
-
-This module makes child process objects, (created with `spawn`, `fork`, `exec`
-or `execFile`) emit the `close` event in node v0.6 like they do in node v0.8.
-This makes it easier to write code that works correctly on both version of
-node.
-
-
-## Usage
-
-Just make sure to `require('child-process-close')` anywhere. It will patch the `child_process` module.
-
-```js
-require('child-process-close');
-var spawn = require('child_process').spawn;
-
-var cp = spawn('foo');
-cp.on('close', function(exitCode, signal) {
-  // This now works on all node versions.
-});
-```
-
-
-## License
-
-Copyright (C) 2012 Bert Belder
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/child-process-close/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,48 +0,0 @@
-
-var child_process = require('child_process');
-
-// Re-export the child_process module.
-module.exports = child_process;
-
-// Only node versions up to v0.7.6 need this hook.
-if (!/^v0\.([0-6]\.|7\.[0-6](\D|$))/.test(process.version))
-  return;
-
-// Do not add the hook if already hooked.
-if (child_process.hasOwnProperty('_exit_hook'))
-  return;
-
-// Version the hook in case there is ever the need to release a 0.2.0.
-child_process._exit_hook = 1;
-
-
-function hook(name) {
-  var orig = child_process[name];
-
-  // Older node versions may not have all functions, e.g. fork().
-  if (!orig)
-    return;
-
-  // Store the unhooked version.
-  child_process['_original_' + name] = orig;
-
-  // Do the actual hooking.
-  child_process[name] = function() {
-    var child = orig.apply(this, arguments);
-
-    child.once('exit', function(code, signal) {
-      process.nextTick(function() {
-        child.emit('close', code, signal);
-      });
-    });
-
-    return child;
-  }
-}
-
-hook('spawn');
-hook('fork');
-hook('execFile');
-
-// Don't hook 'exec': it calls `exports.execFile` internally, so hooking it
-// would trigger the close event twice.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/child-process-close/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-{
-  "name": "child-process-close",
-  "version": "0.1.1",
-  "description": "Make child_process objects emit 'close' events in node v0.6 like they do in v0.8. This makes it easier to write code that works correctly on both version of node.",
-  "main": "index.js",
-  "scripts": {
-    "test": "node test/test.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/piscisaureus/child-process-close.git"
-  },
-  "keywords": [
-    "child_process",
-    "spawn",
-    "fork",
-    "exec",
-    "execFile",
-    "close",
-    "exit"
-  ],
-  "author": {
-    "name": "Bert Belder"
-  },
-  "license": "MIT",
-  "readme": "\n# child-process-close\n\nThis module makes child process objects, (created with `spawn`, `fork`, `exec`\nor `execFile`) emit the `close` event in node v0.6 like they do in node v0.8.\nThis makes it easier to write code that works correctly on both version of\nnode.\n\n\n## Usage\n\nJust make sure to `require('child-process-close')` anywhere. It will patch the `child_process` module.\n\n```js\nrequire('child-process-close');\nvar spawn = require('child_process').spawn;\n\nvar cp = spawn('foo');\ncp.on('close', function(exitCode, signal) {\n  // This now works on all node versions.\n});\n```\n\n\n## License\n\nCopyright (C) 2012 Bert Belder\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/piscisaureus/child-process-close/issues"
-  },
-  "_id": "child-process-close@0.1.1",
-  "dist": {
-    "shasum": "c1909c6c3bbcea623e3bd74493ddb1c94c47c500"
-  },
-  "_from": "child-process-close@",
-  "_resolved": "https://registry.npmjs.org/child-process-close/-/child-process-close-0.1.1.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chmodr/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chmodr/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-Like `chmod -R`.
-
-Takes the same arguments as `fs.chmod()`
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chmodr/chmodr.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-module.exports = chmodr
-chmodr.sync = chmodrSync
-
-var fs = require("fs")
-, path = require("path")
-
-function chmodr (p, mode, cb) {
-  fs.readdir(p, function (er, children) {
-    // any error other than ENOTDIR means it's not readable, or
-    // doesn't exist.  give up.
-    if (er && er.code !== "ENOTDIR")
-      return cb(er)
-    var isDir = !er
-    var m = isDir ? dirMode(mode) : mode
-    if (er || !children.length)
-      return fs.chmod(p, m, cb)
-
-    var len = children.length
-    var errState = null
-    children.forEach(function (child) {
-      chmodr(path.resolve(p, child), mode, then)
-    })
-    function then (er) {
-      if (errState) return
-      if (er) return cb(errState = er)
-      if (-- len === 0) return fs.chmod(p, dirMode(mode), cb)
-    }
-  })
-}
-
-function chmodrSync (p, mode) {
-  var children
-  try {
-    children = fs.readdirSync(p)
-  } catch (er) {
-    if (er && er.code === "ENOTDIR") return fs.chmodSync(p, mode)
-    throw er
-  }
-  if (!children.length) return fs.chmodSync(p, dirMode(mode))
-
-  children.forEach(function (child) {
-    chmodrSync(path.resolve(p, child), mode)
-  })
-  return fs.chmodSync(p, dirMode(mode))
-}
-
-// If a party has r, add x
-// so that dirs are listable
-function dirMode(mode) {
-  if (mode & 0400) mode |= 0100
-  if (mode & 040) mode |= 010
-  if (mode & 04) mode |= 01
-  return mode
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chmodr/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "chmodr",
-  "description": "like `chmod -R`",
-  "version": "0.1.0",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/chmodr.git"
-  },
-  "main": "chmodr.js",
-  "devDependencies": {
-    "tap": "0.2",
-    "mkdirp": "0.3",
-    "rimraf": ""
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "license": "BSD",
-  "readme": "Like `chmod -R`.\n\nTakes the same arguments as `fs.chmod()`\n",
-  "readmeFilename": "README.md",
-  "_id": "chmodr@0.1.0",
-  "_from": "chmodr@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chownr/LICENCE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-Copyright (c) Isaac Z. Schlueter
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS
-``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
-TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
-INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
-CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
-ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
-POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chownr/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-Like `chown -R`.
-
-Takes the same arguments as `fs.chown()`
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chownr/chownr.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,41 +0,0 @@
-module.exports = chownr
-chownr.sync = chownrSync
-
-var fs = require("fs")
-, path = require("path")
-
-function chownr (p, uid, gid, cb) {
-  fs.readdir(p, function (er, children) {
-    // any error other than ENOTDIR means it's not readable, or
-    // doesn't exist.  give up.
-    if (er && er.code !== "ENOTDIR") return cb(er)
-    if (er || !children.length) return fs.chown(p, uid, gid, cb)
-
-    var len = children.length
-    , errState = null
-    children.forEach(function (child) {
-      chownr(path.resolve(p, child), uid, gid, then)
-    })
-    function then (er) {
-      if (errState) return
-      if (er) return cb(errState = er)
-      if (-- len === 0) return fs.chown(p, uid, gid, cb)
-    }
-  })
-}
-
-function chownrSync (p, uid, gid) {
-  var children
-  try {
-    children = fs.readdirSync(p)
-  } catch (er) {
-    if (er && er.code === "ENOTDIR") return fs.chownSync(p, uid, gid)
-    throw er
-  }
-  if (!children.length) return fs.chownSync(p, uid, gid)
-
-  children.forEach(function (child) {
-    chownrSync(path.resolve(p, child), uid, gid)
-  })
-  return fs.chownSync(p, uid, gid)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/chownr/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "chownr",
-  "description": "like `chown -R`",
-  "version": "0.0.1",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/chownr.git"
-  },
-  "main": "chownr.js",
-  "devDependencies": {
-    "tap": "0.2",
-    "mkdirp": "0.3",
-    "rimraf": ""
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "license": "BSD",
-  "_npmUser": {
-    "name": "isaacs",
-    "email": "i@izs.me"
-  },
-  "_id": "chownr@0.0.1",
-  "dependencies": {},
-  "optionalDependencies": {},
-  "engines": {
-    "node": "*"
-  },
-  "_engineSupported": true,
-  "_npmVersion": "1.1.23",
-  "_nodeVersion": "v0.7.10-pre",
-  "_defaultsLoaded": true,
-  "dist": {
-    "shasum": "51d18189d9092d5f8afd623f3288bfd1c6bf1a62"
-  },
-  "_from": "../chownr"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-lib-cov
-*.seed
-*.log
-*.csv
-*.dat
-*.out
-*.pid
-*.gz
-
-pids
-logs
-results
-
-npm-debug.log
-
-node_modules
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-language: node_js
-node_js:
-  - "0.10"
-  - "0.8"
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-# cmd-shim
-
-The cmd-shim used in npm to create executable scripts on Windows,
-since symlinks are not suitable for this purpose there.
-
-On Unix systems, you should use a symbolic link instead.
-
-[![Build Status](https://travis-ci.org/ForbesLindesay/cmd-shim.png?branch=master)](https://travis-ci.org/ForbesLindesay/cmd-shim) [![Dependency Status](https://gemnasium.com/ForbesLindesay/cmd-shim.png)](https://gemnasium.com/ForbesLindesay/cmd-shim)
-
-## Installation
-
-```
-npm install cmd-shim
-```
-
-## API
-
-### cmdShim(from, to, cb)
-
-Create a cmd shim at `to` for the command line program at `from`.
-e.g.
-
-```javascript
-var cmdShim = require('cmd-shim');
-cmdShim(__dirname + '/cli.js', '/usr/bin/command-name', function (err) {
-  if (err) throw err;
-});
-```
-
-### cmdShim.ifExists(from, to, cb)
-
-The same as above, but will just continue if the file does not exist.
-Source:
-
-```javascript
-function cmdShimIfExists (from, to, cb) {
-  fs.stat(from, function (er) {
-    if (er) return cb()
-    cmdShim(from, to, cb)
-  })
-}
-```
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,180 +0,0 @@
-// On windows, create a .cmd file.
-// Read the #! in the file to see what it uses.  The vast majority
-// of the time, this will be either:
-// "#!/usr/bin/env <prog> <args...>"
-// or:
-// "#!<prog> <args...>"
-//
-// Write a binroot/pkg.bin + ".cmd" file that has this line in it:
-// @<prog> <args...> %~dp0<target> %*
-
-module.exports = cmdShim
-cmdShim.ifExists = cmdShimIfExists
-
-try {
-  var fs = require("graceful-fs")
-} catch (e) {
-  var fs = require("fs")
-}
-
-var mkdir = require("mkdirp")
-  , path = require("path")
-  , shebangExpr = /^#\!\s*(?:\/usr\/bin\/env)?\s*([^ \t]+)(.*)$/
-
-function cmdShimIfExists (from, to, cb) {
-  fs.stat(from, function (er) {
-    if (er) return cb()
-    cmdShim(from, to, cb)
-  })
-}
-
-// Try to unlink, but ignore errors.
-// Any problems will surface later.
-function rm (path, cb) {
-  fs.unlink(path, function(er) {
-    cb()
-  })
-}
-
-function cmdShim (from, to, cb) {
-  fs.stat(from, function (er, stat) {
-    if (er)
-      return cb(er)
-
-    cmdShim_(from, to, cb)
-  })
-}
-
-function cmdShim_ (from, to, cb) {
-  var then = times(2, next, cb)
-  rm(to, then)
-  rm(to + ".cmd", then)
-
-  function next(er) {
-    writeShim(from, to, cb)
-  }
-}
-
-function writeShim (from, to, cb) {
-  // make a cmd file and a sh script
-  // First, check if the bin is a #! of some sort.
-  // If not, then assume it's something that'll be compiled, or some other
-  // sort of script, and just call it directly.
-  mkdir(path.dirname(to), function (er) {
-    if (er)
-      return cb(er)
-    fs.readFile(from, "utf8", function (er, data) {
-      if (er) return writeShim_(from, to, null, null, cb)
-      var firstLine = data.trim().split(/\r*\n/)[0]
-        , shebang = firstLine.match(shebangExpr)
-      if (!shebang) return writeShim_(from, to, null, null, cb)
-      var prog = shebang[1]
-        , args = shebang[2] || ""
-      return writeShim_(from, to, prog, args, cb)
-    })
-  })
-}
-
-function writeShim_ (from, to, prog, args, cb) {
-  var shTarget = path.relative(path.dirname(to), from)
-    , target = shTarget.split("/").join("\\")
-    , longProg
-    , shProg = prog && prog.split("\\").join("/")
-    , shLongProg
-  shTarget = shTarget.split("\\").join("/")
-  args = args || ""
-  if (!prog) {
-    prog = "\"%~dp0\\" + target + "\""
-    shProg = "\"$basedir/" + shTarget + "\""
-    args = ""
-    target = ""
-    shTarget = ""
-  } else {
-    longProg = "\"%~dp0\\" + prog + ".exe\""
-    shLongProg = "\"$basedir/" + prog + "\""
-    target = "\"%~dp0\\" + target + "\""
-    shTarget = "\"$basedir/" + shTarget + "\""
-  }
-
-  // @IF EXIST "%~dp0\node.exe" (
-  //   "%~dp0\node.exe" "%~dp0\.\node_modules\npm\bin\npm-cli.js" %*
-  // ) ELSE (
-  //   node "%~dp0\.\node_modules\npm\bin\npm-cli.js" %*
-  // )
-  var cmd
-  if (longProg) {
-    cmd = "@IF EXIST " + longProg + " (\r\n"
-        + "  " + longProg + " " + args + " " + target + " %*\r\n"
-        + ") ELSE (\r\n"
-        + "  " + prog + " " + args + " " + target + " %*\r\n"
-        + ")"
-  } else {
-    cmd = prog + " " + args + " " + target + " %*\r\n"
-  }
-
-  // #!/bin/sh
-  // basedir=`dirname "$0"`
-  //
-  // case `uname` in
-  //     *CYGWIN*) basedir=`cygpath -w "$basedir"`;;
-  // esac
-  //
-  // if [ -x "$basedir/node.exe" ]; then
-  //   "$basedir/node.exe" "$basedir/node_modules/npm/bin/npm-cli.js" "$@"
-  //   ret=$?
-  // else
-  //   node "$basedir/node_modules/npm/bin/npm-cli.js" "$@"
-  //   ret=$?
-  // fi
-  // exit $ret
-
-  var sh = "#!/bin/sh\n"
-
-  if (shLongProg) {
-    sh = sh
-        + "basedir=`dirname \"$0\"`\n"
-        + "\n"
-        + "case `uname` in\n"
-        + "    *CYGWIN*) basedir=`cygpath -w \"$basedir\"`;;\n"
-        + "esac\n"
-        + "\n"
-
-    sh = sh
-       + "if [ -x "+shLongProg+" ]; then\n"
-       + "  " + shLongProg + " " + args + " " + shTarget + " \"$@\"\n"
-       + "  ret=$?\n"
-       + "else \n"
-       + "  " + shProg + " " + args + " " + shTarget + " \"$@\"\n"
-       + "  ret=$?\n"
-       + "fi\n"
-       + "exit $ret\n"
-  } else {
-    sh = shProg + " " + args + " " + shTarget + " \"$@\"\n"
-       + "exit $?\n"
-  }
-
-  var then = times(2, next, cb)
-  fs.writeFile(to + ".cmd", cmd, "utf8", then)
-  fs.writeFile(to, sh, "utf8", then)
-  function next () {
-    chmodShim(to, cb)
-  }
-}
-
-function chmodShim (to, cb) {
-  var then = times(2, cb, cb)
-  fs.chmod(to, 0755, then)
-  fs.chmod(to + ".cmd", 0755, then)
-}
-
-function times(n, ok, cb) {
-  var errState = null
-  return function(er) {
-    if (!errState) {
-      if (er)
-        cb(errState = er)
-      else if (--n === 0)
-        ok()
-    }
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/cmd-shim/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "name": "cmd-shim",
-  "version": "1.1.1",
-  "description": "Used in npm for command line application support",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "https://github.com/ForbesLindesay/cmd-shim.git"
-  },
-  "license": "BSD",
-  "optionalDependencies": {
-    "graceful-fs": "2"
-  },
-  "dependencies": {
-    "mkdirp": "~0.3.3",
-    "graceful-fs": "2"
-  },
-  "devDependencies": {
-    "tap": "~0.4.1",
-    "rimraf": "~2.1.4"
-  },
-  "readme": "# cmd-shim\r\n\r\nThe cmd-shim used in npm to create executable scripts on Windows,\r\nsince symlinks are not suitable for this purpose there.\r\n\r\nOn Unix systems, you should use a symbolic link instead.\r\n\r\n[![Build Status](https://travis-ci.org/ForbesLindesay/cmd-shim.png?branch=master)](https://travis-ci.org/ForbesLindesay/cmd-shim) [![Dependency Status](https://gemnasium.com/ForbesLindesay/cmd-shim.png)](https://gemnasium.com/ForbesLindesay/cmd-shim)\r\n\r\n## Installation\r\n\r\n```\r\nnpm install cmd-shim\r\n```\r\n\r\n## API\r\n\r\n### cmdShim(from, to, cb)\r\n\r\nCreate a cmd shim at `to` for the command line program at `from`.\r\ne.g.\r\n\r\n```javascript\r\nvar cmdShim = require('cmd-shim');\r\ncmdShim(__dirname + '/cli.js', '/usr/bin/command-name', function (err) {\r\n  if (err) throw err;\r\n});\r\n```\r\n\r\n### cmdShim.ifExists(from, to, cb)\r\n\r\nThe same as above, but will just continue if the file does not exist.\r\nSource:\r\n\r\n```javascript\r\nfunction cmdShimIfExists (from, to, cb) {\r\n  fs.stat(from, function (er) {\r\n    if (er) return cb()\r\n    cmdShim(from, to, cb)\r\n  })\r\n}\r\n```\r\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/ForbesLindesay/cmd-shim/issues"
-  },
-  "_id": "cmd-shim@1.1.1",
-  "dist": {
-    "shasum": "87741e2a8b6307ea1ea8bf1f65287cb4a9ca977a"
-  },
-  "_from": "cmd-shim@latest",
-  "_resolved": "https://registry.npmjs.org/cmd-shim/-/cmd-shim-1.1.1.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-This software is released under the MIT license:
-
-Permission is hereby granted, free of charge, to any person obtaining a copy of
-this software and associated documentation files (the "Software"), to deal in
-the Software without restriction, including without limitation the rights to
-use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
-the Software, and to permit persons to whom the Software is furnished to do so,
-subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all
-copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
-FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
-COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
-IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
-CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/README.markdown	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-editor
-======
-
-Launch $EDITOR in your program.
-
-example
-=======
-
-``` js
-var editor = require('editor');
-editor('beep.json', function (code, sig) {
-    console.log('finished editing with code ' + code);
-});
-```
-
-***
-
-```
-$ node edit.js
-```
-
-![editor](http://substack.net/images/screenshots/editor.png)
-
-```
-finished editing with code 0
-```
-
-methods
-=======
-
-``` js
-var editor = require('editor')
-```
-
-editor(file, opts={}, cb)
--------------------------
-
-Launch the `$EDITOR` (or `opts.editor`) for `file`.
-
-When the editor exits, `cb(code, sig)` fires.
-
-install
-=======
-
-With [npm](http://npmjs.org) do:
-
-```
-npm install editor
-```
-
-license
-=======
-
-MIT
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/example/beep.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-{
-  "a" : 3,
-  "b" : 4,
-  "c" : 5
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/example/edit.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-var editor = require('../');
-editor(__dirname + '/beep.json', function (code, sig) {
-    console.log('finished editing with code ' + code);
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-var spawn = require('child_process').spawn;
-
-module.exports = function (file, opts, cb) {
-    if (typeof opts === 'function') {
-        cb = opts;
-        opts = {};
-    }
-    if (!opts) opts = {};
-    
-    var ed = /^win/.test(process.platform) ? 'notepad' : 'vim';
-    var editor = opts.editor || process.env.VISUAL || process.env.EDITOR || ed;
-    
-    setRaw(true);
-    var ps = spawn(editor, [ file ], { customFds : [ 0, 1, 2 ] });
-    
-    ps.on('exit', function (code, sig) {
-        setRaw(false);
-        process.stdin.pause();
-        if (typeof cb === 'function') cb(code, sig)
-    });
-};
-
-var tty = require('tty');
-function setRaw (mode) {
-    process.stdin.setRawMode ? process.stdin.setRawMode(mode) : tty.setRawMode(mode);
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/editor/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,43 +0,0 @@
-{
-  "name": "editor",
-  "version": "0.0.5",
-  "description": "launch $EDITOR in your program",
-  "main": "index.js",
-  "directories": {
-    "example": "example",
-    "test": "test"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "tap": "~0.4.4"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/substack/node-editor.git"
-  },
-  "homepage": "https://github.com/substack/node-editor",
-  "keywords": [
-    "text",
-    "edit",
-    "shell"
-  ],
-  "author": {
-    "name": "James Halliday",
-    "email": "mail@substack.net",
-    "url": "http://substack.net"
-  },
-  "license": "MIT",
-  "engine": {
-    "node": ">=0.6"
-  },
-  "readme": "editor\n======\n\nLaunch $EDITOR in your program.\n\nexample\n=======\n\n``` js\nvar editor = require('editor');\neditor('beep.json', function (code, sig) {\n    console.log('finished editing with code ' + code);\n});\n```\n\n***\n\n```\n$ node edit.js\n```\n\n![editor](http://substack.net/images/screenshots/editor.png)\n\n```\nfinished editing with code 0\n```\n\nmethods\n=======\n\n``` js\nvar editor = require('editor')\n```\n\neditor(file, opts={}, cb)\n-------------------------\n\nLaunch the `$EDITOR` (or `opts.editor`) for `file`.\n\nWhen the editor exits, `cb(code, sig)` fires.\n\ninstall\n=======\n\nWith [npm](http://npmjs.org) do:\n\n```\nnpm install editor\n```\n\nlicense\n=======\n\nMIT\n",
-  "readmeFilename": "README.markdown",
-  "bugs": {
-    "url": "https://github.com/substack/node-editor/issues"
-  },
-  "_id": "editor@0.0.5",
-  "_from": "editor@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-# ignore the output junk from the example scripts
-example/output
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/LICENCE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-Copyright (c) Isaac Z. Schlueter
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS
-``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
-TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
-INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
-CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
-ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
-POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-# fstream-npm
-
-This is an fstream DirReader class that will read a directory and filter
-things according to the semantics of what goes in an npm package.
-
-For example:
-
-```javascript
-// This will print out all the files that would be included
-// by 'npm publish' or 'npm install' of this directory.
-
-var FN = require("fstream-npm")
-FN({ path: "./" })
-  .on("child", function (e) {
-    console.error(e.path.substr(e.root.path.length + 1))
-  })
-```
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/bundle.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-// this example will bundle every dependency
-var P = require("../")
-P({ path: "./" })
-  .on("package", bundleIt)
-  .on("entry", function (e) {
-    console.error(e.constructor.name, e.path.substr(e.root.dirname.length + 1))
-    e.on("package", bundleIt)
-  })
-
-function bundleIt (p) {
-  p.bundleDependencies = Object.keys(p.dependencies || {})
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/dir-tar.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-// this will show what ends up in the fstream-npm package
-var P = require("fstream").DirReader
-var tar = require("tar")
-function f (entry) {
-  return entry.basename !== ".git"
-}
-
-new P({ path: "./", type: "Directory", Directory: true, filter: f })
-  .on("package", function (p) {
-    console.error("package", p)
-  })
-  .on("ignoreFile", function (e) {
-    console.error("ignoreFile", e)
-  })
-  .on("entry", function (e) {
-    console.error(e.constructor.name, e.path.substr(e.root.path.length + 1))
-  })
-  .pipe(tar.Pack())
-  .pipe(process.stdout)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/dir.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-// this will show what ends up in the fstream-npm package
-var P = require("../")
-var DW = require("fstream").DirWriter
-
-var target = new DW({ Directory: true, type: "Directory",
-                      path: __dirname + "/output"})
-
-function f (entry) {
-  return entry.basename !== ".git"
-}
-
-P({ path: "./", type: "Directory", isDirectory: true, filter: f })
-  .on("package", function (p) {
-    console.error("package", p)
-  })
-  .on("ignoreFile", function (e) {
-    console.error("ignoreFile", e)
-  })
-  .on("entry", function (e) {
-    console.error(e.constructor.name, e.path)
-  })
-  .pipe(target)
-  .on("end", function () {
-    console.error("ended")
-  })
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/example.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,12 +0,0 @@
-// this will show what ends up in the fstream-npm package
-var P = require("../")
-P({ path: "./" })
-  .on("package", function (p) {
-    console.error("package", p)
-  })
-  .on("ignoreFile", function (e) {
-    console.error("ignoreFile", e)
-  })
-  .on("entry", function (e) {
-    console.error(e.constructor.name, e.path.substr(e.root.dirname.length + 1))
-  })
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/ig-tar.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-// this will show what ends up in the fstream-npm package
-var P = require("fstream-ignore")
-var tar = require("tar")
-function f (entry) {
-  return entry.basename !== ".git"
-}
-
-new P({ path: "./", type: "Directory", Directory: true, filter: f })
-  .on("package", function (p) {
-    console.error("package", p)
-  })
-  .on("ignoreFile", function (e) {
-    console.error("ignoreFile", e)
-  })
-  .on("entry", function (e) {
-    console.error(e.constructor.name, e.path.substr(e.root.path.length + 1))
-  })
-  .pipe(tar.Pack())
-  .pipe(process.stdout)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/example/tar.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-// this will show what ends up in the fstream-npm package
-var P = require("../")
-var tar = require("tar")
-function f () {
-  return true
-}
-// function f (entry) {
-//   return entry.basename !== ".git"
-// }
-
-new P({ path: "./", type: "Directory", isDirectory: true, filter: f })
-  .on("package", function (p) {
-    console.error("package", p)
-  })
-  .on("ignoreFile", function (e) {
-    console.error("ignoreFile", e)
-  })
-  .on("entry", function (e) {
-    console.error(e.constructor.name, e.path)
-  })
-  .on("end", function () {
-    console.error("ended")
-  })
-  .pipe(tar.Pack())
-  .pipe(process.stdout)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/fstream-npm.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,323 +0,0 @@
-var Ignore = require("fstream-ignore")
-, inherits = require("inherits")
-, path = require("path")
-, fs = require("fs")
-
-module.exports = Packer
-
-inherits(Packer, Ignore)
-
-function Packer (props) {
-  if (!(this instanceof Packer)) {
-    return new Packer(props)
-  }
-
-  if (typeof props === "string") {
-    props = { path: props }
-  }
-
-  props.ignoreFiles = props.ignoreFiles || [ ".npmignore",
-                                             ".gitignore",
-                                             "package.json" ]
-
-  Ignore.call(this, props)
-
-  this.bundled = props.bundled
-  this.bundleLinks = props.bundleLinks
-  this.package = props.package
-
-  // only do the magic bundling stuff for the node_modules folder that
-  // lives right next to a package.json file.
-  this.bundleMagic = this.parent &&
-                     this.parent.packageRoot &&
-                     this.basename === "node_modules"
-
-  // in a node_modules folder, resolve symbolic links to
-  // bundled dependencies when creating the package.
-  props.follow = this.follow = this.bundleMagic
-  // console.error("follow?", this.path, props.follow)
-
-  if (this === this.root ||
-      this.parent &&
-      this.parent.bundleMagic &&
-      this.basename.charAt(0) !== ".") {
-    this.readBundledLinks()
-  }
-
-
-  this.on("entryStat", function (entry, props) {
-    // files should *always* get into tarballs
-    // in a user-writable state, even if they're
-    // being installed from some wackey vm-mounted
-    // read-only filesystem.
-    entry.mode = props.mode = props.mode | 0200
-  })
-}
-
-Packer.prototype.readBundledLinks = function () {
-  if (this._paused) {
-    this.once("resume", this.addIgnoreFiles)
-    return
-  }
-
-  this.pause()
-  fs.readdir(this.path + "/node_modules", function (er, list) {
-    // no harm if there's no bundle
-    var l = list && list.length
-    if (er || l === 0) return this.resume()
-
-    var errState = null
-    , then = function then (er) {
-      if (errState) return
-      if (er) return errState = er, this.resume()
-      if (-- l === 0) return this.resume()
-    }.bind(this)
-
-    list.forEach(function (pkg) {
-      if (pkg.charAt(0) === ".") return then()
-      var pd = this.path + "/node_modules/" + pkg
-      fs.realpath(pd, function (er, rp) {
-        if (er) return then()
-        this.bundleLinks = this.bundleLinks || {}
-        this.bundleLinks[pkg] = rp
-        then()
-      }.bind(this))
-    }, this)
-  }.bind(this))
-}
-
-Packer.prototype.applyIgnores = function (entry, partial, entryObj) {
-  // package.json files can never be ignored.
-  if (entry === "package.json") return true
-
-  // readme files should never be ignored.
-  if (entry.match(/^readme(\.[^\.]*)$/i)) return true
-
-  // special rules.  see below.
-  if (entry === "node_modules" && this.packageRoot) return true
-
-  // some files are *never* allowed under any circumstances
-  if (entry === ".git" ||
-      entry === ".lock-wscript" ||
-      entry.match(/^\.wafpickle-[0-9]+$/) ||
-      entry === "CVS" ||
-      entry === ".svn" ||
-      entry === ".hg" ||
-      entry.match(/^\..*\.swp$/) ||
-      entry === ".DS_Store" ||
-      entry.match(/^\._/) ||
-      entry === "npm-debug.log"
-    ) {
-    return false
-  }
-
-  // in a node_modules folder, we only include bundled dependencies
-  // also, prevent packages in node_modules from being affected
-  // by rules set in the containing package, so that
-  // bundles don't get busted.
-  // Also, once in a bundle, everything is installed as-is
-  // To prevent infinite cycles in the case of cyclic deps that are
-  // linked with npm link, even in a bundle, deps are only bundled
-  // if they're not already present at a higher level.
-  if (this.bundleMagic) {
-    // bubbling up.  stop here and allow anything the bundled pkg allows
-    if (entry.indexOf("/") !== -1) return true
-
-    // never include the .bin.  It's typically full of platform-specific
-    // stuff like symlinks and .cmd files anyway.
-    if (entry === ".bin") return false
-
-    var shouldBundle = false
-    // the package root.
-    var p = this.parent
-    // the package before this one.
-    var pp = p && p.parent
-
-    // if this entry has already been bundled, and is a symlink,
-    // and it is the *same* symlink as this one, then exclude it.
-    if (pp && pp.bundleLinks && this.bundleLinks &&
-        pp.bundleLinks[entry] &&
-        pp.bundleLinks[entry] === this.bundleLinks[entry]) {
-      return false
-    }
-
-    // since it's *not* a symbolic link, if we're *already* in a bundle,
-    // then we should include everything.
-    if (pp && pp.package && pp.basename === "node_modules") {
-      return true
-    }
-
-    // only include it at this point if it's a bundleDependency
-    var bd = this.package && this.package.bundleDependencies
-    var shouldBundle = bd && bd.indexOf(entry) !== -1
-    // if we're not going to bundle it, then it doesn't count as a bundleLink
-    // if (this.bundleLinks && !shouldBundle) delete this.bundleLinks[entry]
-    return shouldBundle
-  }
-  // if (this.bundled) return true
-
-  return Ignore.prototype.applyIgnores.call(this, entry, partial, entryObj)
-}
-
-Packer.prototype.addIgnoreFiles = function () {
-  var entries = this.entries
-  // if there's a .npmignore, then we do *not* want to
-  // read the .gitignore.
-  if (-1 !== entries.indexOf(".npmignore")) {
-    var i = entries.indexOf(".gitignore")
-    if (i !== -1) {
-      entries.splice(i, 1)
-    }
-  }
-
-  this.entries = entries
-
-  Ignore.prototype.addIgnoreFiles.call(this)
-}
-
-
-Packer.prototype.readRules = function (buf, e) {
-  if (e !== "package.json") {
-    return Ignore.prototype.readRules.call(this, buf, e)
-  }
-
-  buf = buf.toString().trim()
-
-  if (buf.length === 0) return []
-
-  try {
-    var p = this.package = JSON.parse(buf)
-  } catch (er) {
-    // just pretend it's a normal old file, not magic at all.
-    return []
-  }
-
-  if (this === this.root) {
-    this.bundleLinks = this.bundleLinks || {}
-    this.bundleLinks[p.name] = this._path
-  }
-
-  this.packageRoot = true
-  this.emit("package", p)
-
-  // make bundle deps predictable
-  if (p.bundledDependencies && !p.bundleDependencies) {
-    p.bundleDependencies = p.bundledDependencies
-    delete p.bundledDependencies
-  }
-
-  if (!p.files || !Array.isArray(p.files)) return []
-
-  // ignore everything except what's in the files array.
-  return ["*"].concat(p.files.map(function (f) {
-    return "!" + f
-  })).concat(p.files.map(function (f) {
-    return "!" + f.replace(/\/+$/, "") + "/**"
-  }))
-}
-
-Packer.prototype.getChildProps = function (stat) {
-  var props = Ignore.prototype.getChildProps.call(this, stat)
-
-  props.package = this.package
-
-  props.bundled = this.bundled && this.bundled.slice(0)
-  props.bundleLinks = this.bundleLinks &&
-    Object.create(this.bundleLinks)
-
-  // Directories have to be read as Packers
-  // otherwise fstream.Reader will create a DirReader instead.
-  if (stat.isDirectory()) {
-    props.type = this.constructor
-  }
-
-  // only follow symbolic links directly in the node_modules folder.
-  props.follow = false
-  return props
-}
-
-
-var order =
-  [ "package.json"
-  , ".npmignore"
-  , ".gitignore"
-  , /^README(\.md)?$/
-  , "LICENCE"
-  , "LICENSE"
-  , /\.js$/ ]
-
-Packer.prototype.sort = function (a, b) {
-  for (var i = 0, l = order.length; i < l; i ++) {
-    var o = order[i]
-    if (typeof o === "string") {
-      if (a === o) return -1
-      if (b === o) return 1
-    } else {
-      if (a.match(o)) return -1
-      if (b.match(o)) return 1
-    }
-  }
-
-  // deps go in the back
-  if (a === "node_modules") return 1
-  if (b === "node_modules") return -1
-
-  return Ignore.prototype.sort.call(this, a, b)
-}
-
-
-
-Packer.prototype.emitEntry = function (entry) {
-  if (this._paused) {
-    this.once("resume", this.emitEntry.bind(this, entry))
-    return
-  }
-
-  // if there is a .gitignore, then we're going to
-  // rename it to .npmignore in the output.
-  if (entry.basename === ".gitignore") {
-    entry.basename = ".npmignore"
-    entry.path = path.resolve(entry.dirname, entry.basename)
-  }
-
-  // all *.gyp files are renamed to binding.gyp for node-gyp
-  // but only when they are in the same folder as a package.json file.
-  if (entry.basename.match(/\.gyp$/) &&
-      this.entries.indexOf("package.json") !== -1) {
-    entry.basename = "binding.gyp"
-    entry.path = path.resolve(entry.dirname, entry.basename)
-  }
-
-  // skip over symbolic links
-  if (entry.type === "SymbolicLink") {
-    entry.abort()
-    return
-  }
-
-  if (entry.type !== "Directory") {
-    // make it so that the folder in the tarball is named "package"
-    var h = path.dirname((entry.root || entry).path)
-    , t = entry.path.substr(h.length + 1).replace(/^[^\/\\]+/, "package")
-    , p = h + "/" + t
-
-    entry.path = p
-    entry.dirname = path.dirname(p)
-    return Ignore.prototype.emitEntry.call(this, entry)
-  }
-
-  // we don't want empty directories to show up in package
-  // tarballs.
-  // don't emit entry events for dirs, but still walk through
-  // and read them.  This means that we need to proxy up their
-  // entry events so that those entries won't be missed, since
-  // .pipe() doesn't do anythign special with "child" events, on
-  // with "entry" events.
-  var me = this
-  entry.on("entry", function (e) {
-    if (e.parent === entry) {
-      e.parent = me
-      me.emit("entry", e)
-    }
-  })
-  entry.on("package", this.emit.bind(this, "package"))
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-test/fixtures
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-# fstream-ignore
-
-A fstream DirReader that filters out files that match globs in `.ignore`
-files throughout the tree, like how git ignores files based on a
-`.gitignore` file.
-
-Here's an example:
-
-```javascript
-var Ignore = require("fstream-ignore")
-Ignore({ path: __dirname
-       , ignoreFiles: [".ignore", ".gitignore"]
-       })
-  .on("child", function (c) {
-    console.error(c.path.substr(c.root.path.length + 1))
-  })
-  .pipe(tar.Pack())
-  .pipe(fs.createWriteStream("foo.tar"))
-```
-
-This will tar up the files in __dirname into `foo.tar`, ignoring
-anything matched by the globs in any .iginore or .gitignore file.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/example/basic.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-var Ignore = require("../")
-Ignore({ path: __dirname
-       , ignoreFiles: [".ignore", ".gitignore"]
-       })
-  .on("child", function (c) {
-    console.error(c.path.substr(c.root.path.length + 1))
-    c.on("ignoreFile", onIgnoreFile)
-  })
-  .on("ignoreFile", onIgnoreFile)
-
-function onIgnoreFile (e) {
-  console.error("adding ignore file", e.path)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/ignore.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,275 +0,0 @@
-// Essentially, this is a fstream.DirReader class, but with a
-// bit of special logic to read the specified sort of ignore files,
-// and a filter that prevents it from picking up anything excluded
-// by those files.
-
-var Minimatch = require("minimatch").Minimatch
-, fstream = require("fstream")
-, DirReader = fstream.DirReader
-, inherits = require("inherits")
-, path = require("path")
-, fs = require("fs")
-
-module.exports = IgnoreReader
-
-inherits(IgnoreReader, DirReader)
-
-function IgnoreReader (props) {
-  if (!(this instanceof IgnoreReader)) {
-    return new IgnoreReader(props)
-  }
-
-  // must be a Directory type
-  if (typeof props === "string") {
-    props = { path: path.resolve(props) }
-  }
-
-  props.type = "Directory"
-  props.Directory = true
-
-  if (!props.ignoreFiles) props.ignoreFiles = [".ignore"]
-  this.ignoreFiles = props.ignoreFiles
-
-  this.ignoreRules = null
-
-  // ensure that .ignore files always show up at the top of the list
-  // that way, they can be read before proceeding to handle other
-  // entries in that same folder
-  if (props.sort) {
-    this._sort = props.sort === "alpha" ? alphasort : props.sort
-    props.sort = null
-  }
-
-  this.on("entries", function () {
-    // if there are any ignore files in the list, then
-    // pause and add them.
-    // then, filter the list based on our ignoreRules
-
-    var hasIg = this.entries.some(this.isIgnoreFile, this)
-
-    if (!hasIg) return this.filterEntries()
-
-    this.addIgnoreFiles()
-  })
-
-  // we filter entries before we know what they are.
-  // however, directories have to be re-tested against
-  // rules with a "/" appended, because "a/b/" will only
-  // match if "a/b" is a dir, and not otherwise.
-  this.on("_entryStat", function (entry, props) {
-    var t = entry.basename
-    if (!this.applyIgnores(entry.basename,
-                           entry.type === "Directory",
-                           entry)) {
-      entry.abort()
-    }
-  }.bind(this))
-
-  DirReader.call(this, props)
-}
-
-
-IgnoreReader.prototype.addIgnoreFiles = function () {
-  if (this._paused) {
-    this.once("resume", this.addIgnoreFiles)
-    return
-  }
-  if (this._ignoreFilesAdded) return
-  this._ignoreFilesAdded = true
-
-  var newIg = this.entries.filter(this.isIgnoreFile, this)
-  , count = newIg.length
-  , errState = null
-
-  if (!count) return
-
-  this.pause()
-
-  var then = function then (er) {
-    if (errState) return
-    if (er) return this.emit("error", errState = er)
-    if (-- count === 0) {
-      this.filterEntries()
-      this.resume()
-    }
-  }.bind(this)
-
-  newIg.forEach(function (ig) {
-    this.addIgnoreFile(ig, then)
-  }, this)
-}
-
-
-IgnoreReader.prototype.isIgnoreFile = function (e) {
-  return e !== "." &&
-         e !== ".." &&
-         -1 !== this.ignoreFiles.indexOf(e)
-}
-
-
-IgnoreReader.prototype.getChildProps = function (stat) {
-  var props = DirReader.prototype.getChildProps.call(this, stat)
-  props.ignoreFiles = this.ignoreFiles
-
-  // Directories have to be read as IgnoreReaders
-  // otherwise fstream.Reader will create a DirReader instead.
-  if (stat.isDirectory()) {
-    props.type = this.constructor
-  }
-  return props
-}
-
-
-IgnoreReader.prototype.addIgnoreFile = function (e, cb) {
-  // read the file, and then call addIgnoreRules
-  // if there's an error, then tell the cb about it.
-
-  var ig = path.resolve(this.path, e)
-  fs.readFile(ig, function (er, data) {
-    if (er) return cb(er)
-
-    this.emit("ignoreFile", e, data)
-    var rules = this.readRules(data, e)
-    this.addIgnoreRules(rules, e)
-    cb()
-  }.bind(this))
-}
-
-
-IgnoreReader.prototype.readRules = function (buf, e) {
-  return buf.toString().split(/\r?\n/)
-}
-
-
-// Override this to do fancier things, like read the
-// "files" array from a package.json file or something.
-IgnoreReader.prototype.addIgnoreRules = function (set, e) {
-  // filter out anything obvious
-  set = set.filter(function (s) {
-    s = s.trim()
-    return s && !s.match(/^#/)
-  })
-
-  // no rules to add!
-  if (!set.length) return
-
-  // now get a minimatch object for each one of these.
-  // Note that we need to allow dot files by default, and
-  // not switch the meaning of their exclusion
-  var mmopt = { matchBase: true, dot: true, flipNegate: true }
-  , mm = set.map(function (s) {
-    var m = new Minimatch(s, mmopt)
-    m.ignoreFile = e
-    return m
-  })
-
-  if (!this.ignoreRules) this.ignoreRules = []
-  this.ignoreRules.push.apply(this.ignoreRules, mm)
-}
-
-
-IgnoreReader.prototype.filterEntries = function () {
-  // this exclusion is at the point where we know the list of
-  // entries in the dir, but don't know what they are.  since
-  // some of them *might* be directories, we have to run the
-  // match in dir-mode as well, so that we'll pick up partials
-  // of files that will be included later.  Anything included
-  // at this point will be checked again later once we know
-  // what it is.
-  this.entries = this.entries.filter(function (entry) {
-    // at this point, we don't know if it's a dir or not.
-    return this.applyIgnores(entry) || this.applyIgnores(entry, true)
-  }, this)
-}
-
-
-IgnoreReader.prototype.applyIgnores = function (entry, partial, obj) {
-  var included = true
-
-  // this = /a/b/c
-  // entry = d
-  // parent /a/b sees c/d
-  if (this.parent && this.parent.applyIgnores) {
-    var pt = this.basename + "/" + entry
-    included = this.parent.applyIgnores(pt, partial)
-  }
-
-  // Negated Rules
-  // Since we're *ignoring* things here, negating means that a file
-  // is re-included, if it would have been excluded by a previous
-  // rule.  So, negated rules are only relevant if the file
-  // has been excluded.
-  //
-  // Similarly, if a file has been excluded, then there's no point
-  // trying it against rules that have already been applied
-  //
-  // We're using the "flipnegate" flag here, which tells minimatch
-  // to set the "negate" for our information, but still report
-  // whether the core pattern was a hit or a miss.
-
-  if (!this.ignoreRules) {
-    return included
-  }
-
-  this.ignoreRules.forEach(function (rule) {
-    // negation means inclusion
-    if (rule.negate && included ||
-        !rule.negate && !included) {
-      // unnecessary
-      return
-    }
-
-    // first, match against /foo/bar
-    var match = rule.match("/" + entry)
-
-    if (!match) {
-      // try with the leading / trimmed off the test
-      // eg: foo/bar instead of /foo/bar
-      match = rule.match(entry)
-    }
-
-    // if the entry is a directory, then it will match
-    // with a trailing slash. eg: /foo/bar/ or foo/bar/
-    if (!match && partial) {
-      match = rule.match("/" + entry + "/") ||
-              rule.match(entry + "/")
-    }
-
-    // When including a file with a negated rule, it's
-    // relevant if a directory partially matches, since
-    // it may then match a file within it.
-    // Eg, if you ignore /a, but !/a/b/c
-    if (!match && rule.negate && partial) {
-      match = rule.match("/" + entry, true) ||
-              rule.match(entry, true)
-    }
-
-    if (match) {
-      included = rule.negate
-    }
-  }, this)
-
-  return included
-}
-
-
-IgnoreReader.prototype.sort = function (a, b) {
-  var aig = this.ignoreFiles.indexOf(a) !== -1
-  , big = this.ignoreFiles.indexOf(b) !== -1
-
-  if (aig && !big) return -1
-  if (big && !aig) return 1
-  return this._sort(a, b)
-}
-
-IgnoreReader.prototype._sort = function (a, b) {
-  return 0
-}
-
-function alphasort (a, b) {
-  return a === b ? 0
-       : a.toLowerCase() > b.toLowerCase() ? 1
-       : a.toLowerCase() < b.toLowerCase() ? -1
-       : a > b ? 1
-       : -1
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "fstream-ignore",
-  "description": "A thing for ignoring files based on globs",
-  "version": "0.0.7",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/fstream-ignore.git"
-  },
-  "main": "ignore.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "dependencies": {
-    "minimatch": "~0.2.0",
-    "fstream": "~0.1.17",
-    "inherits": "2"
-  },
-  "devDependencies": {
-    "tap": "",
-    "rimraf": "",
-    "mkdirp": ""
-  },
-  "license": "BSD",
-  "readme": "# fstream-ignore\n\nA fstream DirReader that filters out files that match globs in `.ignore`\nfiles throughout the tree, like how git ignores files based on a\n`.gitignore` file.\n\nHere's an example:\n\n```javascript\nvar Ignore = require(\"fstream-ignore\")\nIgnore({ path: __dirname\n       , ignoreFiles: [\".ignore\", \".gitignore\"]\n       })\n  .on(\"child\", function (c) {\n    console.error(c.path.substr(c.root.path.length + 1))\n  })\n  .pipe(tar.Pack())\n  .pipe(fs.createWriteStream(\"foo.tar\"))\n```\n\nThis will tar up the files in __dirname into `foo.tar`, ignoring\nanything matched by the globs in any .iginore or .gitignore file.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/fstream-ignore/issues"
-  },
-  "_id": "fstream-ignore@0.0.7",
-  "dist": {
-    "shasum": "eea3033f0c3728139de7b57ab1b0d6d89c353c63"
-  },
-  "_from": "fstream-ignore@~0.0.5",
-  "_resolved": "https://registry.npmjs.org/fstream-ignore/-/fstream-ignore-0.0.7.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream-npm/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "fstream-npm",
-  "description": "fstream class for creating npm packages",
-  "version": "0.1.6",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/fstream-npm.git"
-  },
-  "main": "./fstream-npm.js",
-  "dependencies": {
-    "fstream-ignore": "~0.0.5",
-    "inherits": "2"
-  },
-  "license": "BSD",
-  "readme": "# fstream-npm\n\nThis is an fstream DirReader class that will read a directory and filter\nthings according to the semantics of what goes in an npm package.\n\nFor example:\n\n```javascript\n// This will print out all the files that would be included\n// by 'npm publish' or 'npm install' of this directory.\n\nvar FN = require(\"fstream-npm\")\nFN({ path: \"./\" })\n  .on(\"child\", function (e) {\n    console.error(e.path.substr(e.root.path.length + 1))\n  })\n```\n\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/fstream-npm/issues"
-  },
-  "_id": "fstream-npm@0.1.6",
-  "_from": "fstream-npm@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-.*.swp
-node_modules/
-examples/deep-copy/
-examples/path/
-examples/filter-copy/
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-language: node_js
-node_js:
-  - 0.6
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,76 +0,0 @@
-Like FS streams, but with stat on them, and supporting directories and
-symbolic links, as well as normal files.  Also, you can use this to set
-the stats on a file, even if you don't change its contents, or to create
-a symlink, etc.
-
-So, for example, you can "write" a directory, and it'll call `mkdir`.  You
-can specify a uid and gid, and it'll call `chown`.  You can specify a
-`mtime` and `atime`, and it'll call `utimes`.  You can call it a symlink
-and provide a `linkpath` and it'll call `symlink`.
-
-Note that it won't automatically resolve symbolic links.  So, if you
-call `fstream.Reader('/some/symlink')` then you'll get an object
-that stats and then ends immediately (since it has no data).  To follow
-symbolic links, do this: `fstream.Reader({path:'/some/symlink', follow:
-true })`.
-
-There are various checks to make sure that the bytes emitted are the
-same as the intended size, if the size is set.
-
-## Examples
-
-```javascript
-fstream
-  .Writer({ path: "path/to/file"
-          , mode: 0755
-          , size: 6
-          })
-  .write("hello\n")
-  .end()
-```
-
-This will create the directories if they're missing, and then write
-`hello\n` into the file, chmod it to 0755, and assert that 6 bytes have
-been written when it's done.
-
-```javascript
-fstream
-  .Writer({ path: "path/to/file"
-          , mode: 0755
-          , size: 6
-          , flags: "a"
-          })
-  .write("hello\n")
-  .end()
-```
-
-You can pass flags in, if you want to append to a file.
-
-```javascript
-fstream
-  .Writer({ path: "path/to/symlink"
-          , linkpath: "./file"
-          , SymbolicLink: true
-          , mode: "0755" // octal strings supported
-          })
-  .end()
-```
-
-If isSymbolicLink is a function, it'll be called, and if it returns
-true, then it'll treat it as a symlink.  If it's not a function, then
-any truish value will make a symlink, or you can set `type:
-'SymbolicLink'`, which does the same thing.
-
-Note that the linkpath is relative to the symbolic link location, not
-the parent dir or cwd.
-
-```javascript
-fstream
-  .Reader("path/to/dir")
-  .pipe(fstream.Writer("path/to/other/dir"))
-```
-
-This will do like `cp -Rp path/to/dir path/to/other/dir`.  If the other
-dir exists and isn't a directory, then it'll emit an error.  It'll also
-set the uid, gid, mode, etc. to be identical.  In this way, it's more
-like `rsync -a` than simply a copy.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/examples/filter-pipe.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,131 +0,0 @@
-var fstream = require("../fstream.js")
-var path = require("path")
-
-var r = fstream.Reader({ path: path.dirname(__dirname)
-                       , filter: function () {
-                           return !this.basename.match(/^\./) &&
-                                  !this.basename.match(/^node_modules$/)
-                                  !this.basename.match(/^deep-copy$/)
-                                  !this.basename.match(/^filter-copy$/)
-                         }
-                       })
-
-// this writer will only write directories
-var w = fstream.Writer({ path: path.resolve(__dirname, "filter-copy")
-                       , type: "Directory"
-                       , filter: function () {
-                           return this.type === "Directory"
-                         }
-                       })
-
-var indent = ""
-var escape = {}
-
-r.on("entry", appears)
-r.on("ready", function () {
-  console.error("ready to begin!", r.path)
-})
-
-function appears (entry) {
-  console.error(indent + "a %s appears!", entry.type, entry.basename, typeof entry.basename)
-  if (foggy) {
-    console.error("FOGGY!")
-    var p = entry
-    do {
-      console.error(p.depth, p.path, p._paused)
-    } while (p = p.parent)
-
-    throw new Error("\033[mshould not have entries while foggy")
-  }
-  indent += "\t"
-  entry.on("data", missile(entry))
-  entry.on("end", runaway(entry))
-  entry.on("entry", appears)
-}
-
-var foggy
-function missile (entry) {
-  if (entry.type === "Directory") {
-    var ended = false
-    entry.once("end", function () { ended = true })
-    return function (c) {
-      // throw in some pathological pause()/resume() behavior
-      // just for extra fun.
-      process.nextTick(function () {
-        if (!foggy && !ended) { // && Math.random() < 0.3) {
-          console.error(indent +"%s casts a spell", entry.basename)
-          console.error("\na slowing fog comes over the battlefield...\n\033[32m")
-          entry.pause()
-          entry.once("resume", liftFog)
-          foggy = setTimeout(liftFog, 1000)
-
-          function liftFog (who) {
-            if (!foggy) return
-            if (who) {
-              console.error("%s breaks the spell!", who && who.path)
-            } else {
-              console.error("the spell expires!")
-            }
-            console.error("\033[mthe fog lifts!\n")
-            clearTimeout(foggy)
-            foggy = null
-            if (entry._paused) entry.resume()
-          }
-
-        }
-      })
-    }
-  }
-
-  return function (c) {
-    var e = Math.random() < 0.5
-    console.error(indent + "%s %s for %d damage!",
-                entry.basename,
-                e ? "is struck" : "fires a chunk",
-                c.length)
-  }
-}
-
-function runaway (entry) { return function () {
-  var e = Math.random() < 0.5
-  console.error(indent + "%s %s",
-                entry.basename,
-                e ? "turns to flee" : "is vanquished!")
-  indent = indent.slice(0, -1)
-}}
-
-
-w.on("entry", attacks)
-//w.on("ready", function () { attacks(w) })
-function attacks (entry) {
-  console.error(indent + "%s %s!", entry.basename,
-              entry.type === "Directory" ? "calls for backup" : "attacks")
-  entry.on("entry", attacks)
-}
-
-ended = false
-var i = 1
-r.on("end", function () {
-  if (foggy) clearTimeout(foggy)
-  console.error("\033[mIT'S OVER!!")
-  console.error("A WINNAR IS YOU!")
-
-  console.log("ok " + (i ++) + " A WINNAR IS YOU")
-  ended = true
-  // now go through and verify that everything in there is a dir.
-  var p = path.resolve(__dirname, "filter-copy")
-  var checker = fstream.Reader({ path: p })
-  checker.checker = true
-  checker.on("child", function (e) {
-    var ok = e.type === "Directory"
-    console.log((ok ? "" : "not ") + "ok " + (i ++) +
-                " should be a dir: " +
-                e.path.substr(checker.path.length + 1))
-  })
-})
-
-process.on("exit", function () {
-  console.log((ended ? "" : "not ") + "ok " + (i ++) + " ended")
-})
-
-r.pipe(w)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/examples/pipe.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,115 +0,0 @@
-var fstream = require("../fstream.js")
-var path = require("path")
-
-var r = fstream.Reader({ path: path.dirname(__dirname)
-                       , filter: function () {
-                           return !this.basename.match(/^\./) &&
-                                  !this.basename.match(/^node_modules$/)
-                                  !this.basename.match(/^deep-copy$/)
-                         }
-                       })
-
-var w = fstream.Writer({ path: path.resolve(__dirname, "deep-copy")
-                       , type: "Directory"
-                       })
-
-var indent = ""
-var escape = {}
-
-r.on("entry", appears)
-r.on("ready", function () {
-  console.error("ready to begin!", r.path)
-})
-
-function appears (entry) {
-  console.error(indent + "a %s appears!", entry.type, entry.basename, typeof entry.basename, entry)
-  if (foggy) {
-    console.error("FOGGY!")
-    var p = entry
-    do {
-      console.error(p.depth, p.path, p._paused)
-    } while (p = p.parent)
-
-    throw new Error("\033[mshould not have entries while foggy")
-  }
-  indent += "\t"
-  entry.on("data", missile(entry))
-  entry.on("end", runaway(entry))
-  entry.on("entry", appears)
-}
-
-var foggy
-function missile (entry) {
-  if (entry.type === "Directory") {
-    var ended = false
-    entry.once("end", function () { ended = true })
-    return function (c) {
-      // throw in some pathological pause()/resume() behavior
-      // just for extra fun.
-      process.nextTick(function () {
-        if (!foggy && !ended) { // && Math.random() < 0.3) {
-          console.error(indent +"%s casts a spell", entry.basename)
-          console.error("\na slowing fog comes over the battlefield...\n\033[32m")
-          entry.pause()
-          entry.once("resume", liftFog)
-          foggy = setTimeout(liftFog, 10)
-
-          function liftFog (who) {
-            if (!foggy) return
-            if (who) {
-              console.error("%s breaks the spell!", who && who.path)
-            } else {
-              console.error("the spell expires!")
-            }
-            console.error("\033[mthe fog lifts!\n")
-            clearTimeout(foggy)
-            foggy = null
-            if (entry._paused) entry.resume()
-          }
-
-        }
-      })
-    }
-  }
-
-  return function (c) {
-    var e = Math.random() < 0.5
-    console.error(indent + "%s %s for %d damage!",
-                entry.basename,
-                e ? "is struck" : "fires a chunk",
-                c.length)
-  }
-}
-
-function runaway (entry) { return function () {
-  var e = Math.random() < 0.5
-  console.error(indent + "%s %s",
-                entry.basename,
-                e ? "turns to flee" : "is vanquished!")
-  indent = indent.slice(0, -1)
-}}
-
-
-w.on("entry", attacks)
-//w.on("ready", function () { attacks(w) })
-function attacks (entry) {
-  console.error(indent + "%s %s!", entry.basename,
-              entry.type === "Directory" ? "calls for backup" : "attacks")
-  entry.on("entry", attacks)
-}
-
-ended = false
-r.on("end", function () {
-  if (foggy) clearTimeout(foggy)
-  console.error("\033[mIT'S OVER!!")
-  console.error("A WINNAR IS YOU!")
-
-  console.log("ok 1 A WINNAR IS YOU")
-  ended = true
-})
-
-process.on("exit", function () {
-  console.log((ended ? "" : "not ") + "ok 2 ended")
-})
-
-r.pipe(w)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/examples/reader.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-var fstream = require("../fstream.js")
-var tap = require("tap")
-var fs = require("fs")
-var path = require("path")
-var children = -1
-var dir = path.dirname(__dirname)
-
-var gotReady = false
-var ended = false
-
-tap.test("reader test", function (t) {
-
-  var r = fstream.Reader({ path: dir
-                         , filter: function () {
-                             // return this.parent === r
-                             return this.parent === r || this === r
-                           }
-                         })
-
-  r.on("ready", function () {
-    gotReady = true
-    children = fs.readdirSync(dir).length
-    console.error("Setting expected children to "+children)
-    t.equal(r.type, "Directory", "should be a directory")
-  })
-
-  r.on("entry", function (entry) {
-    children --
-    if (!gotReady) {
-      t.fail("children before ready!")
-    }
-    t.equal(entry.dirname, r.path, "basename is parent dir")
-  })
-
-  r.on("error", function (er) {
-    t.fail(er)
-    t.end()
-    process.exit(1)
-  })
-
-  r.on("end", function () {
-    t.equal(children, 0, "should have seen all children")
-    ended = true
-  })
-
-  var closed = false
-  r.on("close", function () {
-    t.ok(ended, "saw end before close")
-    t.notOk(closed, "close should only happen once")
-    closed = true
-    t.end()
-  })
-
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/examples/symlink-write.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-var fstream = require("../fstream.js")
-  , closed = false
-
-fstream
-  .Writer({ path: "path/to/symlink"
-          , linkpath: "./file"
-          , isSymbolicLink: true
-          , mode: "0755" // octal strings supported
-          })
-  .on("close", function () {
-    closed = true
-    var fs = require("fs")
-    var s = fs.lstatSync("path/to/symlink")
-    var isSym = s.isSymbolicLink()
-    console.log((isSym?"":"not ") +"ok 1 should be symlink")
-    var t = fs.readlinkSync("path/to/symlink")
-    var isTarget = t === "./file"
-    console.log((isTarget?"":"not ") +"ok 2 should link to ./file")
-  })
-  .end()
-
-process.on("exit", function () {
-  console.log((closed?"":"not ")+"ok 3 should be closed")
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/fstream.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-exports.Abstract = require("./lib/abstract.js")
-exports.Reader = require("./lib/reader.js")
-exports.Writer = require("./lib/writer.js")
-
-exports.File =
-  { Reader: require("./lib/file-reader.js")
-  , Writer: require("./lib/file-writer.js") }
-
-exports.Dir = 
-  { Reader : require("./lib/dir-reader.js")
-  , Writer : require("./lib/dir-writer.js") }
-
-exports.Link =
-  { Reader : require("./lib/link-reader.js")
-  , Writer : require("./lib/link-writer.js") }
-
-exports.Proxy =
-  { Reader : require("./lib/proxy-reader.js")
-  , Writer : require("./lib/proxy-writer.js") }
-
-exports.Reader.Dir = exports.DirReader = exports.Dir.Reader
-exports.Reader.File = exports.FileReader = exports.File.Reader
-exports.Reader.Link = exports.LinkReader = exports.Link.Reader
-exports.Reader.Proxy = exports.ProxyReader = exports.Proxy.Reader
-
-exports.Writer.Dir = exports.DirWriter = exports.Dir.Writer
-exports.Writer.File = exports.FileWriter = exports.File.Writer
-exports.Writer.Link = exports.LinkWriter = exports.Link.Writer
-exports.Writer.Proxy = exports.ProxyWriter = exports.Proxy.Writer
-
-exports.collect = require("./lib/collect.js")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/abstract.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,85 +0,0 @@
-// the parent class for all fstreams.
-
-module.exports = Abstract
-
-var Stream = require("stream").Stream
-  , inherits = require("inherits")
-
-function Abstract () {
-  Stream.call(this)
-}
-
-inherits(Abstract, Stream)
-
-Abstract.prototype.on = function (ev, fn) {
-  if (ev === "ready" && this.ready) {
-    process.nextTick(fn.bind(this))
-  } else {
-    Stream.prototype.on.call(this, ev, fn)
-  }
-  return this
-}
-
-Abstract.prototype.abort = function () {
-  this._aborted = true
-  this.emit("abort")
-}
-
-Abstract.prototype.destroy = function () {}
-
-Abstract.prototype.warn = function (msg, code) {
-  var me = this
-    , er = decorate(msg, code, me)
-  if (!me.listeners("warn")) {
-    console.error("%s %s\n" +
-                  "path = %s\n" +
-                  "syscall = %s\n" +
-                  "fstream_type = %s\n" +
-                  "fstream_path = %s\n" +
-                  "fstream_unc_path = %s\n" +
-                  "fstream_class = %s\n" +
-                  "fstream_stack =\n%s\n",
-                  code || "UNKNOWN",
-                  er.stack,
-                  er.path,
-                  er.syscall,
-                  er.fstream_type,
-                  er.fstream_path,
-                  er.fstream_unc_path,
-                  er.fstream_class,
-                  er.fstream_stack.join("\n"))
-  } else {
-    me.emit("warn", er)
-  }
-}
-
-Abstract.prototype.info = function (msg, code) {
-  this.emit("info", msg, code)
-}
-
-Abstract.prototype.error = function (msg, code, th) {
-  var er = decorate(msg, code, this)
-  if (th) throw er
-  else this.emit("error", er)
-}
-
-function decorate (er, code, me) {
-  if (!(er instanceof Error)) er = new Error(er)
-  er.code = er.code || code
-  er.path = er.path || me.path
-  er.fstream_type = er.fstream_type || me.type
-  er.fstream_path = er.fstream_path || me.path
-  if (me._path !== me.path) {
-    er.fstream_unc_path = er.fstream_unc_path || me._path
-  }
-  if (me.linkpath) {
-    er.fstream_linkpath = er.fstream_linkpath || me.linkpath
-  }
-  er.fstream_class = er.fstream_class || me.constructor.name
-  er.fstream_stack = er.fstream_stack ||
-    new Error().stack.split(/\n/).slice(3).map(function (s) {
-      return s.replace(/^    at /, "")
-    })
-
-  return er
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/collect.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,67 +0,0 @@
-module.exports = collect
-
-function collect (stream) {
-  if (stream._collected) return
-
-  stream._collected = true
-  stream.pause()
-
-  stream.on("data", save)
-  stream.on("end", save)
-  var buf = []
-  function save (b) {
-    if (typeof b === "string") b = new Buffer(b)
-    if (Buffer.isBuffer(b) && !b.length) return
-    buf.push(b)
-  }
-
-  stream.on("entry", saveEntry)
-  var entryBuffer = []
-  function saveEntry (e) {
-    collect(e)
-    entryBuffer.push(e)
-  }
-
-  stream.on("proxy", proxyPause)
-  function proxyPause (p) {
-    p.pause()
-  }
-
-
-  // replace the pipe method with a new version that will
-  // unlock the buffered stuff.  if you just call .pipe()
-  // without a destination, then it'll re-play the events.
-  stream.pipe = (function (orig) { return function (dest) {
-    // console.error(" === open the pipes", dest && dest.path)
-
-    // let the entries flow through one at a time.
-    // Once they're all done, then we can resume completely.
-    var e = 0
-    ;(function unblockEntry () {
-      var entry = entryBuffer[e++]
-      // console.error(" ==== unblock entry", entry && entry.path)
-      if (!entry) return resume()
-      entry.on("end", unblockEntry)
-      if (dest) dest.add(entry)
-      else stream.emit("entry", entry)
-    })()
-
-    function resume () {
-      stream.removeListener("entry", saveEntry)
-      stream.removeListener("data", save)
-      stream.removeListener("end", save)
-
-      stream.pipe = orig
-      if (dest) stream.pipe(dest)
-
-      buf.forEach(function (b) {
-        if (b) stream.emit("data", b)
-        else stream.emit("end")
-      })
-
-      stream.resume()
-    }
-
-    return dest
-  }})(stream.pipe)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/dir-reader.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,250 +0,0 @@
-// A thing that emits "entry" events with Reader objects
-// Pausing it causes it to stop emitting entry events, and also
-// pauses the current entry if there is one.
-
-module.exports = DirReader
-
-var fs = require("graceful-fs")
-  , fstream = require("../fstream.js")
-  , Reader = fstream.Reader
-  , inherits = require("inherits")
-  , mkdir = require("mkdirp")
-  , path = require("path")
-  , Reader = require("./reader.js")
-  , assert = require("assert").ok
-
-inherits(DirReader, Reader)
-
-function DirReader (props) {
-  var me = this
-  if (!(me instanceof DirReader)) throw new Error(
-    "DirReader must be called as constructor.")
-
-  // should already be established as a Directory type
-  if (props.type !== "Directory" || !props.Directory) {
-    throw new Error("Non-directory type "+ props.type)
-  }
-
-  me.entries = null
-  me._index = -1
-  me._paused = false
-  me._length = -1
-
-  if (props.sort) {
-    this.sort = props.sort
-  }
-
-  Reader.call(this, props)
-}
-
-DirReader.prototype._getEntries = function () {
-  var me = this
-
-  // race condition.  might pause() before calling _getEntries,
-  // and then resume, and try to get them a second time.
-  if (me._gotEntries) return
-  me._gotEntries = true
-
-  fs.readdir(me._path, function (er, entries) {
-    if (er) return me.error(er)
-
-    me.entries = entries
-
-    me.emit("entries", entries)
-    if (me._paused) me.once("resume", processEntries)
-    else processEntries()
-
-    function processEntries () {
-      me._length = me.entries.length
-      if (typeof me.sort === "function") {
-        me.entries = me.entries.sort(me.sort.bind(me))
-      }
-      me._read()
-    }
-  })
-}
-
-// start walking the dir, and emit an "entry" event for each one.
-DirReader.prototype._read = function () {
-  var me = this
-
-  if (!me.entries) return me._getEntries()
-
-  if (me._paused || me._currentEntry || me._aborted) {
-    // console.error("DR paused=%j, current=%j, aborted=%j", me._paused, !!me._currentEntry, me._aborted)
-    return
-  }
-
-  me._index ++
-  if (me._index >= me.entries.length) {
-    if (!me._ended) {
-      me._ended = true
-      me.emit("end")
-      me.emit("close")
-    }
-    return
-  }
-
-  // ok, handle this one, then.
-
-  // save creating a proxy, by stat'ing the thing now.
-  var p = path.resolve(me._path, me.entries[me._index])
-  assert(p !== me._path)
-  assert(me.entries[me._index])
-
-  // set this to prevent trying to _read() again in the stat time.
-  me._currentEntry = p
-  fs[ me.props.follow ? "stat" : "lstat" ](p, function (er, stat) {
-    if (er) return me.error(er)
-
-    var who = me._proxy || me
-
-    stat.path = p
-    stat.basename = path.basename(p)
-    stat.dirname = path.dirname(p)
-    var childProps = me.getChildProps.call(who, stat)
-    childProps.path = p
-    childProps.basename = path.basename(p)
-    childProps.dirname = path.dirname(p)
-
-    var entry = Reader(childProps, stat)
-
-    // console.error("DR Entry", p, stat.size)
-
-    me._currentEntry = entry
-
-    // "entry" events are for direct entries in a specific dir.
-    // "child" events are for any and all children at all levels.
-    // This nomenclature is not completely final.
-
-    entry.on("pause", function (who) {
-      if (!me._paused && !entry._disowned) {
-        me.pause(who)
-      }
-    })
-
-    entry.on("resume", function (who) {
-      if (me._paused && !entry._disowned) {
-        me.resume(who)
-      }
-    })
-
-    entry.on("stat", function (props) {
-      me.emit("_entryStat", entry, props)
-      if (entry._aborted) return
-      if (entry._paused) entry.once("resume", function () {
-        me.emit("entryStat", entry, props)
-      })
-      else me.emit("entryStat", entry, props)
-    })
-
-    entry.on("ready", function EMITCHILD () {
-      // console.error("DR emit child", entry._path)
-      if (me._paused) {
-        // console.error("  DR emit child - try again later")
-        // pause the child, and emit the "entry" event once we drain.
-        // console.error("DR pausing child entry")
-        entry.pause(me)
-        return me.once("resume", EMITCHILD)
-      }
-
-      // skip over sockets.  they can't be piped around properly,
-      // so there's really no sense even acknowledging them.
-      // if someone really wants to see them, they can listen to
-      // the "socket" events.
-      if (entry.type === "Socket") {
-        me.emit("socket", entry)
-      } else {
-        me.emitEntry(entry)
-      }
-    })
-
-    var ended = false
-    entry.on("close", onend)
-    entry.on("disown", onend)
-    function onend () {
-      if (ended) return
-      ended = true
-      me.emit("childEnd", entry)
-      me.emit("entryEnd", entry)
-      me._currentEntry = null
-      if (!me._paused) {
-        me._read()
-      }
-    }
-
-    // XXX Remove this.  Works in node as of 0.6.2 or so.
-    // Long filenames should not break stuff.
-    entry.on("error", function (er) {
-      if (entry._swallowErrors) {
-        me.warn(er)
-        entry.emit("end")
-        entry.emit("close")
-      } else {
-        me.emit("error", er)
-      }
-    })
-
-    // proxy up some events.
-    ; [ "child"
-      , "childEnd"
-      , "warn"
-      ].forEach(function (ev) {
-        entry.on(ev, me.emit.bind(me, ev))
-      })
-  })
-}
-
-DirReader.prototype.disown = function (entry) {
-  entry.emit("beforeDisown")
-  entry._disowned = true
-  entry.parent = entry.root = null
-  if (entry === this._currentEntry) {
-    this._currentEntry = null
-  }
-  entry.emit("disown")
-}
-
-DirReader.prototype.getChildProps = function (stat) {
-  return { depth: this.depth + 1
-         , root: this.root || this
-         , parent: this
-         , follow: this.follow
-         , filter: this.filter
-         , sort: this.props.sort
-         }
-}
-
-DirReader.prototype.pause = function (who) {
-  var me = this
-  if (me._paused) return
-  who = who || me
-  me._paused = true
-  if (me._currentEntry && me._currentEntry.pause) {
-    me._currentEntry.pause(who)
-  }
-  me.emit("pause", who)
-}
-
-DirReader.prototype.resume = function (who) {
-  var me = this
-  if (!me._paused) return
-  who = who || me
-
-  me._paused = false
-  // console.error("DR Emit Resume", me._path)
-  me.emit("resume", who)
-  if (me._paused) {
-    // console.error("DR Re-paused", me._path)
-    return
-  }
-
-  if (me._currentEntry) {
-    if (me._currentEntry.resume) me._currentEntry.resume(who)
-  } else me._read()
-}
-
-DirReader.prototype.emitEntry = function (entry) {
-  this.emit("entry", entry)
-  this.emit("child", entry)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/dir-writer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,171 +0,0 @@
-// It is expected that, when .add() returns false, the consumer
-// of the DirWriter will pause until a "drain" event occurs. Note
-// that this is *almost always going to be the case*, unless the
-// thing being written is some sort of unsupported type, and thus
-// skipped over.
-
-module.exports = DirWriter
-
-var fs = require("graceful-fs")
-  , fstream = require("../fstream.js")
-  , Writer = require("./writer.js")
-  , inherits = require("inherits")
-  , mkdir = require("mkdirp")
-  , path = require("path")
-  , collect = require("./collect.js")
-
-inherits(DirWriter, Writer)
-
-function DirWriter (props) {
-  var me = this
-  if (!(me instanceof DirWriter)) me.error(
-    "DirWriter must be called as constructor.", null, true)
-
-  // should already be established as a Directory type
-  if (props.type !== "Directory" || !props.Directory) {
-    me.error("Non-directory type "+ props.type + " " +
-                    JSON.stringify(props), null, true)
-  }
-
-  Writer.call(this, props)
-}
-
-DirWriter.prototype._create = function () {
-  var me = this
-  mkdir(me._path, Writer.dirmode, function (er) {
-    if (er) return me.error(er)
-    // ready to start getting entries!
-    me.ready = true
-    me.emit("ready")
-    me._process()
-  })
-}
-
-// a DirWriter has an add(entry) method, but its .write() doesn't
-// do anything.  Why a no-op rather than a throw?  Because this
-// leaves open the door for writing directory metadata for
-// gnu/solaris style dumpdirs.
-DirWriter.prototype.write = function () {
-  return true
-}
-
-DirWriter.prototype.end = function () {
-  this._ended = true
-  this._process()
-}
-
-DirWriter.prototype.add = function (entry) {
-  var me = this
-
-  // console.error("\tadd", entry._path, "->", me._path)
-  collect(entry)
-  if (!me.ready || me._currentEntry) {
-    me._buffer.push(entry)
-    return false
-  }
-
-  // create a new writer, and pipe the incoming entry into it.
-  if (me._ended) {
-    return me.error("add after end")
-  }
-
-  me._buffer.push(entry)
-  me._process()
-
-  return 0 === this._buffer.length
-}
-
-DirWriter.prototype._process = function () {
-  var me = this
-
-  // console.error("DW Process p=%j", me._processing, me.basename)
-
-  if (me._processing) return
-
-  var entry = me._buffer.shift()
-  if (!entry) {
-    // console.error("DW Drain")
-    me.emit("drain")
-    if (me._ended) me._finish()
-    return
-  }
-
-  me._processing = true
-  // console.error("DW Entry", entry._path)
-
-  me.emit("entry", entry)
-
-  // ok, add this entry
-  //
-  // don't allow recursive copying
-  var p = entry
-  do {
-    var pp = p._path || p.path
-    if (pp === me.root._path || pp === me._path ||
-        (pp && pp.indexOf(me._path) === 0)) {
-      // console.error("DW Exit (recursive)", entry.basename, me._path)
-      me._processing = false
-      if (entry._collected) entry.pipe()
-      return me._process()
-    }
-  } while (p = p.parent)
-
-  // console.error("DW not recursive")
-
-  // chop off the entry's root dir, replace with ours
-  var props = { parent: me
-              , root: me.root || me
-              , type: entry.type
-              , depth: me.depth + 1 }
-
-  var p = entry._path || entry.path || entry.props.path
-  if (entry.parent) {
-    p = p.substr(entry.parent._path.length + 1)
-  }
-  // get rid of any ../../ shenanigans
-  props.path = path.join(me.path, path.join("/", p))
-
-  // if i have a filter, the child should inherit it.
-  props.filter = me.filter
-
-  // all the rest of the stuff, copy over from the source.
-  Object.keys(entry.props).forEach(function (k) {
-    if (!props.hasOwnProperty(k)) {
-      props[k] = entry.props[k]
-    }
-  })
-
-  // not sure at this point what kind of writer this is.
-  var child = me._currentChild = new Writer(props)
-  child.on("ready", function () {
-    // console.error("DW Child Ready", child.type, child._path)
-    // console.error("  resuming", entry._path)
-    entry.pipe(child)
-    entry.resume()
-  })
-
-  // XXX Make this work in node.
-  // Long filenames should not break stuff.
-  child.on("error", function (er) {
-    if (child._swallowErrors) {
-      me.warn(er)
-      child.emit("end")
-      child.emit("close")
-    } else {
-      me.emit("error", er)
-    }
-  })
-
-  // we fire _end internally *after* end, so that we don't move on
-  // until any "end" listeners have had their chance to do stuff.
-  child.on("close", onend)
-  var ended = false
-  function onend () {
-    if (ended) return
-    ended = true
-    // console.error("* DW Child end", child.basename)
-    me._currentChild = null
-    me._processing = false
-    me._process()
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/file-reader.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,147 +0,0 @@
-// Basically just a wrapper around an fs.ReadStream
-
-module.exports = FileReader
-
-var fs = require("graceful-fs")
-  , fstream = require("../fstream.js")
-  , Reader = fstream.Reader
-  , inherits = require("inherits")
-  , mkdir = require("mkdirp")
-  , Reader = require("./reader.js")
-  , EOF = {EOF: true}
-  , CLOSE = {CLOSE: true}
-
-inherits(FileReader, Reader)
-
-function FileReader (props) {
-  // console.error("    FR create", props.path, props.size, new Error().stack)
-  var me = this
-  if (!(me instanceof FileReader)) throw new Error(
-    "FileReader must be called as constructor.")
-
-  // should already be established as a File type
-  // XXX Todo: preserve hardlinks by tracking dev+inode+nlink,
-  // with a HardLinkReader class.
-  if (!((props.type === "Link" && props.Link) ||
-        (props.type === "File" && props.File))) {
-    throw new Error("Non-file type "+ props.type)
-  }
-
-  me._buffer = []
-  me._bytesEmitted = 0
-  Reader.call(me, props)
-}
-
-FileReader.prototype._getStream = function () {
-  var me = this
-    , stream = me._stream = fs.createReadStream(me._path, me.props)
-
-  if (me.props.blksize) {
-    stream.bufferSize = me.props.blksize
-  }
-
-  stream.on("open", me.emit.bind(me, "open"))
-
-  stream.on("data", function (c) {
-    // console.error("\t\t%d %s", c.length, me.basename)
-    me._bytesEmitted += c.length
-    // no point saving empty chunks
-    if (!c.length) return
-    else if (me._paused || me._buffer.length) {
-      me._buffer.push(c)
-      me._read()
-    } else me.emit("data", c)
-  })
-
-  stream.on("end", function () {
-    if (me._paused || me._buffer.length) {
-      // console.error("FR Buffering End", me._path)
-      me._buffer.push(EOF)
-      me._read()
-    } else {
-      me.emit("end")
-    }
-
-    if (me._bytesEmitted !== me.props.size) {
-      me.error("Didn't get expected byte count\n"+
-               "expect: "+me.props.size + "\n" +
-               "actual: "+me._bytesEmitted)
-    }
-  })
-
-  stream.on("close", function () {
-    if (me._paused || me._buffer.length) {
-      // console.error("FR Buffering Close", me._path)
-      me._buffer.push(CLOSE)
-      me._read()
-    } else {
-      // console.error("FR close 1", me._path)
-      me.emit("close")
-    }
-  })
-
-  me._read()
-}
-
-FileReader.prototype._read = function () {
-  var me = this
-  // console.error("FR _read", me._path)
-  if (me._paused) {
-    // console.error("FR _read paused", me._path)
-    return
-  }
-
-  if (!me._stream) {
-    // console.error("FR _getStream calling", me._path)
-    return me._getStream()
-  }
-
-  // clear out the buffer, if there is one.
-  if (me._buffer.length) {
-    // console.error("FR _read has buffer", me._buffer.length, me._path)
-    var buf = me._buffer
-    for (var i = 0, l = buf.length; i < l; i ++) {
-      var c = buf[i]
-      if (c === EOF) {
-        // console.error("FR Read emitting buffered end", me._path)
-        me.emit("end")
-      } else if (c === CLOSE) {
-        // console.error("FR Read emitting buffered close", me._path)
-        me.emit("close")
-      } else {
-        // console.error("FR Read emitting buffered data", me._path)
-        me.emit("data", c)
-      }
-
-      if (me._paused) {
-        // console.error("FR Read Re-pausing at "+i, me._path)
-        me._buffer = buf.slice(i)
-        return
-      }
-    }
-    me._buffer.length = 0
-  }
-  // console.error("FR _read done")
-  // that's about all there is to it.
-}
-
-FileReader.prototype.pause = function (who) {
-  var me = this
-  // console.error("FR Pause", me._path)
-  if (me._paused) return
-  who = who || me
-  me._paused = true
-  if (me._stream) me._stream.pause()
-  me.emit("pause", who)
-}
-
-FileReader.prototype.resume = function (who) {
-  var me = this
-  // console.error("FR Resume", me._path)
-  if (!me._paused) return
-  who = who || me
-  me.emit("resume", who)
-  me._paused = false
-  if (me._stream) me._stream.resume()
-  me._read()
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/file-writer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,100 +0,0 @@
-module.exports = FileWriter
-
-var fs = require("graceful-fs")
-  , mkdir = require("mkdirp")
-  , Writer = require("./writer.js")
-  , inherits = require("inherits")
-  , EOF = {}
-
-inherits(FileWriter, Writer)
-
-function FileWriter (props) {
-  var me = this
-  if (!(me instanceof FileWriter)) throw new Error(
-    "FileWriter must be called as constructor.")
-
-  // should already be established as a File type
-  if (props.type !== "File" || !props.File) {
-    throw new Error("Non-file type "+ props.type)
-  }
-
-  me._buffer = []
-  me._bytesWritten = 0
-
-  Writer.call(this, props)
-}
-
-FileWriter.prototype._create = function () {
-  var me = this
-  if (me._stream) return
-
-  var so = {}
-  if (me.props.flags) so.flags = me.props.flags
-  so.mode = Writer.filemode
-  if (me._old && me._old.blksize) so.bufferSize = me._old.blksize
-
-  me._stream = fs.createWriteStream(me._path, so)
-
-  me._stream.on("open", function (fd) {
-    // console.error("FW open", me._buffer, me._path)
-    me.ready = true
-    me._buffer.forEach(function (c) {
-      if (c === EOF) me._stream.end()
-      else me._stream.write(c)
-    })
-    me.emit("ready")
-    // give this a kick just in case it needs it.
-    me.emit("drain")
-  })
-
-  me._stream.on("drain", function () { me.emit("drain") })
-
-  me._stream.on("close", function () {
-    // console.error("\n\nFW Stream Close", me._path, me.size)
-    me._finish()
-  })
-}
-
-FileWriter.prototype.write = function (c) {
-  var me = this
-
-  me._bytesWritten += c.length
-
-  if (!me.ready) {
-    if (!Buffer.isBuffer(c) && typeof c !== 'string')
-      throw new Error('invalid write data')
-    me._buffer.push(c)
-    return false
-  }
-
-  var ret = me._stream.write(c)
-  // console.error("\t-- fw wrote, _stream says", ret, me._stream._queue.length)
-
-  // allow 2 buffered writes, because otherwise there's just too
-  // much stop and go bs.
-  return ret || (me._stream._queue && me._stream._queue.length <= 2)
-}
-
-FileWriter.prototype.end = function (c) {
-  var me = this
-
-  if (c) me.write(c)
-
-  if (!me.ready) {
-    me._buffer.push(EOF)
-    return false
-  }
-
-  return me._stream.end()
-}
-
-FileWriter.prototype._finish = function () {
-  var me = this
-  if (typeof me.size === "number" && me._bytesWritten != me.size) {
-    me.error(
-      "Did not get expected byte count.\n" +
-      "expect: " + me.size + "\n" +
-      "actual: " + me._bytesWritten)
-  }
-  Writer.prototype._finish.call(me)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/get-type.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-module.exports = getType
-
-function getType (st) {
-  var types =
-      [ "Directory"
-      , "File"
-      , "SymbolicLink"
-      , "Link" // special for hardlinks from tarballs
-      , "BlockDevice"
-      , "CharacterDevice"
-      , "FIFO"
-      , "Socket" ]
-    , type
-
-  if (st.type && -1 !== types.indexOf(st.type)) {
-    st[st.type] = true
-    return st.type
-  }
-
-  for (var i = 0, l = types.length; i < l; i ++) {
-    type = types[i]
-    var is = st[type] || st["is" + type]
-    if (typeof is === "function") is = is.call(st)
-    if (is) {
-      st[type] = true
-      st.type = type
-      return type
-    }
-  }
-
-  return null
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/link-reader.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-// Basically just a wrapper around an fs.readlink
-//
-// XXX: Enhance this to support the Link type, by keeping
-// a lookup table of {<dev+inode>:<path>}, so that hardlinks
-// can be preserved in tarballs.
-
-module.exports = LinkReader
-
-var fs = require("graceful-fs")
-  , fstream = require("../fstream.js")
-  , inherits = require("inherits")
-  , mkdir = require("mkdirp")
-  , Reader = require("./reader.js")
-
-inherits(LinkReader, Reader)
-
-function LinkReader (props) {
-  var me = this
-  if (!(me instanceof LinkReader)) throw new Error(
-    "LinkReader must be called as constructor.")
-
-  if (!((props.type === "Link" && props.Link) ||
-        (props.type === "SymbolicLink" && props.SymbolicLink))) {
-    throw new Error("Non-link type "+ props.type)
-  }
-
-  Reader.call(me, props)
-}
-
-// When piping a LinkReader into a LinkWriter, we have to
-// already have the linkpath property set, so that has to
-// happen *before* the "ready" event, which means we need to
-// override the _stat method.
-LinkReader.prototype._stat = function (currentStat) {
-  var me = this
-  fs.readlink(me._path, function (er, linkpath) {
-    if (er) return me.error(er)
-    me.linkpath = me.props.linkpath = linkpath
-    me.emit("linkpath", linkpath)
-    Reader.prototype._stat.call(me, currentStat)
-  })
-}
-
-LinkReader.prototype._read = function () {
-  var me = this
-  if (me._paused) return
-  // basically just a no-op, since we got all the info we need
-  // from the _stat method
-  if (!me._ended) {
-    me.emit("end")
-    me.emit("close")
-    me._ended = true
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/link-writer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,95 +0,0 @@
-
-module.exports = LinkWriter
-
-var fs = require("graceful-fs")
-  , Writer = require("./writer.js")
-  , inherits = require("inherits")
-  , path = require("path")
-  , rimraf = require("rimraf")
-
-inherits(LinkWriter, Writer)
-
-function LinkWriter (props) {
-  var me = this
-  if (!(me instanceof LinkWriter)) throw new Error(
-    "LinkWriter must be called as constructor.")
-
-  // should already be established as a Link type
-  if (!((props.type === "Link" && props.Link) ||
-        (props.type === "SymbolicLink" && props.SymbolicLink))) {
-    throw new Error("Non-link type "+ props.type)
-  }
-
-  if (props.linkpath === "") props.linkpath = "."
-  if (!props.linkpath) {
-    me.error("Need linkpath property to create " + props.type)
-  }
-
-  Writer.call(this, props)
-}
-
-LinkWriter.prototype._create = function () {
-  // console.error(" LW _create")
-  var me = this
-    , hard = me.type === "Link" || process.platform === "win32"
-    , link = hard ? "link" : "symlink"
-    , lp = hard ? path.resolve(me.dirname, me.linkpath) : me.linkpath
-
-  // can only change the link path by clobbering
-  // For hard links, let's just assume that's always the case, since
-  // there's no good way to read them if we don't already know.
-  if (hard) return clobber(me, lp, link)
-
-  fs.readlink(me._path, function (er, p) {
-    // only skip creation if it's exactly the same link
-    if (p && p === lp) return finish(me)
-    clobber(me, lp, link)
-  })
-}
-
-function clobber (me, lp, link) {
-  rimraf(me._path, function (er) {
-    if (er) return me.error(er)
-    create(me, lp, link)
-  })
-}
-
-function create (me, lp, link) {
-  fs[link](lp, me._path, function (er) {
-    // if this is a hard link, and we're in the process of writing out a
-    // directory, it's very possible that the thing we're linking to
-    // doesn't exist yet (especially if it was intended as a symlink),
-    // so swallow ENOENT errors here and just soldier in.
-    // Additionally, an EPERM or EACCES can happen on win32 if it's trying
-    // to make a link to a directory.  Again, just skip it.
-    // A better solution would be to have fs.symlink be supported on
-    // windows in some nice fashion.
-    if (er) {
-      if ((er.code === "ENOENT" ||
-           er.code === "EACCES" ||
-           er.code === "EPERM" ) && process.platform === "win32") {
-        me.ready = true
-        me.emit("ready")
-        me.emit("end")
-        me.emit("close")
-        me.end = me._finish = function () {}
-      } else return me.error(er)
-    }
-    finish(me)
-  })
-}
-
-function finish (me) {
-  me.ready = true
-  me.emit("ready")
-  if (me._ended && !me._finished) me._finish()
-}
-
-LinkWriter.prototype.end = function () {
-  // console.error("LW finish in end")
-  this._ended = true
-  if (this.ready) {
-    this._finished = true
-    this._finish()
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/proxy-reader.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,93 +0,0 @@
-// A reader for when we don't yet know what kind of thing
-// the thing is.
-
-module.exports = ProxyReader
-
-var Reader = require("./reader.js")
-  , getType = require("./get-type.js")
-  , inherits = require("inherits")
-  , fs = require("graceful-fs")
-
-inherits(ProxyReader, Reader)
-
-function ProxyReader (props) {
-  var me = this
-  if (!(me instanceof ProxyReader)) throw new Error(
-    "ProxyReader must be called as constructor.")
-
-  me.props = props
-  me._buffer = []
-  me.ready = false
-
-  Reader.call(me, props)
-}
-
-ProxyReader.prototype._stat = function () {
-  var me = this
-    , props = me.props
-    // stat the thing to see what the proxy should be.
-    , stat = props.follow ? "stat" : "lstat"
-
-  fs[stat](props.path, function (er, current) {
-    var type
-    if (er || !current) {
-      type = "File"
-    } else {
-      type = getType(current)
-    }
-
-    props[type] = true
-    props.type = me.type = type
-
-    me._old = current
-    me._addProxy(Reader(props, current))
-  })
-}
-
-ProxyReader.prototype._addProxy = function (proxy) {
-  var me = this
-  if (me._proxyTarget) {
-    return me.error("proxy already set")
-  }
-
-  me._proxyTarget = proxy
-  proxy._proxy = me
-
-  ; [ "error"
-    , "data"
-    , "end"
-    , "close"
-    , "linkpath"
-    , "entry"
-    , "entryEnd"
-    , "child"
-    , "childEnd"
-    , "warn"
-    , "stat"
-    ].forEach(function (ev) {
-      // console.error("~~ proxy event", ev, me.path)
-      proxy.on(ev, me.emit.bind(me, ev))
-    })
-
-  me.emit("proxy", proxy)
-
-  proxy.on("ready", function () {
-    // console.error("~~ proxy is ready!", me.path)
-    me.ready = true
-    me.emit("ready")
-  })
-
-  var calls = me._buffer
-  me._buffer.length = 0
-  calls.forEach(function (c) {
-    proxy[c[0]].apply(proxy, c[1])
-  })
-}
-
-ProxyReader.prototype.pause = function () {
-  return this._proxyTarget ? this._proxyTarget.pause() : false
-}
-
-ProxyReader.prototype.resume = function () {
-  return this._proxyTarget ? this._proxyTarget.resume() : false
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/proxy-writer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,109 +0,0 @@
-// A writer for when we don't know what kind of thing
-// the thing is.  That is, it's not explicitly set,
-// so we're going to make it whatever the thing already
-// is, or "File"
-//
-// Until then, collect all events.
-
-module.exports = ProxyWriter
-
-var Writer = require("./writer.js")
-  , getType = require("./get-type.js")
-  , inherits = require("inherits")
-  , collect = require("./collect.js")
-  , fs = require("fs")
-
-inherits(ProxyWriter, Writer)
-
-function ProxyWriter (props) {
-  var me = this
-  if (!(me instanceof ProxyWriter)) throw new Error(
-    "ProxyWriter must be called as constructor.")
-
-  me.props = props
-  me._needDrain = false
-
-  Writer.call(me, props)
-}
-
-ProxyWriter.prototype._stat = function () {
-  var me = this
-    , props = me.props
-    // stat the thing to see what the proxy should be.
-    , stat = props.follow ? "stat" : "lstat"
-
-  fs[stat](props.path, function (er, current) {
-    var type
-    if (er || !current) {
-      type = "File"
-    } else {
-      type = getType(current)
-    }
-
-    props[type] = true
-    props.type = me.type = type
-
-    me._old = current
-    me._addProxy(Writer(props, current))
-  })
-}
-
-ProxyWriter.prototype._addProxy = function (proxy) {
-  // console.error("~~ set proxy", this.path)
-  var me = this
-  if (me._proxy) {
-    return me.error("proxy already set")
-  }
-
-  me._proxy = proxy
-  ; [ "ready"
-    , "error"
-    , "close"
-    , "pipe"
-    , "drain"
-    , "warn"
-    ].forEach(function (ev) {
-      proxy.on(ev, me.emit.bind(me, ev))
-    })
-
-  me.emit("proxy", proxy)
-
-  var calls = me._buffer
-  calls.forEach(function (c) {
-    // console.error("~~ ~~ proxy buffered call", c[0], c[1])
-    proxy[c[0]].apply(proxy, c[1])
-  })
-  me._buffer.length = 0
-  if (me._needsDrain) me.emit("drain")
-}
-
-ProxyWriter.prototype.add = function (entry) {
-  // console.error("~~ proxy add")
-  collect(entry)
-
-  if (!this._proxy) {
-    this._buffer.push(["add", [entry]])
-    this._needDrain = true
-    return false
-  }
-  return this._proxy.add(entry)
-}
-
-ProxyWriter.prototype.write = function (c) {
-  // console.error("~~ proxy write")
-  if (!this._proxy) {
-    this._buffer.push(["write", [c]])
-    this._needDrain = true
-    return false
-  }
-  return this._proxy.write(c)
-}
-
-ProxyWriter.prototype.end = function (c) {
-  // console.error("~~ proxy end")
-  if (!this._proxy) {
-    this._buffer.push(["end", [c]])
-    return false
-  }
-  return this._proxy.end(c)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/reader.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,259 +0,0 @@
-
-module.exports = Reader
-
-var fs = require("graceful-fs")
-  , Stream = require("stream").Stream
-  , inherits = require("inherits")
-  , path = require("path")
-  , getType = require("./get-type.js")
-  , hardLinks = Reader.hardLinks = {}
-  , Abstract = require("./abstract.js")
-
-// Must do this *before* loading the child classes
-inherits(Reader, Abstract)
-
-var DirReader = require("./dir-reader.js")
-  , FileReader = require("./file-reader.js")
-  , LinkReader = require("./link-reader.js")
-  , SocketReader = require("./socket-reader.js")
-  , ProxyReader = require("./proxy-reader.js")
-
-function Reader (props, currentStat) {
-  var me = this
-  if (!(me instanceof Reader)) return new Reader(props, currentStat)
-
-  if (typeof props === "string") {
-    props = { path: props }
-  }
-
-  if (!props.path) {
-    me.error("Must provide a path", null, true)
-  }
-
-  // polymorphism.
-  // call fstream.Reader(dir) to get a DirReader object, etc.
-  // Note that, unlike in the Writer case, ProxyReader is going
-  // to be the *normal* state of affairs, since we rarely know
-  // the type of a file prior to reading it.
-
-
-  var type
-    , ClassType
-
-  if (props.type && typeof props.type === "function") {
-    type = props.type
-    ClassType = type
-  } else {
-    type = getType(props)
-    ClassType = Reader
-  }
-
-  if (currentStat && !type) {
-    type = getType(currentStat)
-    props[type] = true
-    props.type = type
-  }
-
-  switch (type) {
-    case "Directory":
-      ClassType = DirReader
-      break
-
-    case "Link":
-      // XXX hard links are just files.
-      // However, it would be good to keep track of files' dev+inode
-      // and nlink values, and create a HardLinkReader that emits
-      // a linkpath value of the original copy, so that the tar
-      // writer can preserve them.
-      // ClassType = HardLinkReader
-      // break
-
-    case "File":
-      ClassType = FileReader
-      break
-
-    case "SymbolicLink":
-      ClassType = LinkReader
-      break
-
-    case "Socket":
-      ClassType = SocketReader
-      break
-
-    case null:
-      ClassType = ProxyReader
-      break
-  }
-
-  if (!(me instanceof ClassType)) {
-    return new ClassType(props)
-  }
-
-  Abstract.call(me)
-
-  me.readable = true
-  me.writable = false
-
-  me.type = type
-  me.props = props
-  me.depth = props.depth = props.depth || 0
-  me.parent = props.parent || null
-  me.root = props.root || (props.parent && props.parent.root) || me
-
-  me._path = me.path = path.resolve(props.path)
-  if (process.platform === "win32") {
-    me.path = me._path = me.path.replace(/\?/g, "_")
-    if (me._path.length >= 260) {
-      // how DOES one create files on the moon?
-      // if the path has spaces in it, then UNC will fail.
-      me._swallowErrors = true
-      //if (me._path.indexOf(" ") === -1) {
-        me._path = "\\\\?\\" + me.path.replace(/\//g, "\\")
-      //}
-    }
-  }
-  me.basename = props.basename = path.basename(me.path)
-  me.dirname = props.dirname = path.dirname(me.path)
-
-  // these have served their purpose, and are now just noisy clutter
-  props.parent = props.root = null
-
-  // console.error("\n\n\n%s setting size to", props.path, props.size)
-  me.size = props.size
-  me.filter = typeof props.filter === "function" ? props.filter : null
-  if (props.sort === "alpha") props.sort = alphasort
-
-  // start the ball rolling.
-  // this will stat the thing, and then call me._read()
-  // to start reading whatever it is.
-  // console.error("calling stat", props.path, currentStat)
-  me._stat(currentStat)
-}
-
-function alphasort (a, b) {
-  return a === b ? 0
-       : a.toLowerCase() > b.toLowerCase() ? 1
-       : a.toLowerCase() < b.toLowerCase() ? -1
-       : a > b ? 1
-       : -1
-}
-
-Reader.prototype._stat = function (currentStat) {
-  var me = this
-    , props = me.props
-    , stat = props.follow ? "stat" : "lstat"
-
-  // console.error("Reader._stat", me._path, currentStat)
-  if (currentStat) process.nextTick(statCb.bind(null, null, currentStat))
-  else fs[stat](me._path, statCb)
-
-
-  function statCb (er, props_) {
-    // console.error("Reader._stat, statCb", me._path, props_, props_.nlink)
-    if (er) return me.error(er)
-
-    Object.keys(props_).forEach(function (k) {
-      props[k] = props_[k]
-    })
-
-    // if it's not the expected size, then abort here.
-    if (undefined !== me.size && props.size !== me.size) {
-      return me.error("incorrect size")
-    }
-    me.size = props.size
-
-    var type = getType(props)
-    // special little thing for handling hardlinks.
-    if (type !== "Directory" && props.nlink && props.nlink > 1) {
-      var k = props.dev + ":" + props.ino
-      // console.error("Reader has nlink", me._path, k)
-      if (hardLinks[k] === me._path || !hardLinks[k]) hardLinks[k] = me._path
-      else {
-        // switch into hardlink mode.
-        type = me.type = me.props.type = "Link"
-        me.Link = me.props.Link = true
-        me.linkpath = me.props.linkpath = hardLinks[k]
-        // console.error("Hardlink detected, switching mode", me._path, me.linkpath)
-        // Setting __proto__ would arguably be the "correct"
-        // approach here, but that just seems too wrong.
-        me._stat = me._read = LinkReader.prototype._read
-      }
-    }
-
-    if (me.type && me.type !== type) {
-      me.error("Unexpected type: " + type)
-    }
-
-    // if the filter doesn't pass, then just skip over this one.
-    // still have to emit end so that dir-walking can move on.
-    if (me.filter) {
-      var who = me._proxy || me
-      // special handling for ProxyReaders
-      if (!me.filter.call(who, who, props)) {
-        if (!me._disowned) {
-          me.abort()
-          me.emit("end")
-          me.emit("close")
-        }
-        return
-      }
-    }
-
-    // last chance to abort or disown before the flow starts!
-    var events = ["_stat", "stat", "ready"]
-    var e = 0
-    ;(function go () {
-      if (me._aborted) {
-        me.emit("end")
-        me.emit("close")
-        return
-      }
-
-      if (me._paused) {
-        me.once("resume", go)
-        return
-      }
-
-      var ev = events[e ++]
-      if (!ev) return me._read()
-      me.emit(ev, props)
-      go()
-    })()
-  }
-}
-
-Reader.prototype.pipe = function (dest, opts) {
-  var me = this
-  if (typeof dest.add === "function") {
-    // piping to a multi-compatible, and we've got directory entries.
-    me.on("entry", function (entry) {
-      var ret = dest.add(entry)
-      if (false === ret) {
-        me.pause()
-      }
-    })
-  }
-
-  // console.error("R Pipe apply Stream Pipe")
-  return Stream.prototype.pipe.apply(this, arguments)
-}
-
-Reader.prototype.pause = function (who) {
-  this._paused = true
-  who = who || this
-  this.emit("pause", who)
-  if (this._stream) this._stream.pause(who)
-}
-
-Reader.prototype.resume = function (who) {
-  this._paused = false
-  who = who || this
-  this.emit("resume", who)
-  if (this._stream) this._stream.resume(who)
-  this._read()
-}
-
-Reader.prototype._read = function () {
-  this.error("Cannot read unknown type: "+this.type)
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/socket-reader.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,38 +0,0 @@
-// Just get the stats, and then don't do anything.
-// You can't really "read" from a socket.  You "connect" to it.
-// Mostly, this is here so that reading a dir with a socket in it
-// doesn't blow up.
-
-module.exports = SocketReader
-
-var fs = require("graceful-fs")
-  , fstream = require("../fstream.js")
-  , inherits = require("inherits")
-  , mkdir = require("mkdirp")
-  , Reader = require("./reader.js")
-
-inherits(SocketReader, Reader)
-
-function SocketReader (props) {
-  var me = this
-  if (!(me instanceof SocketReader)) throw new Error(
-    "SocketReader must be called as constructor.")
-
-  if (!(props.type === "Socket" && props.Socket)) {
-    throw new Error("Non-socket type "+ props.type)
-  }
-
-  Reader.call(me, props)
-}
-
-SocketReader.prototype._read = function () {
-  var me = this
-  if (me._paused) return
-  // basically just a no-op, since we got all the info we have
-  // from the _stat method
-  if (!me._ended) {
-    me.emit("end")
-    me.emit("close")
-    me._ended = true
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/lib/writer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,389 +0,0 @@
-
-module.exports = Writer
-
-var fs = require("graceful-fs")
-  , inherits = require("inherits")
-  , rimraf = require("rimraf")
-  , mkdir = require("mkdirp")
-  , path = require("path")
-  , umask = process.platform === "win32" ? 0 : process.umask()
-  , getType = require("./get-type.js")
-  , Abstract = require("./abstract.js")
-
-// Must do this *before* loading the child classes
-inherits(Writer, Abstract)
-
-Writer.dirmode = 0777 & (~umask)
-Writer.filemode = 0666 & (~umask)
-
-var DirWriter = require("./dir-writer.js")
-  , LinkWriter = require("./link-writer.js")
-  , FileWriter = require("./file-writer.js")
-  , ProxyWriter = require("./proxy-writer.js")
-
-// props is the desired state.  current is optionally the current stat,
-// provided here so that subclasses can avoid statting the target
-// more than necessary.
-function Writer (props, current) {
-  var me = this
-
-  if (typeof props === "string") {
-    props = { path: props }
-  }
-
-  if (!props.path) me.error("Must provide a path", null, true)
-
-  // polymorphism.
-  // call fstream.Writer(dir) to get a DirWriter object, etc.
-  var type = getType(props)
-    , ClassType = Writer
-
-  switch (type) {
-    case "Directory":
-      ClassType = DirWriter
-      break
-    case "File":
-      ClassType = FileWriter
-      break
-    case "Link":
-    case "SymbolicLink":
-      ClassType = LinkWriter
-      break
-    case null:
-      // Don't know yet what type to create, so we wrap in a proxy.
-      ClassType = ProxyWriter
-      break
-  }
-
-  if (!(me instanceof ClassType)) return new ClassType(props)
-
-  // now get down to business.
-
-  Abstract.call(me)
-
-  // props is what we want to set.
-  // set some convenience properties as well.
-  me.type = props.type
-  me.props = props
-  me.depth = props.depth || 0
-  me.clobber = false === props.clobber ? props.clobber : true
-  me.parent = props.parent || null
-  me.root = props.root || (props.parent && props.parent.root) || me
-
-  me._path = me.path = path.resolve(props.path)
-  if (process.platform === "win32") {
-    me.path = me._path = me.path.replace(/\?/g, "_")
-    if (me._path.length >= 260) {
-      me._swallowErrors = true
-      me._path = "\\\\?\\" + me.path.replace(/\//g, "\\")
-    }
-  }
-  me.basename = path.basename(props.path)
-  me.dirname = path.dirname(props.path)
-  me.linkpath = props.linkpath || null
-
-  props.parent = props.root = null
-
-  // console.error("\n\n\n%s setting size to", props.path, props.size)
-  me.size = props.size
-
-  if (typeof props.mode === "string") {
-    props.mode = parseInt(props.mode, 8)
-  }
-
-  me.readable = false
-  me.writable = true
-
-  // buffer until ready, or while handling another entry
-  me._buffer = []
-  me.ready = false
-
-  me.filter = typeof props.filter === "function" ? props.filter: null
-
-  // start the ball rolling.
-  // this checks what's there already, and then calls
-  // me._create() to call the impl-specific creation stuff.
-  me._stat(current)
-}
-
-// Calling this means that it's something we can't create.
-// Just assert that it's already there, otherwise raise a warning.
-Writer.prototype._create = function () {
-  var me = this
-  fs[me.props.follow ? "stat" : "lstat"](me._path, function (er, current) {
-    if (er) {
-      return me.warn("Cannot create " + me._path + "\n" +
-                     "Unsupported type: "+me.type, "ENOTSUP")
-    }
-    me._finish()
-  })
-}
-
-Writer.prototype._stat = function (current) {
-  var me = this
-    , props = me.props
-    , stat = props.follow ? "stat" : "lstat"
-    , who = me._proxy || me
-
-  if (current) statCb(null, current)
-  else fs[stat](me._path, statCb)
-
-  function statCb (er, current) {
-    if (me.filter && !me.filter.call(who, who, current)) {
-      me._aborted = true
-      me.emit("end")
-      me.emit("close")
-      return
-    }
-
-    // if it's not there, great.  We'll just create it.
-    // if it is there, then we'll need to change whatever differs
-    if (er || !current) {
-      return create(me)
-    }
-
-    me._old = current
-    var currentType = getType(current)
-
-    // if it's a type change, then we need to clobber or error.
-    // if it's not a type change, then let the impl take care of it.
-    if (currentType !== me.type) {
-      return rimraf(me._path, function (er) {
-        if (er) return me.error(er)
-        me._old = null
-        create(me)
-      })
-    }
-
-    // otherwise, just handle in the app-specific way
-    // this creates a fs.WriteStream, or mkdir's, or whatever
-    create(me)
-  }
-}
-
-function create (me) {
-  // console.error("W create", me._path, Writer.dirmode)
-
-  // XXX Need to clobber non-dirs that are in the way,
-  // unless { clobber: false } in the props.
-  mkdir(path.dirname(me._path), Writer.dirmode, function (er, made) {
-    // console.error("W created", path.dirname(me._path), er)
-    if (er) return me.error(er)
-
-    // later on, we have to set the mode and owner for these
-    me._madeDir = made
-    return me._create()
-  })
-}
-
-function endChmod (me, want, current, path, cb) {
-    var wantMode = want.mode
-      , chmod = want.follow || me.type !== "SymbolicLink"
-              ? "chmod" : "lchmod"
-
-  if (!fs[chmod]) return cb()
-  if (typeof wantMode !== "number") return cb()
-
-  var curMode = current.mode & 0777
-  wantMode = wantMode & 0777
-  if (wantMode === curMode) return cb()
-
-  fs[chmod](path, wantMode, cb)
-}
-
-
-function endChown (me, want, current, path, cb) {
-  // Don't even try it unless root.  Too easy to EPERM.
-  if (process.platform === "win32") return cb()
-  if (!process.getuid || !process.getuid() === 0) return cb()
-  if (typeof want.uid !== "number" &&
-      typeof want.gid !== "number" ) return cb()
-
-  if (current.uid === want.uid &&
-      current.gid === want.gid) return cb()
-
-  var chown = (me.props.follow || me.type !== "SymbolicLink")
-            ? "chown" : "lchown"
-  if (!fs[chown]) return cb()
-
-  if (typeof want.uid !== "number") want.uid = current.uid
-  if (typeof want.gid !== "number") want.gid = current.gid
-
-  fs[chown](path, want.uid, want.gid, cb)
-}
-
-function endUtimes (me, want, current, path, cb) {
-  if (!fs.utimes || process.platform === "win32") return cb()
-
-  var utimes = (want.follow || me.type !== "SymbolicLink")
-             ? "utimes" : "lutimes"
-
-  if (utimes === "lutimes" && !fs[utimes]) {
-    utimes = "utimes"
-  }
-
-  if (!fs[utimes]) return cb()
-
-  var curA = current.atime
-    , curM = current.mtime
-    , meA = want.atime
-    , meM = want.mtime
-
-  if (meA === undefined) meA = curA
-  if (meM === undefined) meM = curM
-
-  if (!isDate(meA)) meA = new Date(meA)
-  if (!isDate(meM)) meA = new Date(meM)
-
-  if (meA.getTime() === curA.getTime() &&
-      meM.getTime() === curM.getTime()) return cb()
-
-  fs[utimes](path, meA, meM, cb)
-}
-
-
-// XXX This function is beastly.  Break it up!
-Writer.prototype._finish = function () {
-  var me = this
-
-  // console.error(" W Finish", me._path, me.size)
-
-  // set up all the things.
-  // At this point, we're already done writing whatever we've gotta write,
-  // adding files to the dir, etc.
-  var todo = 0
-  var errState = null
-  var done = false
-
-  if (me._old) {
-    // the times will almost *certainly* have changed.
-    // adds the utimes syscall, but remove another stat.
-    me._old.atime = new Date(0)
-    me._old.mtime = new Date(0)
-    // console.error(" W Finish Stale Stat", me._path, me.size)
-    setProps(me._old)
-  } else {
-    var stat = me.props.follow ? "stat" : "lstat"
-    // console.error(" W Finish Stating", me._path, me.size)
-    fs[stat](me._path, function (er, current) {
-      // console.error(" W Finish Stated", me._path, me.size, current)
-      if (er) {
-        // if we're in the process of writing out a
-        // directory, it's very possible that the thing we're linking to
-        // doesn't exist yet (especially if it was intended as a symlink),
-        // so swallow ENOENT errors here and just soldier on.
-        if (er.code === "ENOENT" &&
-            (me.type === "Link" || me.type === "SymbolicLink") &&
-            process.platform === "win32") {
-          me.ready = true
-          me.emit("ready")
-          me.emit("end")
-          me.emit("close")
-          me.end = me._finish = function () {}
-          return
-        } else return me.error(er)
-      }
-      setProps(me._old = current)
-    })
-  }
-
-  return
-
-  function setProps (current) {
-    todo += 3
-    endChmod(me, me.props, current, me._path, next("chmod"))
-    endChown(me, me.props, current, me._path, next("chown"))
-    endUtimes(me, me.props, current, me._path, next("utimes"))
-  }
-
-  function next (what) {
-    return function (er) {
-      // console.error("   W Finish", what, todo)
-      if (errState) return
-      if (er) {
-        er.fstream_finish_call = what
-        return me.error(errState = er)
-      }
-      if (--todo > 0) return
-      if (done) return
-      done = true
-
-      // we may still need to set the mode/etc. on some parent dirs
-      // that were created previously.  delay end/close until then.
-      if (!me._madeDir) return end()
-      else endMadeDir(me, me._path, end)
-
-      function end (er) {
-        if (er) {
-          er.fstream_finish_call = "setupMadeDir"
-          return me.error(er)
-        }
-        // all the props have been set, so we're completely done.
-        me.emit("end")
-        me.emit("close")
-      }
-    }
-  }
-}
-
-function endMadeDir (me, p, cb) {
-  var made = me._madeDir
-  // everything *between* made and path.dirname(me._path)
-  // needs to be set up.  Note that this may just be one dir.
-  var d = path.dirname(p)
-
-  endMadeDir_(me, d, function (er) {
-    if (er) return cb(er)
-    if (d === made) {
-      return cb()
-    }
-    endMadeDir(me, d, cb)
-  })
-}
-
-function endMadeDir_ (me, p, cb) {
-  var dirProps = {}
-  Object.keys(me.props).forEach(function (k) {
-    dirProps[k] = me.props[k]
-
-    // only make non-readable dirs if explicitly requested.
-    if (k === "mode" && me.type !== "Directory") {
-      dirProps[k] = dirProps[k] | 0111
-    }
-  })
-
-  var todo = 3
-  , errState = null
-  fs.stat(p, function (er, current) {
-    if (er) return cb(errState = er)
-    endChmod(me, dirProps, current, p, next)
-    endChown(me, dirProps, current, p, next)
-    endUtimes(me, dirProps, current, p, next)
-  })
-
-  function next (er) {
-    if (errState) return
-    if (er) return cb(errState = er)
-    if (-- todo === 0) return cb()
-  }
-}
-
-Writer.prototype.pipe = function () {
-  this.error("Can't pipe from writable stream")
-}
-
-Writer.prototype.add = function () {
-  this.error("Cannot add to non-Directory type")
-}
-
-Writer.prototype.write = function () {
-  return true
-}
-
-function objectToString (d) {
-  return Object.prototype.toString.call(d)
-}
-
-function isDate(d) {
-  return typeof d === 'object' && objectToString(d) === '[object Date]';
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/fstream/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,38 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "fstream",
-  "description": "Advanced file system stream things",
-  "version": "0.1.24",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/fstream.git"
-  },
-  "main": "fstream.js",
-  "engines": {
-    "node": ">=0.6"
-  },
-  "dependencies": {
-    "rimraf": "2",
-    "mkdirp": "0.3",
-    "graceful-fs": "~2.0.0",
-    "inherits": "~2.0.0"
-  },
-  "devDependencies": {
-    "tap": ""
-  },
-  "scripts": {
-    "test": "tap examples/*.js"
-  },
-  "license": "BSD",
-  "readme": "Like FS streams, but with stat on them, and supporting directories and\nsymbolic links, as well as normal files.  Also, you can use this to set\nthe stats on a file, even if you don't change its contents, or to create\na symlink, etc.\n\nSo, for example, you can \"write\" a directory, and it'll call `mkdir`.  You\ncan specify a uid and gid, and it'll call `chown`.  You can specify a\n`mtime` and `atime`, and it'll call `utimes`.  You can call it a symlink\nand provide a `linkpath` and it'll call `symlink`.\n\nNote that it won't automatically resolve symbolic links.  So, if you\ncall `fstream.Reader('/some/symlink')` then you'll get an object\nthat stats and then ends immediately (since it has no data).  To follow\nsymbolic links, do this: `fstream.Reader({path:'/some/symlink', follow:\ntrue })`.\n\nThere are various checks to make sure that the bytes emitted are the\nsame as the intended size, if the size is set.\n\n## Examples\n\n```javascript\nfstream\n  .Writer({ path: \"path/to/file\"\n          , mode: 0755\n          , size: 6\n          })\n  .write(\"hello\\n\")\n  .end()\n```\n\nThis will create the directories if they're missing, and then write\n`hello\\n` into the file, chmod it to 0755, and assert that 6 bytes have\nbeen written when it's done.\n\n```javascript\nfstream\n  .Writer({ path: \"path/to/file\"\n          , mode: 0755\n          , size: 6\n          , flags: \"a\"\n          })\n  .write(\"hello\\n\")\n  .end()\n```\n\nYou can pass flags in, if you want to append to a file.\n\n```javascript\nfstream\n  .Writer({ path: \"path/to/symlink\"\n          , linkpath: \"./file\"\n          , SymbolicLink: true\n          , mode: \"0755\" // octal strings supported\n          })\n  .end()\n```\n\nIf isSymbolicLink is a function, it'll be called, and if it returns\ntrue, then it'll treat it as a symlink.  If it's not a function, then\nany truish value will make a symlink, or you can set `type:\n'SymbolicLink'`, which does the same thing.\n\nNote that the linkpath is relative to the symbolic link location, not\nthe parent dir or cwd.\n\n```javascript\nfstream\n  .Reader(\"path/to/dir\")\n  .pipe(fstream.Writer(\"path/to/other/dir\"))\n```\n\nThis will do like `cp -Rp path/to/dir path/to/other/dir`.  If the other\ndir exists and isn't a directory, then it'll emit an error.  It'll also\nset the uid, gid, mode, etc. to be identical.  In this way, it's more\nlike `rsync -a` than simply a copy.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/fstream/issues"
-  },
-  "_id": "fstream@0.1.24",
-  "_from": "fstream@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-node_modules
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/History.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,10 +0,0 @@
-
-1.1.1 / 2013-04-23 
-==================
-
-  * package.json: Move test stuff to devDeps
-
-1.1.0 / 2013-04-19 
-==================
-
-  * Add support for gist urls
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-
-test:
-	@./node_modules/.bin/mocha test.js --reporter spec --require should
-
-.PHONY: test
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/Readme.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,41 +0,0 @@
-
-# github-url-from-git
-
-```js
-describe('parse(url)', function(){
-  it('should support git://*', function(){
-    var url = 'git://github.com/jamesor/mongoose-versioner';
-    parse(url).should.equal('https://github.com/jamesor/mongoose-versioner');
-  })
-
-  it('should support git://*.git', function(){
-    var url = 'git://github.com/treygriffith/cellar.git';
-    parse(url).should.equal('https://github.com/treygriffith/cellar');
-  })
-
-  it('should support https://*', function(){
-    var url = 'https://github.com/Empeeric/i18n-node';
-    parse(url).should.equal('https://github.com/Empeeric/i18n-node');
-  })
-
-  it('should support https://*.git', function(){
-    var url = 'https://jpillora@github.com/banchee/tranquil.git';
-    parse(url).should.equal('https://github.com/banchee/tranquil');
-  })
-
-  it('should return undefined on failure', function(){
-    var url = 'git://github.com/justgord/.git';
-    assert(null == parse(url));
-  })
-
-  it('should parse git@gist urls', function() {
-    var url = 'git@gist.github.com:3135914.git';
-    parse(url).should.equal('https://gist.github.com/3135914')
-  })
-
-  it('should parse https://gist urls', function() {
-    var url = 'https://gist.github.com/3135914.git';
-    parse(url).should.equal('https://gist.github.com/3135914')
-  })
-})
-```
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,12 +0,0 @@
-var re = /^(?:https?:\/\/|git:\/\/)?(?:[^@]+@)?(gist.github.com|github.com)[:\/]([^\/]+\/[^\/]+?|[0-9]+)$/
-
-module.exports = function(url){
-  try {
-    var m = re.exec(url.replace(/\.git$/, ''));
-    var host = m[1];
-    var path = m[2];
-    return 'https://' + host + '/' + path;
-  } catch (err) {
-    // ignore
-  }
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-{
-  "name": "github-url-from-git",
-  "version": "1.1.1",
-  "description": "Parse a github git url and return the github repo url",
-  "main": "index.js",
-  "scripts": {
-    "test": "mocha test.js --reporter spec --require should"
-  },
-  "repository": "",
-  "keywords": [
-    "github",
-    "git",
-    "url",
-    "parser"
-  ],
-  "author": "",
-  "license": "MIT",
-  "devDependencies": {
-    "better-assert": "~1.0.0",
-    "mocha": "~1.9.0",
-    "should": "~1.2.2"
-  },
-  "readme": "\n# github-url-from-git\n\n```js\ndescribe('parse(url)', function(){\n  it('should support git://*', function(){\n    var url = 'git://github.com/jamesor/mongoose-versioner';\n    parse(url).should.equal('https://github.com/jamesor/mongoose-versioner');\n  })\n\n  it('should support git://*.git', function(){\n    var url = 'git://github.com/treygriffith/cellar.git';\n    parse(url).should.equal('https://github.com/treygriffith/cellar');\n  })\n\n  it('should support https://*', function(){\n    var url = 'https://github.com/Empeeric/i18n-node';\n    parse(url).should.equal('https://github.com/Empeeric/i18n-node');\n  })\n\n  it('should support https://*.git', function(){\n    var url = 'https://jpillora@github.com/banchee/tranquil.git';\n    parse(url).should.equal('https://github.com/banchee/tranquil');\n  })\n\n  it('should return undefined on failure', function(){\n    var url = 'git://github.com/justgord/.git';\n    assert(null == parse(url));\n  })\n\n  it('should parse git@gist urls', function() {\n    var url = 'git@gist.github.com:3135914.git';\n    parse(url).should.equal('https://gist.github.com/3135914')\n  })\n\n  it('should parse https://gist urls', function() {\n    var url = 'https://gist.github.com/3135914.git';\n    parse(url).should.equal('https://gist.github.com/3135914')\n  })\n})\n```\n",
-  "readmeFilename": "Readme.md",
-  "_id": "github-url-from-git@1.1.1",
-  "dist": {
-    "shasum": "a14903bccbd30c91ea41765ae68ba1b27a53c4d1"
-  },
-  "_from": "github-url-from-git@1.1.1",
-  "_resolved": "https://registry.npmjs.org/github-url-from-git/-/github-url-from-git-1.1.1.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-git/test.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-
-var parse = require('./');
-var assert = require('better-assert');
-
-describe('parse(url)', function(){
-  it('should support git://*', function(){
-    var url = 'git://github.com/jamesor/mongoose-versioner';
-    parse(url).should.equal('https://github.com/jamesor/mongoose-versioner');
-  })
-
-  it('should support git://*.git', function(){
-    var url = 'git://github.com/treygriffith/cellar.git';
-    parse(url).should.equal('https://github.com/treygriffith/cellar');
-  })
-
-  it('should support https://*', function(){
-    var url = 'https://github.com/Empeeric/i18n-node';
-    parse(url).should.equal('https://github.com/Empeeric/i18n-node');
-  })
-
-  it('should support https://*.git', function(){
-    var url = 'https://jpillora@github.com/banchee/tranquil.git';
-    parse(url).should.equal('https://github.com/banchee/tranquil');
-  })
-
-  it('should return undefined on failure', function(){
-    var url = 'git://github.com/justgord/.git';
-    assert(null == parse(url));
-  })
-
-  it('should parse git@gist urls', function() {
-    var url = 'git@gist.github.com:3135914.git';
-    parse(url).should.equal('https://gist.github.com/3135914')
-  })
-
-  it('should parse https://gist urls', function() {
-    var url = 'https://gist.github.com/3135914.git';
-    parse(url).should.equal('https://gist.github.com/3135914')
-  })
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-*.swp
-.*.swp
-
-.DS_Store
-*~
-.project
-.settings
-npm-debug.log
-coverage.html
-.idea
-lib-cov
-
-node_modules
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-language: node_js
-node_js:
-  - "0.8"
-  - "0.10"
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Robert Kowalski ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-[![Build Status](https://travis-ci.org/robertkowalski/github-url-from-username-repo.png?branch=master)](https://travis-ci.org/robertkowalski/github-url-from-username-repo)
-[![Dependency Status](https://gemnasium.com/robertkowalski/github-url-from-username-repo.png)](https://gemnasium.com/robertkowalski/github-url-from-username-repo)
-
-
-# github-url-from-username-repo
-
-## Usage
-
-```javascript
-
-var getUrl = require("github-url-from-username-repo")
-getUrl("visionmedia/express") // git://github.com/visionmedia/express
-
-```
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,9 +0,0 @@
-module.exports = getUrl
-
-function getUrl (r) {
-  if (!r) return
-  if (/^[\w-]+\/[\w-]+$/.test(r))
-    return "git://github.com/" + r
-  else
-    return null
-}
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/github-url-from-username-repo/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-{
-  "name": "github-url-from-username-repo",
-  "version": "0.0.2",
-  "description": "Create urls from username/repo",
-  "main": "index.js",
-  "scripts": {
-    "test": "mocha -R spec"
-  },
-  "devDependencies": {
-    "mocha": "~1.13.0"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git@github.com:robertkowalski/github-url-from-username-repo.git"
-  },
-  "author": {
-    "name": "Robert Kowalski",
-    "email": "rok@kowalski.gd"
-  },
-  "license": "BSD-2-Clause",
-  "bugs": {
-    "url": "https://github.com/robertkowalski/github-url-from-username-repo/issues"
-  },
-  "keywords": [
-    "git",
-    "github",
-    "repo"
-  ],
-  "readme": "[![Build Status](https://travis-ci.org/robertkowalski/github-url-from-username-repo.png?branch=master)](https://travis-ci.org/robertkowalski/github-url-from-username-repo)\n[![Dependency Status](https://gemnasium.com/robertkowalski/github-url-from-username-repo.png)](https://gemnasium.com/robertkowalski/github-url-from-username-repo)\n\n\n# github-url-from-username-repo\n\n## Usage\n\n```javascript\n\nvar getUrl = require(\"github-url-from-username-repo\")\ngetUrl(\"visionmedia/express\") // git://github.com/visionmedia/express\n\n```",
-  "readmeFilename": "README.md",
-  "_id": "github-url-from-username-repo@0.0.2",
-  "_from": "github-url-from-username-repo@"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-.*.swp
-test/a/
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-language: node_js
-node_js:
-  - 0.8
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,250 +0,0 @@
-# Glob
-
-Match files using the patterns the shell uses, like stars and stuff.
-
-This is a glob implementation in JavaScript.  It uses the `minimatch`
-library to do its matching.
-
-## Attention: node-glob users!
-
-The API has changed dramatically between 2.x and 3.x. This library is
-now 100% JavaScript, and the integer flags have been replaced with an
-options object.
-
-Also, there's an event emitter class, proper tests, and all the other
-things you've come to expect from node modules.
-
-And best of all, no compilation!
-
-## Usage
-
-```javascript
-var glob = require("glob")
-
-// options is optional
-glob("**/*.js", options, function (er, files) {
-  // files is an array of filenames.
-  // If the `nonull` option is set, and nothing
-  // was found, then files is ["**/*.js"]
-  // er is an error object or null.
-})
-```
-
-## Features
-
-Please see the [minimatch
-documentation](https://github.com/isaacs/minimatch) for more details.
-
-Supports these glob features:
-
-* Brace Expansion
-* Extended glob matching
-* "Globstar" `**` matching
-
-See:
-
-* `man sh`
-* `man bash`
-* `man 3 fnmatch`
-* `man 5 gitignore`
-* [minimatch documentation](https://github.com/isaacs/minimatch)
-
-## glob(pattern, [options], cb)
-
-* `pattern` {String} Pattern to be matched
-* `options` {Object}
-* `cb` {Function}
-  * `err` {Error | null}
-  * `matches` {Array<String>} filenames found matching the pattern
-
-Perform an asynchronous glob search.
-
-## glob.sync(pattern, [options])
-
-* `pattern` {String} Pattern to be matched
-* `options` {Object}
-* return: {Array<String>} filenames found matching the pattern
-
-Perform a synchronous glob search.
-
-## Class: glob.Glob
-
-Create a Glob object by instanting the `glob.Glob` class.
-
-```javascript
-var Glob = require("glob").Glob
-var mg = new Glob(pattern, options, cb)
-```
-
-It's an EventEmitter, and starts walking the filesystem to find matches
-immediately.
-
-### new glob.Glob(pattern, [options], [cb])
-
-* `pattern` {String} pattern to search for
-* `options` {Object}
-* `cb` {Function} Called when an error occurs, or matches are found
-  * `err` {Error | null}
-  * `matches` {Array<String>} filenames found matching the pattern
-
-Note that if the `sync` flag is set in the options, then matches will
-be immediately available on the `g.found` member.
-
-### Properties
-
-* `minimatch` The minimatch object that the glob uses.
-* `options` The options object passed in.
-* `error` The error encountered.  When an error is encountered, the
-  glob object is in an undefined state, and should be discarded.
-* `aborted` Boolean which is set to true when calling `abort()`.  There
-  is no way at this time to continue a glob search after aborting, but
-  you can re-use the statCache to avoid having to duplicate syscalls.
-* `statCache` Collection of all the stat results the glob search
-  performed.
-* `cache` Convenience object.  Each field has the following possible
-  values:
-  * `false` - Path does not exist
-  * `true` - Path exists
-  * `1` - Path exists, and is not a directory
-  * `2` - Path exists, and is a directory
-  * `[file, entries, ...]` - Path exists, is a directory, and the
-    array value is the results of `fs.readdir`
-
-### Events
-
-* `end` When the matching is finished, this is emitted with all the
-  matches found.  If the `nonull` option is set, and no match was found,
-  then the `matches` list contains the original pattern.  The matches
-  are sorted, unless the `nosort` flag is set.
-* `match` Every time a match is found, this is emitted with the matched.
-* `error` Emitted when an unexpected error is encountered, or whenever
-  any fs error occurs if `options.strict` is set.
-* `abort` When `abort()` is called, this event is raised.
-
-### Methods
-
-* `abort` Stop the search.
-
-### Options
-
-All the options that can be passed to Minimatch can also be passed to
-Glob to change pattern matching behavior.  Also, some have been added,
-or have glob-specific ramifications.
-
-All options are false by default, unless otherwise noted.
-
-All options are added to the glob object, as well.
-
-* `cwd` The current working directory in which to search.  Defaults
-  to `process.cwd()`.
-* `root` The place where patterns starting with `/` will be mounted
-  onto.  Defaults to `path.resolve(options.cwd, "/")` (`/` on Unix
-  systems, and `C:\` or some such on Windows.)
-* `dot` Include `.dot` files in normal matches and `globstar` matches.
-  Note that an explicit dot in a portion of the pattern will always
-  match dot files.
-* `nomount` By default, a pattern starting with a forward-slash will be
-  "mounted" onto the root setting, so that a valid filesystem path is
-  returned.  Set this flag to disable that behavior.
-* `mark` Add a `/` character to directory matches.  Note that this
-  requires additional stat calls.
-* `nosort` Don't sort the results.
-* `stat` Set to true to stat *all* results.  This reduces performance
-  somewhat, and is completely unnecessary, unless `readdir` is presumed
-  to be an untrustworthy indicator of file existence.  It will cause
-  ELOOP to be triggered one level sooner in the case of cyclical
-  symbolic links.
-* `silent` When an unusual error is encountered
-  when attempting to read a directory, a warning will be printed to
-  stderr.  Set the `silent` option to true to suppress these warnings.
-* `strict` When an unusual error is encountered
-  when attempting to read a directory, the process will just continue on
-  in search of other matches.  Set the `strict` option to raise an error
-  in these cases.
-* `cache` See `cache` property above.  Pass in a previously generated
-  cache object to save some fs calls.
-* `statCache` A cache of results of filesystem information, to prevent
-  unnecessary stat calls.  While it should not normally be necessary to
-  set this, you may pass the statCache from one glob() call to the
-  options object of another, if you know that the filesystem will not
-  change between calls.  (See "Race Conditions" below.)
-* `sync` Perform a synchronous glob search.
-* `nounique` In some cases, brace-expanded patterns can result in the
-  same file showing up multiple times in the result set.  By default,
-  this implementation prevents duplicates in the result set.
-  Set this flag to disable that behavior.
-* `nonull` Set to never return an empty set, instead returning a set
-  containing the pattern itself.  This is the default in glob(3).
-* `nocase` Perform a case-insensitive match.  Note that case-insensitive
-  filesystems will sometimes result in glob returning results that are
-  case-insensitively matched anyway, since readdir and stat will not
-  raise an error.
-* `debug` Set to enable debug logging in minimatch and glob.
-* `globDebug` Set to enable debug logging in glob, but not minimatch.
-
-## Comparisons to other fnmatch/glob implementations
-
-While strict compliance with the existing standards is a worthwhile
-goal, some discrepancies exist between node-glob and other
-implementations, and are intentional.
-
-If the pattern starts with a `!` character, then it is negated.  Set the
-`nonegate` flag to suppress this behavior, and treat leading `!`
-characters normally.  This is perhaps relevant if you wish to start the
-pattern with a negative extglob pattern like `!(a|B)`.  Multiple `!`
-characters at the start of a pattern will negate the pattern multiple
-times.
-
-If a pattern starts with `#`, then it is treated as a comment, and
-will not match anything.  Use `\#` to match a literal `#` at the
-start of a line, or set the `nocomment` flag to suppress this behavior.
-
-The double-star character `**` is supported by default, unless the
-`noglobstar` flag is set.  This is supported in the manner of bsdglob
-and bash 4.1, where `**` only has special significance if it is the only
-thing in a path part.  That is, `a/**/b` will match `a/x/y/b`, but
-`a/**b` will not.
-
-If an escaped pattern has no matches, and the `nonull` flag is set,
-then glob returns the pattern as-provided, rather than
-interpreting the character escapes.  For example,
-`glob.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than
-`"*a?"`.  This is akin to setting the `nullglob` option in bash, except
-that it does not resolve escaped pattern characters.
-
-If brace expansion is not disabled, then it is performed before any
-other interpretation of the glob pattern.  Thus, a pattern like
-`+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded
-**first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are
-checked for validity.  Since those two are valid, matching proceeds.
-
-## Windows
-
-**Please only use forward-slashes in glob expressions.**
-
-Though windows uses either `/` or `\` as its path separator, only `/`
-characters are used by this glob implementation.  You must use
-forward-slashes **only** in glob expressions.  Back-slashes will always
-be interpreted as escape characters, not path separators.
-
-Results from absolute patterns such as `/foo/*` are mounted onto the
-root setting using `path.join`.  On windows, this will by default result
-in `/foo/*` matching `C:\foo\bar.txt`.
-
-## Race Conditions
-
-Glob searching, by its very nature, is susceptible to race conditions,
-since it relies on directory walking and such.
-
-As a result, it is possible that a file that exists when glob looks for
-it may have been deleted or modified by the time it returns the result.
-
-As part of its internal implementation, this program caches all stat
-and readdir calls that it makes, in order to cut down on system
-overhead.  However, this also makes it even more susceptible to races,
-especially if the cache or statCache objects are reused between glob
-calls.
-
-Users are thus advised not to use a glob result as a guarantee of
-filesystem state in the face of rapid changes.  For the vast majority
-of operations, this is never a problem.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/examples/g.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,9 +0,0 @@
-var Glob = require("../").Glob
-
-var pattern = "test/a/**/[cg]/../[cg]"
-console.log(pattern)
-
-var mg = new Glob(pattern, {mark: true, sync:true}, function (er, matches) {
-  console.log("matches", matches)
-})
-console.log("after")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/examples/usr-local.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,9 +0,0 @@
-var Glob = require("../").Glob
-
-var pattern = "{./*/*,/*,/usr/local/*}"
-console.log(pattern)
-
-var mg = new Glob(pattern, {mark: true}, function (er, matches) {
-  console.log("matches", matches)
-})
-console.log("after")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/glob.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,675 +0,0 @@
-// Approach:
-//
-// 1. Get the minimatch set
-// 2. For each pattern in the set, PROCESS(pattern)
-// 3. Store matches per-set, then uniq them
-//
-// PROCESS(pattern)
-// Get the first [n] items from pattern that are all strings
-// Join these together.  This is PREFIX.
-//   If there is no more remaining, then stat(PREFIX) and
-//   add to matches if it succeeds.  END.
-// readdir(PREFIX) as ENTRIES
-//   If fails, END
-//   If pattern[n] is GLOBSTAR
-//     // handle the case where the globstar match is empty
-//     // by pruning it out, and testing the resulting pattern
-//     PROCESS(pattern[0..n] + pattern[n+1 .. $])
-//     // handle other cases.
-//     for ENTRY in ENTRIES (not dotfiles)
-//       // attach globstar + tail onto the entry
-//       PROCESS(pattern[0..n] + ENTRY + pattern[n .. $])
-//
-//   else // not globstar
-//     for ENTRY in ENTRIES (not dotfiles, unless pattern[n] is dot)
-//       Test ENTRY against pattern[n]
-//       If fails, continue
-//       If passes, PROCESS(pattern[0..n] + item + pattern[n+1 .. $])
-//
-// Caveat:
-//   Cache all stats and readdirs results to minimize syscall.  Since all
-//   we ever care about is existence and directory-ness, we can just keep
-//   `true` for files, and [children,...] for directories, or `false` for
-//   things that don't exist.
-
-
-
-module.exports = glob
-
-var fs = require("fs")
-, minimatch = require("minimatch")
-, Minimatch = minimatch.Minimatch
-, inherits = require("inherits")
-, EE = require("events").EventEmitter
-, path = require("path")
-, isDir = {}
-, assert = require("assert").ok
-
-function glob (pattern, options, cb) {
-  if (typeof options === "function") cb = options, options = {}
-  if (!options) options = {}
-
-  if (typeof options === "number") {
-    deprecated()
-    return
-  }
-
-  var g = new Glob(pattern, options, cb)
-  return g.sync ? g.found : g
-}
-
-glob.fnmatch = deprecated
-
-function deprecated () {
-  throw new Error("glob's interface has changed. Please see the docs.")
-}
-
-glob.sync = globSync
-function globSync (pattern, options) {
-  if (typeof options === "number") {
-    deprecated()
-    return
-  }
-
-  options = options || {}
-  options.sync = true
-  return glob(pattern, options)
-}
-
-
-glob.Glob = Glob
-inherits(Glob, EE)
-function Glob (pattern, options, cb) {
-  if (!(this instanceof Glob)) {
-    return new Glob(pattern, options, cb)
-  }
-
-  if (typeof cb === "function") {
-    this.on("error", cb)
-    this.on("end", function (matches) {
-      cb(null, matches)
-    })
-  }
-
-  options = options || {}
-
-  this.EOF = {}
-  this._emitQueue = []
-
-  this.maxDepth = options.maxDepth || 1000
-  this.maxLength = options.maxLength || Infinity
-  this.cache = options.cache || {}
-  this.statCache = options.statCache || {}
-
-  this.changedCwd = false
-  var cwd = process.cwd()
-  if (!options.hasOwnProperty("cwd")) this.cwd = cwd
-  else {
-    this.cwd = options.cwd
-    this.changedCwd = path.resolve(options.cwd) !== cwd
-  }
-
-  this.root = options.root || path.resolve(this.cwd, "/")
-  this.root = path.resolve(this.root)
-  if (process.platform === "win32")
-    this.root = this.root.replace(/\\/g, "/")
-
-  this.nomount = !!options.nomount
-
-  if (!pattern) {
-    throw new Error("must provide pattern")
-  }
-
-  // base-matching: just use globstar for that.
-  if (options.matchBase && -1 === pattern.indexOf("/")) {
-    if (options.noglobstar) {
-      throw new Error("base matching requires globstar")
-    }
-    pattern = "**/" + pattern
-  }
-
-  this.strict = options.strict !== false
-  this.dot = !!options.dot
-  this.mark = !!options.mark
-  this.sync = !!options.sync
-  this.nounique = !!options.nounique
-  this.nonull = !!options.nonull
-  this.nosort = !!options.nosort
-  this.nocase = !!options.nocase
-  this.stat = !!options.stat
-
-  this.debug = !!options.debug || !!options.globDebug
-  if (this.debug)
-    this.log = console.error
-
-  this.silent = !!options.silent
-
-  var mm = this.minimatch = new Minimatch(pattern, options)
-  this.options = mm.options
-  pattern = this.pattern = mm.pattern
-
-  this.error = null
-  this.aborted = false
-
-  // list of all the patterns that ** has resolved do, so
-  // we can avoid visiting multiple times.
-  this._globstars = {}
-
-  EE.call(this)
-
-  // process each pattern in the minimatch set
-  var n = this.minimatch.set.length
-
-  // The matches are stored as {<filename>: true,...} so that
-  // duplicates are automagically pruned.
-  // Later, we do an Object.keys() on these.
-  // Keep them as a list so we can fill in when nonull is set.
-  this.matches = new Array(n)
-
-  this.minimatch.set.forEach(iterator.bind(this))
-  function iterator (pattern, i, set) {
-    this._process(pattern, 0, i, function (er) {
-      if (er) this.emit("error", er)
-      if (-- n <= 0) this._finish()
-    })
-  }
-}
-
-Glob.prototype.log = function () {}
-
-Glob.prototype._finish = function () {
-  assert(this instanceof Glob)
-
-  var nou = this.nounique
-  , all = nou ? [] : {}
-
-  for (var i = 0, l = this.matches.length; i < l; i ++) {
-    var matches = this.matches[i]
-    this.log("matches[%d] =", i, matches)
-    // do like the shell, and spit out the literal glob
-    if (!matches) {
-      if (this.nonull) {
-        var literal = this.minimatch.globSet[i]
-        if (nou) all.push(literal)
-        else all[literal] = true
-      }
-    } else {
-      // had matches
-      var m = Object.keys(matches)
-      if (nou) all.push.apply(all, m)
-      else m.forEach(function (m) {
-        all[m] = true
-      })
-    }
-  }
-
-  if (!nou) all = Object.keys(all)
-
-  if (!this.nosort) {
-    all = all.sort(this.nocase ? alphasorti : alphasort)
-  }
-
-  if (this.mark) {
-    // at *some* point we statted all of these
-    all = all.map(function (m) {
-      var sc = this.cache[m]
-      if (!sc)
-        return m
-      var isDir = (Array.isArray(sc) || sc === 2)
-      if (isDir && m.slice(-1) !== "/") {
-        return m + "/"
-      }
-      if (!isDir && m.slice(-1) === "/") {
-        return m.replace(/\/+$/, "")
-      }
-      return m
-    }, this)
-  }
-
-  this.log("emitting end", all)
-
-  this.EOF = this.found = all
-  this.emitMatch(this.EOF)
-}
-
-function alphasorti (a, b) {
-  a = a.toLowerCase()
-  b = b.toLowerCase()
-  return alphasort(a, b)
-}
-
-function alphasort (a, b) {
-  return a > b ? 1 : a < b ? -1 : 0
-}
-
-Glob.prototype.abort = function () {
-  this.aborted = true
-  this.emit("abort")
-}
-
-Glob.prototype.pause = function () {
-  if (this.paused) return
-  if (this.sync)
-    this.emit("error", new Error("Can't pause/resume sync glob"))
-  this.paused = true
-  this.emit("pause")
-}
-
-Glob.prototype.resume = function () {
-  if (!this.paused) return
-  if (this.sync)
-    this.emit("error", new Error("Can't pause/resume sync glob"))
-  this.paused = false
-  this.emit("resume")
-  this._processEmitQueue()
-  //process.nextTick(this.emit.bind(this, "resume"))
-}
-
-Glob.prototype.emitMatch = function (m) {
-  if (!this.stat || this.statCache[m] || m === this.EOF) {
-    this._emitQueue.push(m)
-    this._processEmitQueue()
-  } else {
-    this._stat(m, function(exists, isDir) {
-      if (exists) {
-        this._emitQueue.push(m)
-        this._processEmitQueue()
-      }
-    })
-  }
-}
-
-Glob.prototype._processEmitQueue = function (m) {
-  while (!this._processingEmitQueue &&
-         !this.paused) {
-    this._processingEmitQueue = true
-    var m = this._emitQueue.shift()
-    if (!m) {
-      this._processingEmitQueue = false
-      break
-    }
-
-    this.log('emit!', m === this.EOF ? "end" : "match")
-
-    this.emit(m === this.EOF ? "end" : "match", m)
-    this._processingEmitQueue = false
-  }
-}
-
-Glob.prototype._process = function (pattern, depth, index, cb_) {
-  assert(this instanceof Glob)
-
-  var cb = function cb (er, res) {
-    assert(this instanceof Glob)
-    if (this.paused) {
-      if (!this._processQueue) {
-        this._processQueue = []
-        this.once("resume", function () {
-          var q = this._processQueue
-          this._processQueue = null
-          q.forEach(function (cb) { cb() })
-        })
-      }
-      this._processQueue.push(cb_.bind(this, er, res))
-    } else {
-      cb_.call(this, er, res)
-    }
-  }.bind(this)
-
-  if (this.aborted) return cb()
-
-  if (depth > this.maxDepth) return cb()
-
-  // Get the first [n] parts of pattern that are all strings.
-  var n = 0
-  while (typeof pattern[n] === "string") {
-    n ++
-  }
-  // now n is the index of the first one that is *not* a string.
-
-  // see if there's anything else
-  var prefix
-  switch (n) {
-    // if not, then this is rather simple
-    case pattern.length:
-      prefix = pattern.join("/")
-      this._stat(prefix, function (exists, isDir) {
-        // either it's there, or it isn't.
-        // nothing more to do, either way.
-        if (exists) {
-          if (prefix && isAbsolute(prefix) && !this.nomount) {
-            if (prefix.charAt(0) === "/") {
-              prefix = path.join(this.root, prefix)
-            } else {
-              prefix = path.resolve(this.root, prefix)
-            }
-          }
-
-          if (process.platform === "win32")
-            prefix = prefix.replace(/\\/g, "/")
-
-          this.matches[index] = this.matches[index] || {}
-          this.matches[index][prefix] = true
-          this.emitMatch(prefix)
-        }
-        return cb()
-      })
-      return
-
-    case 0:
-      // pattern *starts* with some non-trivial item.
-      // going to readdir(cwd), but not include the prefix in matches.
-      prefix = null
-      break
-
-    default:
-      // pattern has some string bits in the front.
-      // whatever it starts with, whether that's "absolute" like /foo/bar,
-      // or "relative" like "../baz"
-      prefix = pattern.slice(0, n)
-      prefix = prefix.join("/")
-      break
-  }
-
-  // get the list of entries.
-  var read
-  if (prefix === null) read = "."
-  else if (isAbsolute(prefix) || isAbsolute(pattern.join("/"))) {
-    if (!prefix || !isAbsolute(prefix)) {
-      prefix = path.join("/", prefix)
-    }
-    read = prefix = path.resolve(prefix)
-
-    // if (process.platform === "win32")
-    //   read = prefix = prefix.replace(/^[a-zA-Z]:|\\/g, "/")
-
-    this.log('absolute: ', prefix, this.root, pattern, read)
-  } else {
-    read = prefix
-  }
-
-  this.log('readdir(%j)', read, this.cwd, this.root)
-
-  return this._readdir(read, function (er, entries) {
-    if (er) {
-      // not a directory!
-      // this means that, whatever else comes after this, it can never match
-      return cb()
-    }
-
-    // globstar is special
-    if (pattern[n] === minimatch.GLOBSTAR) {
-      // test without the globstar, and with every child both below
-      // and replacing the globstar.
-      var s = [ pattern.slice(0, n).concat(pattern.slice(n + 1)) ]
-      entries.forEach(function (e) {
-        if (e.charAt(0) === "." && !this.dot) return
-        // instead of the globstar
-        s.push(pattern.slice(0, n).concat(e).concat(pattern.slice(n + 1)))
-        // below the globstar
-        s.push(pattern.slice(0, n).concat(e).concat(pattern.slice(n)))
-      }, this)
-
-      s = s.filter(function (pattern) {
-        var key = gsKey(pattern)
-        var seen = !this._globstars[key]
-        this._globstars[key] = true
-        return seen
-      }, this)
-
-      if (!s.length)
-        return cb()
-
-      // now asyncForEach over this
-      var l = s.length
-      , errState = null
-      s.forEach(function (gsPattern) {
-        this._process(gsPattern, depth + 1, index, function (er) {
-          if (errState) return
-          if (er) return cb(errState = er)
-          if (--l <= 0) return cb()
-        })
-      }, this)
-
-      return
-    }
-
-    // not a globstar
-    // It will only match dot entries if it starts with a dot, or if
-    // dot is set.  Stuff like @(.foo|.bar) isn't allowed.
-    var pn = pattern[n]
-    var rawGlob = pattern[n]._glob
-    , dotOk = this.dot || rawGlob.charAt(0) === "."
-
-    entries = entries.filter(function (e) {
-      return (e.charAt(0) !== "." || dotOk) &&
-             e.match(pattern[n])
-    })
-
-    // If n === pattern.length - 1, then there's no need for the extra stat
-    // *unless* the user has specified "mark" or "stat" explicitly.
-    // We know that they exist, since the readdir returned them.
-    if (n === pattern.length - 1 &&
-        !this.mark &&
-        !this.stat) {
-      entries.forEach(function (e) {
-        if (prefix) {
-          if (prefix !== "/") e = prefix + "/" + e
-          else e = prefix + e
-        }
-        if (e.charAt(0) === "/" && !this.nomount) {
-          e = path.join(this.root, e)
-        }
-
-        if (process.platform === "win32")
-          e = e.replace(/\\/g, "/")
-
-        this.matches[index] = this.matches[index] || {}
-        this.matches[index][e] = true
-        this.emitMatch(e)
-      }, this)
-      return cb.call(this)
-    }
-
-
-    // now test all the remaining entries as stand-ins for that part
-    // of the pattern.
-    var l = entries.length
-    , errState = null
-    if (l === 0) return cb() // no matches possible
-    entries.forEach(function (e) {
-      var p = pattern.slice(0, n).concat(e).concat(pattern.slice(n + 1))
-      this._process(p, depth + 1, index, function (er) {
-        if (errState) return
-        if (er) return cb(errState = er)
-        if (--l === 0) return cb.call(this)
-      })
-    }, this)
-  })
-
-}
-
-function gsKey (pattern) {
-  return '**' + pattern.map(function (p) {
-    return (p === minimatch.GLOBSTAR) ? '**' : (''+p)
-  }).join('/')
-}
-
-Glob.prototype._stat = function (f, cb) {
-  assert(this instanceof Glob)
-  var abs = f
-  if (f.charAt(0) === "/") {
-    abs = path.join(this.root, f)
-  } else if (this.changedCwd) {
-    abs = path.resolve(this.cwd, f)
-  }
-
-  if (f.length > this.maxLength) {
-    var er = new Error("Path name too long")
-    er.code = "ENAMETOOLONG"
-    er.path = f
-    return this._afterStat(f, abs, cb, er)
-  }
-
-  this.log('stat', [this.cwd, f, '=', abs])
-
-  if (!this.stat && this.cache.hasOwnProperty(f)) {
-    var exists = this.cache[f]
-    , isDir = exists && (Array.isArray(exists) || exists === 2)
-    if (this.sync) return cb.call(this, !!exists, isDir)
-    return process.nextTick(cb.bind(this, !!exists, isDir))
-  }
-
-  var stat = this.statCache[abs]
-  if (this.sync || stat) {
-    var er
-    try {
-      stat = fs.statSync(abs)
-    } catch (e) {
-      er = e
-    }
-    this._afterStat(f, abs, cb, er, stat)
-  } else {
-    fs.stat(abs, this._afterStat.bind(this, f, abs, cb))
-  }
-}
-
-Glob.prototype._afterStat = function (f, abs, cb, er, stat) {
-  var exists
-  assert(this instanceof Glob)
-
-  if (abs.slice(-1) === "/" && stat && !stat.isDirectory()) {
-    this.log("should be ENOTDIR, fake it")
-
-    er = new Error("ENOTDIR, not a directory '" + abs + "'")
-    er.path = abs
-    er.code = "ENOTDIR"
-    stat = null
-  }
-
-  var emit = !this.statCache[abs]
-  this.statCache[abs] = stat
-
-  if (er || !stat) {
-    exists = false
-  } else {
-    exists = stat.isDirectory() ? 2 : 1
-    if (emit)
-      this.emit('stat', f, stat)
-  }
-  this.cache[f] = this.cache[f] || exists
-  cb.call(this, !!exists, exists === 2)
-}
-
-Glob.prototype._readdir = function (f, cb) {
-  assert(this instanceof Glob)
-  var abs = f
-  if (f.charAt(0) === "/") {
-    abs = path.join(this.root, f)
-  } else if (isAbsolute(f)) {
-    abs = f
-  } else if (this.changedCwd) {
-    abs = path.resolve(this.cwd, f)
-  }
-
-  if (f.length > this.maxLength) {
-    var er = new Error("Path name too long")
-    er.code = "ENAMETOOLONG"
-    er.path = f
-    return this._afterReaddir(f, abs, cb, er)
-  }
-
-  this.log('readdir', [this.cwd, f, abs])
-  if (this.cache.hasOwnProperty(f)) {
-    var c = this.cache[f]
-    if (Array.isArray(c)) {
-      if (this.sync) return cb.call(this, null, c)
-      return process.nextTick(cb.bind(this, null, c))
-    }
-
-    if (!c || c === 1) {
-      // either ENOENT or ENOTDIR
-      var code = c ? "ENOTDIR" : "ENOENT"
-      , er = new Error((c ? "Not a directory" : "Not found") + ": " + f)
-      er.path = f
-      er.code = code
-      this.log(f, er)
-      if (this.sync) return cb.call(this, er)
-      return process.nextTick(cb.bind(this, er))
-    }
-
-    // at this point, c === 2, meaning it's a dir, but we haven't
-    // had to read it yet, or c === true, meaning it's *something*
-    // but we don't have any idea what.  Need to read it, either way.
-  }
-
-  if (this.sync) {
-    var er, entries
-    try {
-      entries = fs.readdirSync(abs)
-    } catch (e) {
-      er = e
-    }
-    return this._afterReaddir(f, abs, cb, er, entries)
-  }
-
-  fs.readdir(abs, this._afterReaddir.bind(this, f, abs, cb))
-}
-
-Glob.prototype._afterReaddir = function (f, abs, cb, er, entries) {
-  assert(this instanceof Glob)
-  if (entries && !er) {
-    this.cache[f] = entries
-    // if we haven't asked to stat everything for suresies, then just
-    // assume that everything in there exists, so we can avoid
-    // having to stat it a second time.  This also gets us one step
-    // further into ELOOP territory.
-    if (!this.mark && !this.stat) {
-      entries.forEach(function (e) {
-        if (f === "/") e = f + e
-        else e = f + "/" + e
-        this.cache[e] = true
-      }, this)
-    }
-
-    return cb.call(this, er, entries)
-  }
-
-  // now handle errors, and cache the information
-  if (er) switch (er.code) {
-    case "ENOTDIR": // totally normal. means it *does* exist.
-      this.cache[f] = 1
-      return cb.call(this, er)
-    case "ENOENT": // not terribly unusual
-    case "ELOOP":
-    case "ENAMETOOLONG":
-    case "UNKNOWN":
-      this.cache[f] = false
-      return cb.call(this, er)
-    default: // some unusual error.  Treat as failure.
-      this.cache[f] = false
-      if (this.strict) this.emit("error", er)
-      if (!this.silent) console.error("glob error", er)
-      return cb.call(this, er)
-  }
-}
-
-var isAbsolute = process.platform === "win32" ? absWin : absUnix
-
-function absWin (p) {
-  if (absUnix(p)) return true
-  // pull off the device/UNC bit from a windows path.
-  // from node's lib/path.js
-  var splitDeviceRe =
-      /^([a-zA-Z]:|[\\\/]{2}[^\\\/]+[\\\/]+[^\\\/]+)?([\\\/])?([\s\S]*?)$/
-    , result = splitDeviceRe.exec(p)
-    , device = result[1] || ''
-    , isUnc = device && device.charAt(1) !== ':'
-    , isAbsolute = !!result[2] || isUnc // UNC paths are always absolute
-
-  return isAbsolute
-}
-
-function absUnix (p) {
-  return p.charAt(0) === "/" || p === ""
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/glob/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,38 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "glob",
-  "description": "a little globber",
-  "version": "3.2.6",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/node-glob.git"
-  },
-  "main": "glob.js",
-  "engines": {
-    "node": "*"
-  },
-  "dependencies": {
-    "minimatch": "~0.2.11",
-    "inherits": "2"
-  },
-  "devDependencies": {
-    "tap": "~0.4.0",
-    "mkdirp": "0",
-    "rimraf": "1"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "license": "BSD",
-  "readme": "# Glob\n\nMatch files using the patterns the shell uses, like stars and stuff.\n\nThis is a glob implementation in JavaScript.  It uses the `minimatch`\nlibrary to do its matching.\n\n## Attention: node-glob users!\n\nThe API has changed dramatically between 2.x and 3.x. This library is\nnow 100% JavaScript, and the integer flags have been replaced with an\noptions object.\n\nAlso, there's an event emitter class, proper tests, and all the other\nthings you've come to expect from node modules.\n\nAnd best of all, no compilation!\n\n## Usage\n\n```javascript\nvar glob = require(\"glob\")\n\n// options is optional\nglob(\"**/*.js\", options, function (er, files) {\n  // files is an array of filenames.\n  // If the `nonull` option is set, and nothing\n  // was found, then files is [\"**/*.js\"]\n  // er is an error object or null.\n})\n```\n\n## Features\n\nPlease see the [minimatch\ndocumentation](https://github.com/isaacs/minimatch) for more details.\n\nSupports these glob features:\n\n* Brace Expansion\n* Extended glob matching\n* \"Globstar\" `**` matching\n\nSee:\n\n* `man sh`\n* `man bash`\n* `man 3 fnmatch`\n* `man 5 gitignore`\n* [minimatch documentation](https://github.com/isaacs/minimatch)\n\n## glob(pattern, [options], cb)\n\n* `pattern` {String} Pattern to be matched\n* `options` {Object}\n* `cb` {Function}\n  * `err` {Error | null}\n  * `matches` {Array<String>} filenames found matching the pattern\n\nPerform an asynchronous glob search.\n\n## glob.sync(pattern, [options])\n\n* `pattern` {String} Pattern to be matched\n* `options` {Object}\n* return: {Array<String>} filenames found matching the pattern\n\nPerform a synchronous glob search.\n\n## Class: glob.Glob\n\nCreate a Glob object by instanting the `glob.Glob` class.\n\n```javascript\nvar Glob = require(\"glob\").Glob\nvar mg = new Glob(pattern, options, cb)\n```\n\nIt's an EventEmitter, and starts walking the filesystem to find matches\nimmediately.\n\n### new glob.Glob(pattern, [options], [cb])\n\n* `pattern` {String} pattern to search for\n* `options` {Object}\n* `cb` {Function} Called when an error occurs, or matches are found\n  * `err` {Error | null}\n  * `matches` {Array<String>} filenames found matching the pattern\n\nNote that if the `sync` flag is set in the options, then matches will\nbe immediately available on the `g.found` member.\n\n### Properties\n\n* `minimatch` The minimatch object that the glob uses.\n* `options` The options object passed in.\n* `error` The error encountered.  When an error is encountered, the\n  glob object is in an undefined state, and should be discarded.\n* `aborted` Boolean which is set to true when calling `abort()`.  There\n  is no way at this time to continue a glob search after aborting, but\n  you can re-use the statCache to avoid having to duplicate syscalls.\n* `statCache` Collection of all the stat results the glob search\n  performed.\n* `cache` Convenience object.  Each field has the following possible\n  values:\n  * `false` - Path does not exist\n  * `true` - Path exists\n  * `1` - Path exists, and is not a directory\n  * `2` - Path exists, and is a directory\n  * `[file, entries, ...]` - Path exists, is a directory, and the\n    array value is the results of `fs.readdir`\n\n### Events\n\n* `end` When the matching is finished, this is emitted with all the\n  matches found.  If the `nonull` option is set, and no match was found,\n  then the `matches` list contains the original pattern.  The matches\n  are sorted, unless the `nosort` flag is set.\n* `match` Every time a match is found, this is emitted with the matched.\n* `error` Emitted when an unexpected error is encountered, or whenever\n  any fs error occurs if `options.strict` is set.\n* `abort` When `abort()` is called, this event is raised.\n\n### Methods\n\n* `abort` Stop the search.\n\n### Options\n\nAll the options that can be passed to Minimatch can also be passed to\nGlob to change pattern matching behavior.  Also, some have been added,\nor have glob-specific ramifications.\n\nAll options are false by default, unless otherwise noted.\n\nAll options are added to the glob object, as well.\n\n* `cwd` The current working directory in which to search.  Defaults\n  to `process.cwd()`.\n* `root` The place where patterns starting with `/` will be mounted\n  onto.  Defaults to `path.resolve(options.cwd, \"/\")` (`/` on Unix\n  systems, and `C:\\` or some such on Windows.)\n* `dot` Include `.dot` files in normal matches and `globstar` matches.\n  Note that an explicit dot in a portion of the pattern will always\n  match dot files.\n* `nomount` By default, a pattern starting with a forward-slash will be\n  \"mounted\" onto the root setting, so that a valid filesystem path is\n  returned.  Set this flag to disable that behavior.\n* `mark` Add a `/` character to directory matches.  Note that this\n  requires additional stat calls.\n* `nosort` Don't sort the results.\n* `stat` Set to true to stat *all* results.  This reduces performance\n  somewhat, and is completely unnecessary, unless `readdir` is presumed\n  to be an untrustworthy indicator of file existence.  It will cause\n  ELOOP to be triggered one level sooner in the case of cyclical\n  symbolic links.\n* `silent` When an unusual error is encountered\n  when attempting to read a directory, a warning will be printed to\n  stderr.  Set the `silent` option to true to suppress these warnings.\n* `strict` When an unusual error is encountered\n  when attempting to read a directory, the process will just continue on\n  in search of other matches.  Set the `strict` option to raise an error\n  in these cases.\n* `cache` See `cache` property above.  Pass in a previously generated\n  cache object to save some fs calls.\n* `statCache` A cache of results of filesystem information, to prevent\n  unnecessary stat calls.  While it should not normally be necessary to\n  set this, you may pass the statCache from one glob() call to the\n  options object of another, if you know that the filesystem will not\n  change between calls.  (See \"Race Conditions\" below.)\n* `sync` Perform a synchronous glob search.\n* `nounique` In some cases, brace-expanded patterns can result in the\n  same file showing up multiple times in the result set.  By default,\n  this implementation prevents duplicates in the result set.\n  Set this flag to disable that behavior.\n* `nonull` Set to never return an empty set, instead returning a set\n  containing the pattern itself.  This is the default in glob(3).\n* `nocase` Perform a case-insensitive match.  Note that case-insensitive\n  filesystems will sometimes result in glob returning results that are\n  case-insensitively matched anyway, since readdir and stat will not\n  raise an error.\n* `debug` Set to enable debug logging in minimatch and glob.\n* `globDebug` Set to enable debug logging in glob, but not minimatch.\n\n## Comparisons to other fnmatch/glob implementations\n\nWhile strict compliance with the existing standards is a worthwhile\ngoal, some discrepancies exist between node-glob and other\nimplementations, and are intentional.\n\nIf the pattern starts with a `!` character, then it is negated.  Set the\n`nonegate` flag to suppress this behavior, and treat leading `!`\ncharacters normally.  This is perhaps relevant if you wish to start the\npattern with a negative extglob pattern like `!(a|B)`.  Multiple `!`\ncharacters at the start of a pattern will negate the pattern multiple\ntimes.\n\nIf a pattern starts with `#`, then it is treated as a comment, and\nwill not match anything.  Use `\\#` to match a literal `#` at the\nstart of a line, or set the `nocomment` flag to suppress this behavior.\n\nThe double-star character `**` is supported by default, unless the\n`noglobstar` flag is set.  This is supported in the manner of bsdglob\nand bash 4.1, where `**` only has special significance if it is the only\nthing in a path part.  That is, `a/**/b` will match `a/x/y/b`, but\n`a/**b` will not.\n\nIf an escaped pattern has no matches, and the `nonull` flag is set,\nthen glob returns the pattern as-provided, rather than\ninterpreting the character escapes.  For example,\n`glob.match([], \"\\\\*a\\\\?\")` will return `\"\\\\*a\\\\?\"` rather than\n`\"*a?\"`.  This is akin to setting the `nullglob` option in bash, except\nthat it does not resolve escaped pattern characters.\n\nIf brace expansion is not disabled, then it is performed before any\nother interpretation of the glob pattern.  Thus, a pattern like\n`+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded\n**first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are\nchecked for validity.  Since those two are valid, matching proceeds.\n\n## Windows\n\n**Please only use forward-slashes in glob expressions.**\n\nThough windows uses either `/` or `\\` as its path separator, only `/`\ncharacters are used by this glob implementation.  You must use\nforward-slashes **only** in glob expressions.  Back-slashes will always\nbe interpreted as escape characters, not path separators.\n\nResults from absolute patterns such as `/foo/*` are mounted onto the\nroot setting using `path.join`.  On windows, this will by default result\nin `/foo/*` matching `C:\\foo\\bar.txt`.\n\n## Race Conditions\n\nGlob searching, by its very nature, is susceptible to race conditions,\nsince it relies on directory walking and such.\n\nAs a result, it is possible that a file that exists when glob looks for\nit may have been deleted or modified by the time it returns the result.\n\nAs part of its internal implementation, this program caches all stat\nand readdir calls that it makes, in order to cut down on system\noverhead.  However, this also makes it even more susceptible to races,\nespecially if the cache or statCache objects are reused between glob\ncalls.\n\nUsers are thus advised not to use a glob result as a guarantee of\nfilesystem state in the face of rapid changes.  For the vast majority\nof operations, this is never a problem.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/node-glob/issues"
-  },
-  "_id": "glob@3.2.6",
-  "_from": "glob@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-node_modules/
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-# graceful-fs
-
-graceful-fs functions as a drop-in replacement for the fs module,
-making various improvements.
-
-The improvements are meant to normalize behavior across different
-platforms and environments, and to make filesystem access more
-resilient to errors.
-
-## Improvements over fs module
-
-graceful-fs:
-
-* Queues up `open` and `readdir` calls, and retries them once
-  something closes if there is an EMFILE error from too many file
-  descriptors.
-* fixes `lchmod` for Node versions prior to 0.6.2.
-* implements `fs.lutimes` if possible. Otherwise it becomes a noop.
-* ignores `EINVAL` and `EPERM` errors in `chown`, `fchown` or
-  `lchown` if the user isn't root.
-* makes `lchmod` and `lchown` become noops, if not available.
-* retries reading a file if `read` results in EAGAIN error.
-
-On Windows, it retries renaming a file for up to one second if `EACCESS`
-or `EPERM` error occurs, likely because antivirus software has locked
-the directory.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/graceful-fs.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,159 +0,0 @@
-// Monkey-patching the fs module.
-// It's ugly, but there is simply no other way to do this.
-var fs = module.exports = require('fs')
-
-var assert = require('assert')
-
-// fix up some busted stuff, mostly on windows and old nodes
-require('./polyfills.js')
-
-// The EMFILE enqueuing stuff
-
-var util = require('util')
-
-function noop () {}
-
-var debug = noop
-var util = require('util')
-if (util.debuglog)
-  debug = util.debuglog('gfs')
-else if (/\bgfs\b/i.test(process.env.NODE_DEBUG || ''))
-  debug = function() {
-    var m = util.format.apply(util, arguments)
-    m = 'GFS: ' + m.split(/\n/).join('\nGFS: ')
-    console.error(m)
-  }
-
-if (/\bgfs\b/i.test(process.env.NODE_DEBUG || '')) {
-  process.on('exit', function() {
-    debug('fds', fds)
-    debug(queue)
-    assert.equal(queue.length, 0)
-  })
-}
-
-
-var originalOpen = fs.open
-fs.open = open
-
-function open(path, flags, mode, cb) {
-  if (typeof mode === "function") cb = mode, mode = null
-  if (typeof cb !== "function") cb = noop
-  new OpenReq(path, flags, mode, cb)
-}
-
-function OpenReq(path, flags, mode, cb) {
-  this.path = path
-  this.flags = flags
-  this.mode = mode
-  this.cb = cb
-  Req.call(this)
-}
-
-util.inherits(OpenReq, Req)
-
-OpenReq.prototype.process = function() {
-  originalOpen.call(fs, this.path, this.flags, this.mode, this.done)
-}
-
-var fds = {}
-OpenReq.prototype.done = function(er, fd) {
-  debug('open done', er, fd)
-  if (fd)
-    fds['fd' + fd] = this.path
-  Req.prototype.done.call(this, er, fd)
-}
-
-
-var originalReaddir = fs.readdir
-fs.readdir = readdir
-
-function readdir(path, cb) {
-  if (typeof cb !== "function") cb = noop
-  new ReaddirReq(path, cb)
-}
-
-function ReaddirReq(path, cb) {
-  this.path = path
-  this.cb = cb
-  Req.call(this)
-}
-
-util.inherits(ReaddirReq, Req)
-
-ReaddirReq.prototype.process = function() {
-  originalReaddir.call(fs, this.path, this.done)
-}
-
-ReaddirReq.prototype.done = function(er, files) {
-  Req.prototype.done.call(this, er, files)
-  onclose()
-}
-
-
-var originalClose = fs.close
-fs.close = close
-
-function close (fd, cb) {
-  debug('close', fd)
-  if (typeof cb !== "function") cb = noop
-  delete fds['fd' + fd]
-  originalClose.call(fs, fd, function(er) {
-    onclose()
-    cb(er)
-  })
-}
-
-
-var originalCloseSync = fs.closeSync
-fs.closeSync = closeSync
-
-function closeSync (fd) {
-  try {
-    return originalCloseSync(fd)
-  } finally {
-    onclose()
-  }
-}
-
-
-// Req class
-function Req () {
-  // start processing
-  this.done = this.done.bind(this)
-  this.failures = 0
-  this.process()
-}
-
-Req.prototype.done = function (er, result) {
-  var tryAgain = false
-  if (er) {
-    var code = er.code
-    var tryAgain = code === "EMFILE"
-    if (process.platform === "win32")
-      tryAgain = tryAgain || code === "OK"
-  }
-
-  if (tryAgain) {
-    this.failures ++
-    enqueue(this)
-  } else {
-    var cb = this.cb
-    cb(er, result)
-  }
-}
-
-var queue = []
-
-function enqueue(req) {
-  queue.push(req)
-  debug('enqueue %d %s', queue.length, req.constructor.name, req)
-}
-
-function onclose() {
-  var req = queue.shift()
-  if (req) {
-    debug('process', req.constructor.name, req)
-    req.process()
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,48 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me"
-  },
-  "name": "graceful-fs",
-  "description": "A drop-in replacement for fs, making various improvements.",
-  "version": "2.0.1",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/node-graceful-fs.git"
-  },
-  "main": "graceful-fs.js",
-  "engines": {
-    "node": ">=0.4.0"
-  },
-  "directories": {
-    "test": "test"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "keywords": [
-    "fs",
-    "module",
-    "reading",
-    "retry",
-    "retries",
-    "queue",
-    "error",
-    "errors",
-    "handling",
-    "EMFILE",
-    "EAGAIN",
-    "EINVAL",
-    "EPERM",
-    "EACCESS"
-  ],
-  "license": "BSD",
-  "readme": "# graceful-fs\n\ngraceful-fs functions as a drop-in replacement for the fs module,\nmaking various improvements.\n\nThe improvements are meant to normalize behavior across different\nplatforms and environments, and to make filesystem access more\nresilient to errors.\n\n## Improvements over fs module\n\ngraceful-fs:\n\n* Queues up `open` and `readdir` calls, and retries them once\n  something closes if there is an EMFILE error from too many file\n  descriptors.\n* fixes `lchmod` for Node versions prior to 0.6.2.\n* implements `fs.lutimes` if possible. Otherwise it becomes a noop.\n* ignores `EINVAL` and `EPERM` errors in `chown`, `fchown` or\n  `lchown` if the user isn't root.\n* makes `lchmod` and `lchown` become noops, if not available.\n* retries reading a file if `read` results in EAGAIN error.\n\nOn Windows, it retries renaming a file for up to one second if `EACCESS`\nor `EPERM` error occurs, likely because antivirus software has locked\nthe directory.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/node-graceful-fs/issues"
-  },
-  "_id": "graceful-fs@2.0.1",
-  "_from": "graceful-fs@~2.0.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/graceful-fs/polyfills.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,228 +0,0 @@
-var fs = require('fs')
-var constants = require('constants')
-
-var origCwd = process.cwd
-var cwd = null
-process.cwd = function() {
-  if (!cwd)
-    cwd = origCwd.call(process)
-  return cwd
-}
-var chdir = process.chdir
-process.chdir = function(d) {
-  cwd = null
-  chdir.call(process, d)
-}
-
-// (re-)implement some things that are known busted or missing.
-
-// lchmod, broken prior to 0.6.2
-// back-port the fix here.
-if (constants.hasOwnProperty('O_SYMLINK') &&
-    process.version.match(/^v0\.6\.[0-2]|^v0\.5\./)) {
-  fs.lchmod = function (path, mode, callback) {
-    callback = callback || noop
-    fs.open( path
-           , constants.O_WRONLY | constants.O_SYMLINK
-           , mode
-           , function (err, fd) {
-      if (err) {
-        callback(err)
-        return
-      }
-      // prefer to return the chmod error, if one occurs,
-      // but still try to close, and report closing errors if they occur.
-      fs.fchmod(fd, mode, function (err) {
-        fs.close(fd, function(err2) {
-          callback(err || err2)
-        })
-      })
-    })
-  }
-
-  fs.lchmodSync = function (path, mode) {
-    var fd = fs.openSync(path, constants.O_WRONLY | constants.O_SYMLINK, mode)
-
-    // prefer to return the chmod error, if one occurs,
-    // but still try to close, and report closing errors if they occur.
-    var err, err2
-    try {
-      var ret = fs.fchmodSync(fd, mode)
-    } catch (er) {
-      err = er
-    }
-    try {
-      fs.closeSync(fd)
-    } catch (er) {
-      err2 = er
-    }
-    if (err || err2) throw (err || err2)
-    return ret
-  }
-}
-
-
-// lutimes implementation, or no-op
-if (!fs.lutimes) {
-  if (constants.hasOwnProperty("O_SYMLINK")) {
-    fs.lutimes = function (path, at, mt, cb) {
-      fs.open(path, constants.O_SYMLINK, function (er, fd) {
-        cb = cb || noop
-        if (er) return cb(er)
-        fs.futimes(fd, at, mt, function (er) {
-          fs.close(fd, function (er2) {
-            return cb(er || er2)
-          })
-        })
-      })
-    }
-
-    fs.lutimesSync = function (path, at, mt) {
-      var fd = fs.openSync(path, constants.O_SYMLINK)
-        , err
-        , err2
-        , ret
-
-      try {
-        var ret = fs.futimesSync(fd, at, mt)
-      } catch (er) {
-        err = er
-      }
-      try {
-        fs.closeSync(fd)
-      } catch (er) {
-        err2 = er
-      }
-      if (err || err2) throw (err || err2)
-      return ret
-    }
-
-  } else if (fs.utimensat && constants.hasOwnProperty("AT_SYMLINK_NOFOLLOW")) {
-    // maybe utimensat will be bound soonish?
-    fs.lutimes = function (path, at, mt, cb) {
-      fs.utimensat(path, at, mt, constants.AT_SYMLINK_NOFOLLOW, cb)
-    }
-
-    fs.lutimesSync = function (path, at, mt) {
-      return fs.utimensatSync(path, at, mt, constants.AT_SYMLINK_NOFOLLOW)
-    }
-
-  } else {
-    fs.lutimes = function (_a, _b, _c, cb) { process.nextTick(cb) }
-    fs.lutimesSync = function () {}
-  }
-}
-
-
-// https://github.com/isaacs/node-graceful-fs/issues/4
-// Chown should not fail on einval or eperm if non-root.
-
-fs.chown = chownFix(fs.chown)
-fs.fchown = chownFix(fs.fchown)
-fs.lchown = chownFix(fs.lchown)
-
-fs.chownSync = chownFixSync(fs.chownSync)
-fs.fchownSync = chownFixSync(fs.fchownSync)
-fs.lchownSync = chownFixSync(fs.lchownSync)
-
-function chownFix (orig) {
-  if (!orig) return orig
-  return function (target, uid, gid, cb) {
-    return orig.call(fs, target, uid, gid, function (er, res) {
-      if (chownErOk(er)) er = null
-      cb(er, res)
-    })
-  }
-}
-
-function chownFixSync (orig) {
-  if (!orig) return orig
-  return function (target, uid, gid) {
-    try {
-      return orig.call(fs, target, uid, gid)
-    } catch (er) {
-      if (!chownErOk(er)) throw er
-    }
-  }
-}
-
-function chownErOk (er) {
-  // if there's no getuid, or if getuid() is something other than 0,
-  // and the error is EINVAL or EPERM, then just ignore it.
-  // This specific case is a silent failure in cp, install, tar,
-  // and most other unix tools that manage permissions.
-  // When running as root, or if other types of errors are encountered,
-  // then it's strict.
-  if (!er || (!process.getuid || process.getuid() !== 0)
-      && (er.code === "EINVAL" || er.code === "EPERM")) return true
-}
-
-
-// if lchmod/lchown do not exist, then make them no-ops
-if (!fs.lchmod) {
-  fs.lchmod = function (path, mode, cb) {
-    process.nextTick(cb)
-  }
-  fs.lchmodSync = function () {}
-}
-if (!fs.lchown) {
-  fs.lchown = function (path, uid, gid, cb) {
-    process.nextTick(cb)
-  }
-  fs.lchownSync = function () {}
-}
-
-
-
-// on Windows, A/V software can lock the directory, causing this
-// to fail with an EACCES or EPERM if the directory contains newly
-// created files.  Try again on failure, for up to 1 second.
-if (process.platform === "win32") {
-  var rename_ = fs.rename
-  fs.rename = function rename (from, to, cb) {
-    var start = Date.now()
-    rename_(from, to, function CB (er) {
-      if (er
-          && (er.code === "EACCES" || er.code === "EPERM")
-          && Date.now() - start < 1000) {
-        return rename_(from, to, CB)
-      }
-      cb(er)
-    })
-  }
-}
-
-
-// if read() returns EAGAIN, then just try it again.
-var read = fs.read
-fs.read = function (fd, buffer, offset, length, position, callback_) {
-  var callback
-  if (callback_ && typeof callback_ === 'function') {
-    var eagCounter = 0
-    callback = function (er, _, __) {
-      if (er && er.code === 'EAGAIN' && eagCounter < 10) {
-        eagCounter ++
-        return read.call(fs, fd, buffer, offset, length, position, callback)
-      }
-      callback_.apply(this, arguments)
-    }
-  }
-  return read.call(fs, fd, buffer, offset, length, position, callback)
-}
-
-var readSync = fs.readSync
-fs.readSync = function (fd, buffer, offset, length, position) {
-  var eagCounter = 0
-  while (true) {
-    try {
-      return readSync.call(fs, fd, buffer, offset, length, position)
-    } catch (er) {
-      if (er.code === 'EAGAIN' && eagCounter < 10) {
-        eagCounter ++
-        continue
-      }
-      throw er
-    }
-  }
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-The ISC License
-
-Copyright (c) Isaac Z. Schlueter
-
-Permission to use, copy, modify, and/or distribute this software for any
-purpose with or without fee is hereby granted, provided that the above
-copyright notice and this permission notice appear in all copies.
-
-THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
-REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
-FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
-INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
-LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
-OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
-PERFORMANCE OF THIS SOFTWARE.
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-Browser-friendly inheritance fully compatible with standard node.js
-[inherits](http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor).
-
-This package exports standard `inherits` from node.js `util` module in
-node environment, but also provides alternative browser-friendly
-implementation through [browser
-field](https://gist.github.com/shtylman/4339901). Alternative
-implementation is a literal copy of standard one located in standalone
-module to avoid requiring of `util`. It also has a shim for old
-browsers with no `Object.create` support.
-
-While keeping you sure you are using standard `inherits`
-implementation in node.js environment, it allows bundlers such as
-[browserify](https://github.com/substack/node-browserify) to not
-include full `util` package to your client code if all you need is
-just `inherits` function. It worth, because browser shim for `util`
-package is large and `inherits` is often the single function you need
-from it.
-
-It's recommended to use this package instead of
-`require('util').inherits` for any code that has chances to be used
-not only in node.js but in browser too.
-
-## usage
-
-```js
-var inherits = require('inherits');
-// then use exactly as the standard one
-```
-
-## note on version ~1.0
-
-Version ~1.0 had completely different motivation and is not compatible
-neither with 2.0 nor with standard node.js `inherits`.
-
-If you are using version ~1.0 and planning to switch to ~2.0, be
-careful:
-
-* new version uses `super_` instead of `super` for referencing
-  superclass
-* new version overwrites current prototype while old one preserves any
-  existing fields on it
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/inherits.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require('util').inherits
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/inherits_browser.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-if (typeof Object.create === 'function') {
-  // implementation from standard node.js 'util' module
-  module.exports = function inherits(ctor, superCtor) {
-    ctor.super_ = superCtor
-    ctor.prototype = Object.create(superCtor.prototype, {
-      constructor: {
-        value: ctor,
-        enumerable: false,
-        writable: true,
-        configurable: true
-      }
-    });
-  };
-} else {
-  // old school shim for old browsers
-  module.exports = function inherits(ctor, superCtor) {
-    ctor.super_ = superCtor
-    var TempCtor = function () {}
-    TempCtor.prototype = superCtor.prototype
-    ctor.prototype = new TempCtor()
-    ctor.prototype.constructor = ctor
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-{
-  "name": "inherits",
-  "description": "Browser-friendly inheritance fully compatible with standard node.js inherits()",
-  "version": "2.0.1",
-  "keywords": [
-    "inheritance",
-    "class",
-    "klass",
-    "oop",
-    "object-oriented",
-    "inherits",
-    "browser",
-    "browserify"
-  ],
-  "main": "./inherits.js",
-  "browser": "./inherits_browser.js",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/inherits"
-  },
-  "license": "ISC",
-  "scripts": {
-    "test": "node test"
-  },
-  "readme": "Browser-friendly inheritance fully compatible with standard node.js\n[inherits](http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor).\n\nThis package exports standard `inherits` from node.js `util` module in\nnode environment, but also provides alternative browser-friendly\nimplementation through [browser\nfield](https://gist.github.com/shtylman/4339901). Alternative\nimplementation is a literal copy of standard one located in standalone\nmodule to avoid requiring of `util`. It also has a shim for old\nbrowsers with no `Object.create` support.\n\nWhile keeping you sure you are using standard `inherits`\nimplementation in node.js environment, it allows bundlers such as\n[browserify](https://github.com/substack/node-browserify) to not\ninclude full `util` package to your client code if all you need is\njust `inherits` function. It worth, because browser shim for `util`\npackage is large and `inherits` is often the single function you need\nfrom it.\n\nIt's recommended to use this package instead of\n`require('util').inherits` for any code that has chances to be used\nnot only in node.js but in browser too.\n\n## usage\n\n```js\nvar inherits = require('inherits');\n// then use exactly as the standard one\n```\n\n## note on version ~1.0\n\nVersion ~1.0 had completely different motivation and is not compatible\nneither with 2.0 nor with standard node.js `inherits`.\n\nIf you are using version ~1.0 and planning to switch to ~2.0, be\ncareful:\n\n* new version uses `super_` instead of `super` for referencing\n  superclass\n* new version overwrites current prototype while old one preserves any\n  existing fields on it\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/inherits/issues"
-  },
-  "_id": "inherits@2.0.1",
-  "_from": "inherits@"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/inherits/test.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-var inherits = require('./inherits.js')
-var assert = require('assert')
-
-function test(c) {
-  assert(c.constructor === Child)
-  assert(c.constructor.super_ === Parent)
-  assert(Object.getPrototypeOf(c) === Child.prototype)
-  assert(Object.getPrototypeOf(Object.getPrototypeOf(c)) === Parent.prototype)
-  assert(c instanceof Child)
-  assert(c instanceof Parent)
-}
-
-function Child() {
-  Parent.call(this)
-  test(this)
-}
-
-function Parent() {}
-
-inherits(Child, Parent)
-
-var c = new Child
-test(c)
-
-console.log('ok')
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ini/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-Copyright 2009, 2010, 2011 Isaac Z. Schlueter.
-All rights reserved.
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ini/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,79 +0,0 @@
-An ini format parser and serializer for node.
-
-Sections are treated as nested objects.  Items before the first heading
-are saved on the object directly.
-
-## Usage
-
-Consider an ini-file `config.ini` that looks like this:
-
-    ; this comment is being ignored
-    scope = global
-
-    [database]
-    user = dbuser
-    password = dbpassword
-    database = use_this_database
-
-    [paths.default]
-    datadir = /var/lib/data
-    array[] = first value
-    array[] = second value
-    array[] = third value
-
-You can read, manipulate and write the ini-file like so:
-
-    var fs = require('fs')
-      , ini = require('ini')
-
-    var config = ini.parse(fs.readFileSync('./config.ini', 'utf-8'))
-
-    config.scope = 'local'
-    config.database.database = 'use_another_database'
-    config.paths.default.tmpdir = '/tmp'
-    delete config.paths.default.datadir
-    config.paths.default.array.push('fourth value')
-
-    fs.writeFileSync('./config_modified.ini', ini.stringify(config, 'section'))
-
-This will result in a file called `config_modified.ini` being written to the filesystem with the following content:
-
-    [section]
-    scope = local
-    [section.database]
-    user = dbuser
-    password = dbpassword
-    database = use_another_database
-    [section.paths.default]
-    tmpdir = /tmp
-    array[] = first value
-    array[] = second value
-    array[] = third value
-    array[] = fourth value
-
-
-## API
-
-### decode(inistring)
-Decode the ini-style formatted `inistring` into a nested object.
-
-### parse(inistring)
-Alias for `decode(inistring)`
-
-### encode(object, [section])
-Encode the object `object` into an ini-style formatted string. If the optional parameter `section` is given, then all top-level properties of the object are put into this section and the `section`-string is prepended to all sub-sections, see the usage example above.
-
-### stringify(object, [section])
-Alias for `encode(object, [section])`
-
-### safe(val)
-Escapes the string `val` such that it is safe to be used as a key or value in an ini-file. Basically escapes quotes. For example
-
-    ini.safe('"unsafe string"')
-
-would result in
-
-    "\"unsafe string\""
-
-### unsafe(val)
-Unescapes the string `val`
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ini/ini.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,166 +0,0 @@
-
-exports.parse = exports.decode = decode
-exports.stringify = exports.encode = encode
-
-exports.safe = safe
-exports.unsafe = unsafe
-
-var eol = process.platform === "win32" ? "\r\n" : "\n"
-
-function encode (obj, section) {
-  var children = []
-    , out = ""
-
-  Object.keys(obj).forEach(function (k, _, __) {
-    var val = obj[k]
-    if (val && Array.isArray(val)) {
-        val.forEach(function(item) {
-            out += safe(k + "[]") + " = " + safe(item) + "\n"
-        })
-    }
-    else if (val && typeof val === "object") {
-      children.push(k)
-    } else {
-      out += safe(k) + " = " + safe(val) + eol
-    }
-  })
-
-  if (section && out.length) {
-    out = "[" + safe(section) + "]" + eol + out
-  }
-
-  children.forEach(function (k, _, __) {
-    var nk = dotSplit(k).join('\\.')
-    var child = encode(obj[k], (section ? section + "." : "") + nk)
-    if (out.length && child.length) {
-      out += eol
-    }
-    out += child
-  })
-
-  return out
-}
-
-function dotSplit (str) {
-  return str.replace(/\1/g, '\2LITERAL\\1LITERAL\2')
-         .replace(/\\\./g, '\1')
-         .split(/\./).map(function (part) {
-           return part.replace(/\1/g, '\\.')
-                  .replace(/\2LITERAL\\1LITERAL\2/g, '\1')
-         })
-}
-
-function decode (str) {
-  var out = {}
-    , p = out
-    , section = null
-    , state = "START"
-           // section     |key = value
-    , re = /^\[([^\]]*)\]$|^([^=]+)(=(.*))?$/i
-    , lines = str.split(/[\r\n]+/g)
-    , section = null
-
-  lines.forEach(function (line, _, __) {
-    if (!line || line.match(/^\s*;/)) return
-    var match = line.match(re)
-    if (!match) return
-    if (match[1] !== undefined) {
-      section = unsafe(match[1])
-      p = out[section] = out[section] || {}
-      return
-    }
-    var key = unsafe(match[2])
-      , value = match[3] ? unsafe((match[4] || "")) : true
-    switch (value) {
-      case 'true':
-      case 'false':
-      case 'null': value = JSON.parse(value)
-    }
-
-    // Convert keys with '[]' suffix to an array
-    if (key.length > 2 && key.slice(-2) === "[]") {
-        key = key.substring(0, key.length - 2)
-        if (!p[key]) {
-          p[key] = []
-        }
-        else if (!Array.isArray(p[key])) {
-          p[key] = [p[key]]
-        }
-    }
-
-    // safeguard against resetting a previously defined
-    // array by accidentally forgetting the brackets
-    if (Array.isArray(p[key])) {
-      p[key].push(value)
-    }
-    else {
-      p[key] = value
-    }
-  })
-
-  // {a:{y:1},"a.b":{x:2}} --> {a:{y:1,b:{x:2}}}
-  // use a filter to return the keys that have to be deleted.
-  Object.keys(out).filter(function (k, _, __) {
-    if (!out[k] || typeof out[k] !== "object" || Array.isArray(out[k])) return false
-    // see if the parent section is also an object.
-    // if so, add it to that, and mark this one for deletion
-    var parts = dotSplit(k)
-      , p = out
-      , l = parts.pop()
-      , nl = l.replace(/\\\./g, '.')
-    parts.forEach(function (part, _, __) {
-      if (!p[part] || typeof p[part] !== "object") p[part] = {}
-      p = p[part]
-    })
-    if (p === out && nl === l) return false
-    p[nl] = out[k]
-    return true
-  }).forEach(function (del, _, __) {
-    delete out[del]
-  })
-
-  return out
-}
-
-function safe (val) {
-  return ( typeof val !== "string"
-         || val.match(/[\r\n]/)
-         || val.match(/^\[/)
-         || (val.length > 1
-             && val.charAt(0) === "\""
-             && val.slice(-1) === "\"")
-         || val !== val.trim() )
-         ? JSON.stringify(val)
-         : val.replace(/;/g, '\\;')
-}
-
-function unsafe (val, doUnesc) {
-  val = (val || "").trim()
-  if (val.charAt(0) === "\"" && val.slice(-1) === "\"") {
-    try { val = JSON.parse(val) } catch (_) {}
-  } else {
-    // walk the val to find the first not-escaped ; character
-    var esc = false
-    var unesc = "";
-    for (var i = 0, l = val.length; i < l; i++) {
-      var c = val.charAt(i)
-      if (esc) {
-        if (c === "\\" || c === ";")
-          unesc += c
-        else
-          unesc += "\\" + c
-        esc = false
-      } else if (c === ";") {
-        break
-      } else if (c === "\\") {
-        esc = true
-      } else {
-        unesc += c
-      }
-    }
-    if (esc)
-      unesc += "\\"
-    return unesc
-  }
-  return val
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/ini/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,29 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "ini",
-  "description": "An ini encoder/decoder for node",
-  "version": "1.1.0",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/ini.git"
-  },
-  "main": "ini.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "engines": {
-    "node": "*"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "tap": "~0.0.9"
-  },
-  "readme": "An ini format parser and serializer for node.\n\nSections are treated as nested objects.  Items before the first heading\nare saved on the object directly.\n\n## Usage\n\nConsider an ini-file `config.ini` that looks like this:\n\n    ; this comment is being ignored\n    scope = global\n\n    [database]\n    user = dbuser\n    password = dbpassword\n    database = use_this_database\n\n    [paths.default]\n    datadir = /var/lib/data\n    array[] = first value\n    array[] = second value\n    array[] = third value\n\nYou can read, manipulate and write the ini-file like so:\n\n    var fs = require('fs')\n      , ini = require('ini')\n\n    var config = ini.parse(fs.readFileSync('./config.ini', 'utf-8'))\n\n    config.scope = 'local'\n    config.database.database = 'use_another_database'\n    config.paths.default.tmpdir = '/tmp'\n    delete config.paths.default.datadir\n    config.paths.default.array.push('fourth value')\n\n    fs.writeFileSync('./config_modified.ini', ini.stringify(config, 'section'))\n\nThis will result in a file called `config_modified.ini` being written to the filesystem with the following content:\n\n    [section]\n    scope = local\n    [section.database]\n    user = dbuser\n    password = dbpassword\n    database = use_another_database\n    [section.paths.default]\n    tmpdir = /tmp\n    array[] = first value\n    array[] = second value\n    array[] = third value\n    array[] = fourth value\n\n\n## API\n\n### decode(inistring)\nDecode the ini-style formatted `inistring` into a nested object.\n\n### parse(inistring)\nAlias for `decode(inistring)`\n\n### encode(object, [section])\nEncode the object `object` into an ini-style formatted string. If the optional parameter `section` is given, then all top-level properties of the object are put into this section and the `section`-string is prepended to all sub-sections, see the usage example above.\n\n### stringify(object, [section])\nAlias for `encode(object, [section])`\n\n### safe(val)\nEscapes the string `val` such that it is safe to be used as a key or value in an ini-file. Basically escapes quotes. For example\n\n    ini.safe('\"unsafe string\"')\n\nwould result in\n\n    \"\\\"unsafe string\\\"\"\n\n### unsafe(val)\nUnescapes the string `val`\n",
-  "readmeFilename": "README.md",
-  "_id": "ini@1.1.0",
-  "_from": "ini@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,43 +0,0 @@
-# init-package-json
-
-A node module to get your node module started.
-
-## Usage
-
-```javascript
-var init = require('init-package-json')
-var path = require('path')
-
-// a path to a promzard module.  In the event that this file is
-// not found, one will be provided for you.
-var initFile = path.resolve(process.env.HOME, '.npm-init')
-
-// the dir where we're doin stuff.
-var dir = process.cwd()
-
-// extra stuff that gets put into the PromZard module's context.
-// In npm, this is the resolved config object.  Exposed as 'config'
-// Optional.
-var configData = { some: 'extra stuff' }
-
-// Any existing stuff from the package.json file is also exposed in the
-// PromZard module as the `package` object.  There will also be free
-// vars for:
-// * `filename` path to the package.json file
-// * `basename` the tip of the package dir
-// * `dirname` the parent of the package dir
-
-init(dir, initFile, configData, function (er, data) {
-  // the data's already been written to {dir}/package.json
-  // now you can do stuff with it
-})
-```
-
-Or from the command line:
-
-```
-$ npm-init
-```
-
-See [PromZard](https://github.com/isaacs/promzard) for details about
-what can go in the config file.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/default-input.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,179 +0,0 @@
-var fs = require('fs')
-var path = require('path')
-var glob = require('glob')
-
-// more popular packages should go here, maybe?
-function isTestPkg (p) {
-  return !!p.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/)
-}
-
-function niceName (n) {
-  return n.replace(/^node-|[.-]js$/g, '')
-}
-
-function readDeps (test) { return function (cb) {
-  fs.readdir('node_modules', function (er, dir) {
-    if (er) return cb()
-    var deps = {}
-    var n = dir.length
-    if (n === 0) return cb(null, deps)
-    dir.forEach(function (d) {
-      if (d.match(/^\./)) return next()
-      if (test !== isTestPkg(d))
-        return next()
-
-      var dp = path.join(dirname, 'node_modules', d, 'package.json')
-      fs.readFile(dp, 'utf8', function (er, p) {
-        if (er) return next()
-        try { p = JSON.parse(p) }
-        catch (e) { return next() }
-        if (!p.version) return next()
-        deps[d] = '~' + p.version
-        return next()
-      })
-    })
-    function next () {
-      if (--n === 0) return cb(null, deps)
-    }
-  })
-}}
-
-
-exports.name = prompt('name', package.name || basename)
-exports.version = prompt('version', package.version || '0.0.0')
-if (!package.description) {
-  exports.description = prompt('description')
-}
-
-if (!package.main) {
-  exports.main = function (cb) {
-    fs.readdir(dirname, function (er, f) {
-      if (er) f = []
-
-      f = f.filter(function (f) {
-        return f.match(/\.js$/)
-      })
-
-      if (f.indexOf('index.js') !== -1)
-        f = 'index.js'
-      else if (f.indexOf('main.js') !== -1)
-        f = 'main.js'
-      else if (f.indexOf(basename + '.js') !== -1)
-        f = basename + '.js'
-      else
-        f = f[0]
-
-      return cb(null, prompt('entry point', f || 'index.js'))
-    })
-  }
-}
-
-if (!package.bin) {
-  exports.bin = function (cb) {
-    fs.readdir(path.resolve(dirname, 'bin'), function (er, d) {
-      // no bins
-      if (er) return cb()
-      // just take the first js file we find there, or nada
-      return cb(null, d.filter(function (f) {
-        return f.match(/\.js$/)
-      })[0])
-    })
-  }
-}
-
-exports.directories = function (cb) {
-  fs.readdir(dirname, function (er, dirs) {
-    if (er) return cb(er)
-    var res = {}
-    dirs.forEach(function (d) {
-      switch (d) {
-        case 'example': case 'examples': return res.example = d
-        case 'test': case 'tests': return res.test = d
-        case 'doc': case 'docs': return res.doc = d
-        case 'man': return res.man = d
-      }
-    })
-    if (Object.keys(res).length === 0) res = undefined
-    return cb(null, res)
-  })
-}
-
-if (!package.dependencies) {
-  exports.dependencies = readDeps(false)
-}
-
-if (!package.devDependencies) {
-  exports.devDependencies = readDeps(true)
-}
-
-// MUST have a test script!
-var s = package.scripts || {}
-var notest = 'echo "Error: no test specified" && exit 1'
-if (!package.scripts) {
-  exports.scripts = function (cb) {
-    fs.readdir(path.join(dirname, 'node_modules'), function (er, d) {
-      setupScripts(d || [], cb)
-    })
-  }
-}
-function setupScripts (d, cb) {
-  // check to see what framework is in use, if any
-  function tx (test) {
-    return test || notest
-  }
-
-  if (!s.test || s.test === notest) {
-    if (d.indexOf('tap') !== -1)
-      s.test = prompt('test command', 'tap test/*.js', tx)
-    else if (d.indexOf('expresso') !== -1)
-      s.test = prompt('test command', 'expresso test', tx)
-    else if (d.indexOf('mocha') !== -1)
-      s.test = prompt('test command', 'mocha', tx)
-    else
-      s.test = prompt('test command', tx)
-  }
-
-  return cb(null, s)
-}
-
-if (!package.repository) {
-  exports.repository = function (cb) {
-    fs.readFile('.git/config', 'utf8', function (er, gconf) {
-      if (er || !gconf) return cb(null, prompt('git repository'))
-
-      gconf = gconf.split(/\r?\n/)
-      var i = gconf.indexOf('[remote "origin"]')
-      if (i !== -1) {
-        var u = gconf[i + 1]
-        if (!u.match(/^\s*url =/)) u = gconf[i + 2]
-        if (!u.match(/^\s*url =/)) u = null
-        else u = u.replace(/^\s*url = /, '')
-      }
-      if (u && u.match(/^git@github.com:/))
-        u = u.replace(/^git@github.com:/, 'git://github.com/')
-
-      return cb(null, prompt('git repository', u))
-    })
-  }
-}
-
-if (!package.keywords) {
-  exports.keywords = prompt('keywords', function (s) {
-    if (!s) return undefined
-    if (Array.isArray(s)) s = s.join(' ')
-    if (typeof s !== 'string') return s
-    return s.split(/[\s,]+/)
-  })
-}
-
-if (!package.author) {
-  exports.author = config.get('init.author.name')
-  ? {
-      "name" : config.get('init.author.name'),
-      "email" : config.get('init.author.email'),
-      "url" : config.get('init.author.url')
-    }
-  : prompt('author')
-}
-
-exports.license = prompt('license', 'BSD-2-Clause')
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/example.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-var init = require('./init-package-json.js')
-var path = require('path')
-var initFile = path.resolve(process.env.HOME, '.npm-init')
-var dir = process.cwd()
-
-var npm = require('npm')
-npm.load(function (er, npm) {
-  if (er) throw er
-  init(dir, initFile, npm.config.get(), function (er, data) {
-    if (er) throw er
-    console.log('written successfully')
-  })
-})
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/init-package-json.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,129 +0,0 @@
-
-module.exports = init
-
-var PZ = require('promzard').PromZard
-var path = require('path')
-var def = require.resolve('./default-input.js')
-
-var fs = require('fs')
-var semver = require('semver')
-var read = require('read')
-
-// to validate the data object at the end as a worthwhile package
-// and assign default values for things.
-// readJson.extras(file, data, cb)
-var readJson = require('read-package-json')
-
-function init (dir, input, config, cb) {
-  if (typeof config === 'function')
-    cb = config, config = {}
-
-  // accept either a plain-jane object, or a config object
-  // with a "get" method.
-  if (typeof config.get !== 'function') {
-    var data = config
-    config = {
-      get: function (k) {
-        return data[k]
-      },
-      toJSON: function () {
-        return data
-      }
-    }
-  }
-
-  var package = path.resolve(dir, 'package.json')
-  input = path.resolve(input)
-  var pkg
-  var ctx = {}
-
-  var es = readJson.extraSet
-  readJson.extraSet = es.filter(function (fn) {
-    return fn.name !== 'authors' && fn.name !== 'mans'
-  })
-  readJson(package, function (er, d) {
-    readJson.extraSet = es
-
-    if (er) pkg = {}
-    else pkg = d
-
-    ctx.filename = package
-    ctx.dirname = path.dirname(package)
-    ctx.basename = path.basename(ctx.dirname)
-    if (!pkg.version || !semver.valid(pkg.version))
-      delete pkg.version
-
-    ctx.package = pkg
-    ctx.config = config || {}
-
-    // make sure that the input is valid.
-    // if not, use the default
-    var pz = new PZ(input, ctx)
-    pz.backupFile = def
-    pz.on('error', cb)
-    pz.on('data', function (data) {
-      Object.keys(data).forEach(function (k) {
-        if (data[k] !== undefined && data[k] !== null) pkg[k] = data[k]
-      })
-
-      // only do a few of these.
-      // no need for mans or contributors if they're in the files
-      var es = readJson.extraSet
-      readJson.extraSet = es.filter(function (fn) {
-        return fn.name !== 'authors' && fn.name !== 'mans'
-      })
-      readJson.extras(package, pkg, function (er, pkg) {
-        readJson.extraSet = es
-        if (er) return cb(er, pkg)
-        pkg = unParsePeople(pkg)
-        // no need for the readme now.
-        delete pkg.readme
-        delete pkg.readmeFilename
-
-        // really don't want to have this lying around in the file
-        delete pkg._id
-
-        // ditto
-        delete pkg.gitHead
-
-        // if the repo is empty, remove it.
-        if (!pkg.repository)
-          delete pkg.repository
-
-        var d = JSON.stringify(pkg, null, 2) + '\n'
-        console.log('About to write to %s:\n\n%s\n', package, d)
-        read({prompt:'Is this ok? ', default: 'yes'}, function (er, ok) {
-          if (!ok || ok.toLowerCase().charAt(0) !== 'y') {
-            console.log('Aborted.')
-          } else {
-            fs.writeFile(package, d, 'utf8', function (er) {
-              return cb(er, pkg)
-            })
-          }
-        })
-      })
-    })
-  })
-
-}
-
-// turn the objects into somewhat more humane strings.
-function unParsePeople (data) {
-  if (data.author) data.author = unParsePerson(data.author)
-  ;["maintainers", "contributors"].forEach(function (set) {
-    if (!Array.isArray(data[set])) return;
-    data[set] = data[set].map(unParsePerson)
-  })
-  return data
-}
-
-function unParsePerson (person) {
-  if (typeof person === "string") return person
-  var name = person.name || ""
-  var u = person.url || person.web
-  var url = u ? (" ("+u+")") : ""
-  var e = person.email || person.mail
-  var email = e ? (" <"+e+">") : ""
-  return name+email+url
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-example/npm-init/package.json
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,133 +0,0 @@
-# promzard
-
-A prompting wizard for building files from specialized PromZard modules.
-Used by `npm init`.
-
-A reimplementation of @SubStack's
-[prompter](https://github.com/substack/node-prompter), which does not
-use AST traversal.
-
-From another point of view, it's a reimplementation of
-[@Marak](https://github.com/marak)'s
-[wizard](https://github.com/Marak/wizard) which doesn't use schemas.
-
-The goal is a nice drop-in enhancement for `npm init`.
-
-## Usage
-
-```javascript
-var promzard = require('promzard')
-promzard(inputFile, optionalContextAdditions, function (er, data) {
-  // .. you know what you doing ..
-})
-```
-
-In the `inputFile` you can have something like this:
-
-```javascript
-var fs = require('fs')
-module.exports = {
-  "greeting": prompt("Who shall you greet?", "world", function (who) {
-    return "Hello, " + who
-  }),
-  "filename": __filename,
-  "directory": function (cb) {
-    fs.readdir(__dirname, cb)
-  }
-}
-```
-
-When run, promzard will display the prompts and resolve the async
-functions in order, and then either give you an error, or the resolved
-data, ready to be dropped into a JSON file or some other place.
-
-
-### promzard(inputFile, ctx, callback)
-
-The inputFile is just a node module.  You can require() things, set
-module.exports, etc.  Whatever that module exports is the result, and it
-is walked over to call any functions as described below.
-
-The only caveat is that you must give PromZard the full absolute path
-to the module (you can get this via Node's `require.resolve`.)  Also,
-the `prompt` function is injected into the context object, so watch out.
-
-Whatever you put in that `ctx` will of course also be available in the
-module.  You can get quite fancy with this, passing in existing configs
-and so on.
-
-### Class: promzard.PromZard(file, ctx)
-
-Just like the `promzard` function, but the EventEmitter that makes it
-all happen.  Emits either a `data` event with the data, or a `error`
-event if it blows up.
-
-If `error` is emitted, then `data` never will be.
-
-### prompt(...)
-
-In the promzard input module, you can call the `prompt` function.
-This prompts the user to input some data.  The arguments are interpreted
-based on type:
-
-1. `string`  The first string encountered is the prompt.  The second is
-   the default value.
-2. `function` A transformer function which receives the data and returns
-   something else.  More than meets the eye.
-3. `object` The `prompt` member is the prompt, the `default` member is
-   the default value, and the `transform` is the transformer.
-
-Whatever the final value is, that's what will be put on the resulting
-object.
-
-### Functions
-
-If there are any functions on the promzard input module's exports, then
-promzard will call each of them with a callback.  This way, your module
-can do asynchronous actions if necessary to validate or ascertain
-whatever needs verification.
-
-The functions are called in the context of the ctx object, and are given
-a single argument, which is a callback that should be called with either
-an error, or the result to assign to that spot.
-
-In the async function, you can also call prompt() and return the result
-of the prompt in the callback.
-
-For example, this works fine in a promzard module:
-
-```
-exports.asyncPrompt = function (cb) {
-  fs.stat(someFile, function (er, st) {
-    // if there's an error, no prompt, just error
-    // otherwise prompt and use the actual file size as the default
-    cb(er, prompt('file size', st.size))
-  })
-}
-```
-
-You can also return other async functions in the async function
-callback.  Though that's a bit silly, it could be a handy way to reuse
-functionality in some cases.
-
-### Sync vs Async
-
-The `prompt()` function is not synchronous, though it appears that way.
-It just returns a token that is swapped out when the data object is
-walked over asynchronously later, and returns a token.
-
-For that reason, prompt() calls whose results don't end up on the data
-object are never shown to the user.  For example, this will only prompt
-once:
-
-```
-exports.promptThreeTimes = prompt('prompt me once', 'shame on you')
-exports.promptThreeTimes = prompt('prompt me twice', 'um....')
-exports.promptThreeTimes = prompt('you cant prompt me again')
-```
-
-### Isn't this exactly the sort of 'looks sync' that you said was bad about other libraries?
-
-Yeah, sorta.  I wouldn't use promzard for anything more complicated than
-a wizard that spits out prompts to set up a config file or something.
-Maybe there are other use cases I haven't considered.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
-var pz = require('../promzard')
-
-var path = require('path')
-var file = path.resolve(__dirname, 'substack-input.js')
-var ctx = { basename: path.basename(path.dirname(file)) }
-
-pz(file, ctx, function (er, res) {
-  if (er)
-    throw er
-  console.error(JSON.stringify(res, null, 2))
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,8 +0,0 @@
-# npm-init
-
-An initter you init wit, innit?
-
-## More stuff here
-
-Blerp derp herp lerg borgle pop munch efemerate baz foo a gandt synergy
-jorka chatt slurm.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/init-input.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,191 +0,0 @@
-var fs = require('fs')
-var path = require('path');
-
-module.exports = {
-  "name" : prompt('name',
-    typeof name === 'undefined'
-    ? basename.replace(/^node-|[.-]js$/g, ''): name),
-  "version" : prompt('version', typeof version !== "undefined"
-                              ? version : '0.0.0'),
-  "description" : (function () {
-      if (typeof description !== 'undefined' && description) {
-        return description
-      }
-      var value;
-      try {
-          var src = fs.readFileSync('README.md', 'utf8');
-          value = src.split('\n').filter(function (line) {
-              return /\s+/.test(line)
-                  && line.trim() !== basename.replace(/^node-/, '')
-                  && !line.trim().match(/^#/)
-              ;
-          })[0]
-              .trim()
-              .replace(/^./, function (c) { return c.toLowerCase() })
-              .replace(/\.$/, '')
-          ;
-      }
-      catch (e) {
-        try {
-          // Wouldn't it be nice if that file mattered?
-          var d = fs.readFileSync('.git/description', 'utf8')
-        } catch (e) {}
-        if (d.trim() && !value) value = d
-      }
-      return prompt('description', value);
-  })(),
-  "main" : (function () {
-    var f
-    try {
-      f = fs.readdirSync(dirname).filter(function (f) {
-        return f.match(/\.js$/)
-      })
-      if (f.indexOf('index.js') !== -1)
-        f = 'index.js'
-      else if (f.indexOf('main.js') !== -1)
-        f = 'main.js'
-      else if (f.indexOf(basename + '.js') !== -1)
-        f = basename + '.js'
-      else
-        f = f[0]
-    } catch (e) {}
-
-    return prompt('entry point', f || 'index.js')
-  })(),
-  "bin" : function (cb) {
-    fs.readdir(dirname + '/bin', function (er, d) {
-      // no bins
-      if (er) return cb()
-      // just take the first js file we find there, or nada
-      return cb(null, d.filter(function (f) {
-        return f.match(/\.js$/)
-      })[0])
-    })
-  },
-  "directories" : function (cb) {
-    fs.readdir('.', function (er, dirs) {
-      if (er) return cb(er)
-      var res = {}
-      dirs.forEach(function (d) {
-        switch (d) {
-          case 'example': case 'examples': return res.example = d
-          case 'test': case 'tests': return res.test = d
-          case 'doc': case 'docs': return res.doc = d
-          case 'man': return res.man = d
-        }
-      })
-      if (Object.keys(res).length === 0) res = undefined
-      return cb(null, res)
-    })
-  },
-  "dependencies" : typeof dependencies !== 'undefined' ? dependencies
-    : function (cb) {
-      fs.readdir('node_modules', function (er, dir) {
-        if (er) return cb()
-        var deps = {}
-        var n = dir.length
-        dir.forEach(function (d) {
-          if (d.match(/^\./)) return next()
-          if (d.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/))
-            return next()
-          fs.readFile('node_modules/' + d + '/package.json', function (er, p) {
-            if (er) return next()
-            try { p = JSON.parse(p) } catch (e) { return next() }
-            if (!p.version) return next()
-            deps[d] = '~' + p.version
-            return next()
-          })
-        })
-        function next () {
-          if (--n === 0) return cb(null, deps)
-        }
-      })
-    },
-  "devDependencies" : typeof devDependencies !== 'undefined' ? devDependencies
-    : function (cb) {
-      // same as dependencies but for dev deps
-      fs.readdir('node_modules', function (er, dir) {
-        if (er) return cb()
-        var deps = {}
-        var n = dir.length
-        dir.forEach(function (d) {
-          if (d.match(/^\./)) return next()
-          if (!d.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/))
-            return next()
-          fs.readFile('node_modules/' + d + '/package.json', function (er, p) {
-            if (er) return next()
-            try { p = JSON.parse(p) } catch (e) { return next() }
-            if (!p.version) return next()
-            deps[d] = '~' + p.version
-            return next()
-          })
-        })
-        function next () {
-          if (--n === 0) return cb(null, deps)
-        }
-      })
-    },
-  "scripts" : (function () {
-    // check to see what framework is in use, if any
-    try { var d = fs.readdirSync('node_modules') }
-    catch (e) { d = [] }
-    var s = typeof scripts === 'undefined' ? {} : scripts
-
-    if (d.indexOf('coffee-script') !== -1)
-      s.prepublish = prompt('build command',
-                            s.prepublish || 'coffee src/*.coffee -o lib')
-
-    var notest = 'echo "Error: no test specified" && exit 1'
-    function tx (test) {
-      return test || notest
-    }
-
-    if (!s.test || s.test === notest) {
-      if (d.indexOf('tap') !== -1)
-        s.test = prompt('test command', 'tap test/*.js', tx)
-      else if (d.indexOf('expresso') !== -1)
-        s.test = prompt('test command', 'expresso test', tx)
-      else if (d.indexOf('mocha') !== -1)
-        s.test = prompt('test command', 'mocha', tx)
-      else
-        s.test = prompt('test command', tx)
-    }
-
-    return s
-
-  })(),
-
-  "repository" : (function () {
-    try { var gconf = fs.readFileSync('.git/config') }
-    catch (e) { gconf = null }
-    if (gconf) {
-      gconf = gconf.split(/\r?\n/)
-      var i = gconf.indexOf('[remote "origin"]')
-      if (i !== -1) {
-        var u = gconf[i + 1]
-        if (!u.match(/^\s*url =/)) u = gconf[i + 2]
-        if (!u.match(/^\s*url =/)) u = null
-        else u = u.replace(/^\s*url = /, '')
-      }
-      if (u && u.match(/^git@github.com:/))
-        u = u.replace(/^git@github.com:/, 'git://github.com/')
-    }
-
-    return prompt('git repository', u)
-  })(),
-
-  "keywords" : prompt(function (s) {
-    if (!s) return undefined
-    if (Array.isArray(s)) s = s.join(' ')
-    if (typeof s !== 'string') return s
-    return s.split(/[\s,]+/)
-  }),
-  "author" : config['init.author.name']
-    ? {
-        "name" : config['init.author.name'],
-        "email" : config['init.author.email'],
-        "url" : config['init.author.url']
-      }
-    : undefined,
-  "license" : prompt('license', 'BSD')
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/init.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-var PZ = require('../../promzard').PromZard
-var path = require('path')
-var input = path.resolve(__dirname, 'init-input.js')
-
-var fs = require('fs')
-var package = path.resolve(__dirname, 'package.json')
-var pkg
-
-fs.readFile(package, 'utf8', function (er, d) {
-  if (er) ctx = {}
-  try { ctx = JSON.parse(d); pkg = JSON.parse(d) }
-  catch (e) { ctx = {} }
-
-  ctx.dirname = path.dirname(package)
-  ctx.basename = path.basename(ctx.dirname)
-  if (!ctx.version) ctx.version = undefined
-
-  // this should be replaced with the npm conf object
-  ctx.config = {}
-
-  console.error('ctx=', ctx)
-
-  var pz = new PZ(input, ctx)
-
-  pz.on('data', function (data) {
-    console.error('pz data', data)
-    if (!pkg) pkg = {}
-    Object.keys(data).forEach(function (k) {
-      if (data[k] !== undefined && data[k] !== null) pkg[k] = data[k]
-    })
-    console.error('package data %s', JSON.stringify(data, null, 2))
-    fs.writeFile(package, JSON.stringify(pkg, null, 2), function (er) {
-      if (er) throw er
-      console.log('ok')
-    })
-  })
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,10 +0,0 @@
-{
-  "name": "npm-init",
-  "version": "0.0.0",
-  "description": "an initter you init wit, innit?",
-  "main": "index.js",
-  "scripts": {
-    "test": "asdf"
-  },
-  "license": "BSD"
-}
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/substack-input.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,61 +0,0 @@
-module.exports = {
-    "name" : basename.replace(/^node-/, ''),
-    "version" : "0.0.0",
-    "description" : (function (cb) {
-        var fs = require('fs');
-        var value;
-        try {
-            var src = fs.readFileSync('README.markdown', 'utf8');
-            value = src.split('\n').filter(function (line) {
-                return /\s+/.test(line)
-                    && line.trim() !== basename.replace(/^node-/, '')
-                ;
-            })[0]
-                .trim()
-                .replace(/^./, function (c) { return c.toLowerCase() })
-                .replace(/\.$/, '')
-            ;
-        }
-        catch (e) {}
-        
-        return prompt('description', value);
-    })(),
-    "main" : prompt('entry point', 'index.js'),
-    "bin" : function (cb) {
-        var path = require('path');
-        var fs = require('fs');
-        var exists = fs.exists || path.exists;
-        exists('bin/cmd.js', function (ex) {
-            var bin
-            if (ex) {
-                var bin = {}
-                bin[basename.replace(/^node-/, '')] = 'bin/cmd.js'
-            }
-            cb(null, bin);
-        });
-    },
-    "directories" : {
-        "example" : "example",
-        "test" : "test"
-    },
-    "dependencies" : {},
-    "devDependencies" : {
-        "tap" : "~0.2.5"
-    },
-    "scripts" : {
-        "test" : "tap test/*.js"
-    },
-    "repository" : {
-        "type" : "git",
-        "url" : "git://github.com/substack/" + basename + ".git"
-    },
-    "homepage" : "https://github.com/substack/" + basename,
-    "keywords" : prompt(function (s) { return s.split(/\s+/) }),
-    "author" : {
-        "name" : "James Halliday",
-        "email" : "mail@substack.net",
-        "url" : "http://substack.net"
-    },
-    "license" : "MIT",
-    "engine" : { "node" : ">=0.6" }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "promzard",
-  "description": "prompting wizardly",
-  "version": "0.2.0",
-  "repository": {
-    "url": "git://github.com/isaacs/promzard"
-  },
-  "dependencies": {
-    "read": "1"
-  },
-  "devDependencies": {
-    "tap": "~0.2.5"
-  },
-  "main": "promzard.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "readme": "# promzard\n\nA prompting wizard for building files from specialized PromZard modules.\nUsed by `npm init`.\n\nA reimplementation of @SubStack's\n[prompter](https://github.com/substack/node-prompter), which does not\nuse AST traversal.\n\nFrom another point of view, it's a reimplementation of\n[@Marak](https://github.com/marak)'s\n[wizard](https://github.com/Marak/wizard) which doesn't use schemas.\n\nThe goal is a nice drop-in enhancement for `npm init`.\n\n## Usage\n\n```javascript\nvar promzard = require('promzard')\npromzard(inputFile, optionalContextAdditions, function (er, data) {\n  // .. you know what you doing ..\n})\n```\n\nIn the `inputFile` you can have something like this:\n\n```javascript\nvar fs = require('fs')\nmodule.exports = {\n  \"greeting\": prompt(\"Who shall you greet?\", \"world\", function (who) {\n    return \"Hello, \" + who\n  }),\n  \"filename\": __filename,\n  \"directory\": function (cb) {\n    fs.readdir(__dirname, cb)\n  }\n}\n```\n\nWhen run, promzard will display the prompts and resolve the async\nfunctions in order, and then either give you an error, or the resolved\ndata, ready to be dropped into a JSON file or some other place.\n\n\n### promzard(inputFile, ctx, callback)\n\nThe inputFile is just a node module.  You can require() things, set\nmodule.exports, etc.  Whatever that module exports is the result, and it\nis walked over to call any functions as described below.\n\nThe only caveat is that you must give PromZard the full absolute path\nto the module (you can get this via Node's `require.resolve`.)  Also,\nthe `prompt` function is injected into the context object, so watch out.\n\nWhatever you put in that `ctx` will of course also be available in the\nmodule.  You can get quite fancy with this, passing in existing configs\nand so on.\n\n### Class: promzard.PromZard(file, ctx)\n\nJust like the `promzard` function, but the EventEmitter that makes it\nall happen.  Emits either a `data` event with the data, or a `error`\nevent if it blows up.\n\nIf `error` is emitted, then `data` never will be.\n\n### prompt(...)\n\nIn the promzard input module, you can call the `prompt` function.\nThis prompts the user to input some data.  The arguments are interpreted\nbased on type:\n\n1. `string`  The first string encountered is the prompt.  The second is\n   the default value.\n2. `function` A transformer function which receives the data and returns\n   something else.  More than meets the eye.\n3. `object` The `prompt` member is the prompt, the `default` member is\n   the default value, and the `transform` is the transformer.\n\nWhatever the final value is, that's what will be put on the resulting\nobject.\n\n### Functions\n\nIf there are any functions on the promzard input module's exports, then\npromzard will call each of them with a callback.  This way, your module\ncan do asynchronous actions if necessary to validate or ascertain\nwhatever needs verification.\n\nThe functions are called in the context of the ctx object, and are given\na single argument, which is a callback that should be called with either\nan error, or the result to assign to that spot.\n\nIn the async function, you can also call prompt() and return the result\nof the prompt in the callback.\n\nFor example, this works fine in a promzard module:\n\n```\nexports.asyncPrompt = function (cb) {\n  fs.stat(someFile, function (er, st) {\n    // if there's an error, no prompt, just error\n    // otherwise prompt and use the actual file size as the default\n    cb(er, prompt('file size', st.size))\n  })\n}\n```\n\nYou can also return other async functions in the async function\ncallback.  Though that's a bit silly, it could be a handy way to reuse\nfunctionality in some cases.\n\n### Sync vs Async\n\nThe `prompt()` function is not synchronous, though it appears that way.\nIt just returns a token that is swapped out when the data object is\nwalked over asynchronously later, and returns a token.\n\nFor that reason, prompt() calls whose results don't end up on the data\nobject are never shown to the user.  For example, this will only prompt\nonce:\n\n```\nexports.promptThreeTimes = prompt('prompt me once', 'shame on you')\nexports.promptThreeTimes = prompt('prompt me twice', 'um....')\nexports.promptThreeTimes = prompt('you cant prompt me again')\n```\n\n### Isn't this exactly the sort of 'looks sync' that you said was bad about other libraries?\n\nYeah, sorta.  I wouldn't use promzard for anything more complicated than\na wizard that spits out prompts to set up a config file or something.\nMaybe there are other use cases I haven't considered.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/promzard/issues"
-  },
-  "_id": "promzard@0.2.0",
-  "_from": "promzard@~0.2.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/promzard.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,216 +0,0 @@
-module.exports = promzard
-promzard.PromZard = PromZard
-
-var fs = require('fs')
-var vm = require('vm')
-var util = require('util')
-var files = {}
-var crypto = require('crypto')
-var EventEmitter = require('events').EventEmitter
-var read = require('read')
-
-var Module = require('module').Module
-var path = require('path')
-
-function promzard (file, ctx, cb) {
-  if (typeof ctx === 'function') cb = ctx, ctx = null;
-  if (!ctx) ctx = {};
-  var pz = new PromZard(file, ctx)
-  pz.on('error', cb)
-  pz.on('data', function (data) {
-    cb(null, data)
-  })
-}
-
-function PromZard (file, ctx) {
-  if (!(this instanceof PromZard))
-    return new PromZard(file, ctx)
-  EventEmitter.call(this)
-  this.file = file
-  this.ctx = ctx
-  this.unique = crypto.randomBytes(8).toString('hex')
-  this.load()
-}
-
-PromZard.prototype = Object.create(
-  EventEmitter.prototype,
-  { constructor: {
-      value: PromZard,
-      readable: true,
-      configurable: true,
-      writable: true,
-      enumerable: false } } )
-
-PromZard.prototype.load = function () {
-  if (files[this.file])
-    return this.loaded()
-
-  fs.readFile(this.file, 'utf8', function (er, d) {
-    if (er && this.backupFile) {
-      this.file = this.backupFile
-      delete this.backupFile
-      return this.load()
-    }
-    if (er)
-      return this.emit('error', this.error = er)
-    files[this.file] = d
-    this.loaded()
-  }.bind(this))
-}
-
-PromZard.prototype.loaded = function () {
-  this.ctx.prompt = this.makePrompt()
-  this.ctx.__filename = this.file
-  this.ctx.__dirname = path.dirname(this.file)
-  this.ctx.__basename = path.basename(this.file)
-  var mod = this.ctx.module = this.makeModule()
-  this.ctx.require = function (path) {
-    return mod.require(path)
-  }
-  this.ctx.require.resolve = function(path) {
-    return Module._resolveFilename(path, mod);
-  }
-  this.ctx.exports = mod.exports
-
-  this.script = this.wrap(files[this.file])
-  var fn = vm.runInThisContext(this.script, this.file)
-  var args = Object.keys(this.ctx).map(function (k) {
-    return this.ctx[k]
-  }.bind(this))
-  try { var res = fn.apply(this.ctx, args) }
-  catch (er) { this.emit('error', er) }
-  if (res &&
-      typeof res === 'object' &&
-      exports === mod.exports &&
-      Object.keys(exports).length === 1) {
-    this.result = res
-  } else {
-    this.result = mod.exports
-  }
-  this.walk()
-}
-
-PromZard.prototype.makeModule = function () {
-  var mod = new Module(this.file, module)
-  mod.loaded = true
-  mod.filename = this.file
-  mod.id = this.file
-  mod.paths = Module._nodeModulePaths(path.dirname(this.file))
-  return mod
-}
-
-PromZard.prototype.wrap = function (body) {
-  var s = '(function( %s ) { %s\n })'
-  var args = Object.keys(this.ctx).join(', ')
-  return util.format(s, args, body)
-}
-
-PromZard.prototype.makePrompt = function () {
-  this.prompts = []
-  return prompt.bind(this)
-  function prompt () {
-    var p, d, t
-    for (var i = 0; i < arguments.length; i++) {
-      var a = arguments[i]
-      if (typeof a === 'string' && p)
-        d = a
-      else if (typeof a === 'string')
-        p = a
-      else if (typeof a === 'function')
-        t = a
-      else if (a && typeof a === 'object') {
-        p = a.prompt || p
-        d = a.default || d
-        t = a.tranform || t
-      }
-    }
-
-    try { return this.unique + '-' + this.prompts.length }
-    finally { this.prompts.push([p, d, t]) }
-  }
-}
-
-PromZard.prototype.walk = function (o, cb) {
-  o = o || this.result
-  cb = cb || function (er, res) {
-    if (er)
-      return this.emit('error', this.error = er)
-    this.result = res
-    return this.emit('data', res)
-  }
-  cb = cb.bind(this)
-  var keys = Object.keys(o)
-  var i = 0
-  var len = keys.length
-
-  L.call(this)
-  function L () {
-    if (this.error)
-      return
-    while (i < len) {
-      var k = keys[i]
-      var v = o[k]
-      i++
-
-      if (v && typeof v === 'object') {
-        return this.walk(v, function (er, res) {
-          if (er) return cb(er)
-          o[k] = res
-          L.call(this)
-        }.bind(this))
-      } else if (v &&
-                 typeof v === 'string' &&
-                 v.indexOf(this.unique) === 0) {
-        var n = +v.substr(this.unique.length + 1)
-        var prompt = this.prompts[n]
-        if (isNaN(n) || !prompt)
-          continue
-
-        // default to the key
-        if (undefined === prompt[0])
-          prompt[0] = k
-
-        // default to the ctx value, if there is one
-        if (undefined === prompt[1])
-          prompt[1] = this.ctx[k]
-
-        return this.prompt(prompt, function (er, res) {
-          if (er)
-            return this.emit('error', this.error = er);
-          o[k] = res
-          L.call(this)
-        }.bind(this))
-      } else if (typeof v === 'function') {
-        try { return v.call(this.ctx, function (er, res) {
-          if (er)
-            return this.emit('error', this.error = er)
-          o[k] = res
-          // back up so that we process this one again.
-          // this is because it might return a prompt() call in the cb.
-          i --
-          L.call(this)
-        }.bind(this)) }
-        catch (er) { this.emit('error', er) }
-      }
-    }
-    // made it to the end of the loop, maybe
-    if (i >= len)
-      return cb(null, o)
-  }
-}
-
-PromZard.prototype.prompt = function (pdt, cb) {
-  var prompt = pdt[0]
-  var def = pdt[1]
-  var tx = pdt[2]
-
-  if (tx) {
-    cb = function (cb) { return function (er, data) {
-      try { return cb(er, tx(data)) }
-      catch (er) { this.emit('error', er) }
-    }}(cb).bind(this)
-  }
-
-  read({ prompt: prompt + ':' , default: def }, cb)
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/init-package-json/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,46 +0,0 @@
-{
-  "name": "init-package-json",
-  "version": "0.0.11",
-  "main": "init-package-json.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/init-package-json"
-  },
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "license": "BSD",
-  "description": "A node module to get your node module started",
-  "dependencies": {
-    "promzard": "~0.2.0",
-    "read": "~1.0.1",
-    "read-package-json": "1",
-    "semver": "2.x"
-  },
-  "devDependencies": {
-    "tap": "~0.2.5",
-    "rimraf": "~2.0.2"
-  },
-  "keywords": [
-    "init",
-    "package.json",
-    "package",
-    "helper",
-    "wizard",
-    "wizerd",
-    "prompt",
-    "start"
-  ],
-  "readme": "# init-package-json\n\nA node module to get your node module started.\n\n## Usage\n\n```javascript\nvar init = require('init-package-json')\nvar path = require('path')\n\n// a path to a promzard module.  In the event that this file is\n// not found, one will be provided for you.\nvar initFile = path.resolve(process.env.HOME, '.npm-init')\n\n// the dir where we're doin stuff.\nvar dir = process.cwd()\n\n// extra stuff that gets put into the PromZard module's context.\n// In npm, this is the resolved config object.  Exposed as 'config'\n// Optional.\nvar configData = { some: 'extra stuff' }\n\n// Any existing stuff from the package.json file is also exposed in the\n// PromZard module as the `package` object.  There will also be free\n// vars for:\n// * `filename` path to the package.json file\n// * `basename` the tip of the package dir\n// * `dirname` the parent of the package dir\n\ninit(dir, initFile, configData, function (er, data) {\n  // the data's already been written to {dir}/package.json\n  // now you can do stuff with it\n})\n```\n\nOr from the command line:\n\n```\n$ npm-init\n```\n\nSee [PromZard](https://github.com/isaacs/promzard) for details about\nwhat can go in the config file.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/init-package-json/issues"
-  },
-  "_id": "init-package-json@0.0.11",
-  "_from": "init-package-json@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lockfile/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lockfile/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,81 +0,0 @@
-# lockfile
-
-A very polite lock file utility, which endeavors to not litter, and to
-wait patiently for others.
-
-## Usage
-
-```javascript
-var lockFile = require('lockfile')
-
-// opts is optional, and defaults to {}
-lockFile.lock('some-file.lock', opts, function (er) {
-  // if the er happens, then it failed to acquire a lock.
-  // if there was not an error, then the file was created,
-  // and won't be deleted until we unlock it.
-
-  // do my stuff, free of interruptions
-  // then, some time later, do:
-  lockFile.unlock('some-file.lock', function (er) {
-    // er means that an error happened, and is probably bad.
-  })
-})
-```
-
-## Methods
-
-Sync methods return the value/throw the error, others don't.  Standard
-node fs stuff.
-
-All known locks are removed when the process exits.  Of course, it's
-possible for certain types of failures to cause this to fail, but a best
-effort is made to not be a litterbug.
-
-### lockFile.lock(path, [opts], cb)
-
-Acquire a file lock on the specified path
-
-### lockFile.lockSync(path, [opts])
-
-Acquire a file lock on the specified path
-
-### lockFile.unlock(path, cb)
-
-Close and unlink the lockfile.
-
-### lockFile.unlockSync(path)
-
-Close and unlink the lockfile.
-
-### lockFile.check(path, [opts], cb)
-
-Check if the lockfile is locked and not stale.
-
-Returns boolean.
-
-### lockFile.checkSync(path, [opts], cb)
-
-Check if the lockfile is locked and not stale.
-
-Callback is called with `cb(error, isLocked)`.
-
-## Options
-
-### opts.wait
-
-A number of milliseconds to wait for locks to expire before giving up.
-Only used by lockFile.lock.  Relies on fs.watch.  If the lock is not
-cleared by the time the wait expires, then it returns with the original
-error.
-
-### opts.stale
-
-A number of milliseconds before locks are considered to have expired.
-
-### opts.retries
-
-Used by lock and lockSync.  Retry `n` number of times before giving up.
-
-### opts.retryWait
-
-Used by lock.  Wait `n` milliseconds before retrying.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lockfile/lockfile.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,272 +0,0 @@
-var fs = require('fs')
-
-var wx = 'wx'
-if (process.version.match(/^v0\.[0-6]/)) {
-  var c = require('constants')
-  wx = c.O_TRUNC | c.O_CREAT | c.O_WRONLY | c.O_EXCL
-}
-
-var os = require('os')
-var filetime = 'ctime'
-if (os.platform() == "win32") {
-  filetime = 'mtime'
-}
-
-var debug
-var util = require('util')
-if (util.debuglog)
-  debug = util.debuglog('LOCKFILE')
-else if (/\blockfile\b/i.test(process.env.NODE_DEBUG))
-  debug = function() {
-    var msg = util.format.apply(util, arguments)
-    console.error('LOCKFILE %d %s', process.pid, msg)
-  }
-else
-  debug = function() {}
-
-var locks = {}
-
-function hasOwnProperty (obj, prop) {
-  return Object.prototype.hasOwnProperty.call(obj, prop)
-}
-
-process.on('exit', function () {
-  debug('exit listener')
-  // cleanup
-  Object.keys(locks).forEach(exports.unlockSync)
-})
-
-// XXX https://github.com/joyent/node/issues/3555
-// Remove when node 0.8 is deprecated.
-if (/^v0\.[0-8]\./.test(process.version)) {
-  debug('uncaughtException, version = %s', process.version)
-  process.on('uncaughtException', function H (er) {
-    debug('uncaughtException')
-    var l = process.listeners('uncaughtException').filter(function (h) {
-      return h !== H
-    })
-    if (!l.length) {
-      // cleanup
-      try { Object.keys(locks).forEach(exports.unlockSync) } catch (e) {}
-      process.removeListener('uncaughtException', H)
-      throw er
-    }
-  })
-}
-
-exports.unlock = function (path, cb) {
-  debug('unlock', path)
-  // best-effort.  unlocking an already-unlocked lock is a noop
-  delete locks[path]
-  fs.unlink(path, function (unlinkEr) { cb() })
-}
-
-exports.unlockSync = function (path) {
-  debug('unlockSync', path)
-  // best-effort.  unlocking an already-unlocked lock is a noop
-  try { fs.unlinkSync(path) } catch (er) {}
-  delete locks[path]
-}
-
-
-// if the file can be opened in readonly mode, then it's there.
-// if the error is something other than ENOENT, then it's not.
-exports.check = function (path, opts, cb) {
-  if (typeof opts === 'function') cb = opts, opts = {}
-  debug('check', path, opts)
-  fs.open(path, 'r', function (er, fd) {
-    if (er) {
-      if (er.code !== 'ENOENT') return cb(er)
-      return cb(null, false)
-    }
-
-    if (!opts.stale) {
-      return fs.close(fd, function (er) {
-        return cb(er, true)
-      })
-    }
-
-    fs.fstat(fd, function (er, st) {
-      if (er) return fs.close(fd, function (er2) {
-        return cb(er)
-      })
-
-      fs.close(fd, function (er) {
-        var age = Date.now() - st[filetime].getTime()
-        return cb(er, age <= opts.stale)
-      })
-    })
-  })
-}
-
-exports.checkSync = function (path, opts) {
-  opts = opts || {}
-  debug('checkSync', path, opts)
-  if (opts.wait) {
-    throw new Error('opts.wait not supported sync for obvious reasons')
-  }
-
-  try {
-    var fd = fs.openSync(path, 'r')
-  } catch (er) {
-    if (er.code !== 'ENOENT') throw er
-    return false
-  }
-
-  if (!opts.stale) {
-    try { fs.closeSync(fd) } catch (er) {}
-    return true
-  }
-
-  // file exists.  however, might be stale
-  if (opts.stale) {
-    try {
-      var st = fs.fstatSync(fd)
-    } finally {
-      fs.closeSync(fd)
-    }
-    var age = Date.now() - st[filetime].getTime()
-    return (age <= opts.stale)
-  }
-}
-
-
-
-var req = 0
-exports.lock = function (path, opts, cb) {
-  if (typeof opts === 'function') cb = opts, opts = {}
-  opts.req = opts.req || req++
-  debug('lock', path, opts)
-
-  if (typeof opts.retries === 'number' && opts.retries > 0) {
-    cb = (function (orig) { return function (er, fd) {
-      if (!er) return orig(er, fd)
-      var newRT = opts.retries - 1
-      opts_ = Object.create(opts, { retries: { value: newRT }})
-      debug('lock retry', path, newRT)
-      if (opts.retryWait) setTimeout(function() {
-        exports.lock(path, opts_, orig)
-      }, opts.retryWait)
-      else exports.lock(path, opts_, orig)
-    }})(cb)
-  }
-
-  // try to engage the lock.
-  // if this succeeds, then we're in business.
-  fs.open(path, wx, function (er, fd) {
-    if (!er) {
-      debug('locked', path, fd)
-      locks[path] = fd
-      return fs.close(fd, function () {
-        return cb()
-      })
-    }
-
-    // something other than "currently locked"
-    // maybe eperm or something.
-    if (er.code !== 'EEXIST') return cb(er)
-
-    // someone's got this one.  see if it's valid.
-    if (opts.stale) fs.stat(path, function (statEr, st) {
-      if (statEr) {
-        if (statEr.code === 'ENOENT') {
-          // expired already!
-          var opts_ = Object.create(opts, { stale: { value: false }})
-          debug('lock stale enoent retry', path, opts_)
-          exports.lock(path, opts_, cb)
-          return
-        }
-        return cb(statEr)
-      }
-
-      var age = Date.now() - st[filetime].getTime()
-      if (age > opts.stale) {
-        debug('lock stale', path, opts_)
-        exports.unlock(path, function (er) {
-          if (er) return cb(er)
-          var opts_ = Object.create(opts, { stale: { value: false }})
-          debug('lock stale retry', path, opts_)
-          exports.lock(path, opts_, cb)
-        })
-      } else notStale(er, path, opts, cb)
-    })
-    else notStale(er, path, opts, cb)
-  })
-}
-
-function notStale (er, path, opts, cb) {
-  debug('notStale', path, opts)
-
-  // if we can't wait, then just call it a failure
-  if (typeof opts.wait !== 'number' || opts.wait <= 0)
-    return cb(er)
-
-  // console.error('wait', path, opts.wait)
-  // wait for some ms for the lock to clear
-  var start = Date.now()
-  var end = start + opts.wait
-
-  function retry () {
-    debug('notStale retry', path, opts)
-    var now = Date.now()
-    var newWait = end - now
-    var newOpts = Object.create(opts, { wait: { value: newWait }})
-    exports.lock(path, newOpts, cb)
-  }
-
-  var timer = setTimeout(retry, 100)
-}
-
-exports.lockSync = function (path, opts) {
-  opts = opts || {}
-  opts.req = opts.req || req++
-  debug('lockSync', path, opts)
-  if (opts.wait || opts.retryWait) {
-    throw new Error('opts.wait not supported sync for obvious reasons')
-  }
-
-  try {
-    var fd = fs.openSync(path, wx)
-    locks[path] = fd
-    try { fs.closeSync(fd) } catch (er) {}
-    debug('locked sync!', path, fd)
-    return
-  } catch (er) {
-    if (er.code !== 'EEXIST') return retryThrow(path, opts, er)
-
-    if (opts.stale) {
-      var st = fs.statSync(path)
-      var ct = st[filetime].getTime()
-      if (!(ct % 1000) && (opts.stale % 1000)) {
-        // probably don't have subsecond resolution.
-        // round up the staleness indicator.
-        // Yes, this will be wrong 1/1000 times on platforms
-        // with subsecond stat precision, but that's acceptable
-        // in exchange for not mistakenly removing locks on
-        // most other systems.
-        opts.stale = 1000 * Math.ceil(opts.stale / 1000)
-      }
-      var age = Date.now() - ct
-      if (age > opts.stale) {
-        debug('lockSync stale', path, opts, age)
-        exports.unlockSync(path)
-        return exports.lockSync(path, opts)
-      }
-    }
-
-    // failed to lock!
-    debug('failed to lock', path, opts, er)
-    return retryThrow(path, opts, er)
-  }
-}
-
-function retryThrow (path, opts, er) {
-  if (typeof opts.retries === 'number' && opts.retries > 0) {
-    var newRT = opts.retries - 1
-    debug('retryThrow', path, opts, newRT)
-    var opts_ = Object.create(opts, { retries: { value: newRT }})
-    return exports.lockSync(path, opts_)
-  }
-  throw er
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lockfile/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,45 +0,0 @@
-{
-  "name": "lockfile",
-  "version": "0.4.2",
-  "main": "lockfile.js",
-  "directories": {
-    "test": "test"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "tap": "~0.2.5",
-    "touch": "0"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/lockfile"
-  },
-  "keywords": [
-    "lockfile",
-    "lock",
-    "file",
-    "fs",
-    "O_EXCL"
-  ],
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "license": "BSD",
-  "description": "A very polite lock file utility, which endeavors to not litter, and to wait patiently for others.",
-  "readme": "# lockfile\n\nA very polite lock file utility, which endeavors to not litter, and to\nwait patiently for others.\n\n## Usage\n\n```javascript\nvar lockFile = require('lockfile')\n\n// opts is optional, and defaults to {}\nlockFile.lock('some-file.lock', opts, function (er) {\n  // if the er happens, then it failed to acquire a lock.\n  // if there was not an error, then the file was created,\n  // and won't be deleted until we unlock it.\n\n  // do my stuff, free of interruptions\n  // then, some time later, do:\n  lockFile.unlock('some-file.lock', function (er) {\n    // er means that an error happened, and is probably bad.\n  })\n})\n```\n\n## Methods\n\nSync methods return the value/throw the error, others don't.  Standard\nnode fs stuff.\n\nAll known locks are removed when the process exits.  Of course, it's\npossible for certain types of failures to cause this to fail, but a best\neffort is made to not be a litterbug.\n\n### lockFile.lock(path, [opts], cb)\n\nAcquire a file lock on the specified path\n\n### lockFile.lockSync(path, [opts])\n\nAcquire a file lock on the specified path\n\n### lockFile.unlock(path, cb)\n\nClose and unlink the lockfile.\n\n### lockFile.unlockSync(path)\n\nClose and unlink the lockfile.\n\n### lockFile.check(path, [opts], cb)\n\nCheck if the lockfile is locked and not stale.\n\nReturns boolean.\n\n### lockFile.checkSync(path, [opts], cb)\n\nCheck if the lockfile is locked and not stale.\n\nCallback is called with `cb(error, isLocked)`.\n\n## Options\n\n### opts.wait\n\nA number of milliseconds to wait for locks to expire before giving up.\nOnly used by lockFile.lock.  Relies on fs.watch.  If the lock is not\ncleared by the time the wait expires, then it returns with the original\nerror.\n\n### opts.stale\n\nA number of milliseconds before locks are considered to have expired.\n\n### opts.retries\n\nUsed by lock and lockSync.  Retry `n` number of times before giving up.\n\n### opts.retryWait\n\nUsed by lock.  Wait `n` milliseconds before retrying.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/lockfile/issues"
-  },
-  "_id": "lockfile@0.4.2",
-  "dist": {
-    "shasum": "ab91f5d3745bc005ae4fa34d078910d1f2b9612d"
-  },
-  "_from": "lockfile@0.4.2",
-  "_resolved": "https://registry.npmjs.org/lockfile/-/lockfile-0.4.2.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-/node_modules
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/AUTHORS	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,8 +0,0 @@
-# Authors, sorted by whether or not they are me
-Isaac Z. Schlueter <i@izs.me>
-Carlos Brito Lage <carlos@carloslage.net>
-Marko Mikulicic <marko.mikulicic@isti.cnr.it>
-Trent Mick <trentm@gmail.com>
-Kevin O'Hara <kevinohara80@gmail.com>
-Marco Rogers <marco.rogers@gmail.com>
-Jesse Dailey <jesse.dailey@gmail.com>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-Copyright 2009, 2010, 2011 Isaac Z. Schlueter.
-All rights reserved.
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,97 +0,0 @@
-# lru cache
-
-A cache object that deletes the least-recently-used items.
-
-## Usage:
-
-```javascript
-var LRU = require("lru-cache")
-  , options = { max: 500
-              , length: function (n) { return n * 2 }
-              , dispose: function (key, n) { n.close() }
-              , maxAge: 1000 * 60 * 60 }
-  , cache = LRU(options)
-  , otherCache = LRU(50) // sets just the max size
-
-cache.set("key", "value")
-cache.get("key") // "value"
-
-cache.reset()    // empty the cache
-```
-
-If you put more stuff in it, then items will fall out.
-
-If you try to put an oversized thing in it, then it'll fall out right
-away.
-
-## Options
-
-* `max` The maximum size of the cache, checked by applying the length
-  function to all values in the cache.  Not setting this is kind of
-  silly, since that's the whole purpose of this lib, but it defaults
-  to `Infinity`.
-* `maxAge` Maximum age in ms.  Items are not pro-actively pruned out
-  as they age, but if you try to get an item that is too old, it'll
-  drop it and return undefined instead of giving it to you.
-* `length` Function that is used to calculate the length of stored
-  items.  If you're storing strings or buffers, then you probably want
-  to do something like `function(n){return n.length}`.  The default is
-  `function(n){return 1}`, which is fine if you want to store `n`
-  like-sized things.
-* `dispose` Function that is called on items when they are dropped
-  from the cache.  This can be handy if you want to close file
-  descriptors or do other cleanup tasks when items are no longer
-  accessible.  Called with `key, value`.  It's called *before*
-  actually removing the item from the internal cache, so if you want
-  to immediately put it back in, you'll have to do that in a
-  `nextTick` or `setTimeout` callback or it won't do anything.
-* `stale` By default, if you set a `maxAge`, it'll only actually pull
-  stale items out of the cache when you `get(key)`.  (That is, it's
-  not pre-emptively doing a `setTimeout` or anything.)  If you set
-  `stale:true`, it'll return the stale value before deleting it.  If
-  you don't set this, then it'll return `undefined` when you try to
-  get a stale entry, as if it had already been deleted.
-
-## API
-
-* `set(key, value)`
-* `get(key) => value`
-
-    Both of these will update the "recently used"-ness of the key.
-    They do what you think.
-
-* `peek(key)`
-
-    Returns the key value (or `undefined` if not found) without
-    updating the "recently used"-ness of the key.
-
-    (If you find yourself using this a lot, you *might* be using the
-    wrong sort of data structure, but there are some use cases where
-    it's handy.)
-
-* `del(key)`
-
-    Deletes a key out of the cache.
-
-* `reset()`
-
-    Clear the cache entirely, throwing away all values.
-
-* `has(key)`
-
-    Check if a key is in the cache, without updating the recent-ness
-    or deleting it for being stale.
-
-* `forEach(function(value,key,cache), [thisp])`
-
-    Just like `Array.prototype.forEach`.  Iterates over all the keys
-    in the cache, in order of recent-ness.  (Ie, more recently used
-    items are iterated over first.)
-
-* `keys()`
-
-    Return an array of the keys in the cache.
-
-* `values()`
-
-    Return an array of the values in the cache.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/bench.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-var LRU = require('lru-cache');
-
-var max = +process.argv[2] || 10240;
-var more = 102400;
-
-var cache = LRU({
-  max: max, maxAge: 86400e3
-});
-
-// fill cache
-for (var i = 0; i < max; ++i) {
-  cache.set(i, {});
-}
-
-var start = process.hrtime();
-
-// adding more items
-for ( ; i < max+more; ++i) {
-  cache.set(i, {});
-}
-
-var end = process.hrtime(start);
-var msecs = end[0] * 1E3 + end[1] / 1E6;
-
-console.log('adding %d items took %d ms', more, msecs.toPrecision(5));
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/lib/lru-cache.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,263 +0,0 @@
-;(function () { // closure for web browsers
-
-if (typeof module === 'object' && module.exports) {
-  module.exports = LRUCache
-} else {
-  // just set the global for non-node platforms.
-  this.LRUCache = LRUCache
-}
-
-function hOP (obj, key) {
-  return Object.prototype.hasOwnProperty.call(obj, key)
-}
-
-function naiveLength () { return 1 }
-
-function LRUCache (options) {
-  if (!(this instanceof LRUCache)) {
-    return new LRUCache(options)
-  }
-
-  var max
-  if (typeof options === 'number') {
-    max = options
-    options = { max: max }
-  }
-
-  if (!options) options = {}
-
-  max = options.max
-
-  var lengthCalculator = options.length || naiveLength
-
-  if (typeof lengthCalculator !== "function") {
-    lengthCalculator = naiveLength
-  }
-
-  if (!max || !(typeof max === "number") || max <= 0 ) {
-    // a little bit silly.  maybe this should throw?
-    max = Infinity
-  }
-
-  var allowStale = options.stale || false
-
-  var maxAge = options.maxAge || null
-
-  var dispose = options.dispose
-
-  var cache = Object.create(null) // hash of items by key
-    , lruList = Object.create(null) // list of items in order of use recency
-    , mru = 0 // most recently used
-    , lru = 0 // least recently used
-    , length = 0 // number of items in the list
-    , itemCount = 0
-
-
-  // resize the cache when the max changes.
-  Object.defineProperty(this, "max",
-    { set : function (mL) {
-        if (!mL || !(typeof mL === "number") || mL <= 0 ) mL = Infinity
-        max = mL
-        // if it gets above double max, trim right away.
-        // otherwise, do it whenever it's convenient.
-        if (length > max) trim()
-      }
-    , get : function () { return max }
-    , enumerable : true
-    })
-
-  // resize the cache when the lengthCalculator changes.
-  Object.defineProperty(this, "lengthCalculator",
-    { set : function (lC) {
-        if (typeof lC !== "function") {
-          lengthCalculator = naiveLength
-          length = itemCount
-          for (var key in cache) {
-            cache[key].length = 1
-          }
-        } else {
-          lengthCalculator = lC
-          length = 0
-          for (var key in cache) {
-            cache[key].length = lengthCalculator(cache[key].value)
-            length += cache[key].length
-          }
-        }
-
-        if (length > max) trim()
-      }
-    , get : function () { return lengthCalculator }
-    , enumerable : true
-    })
-
-  Object.defineProperty(this, "length",
-    { get : function () { return length }
-    , enumerable : true
-    })
-
-
-  Object.defineProperty(this, "itemCount",
-    { get : function () { return itemCount }
-    , enumerable : true
-    })
-
-  this.forEach = function (fn, thisp) {
-    thisp = thisp || this
-    var i = 0;
-    for (var k = mru - 1; k >= 0 && i < itemCount; k--) if (lruList[k]) {
-      i++
-      var hit = lruList[k]
-      if (maxAge && (Date.now() - hit.now > maxAge)) {
-        del(hit)
-        if (!allowStale) hit = undefined
-      }
-      if (hit) {
-        fn.call(thisp, hit.value, hit.key, this)
-      }
-    }
-  }
-
-  this.keys = function () {
-    var keys = new Array(itemCount)
-    var i = 0
-    for (var k = mru - 1; k >= 0 && i < itemCount; k--) if (lruList[k]) {
-      var hit = lruList[k]
-      keys[i++] = hit.key
-    }
-    return keys
-  }
-
-  this.values = function () {
-    var values = new Array(itemCount)
-    var i = 0
-    for (var k = mru - 1; k >= 0 && i < itemCount; k--) if (lruList[k]) {
-      var hit = lruList[k]
-      values[i++] = hit.value
-    }
-    return values
-  }
-
-  this.reset = function () {
-    if (dispose) {
-      for (var k in cache) {
-        dispose(k, cache[k].value)
-      }
-    }
-    cache = {}
-    lruList = {}
-    lru = 0
-    mru = 0
-    length = 0
-    itemCount = 0
-  }
-
-  // Provided for debugging/dev purposes only. No promises whatsoever that
-  // this API stays stable.
-  this.dump = function () {
-    return cache
-  }
-
-  this.dumpLru = function () {
-    return lruList
-  }
-
-  this.set = function (key, value) {
-    if (hOP(cache, key)) {
-      // dispose of the old one before overwriting
-      if (dispose) dispose(key, cache[key].value)
-      if (maxAge) cache[key].now = Date.now()
-      cache[key].value = value
-      this.get(key)
-      return true
-    }
-
-    var len = lengthCalculator(value)
-    var age = maxAge ? Date.now() : 0
-    var hit = new Entry(key, value, mru++, len, age)
-
-    // oversized objects fall out of cache automatically.
-    if (hit.length > max) {
-      if (dispose) dispose(key, value)
-      return false
-    }
-
-    length += hit.length
-    lruList[hit.lu] = cache[key] = hit
-    itemCount ++
-
-    if (length > max) trim()
-    return true
-  }
-
-  this.has = function (key) {
-    if (!hOP(cache, key)) return false
-    var hit = cache[key]
-    if (maxAge && (Date.now() - hit.now > maxAge)) {
-      return false
-    }
-    return true
-  }
-
-  this.get = function (key) {
-    return get(key, true)
-  }
-
-  this.peek = function (key) {
-    return get(key, false)
-  }
-
-  function get (key, doUse) {
-    var hit = cache[key]
-    if (hit) {
-      if (maxAge && (Date.now() - hit.now > maxAge)) {
-        del(hit)
-        if (!allowStale) hit = undefined
-      } else {
-        if (doUse) use(hit)
-      }
-      if (hit) hit = hit.value
-    }
-    return hit
-  }
-
-  function use (hit) {
-    shiftLU(hit)
-    hit.lu = mru ++
-    lruList[hit.lu] = hit
-  }
-
-  this.del = function (key) {
-    del(cache[key])
-  }
-
-  function trim () {
-    while (lru < mru && length > max)
-      del(lruList[lru])
-  }
-
-  function shiftLU(hit) {
-    delete lruList[ hit.lu ]
-    while (lru < mru && !lruList[lru]) lru ++
-  }
-
-  function del(hit) {
-    if (hit) {
-      if (dispose) dispose(hit.key, hit.value)
-      length -= hit.length
-      itemCount --
-      delete cache[ hit.key ]
-      shiftLU(hit)
-    }
-  }
-}
-
-// classy, since V8 prefers predictable objects.
-function Entry (key, value, mru, len, age) {
-  this.key = key
-  this.value = value
-  this.lu = mru
-  this.length = len
-  this.now = age
-}
-
-})()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/lru-cache/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,66 +0,0 @@
-{
-  "name": "lru-cache",
-  "description": "A cache object that deletes the least-recently-used items.",
-  "version": "2.3.1",
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me"
-  },
-  "scripts": {
-    "test": "tap test --gc"
-  },
-  "main": "lib/lru-cache.js",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/node-lru-cache.git"
-  },
-  "devDependencies": {
-    "tap": "",
-    "weak": ""
-  },
-  "license": {
-    "type": "MIT",
-    "url": "http://github.com/isaacs/node-lru-cache/raw/master/LICENSE"
-  },
-  "contributors": [
-    {
-      "name": "Isaac Z. Schlueter",
-      "email": "i@izs.me"
-    },
-    {
-      "name": "Carlos Brito Lage",
-      "email": "carlos@carloslage.net"
-    },
-    {
-      "name": "Marko Mikulicic",
-      "email": "marko.mikulicic@isti.cnr.it"
-    },
-    {
-      "name": "Trent Mick",
-      "email": "trentm@gmail.com"
-    },
-    {
-      "name": "Kevin O'Hara",
-      "email": "kevinohara80@gmail.com"
-    },
-    {
-      "name": "Marco Rogers",
-      "email": "marco.rogers@gmail.com"
-    },
-    {
-      "name": "Jesse Dailey",
-      "email": "jesse.dailey@gmail.com"
-    }
-  ],
-  "readme": "# lru cache\n\nA cache object that deletes the least-recently-used items.\n\n## Usage:\n\n```javascript\nvar LRU = require(\"lru-cache\")\n  , options = { max: 500\n              , length: function (n) { return n * 2 }\n              , dispose: function (key, n) { n.close() }\n              , maxAge: 1000 * 60 * 60 }\n  , cache = LRU(options)\n  , otherCache = LRU(50) // sets just the max size\n\ncache.set(\"key\", \"value\")\ncache.get(\"key\") // \"value\"\n\ncache.reset()    // empty the cache\n```\n\nIf you put more stuff in it, then items will fall out.\n\nIf you try to put an oversized thing in it, then it'll fall out right\naway.\n\n## Options\n\n* `max` The maximum size of the cache, checked by applying the length\n  function to all values in the cache.  Not setting this is kind of\n  silly, since that's the whole purpose of this lib, but it defaults\n  to `Infinity`.\n* `maxAge` Maximum age in ms.  Items are not pro-actively pruned out\n  as they age, but if you try to get an item that is too old, it'll\n  drop it and return undefined instead of giving it to you.\n* `length` Function that is used to calculate the length of stored\n  items.  If you're storing strings or buffers, then you probably want\n  to do something like `function(n){return n.length}`.  The default is\n  `function(n){return 1}`, which is fine if you want to store `n`\n  like-sized things.\n* `dispose` Function that is called on items when they are dropped\n  from the cache.  This can be handy if you want to close file\n  descriptors or do other cleanup tasks when items are no longer\n  accessible.  Called with `key, value`.  It's called *before*\n  actually removing the item from the internal cache, so if you want\n  to immediately put it back in, you'll have to do that in a\n  `nextTick` or `setTimeout` callback or it won't do anything.\n* `stale` By default, if you set a `maxAge`, it'll only actually pull\n  stale items out of the cache when you `get(key)`.  (That is, it's\n  not pre-emptively doing a `setTimeout` or anything.)  If you set\n  `stale:true`, it'll return the stale value before deleting it.  If\n  you don't set this, then it'll return `undefined` when you try to\n  get a stale entry, as if it had already been deleted.\n\n## API\n\n* `set(key, value)`\n* `get(key) => value`\n\n    Both of these will update the \"recently used\"-ness of the key.\n    They do what you think.\n\n* `peek(key)`\n\n    Returns the key value (or `undefined` if not found) without\n    updating the \"recently used\"-ness of the key.\n\n    (If you find yourself using this a lot, you *might* be using the\n    wrong sort of data structure, but there are some use cases where\n    it's handy.)\n\n* `del(key)`\n\n    Deletes a key out of the cache.\n\n* `reset()`\n\n    Clear the cache entirely, throwing away all values.\n\n* `has(key)`\n\n    Check if a key is in the cache, without updating the recent-ness\n    or deleting it for being stale.\n\n* `forEach(function(value,key,cache), [thisp])`\n\n    Just like `Array.prototype.forEach`.  Iterates over all the keys\n    in the cache, in order of recent-ness.  (Ie, more recently used\n    items are iterated over first.)\n\n* `keys()`\n\n    Return an array of the keys in the cache.\n\n* `values()`\n\n    Return an array of the values in the cache.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/node-lru-cache/issues"
-  },
-  "_id": "lru-cache@2.3.1",
-  "dist": {
-    "shasum": "b3adf6b3d856e954e2c390e6cef22081245a53d6"
-  },
-  "_from": "lru-cache@2.3.1",
-  "_resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-2.3.1.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-Copyright 2009, 2010, 2011 Isaac Z. Schlueter.
-All rights reserved.
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,218 +0,0 @@
-# minimatch
-
-A minimal matching utility.
-
-[![Build Status](https://secure.travis-ci.org/isaacs/minimatch.png)](http://travis-ci.org/isaacs/minimatch)
-
-
-This is the matching library used internally by npm.
-
-Eventually, it will replace the C binding in node-glob.
-
-It works by converting glob expressions into JavaScript `RegExp`
-objects.
-
-## Usage
-
-```javascript
-var minimatch = require("minimatch")
-
-minimatch("bar.foo", "*.foo") // true!
-minimatch("bar.foo", "*.bar") // false!
-```
-
-## Features
-
-Supports these glob features:
-
-* Brace Expansion
-* Extended glob matching
-* "Globstar" `**` matching
-
-See:
-
-* `man sh`
-* `man bash`
-* `man 3 fnmatch`
-* `man 5 gitignore`
-
-### Comparisons to other fnmatch/glob implementations
-
-While strict compliance with the existing standards is a worthwhile
-goal, some discrepancies exist between minimatch and other
-implementations, and are intentional.
-
-If the pattern starts with a `!` character, then it is negated.  Set the
-`nonegate` flag to suppress this behavior, and treat leading `!`
-characters normally.  This is perhaps relevant if you wish to start the
-pattern with a negative extglob pattern like `!(a|B)`.  Multiple `!`
-characters at the start of a pattern will negate the pattern multiple
-times.
-
-If a pattern starts with `#`, then it is treated as a comment, and
-will not match anything.  Use `\#` to match a literal `#` at the
-start of a line, or set the `nocomment` flag to suppress this behavior.
-
-The double-star character `**` is supported by default, unless the
-`noglobstar` flag is set.  This is supported in the manner of bsdglob
-and bash 4.1, where `**` only has special significance if it is the only
-thing in a path part.  That is, `a/**/b` will match `a/x/y/b`, but
-`a/**b` will not.  **Note that this is different from the way that `**` is
-handled by ruby's `Dir` class.**
-
-If an escaped pattern has no matches, and the `nonull` flag is set,
-then minimatch.match returns the pattern as-provided, rather than
-interpreting the character escapes.  For example,
-`minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than
-`"*a?"`.  This is akin to setting the `nullglob` option in bash, except
-that it does not resolve escaped pattern characters.
-
-If brace expansion is not disabled, then it is performed before any
-other interpretation of the glob pattern.  Thus, a pattern like
-`+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded
-**first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are
-checked for validity.  Since those two are valid, matching proceeds.
-
-
-## Minimatch Class
-
-Create a minimatch object by instanting the `minimatch.Minimatch` class.
-
-```javascript
-var Minimatch = require("minimatch").Minimatch
-var mm = new Minimatch(pattern, options)
-```
-
-### Properties
-
-* `pattern` The original pattern the minimatch object represents.
-* `options` The options supplied to the constructor.
-* `set` A 2-dimensional array of regexp or string expressions.
-  Each row in the
-  array corresponds to a brace-expanded pattern.  Each item in the row
-  corresponds to a single path-part.  For example, the pattern
-  `{a,b/c}/d` would expand to a set of patterns like:
-
-        [ [ a, d ]
-        , [ b, c, d ] ]
-
-    If a portion of the pattern doesn't have any "magic" in it
-    (that is, it's something like `"foo"` rather than `fo*o?`), then it
-    will be left as a string rather than converted to a regular
-    expression.
-
-* `regexp` Created by the `makeRe` method.  A single regular expression
-  expressing the entire pattern.  This is useful in cases where you wish
-  to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled.
-* `negate` True if the pattern is negated.
-* `comment` True if the pattern is a comment.
-* `empty` True if the pattern is `""`.
-
-### Methods
-
-* `makeRe` Generate the `regexp` member if necessary, and return it.
-  Will return `false` if the pattern is invalid.
-* `match(fname)` Return true if the filename matches the pattern, or
-  false otherwise.
-* `matchOne(fileArray, patternArray, partial)` Take a `/`-split
-  filename, and match it against a single row in the `regExpSet`.  This
-  method is mainly for internal use, but is exposed so that it can be
-  used by a glob-walker that needs to avoid excessive filesystem calls.
-
-All other methods are internal, and will be called as necessary.
-
-## Functions
-
-The top-level exported function has a `cache` property, which is an LRU
-cache set to store 100 items.  So, calling these methods repeatedly
-with the same pattern and options will use the same Minimatch object,
-saving the cost of parsing it multiple times.
-
-### minimatch(path, pattern, options)
-
-Main export.  Tests a path against the pattern using the options.
-
-```javascript
-var isJS = minimatch(file, "*.js", { matchBase: true })
-```
-
-### minimatch.filter(pattern, options)
-
-Returns a function that tests its
-supplied argument, suitable for use with `Array.filter`.  Example:
-
-```javascript
-var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true}))
-```
-
-### minimatch.match(list, pattern, options)
-
-Match against the list of
-files, in the style of fnmatch or glob.  If nothing is matched, and
-options.nonull is set, then return a list containing the pattern itself.
-
-```javascript
-var javascripts = minimatch.match(fileList, "*.js", {matchBase: true}))
-```
-
-### minimatch.makeRe(pattern, options)
-
-Make a regular expression object from the pattern.
-
-## Options
-
-All options are `false` by default.
-
-### debug
-
-Dump a ton of stuff to stderr.
-
-### nobrace
-
-Do not expand `{a,b}` and `{1..3}` brace sets.
-
-### noglobstar
-
-Disable `**` matching against multiple folder names.
-
-### dot
-
-Allow patterns to match filenames starting with a period, even if
-the pattern does not explicitly have a period in that spot.
-
-Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot`
-is set.
-
-### noext
-
-Disable "extglob" style patterns like `+(a|b)`.
-
-### nocase
-
-Perform a case-insensitive match.
-
-### nonull
-
-When a match is not found by `minimatch.match`, return a list containing
-the pattern itself.  When set, an empty list is returned if there are
-no matches.
-
-### matchBase
-
-If set, then patterns without slashes will be matched
-against the basename of the path if it contains slashes.  For example,
-`a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`.
-
-### nocomment
-
-Suppress the behavior of treating `#` at the start of a pattern as a
-comment.
-
-### nonegate
-
-Suppress the behavior of treating a leading `!` character as negation.
-
-### flipNegate
-
-Returns from negate expressions the same as if they were not negated.
-(Ie, true on a hit, false on a miss.)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/minimatch.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1079 +0,0 @@
-;(function (require, exports, module, platform) {
-
-if (module) module.exports = minimatch
-else exports.minimatch = minimatch
-
-if (!require) {
-  require = function (id) {
-    switch (id) {
-      case "sigmund": return function sigmund (obj) {
-        return JSON.stringify(obj)
-      }
-      case "path": return { basename: function (f) {
-        f = f.split(/[\/\\]/)
-        var e = f.pop()
-        if (!e) e = f.pop()
-        return e
-      }}
-      case "lru-cache": return function LRUCache () {
-        // not quite an LRU, but still space-limited.
-        var cache = {}
-        var cnt = 0
-        this.set = function (k, v) {
-          cnt ++
-          if (cnt >= 100) cache = {}
-          cache[k] = v
-        }
-        this.get = function (k) { return cache[k] }
-      }
-    }
-  }
-}
-
-minimatch.Minimatch = Minimatch
-
-var LRU = require("lru-cache")
-  , cache = minimatch.cache = new LRU({max: 100})
-  , GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {}
-  , sigmund = require("sigmund")
-
-var path = require("path")
-  // any single thing other than /
-  // don't need to escape / when using new RegExp()
-  , qmark = "[^/]"
-
-  // * => any number of characters
-  , star = qmark + "*?"
-
-  // ** when dots are allowed.  Anything goes, except .. and .
-  // not (^ or / followed by one or two dots followed by $ or /),
-  // followed by anything, any number of times.
-  , twoStarDot = "(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?"
-
-  // not a ^ or / followed by a dot,
-  // followed by anything, any number of times.
-  , twoStarNoDot = "(?:(?!(?:\\\/|^)\\.).)*?"
-
-  // characters that need to be escaped in RegExp.
-  , reSpecials = charSet("().*{}+?[]^$\\!")
-
-// "abc" -> { a:true, b:true, c:true }
-function charSet (s) {
-  return s.split("").reduce(function (set, c) {
-    set[c] = true
-    return set
-  }, {})
-}
-
-// normalizes slashes.
-var slashSplit = /\/+/
-
-minimatch.monkeyPatch = monkeyPatch
-function monkeyPatch () {
-  var desc = Object.getOwnPropertyDescriptor(String.prototype, "match")
-  var orig = desc.value
-  desc.value = function (p) {
-    if (p instanceof Minimatch) return p.match(this)
-    return orig.call(this, p)
-  }
-  Object.defineProperty(String.prototype, desc)
-}
-
-minimatch.filter = filter
-function filter (pattern, options) {
-  options = options || {}
-  return function (p, i, list) {
-    return minimatch(p, pattern, options)
-  }
-}
-
-function ext (a, b) {
-  a = a || {}
-  b = b || {}
-  var t = {}
-  Object.keys(b).forEach(function (k) {
-    t[k] = b[k]
-  })
-  Object.keys(a).forEach(function (k) {
-    t[k] = a[k]
-  })
-  return t
-}
-
-minimatch.defaults = function (def) {
-  if (!def || !Object.keys(def).length) return minimatch
-
-  var orig = minimatch
-
-  var m = function minimatch (p, pattern, options) {
-    return orig.minimatch(p, pattern, ext(def, options))
-  }
-
-  m.Minimatch = function Minimatch (pattern, options) {
-    return new orig.Minimatch(pattern, ext(def, options))
-  }
-
-  return m
-}
-
-Minimatch.defaults = function (def) {
-  if (!def || !Object.keys(def).length) return Minimatch
-  return minimatch.defaults(def).Minimatch
-}
-
-
-function minimatch (p, pattern, options) {
-  if (typeof pattern !== "string") {
-    throw new TypeError("glob pattern string required")
-  }
-
-  if (!options) options = {}
-
-  // shortcut: comments match nothing.
-  if (!options.nocomment && pattern.charAt(0) === "#") {
-    return false
-  }
-
-  // "" only matches ""
-  if (pattern.trim() === "") return p === ""
-
-  return new Minimatch(pattern, options).match(p)
-}
-
-function Minimatch (pattern, options) {
-  if (!(this instanceof Minimatch)) {
-    return new Minimatch(pattern, options, cache)
-  }
-
-  if (typeof pattern !== "string") {
-    throw new TypeError("glob pattern string required")
-  }
-
-  if (!options) options = {}
-  pattern = pattern.trim()
-
-  // windows: need to use /, not \
-  // On other platforms, \ is a valid (albeit bad) filename char.
-  if (platform === "win32") {
-    pattern = pattern.split("\\").join("/")
-  }
-
-  // lru storage.
-  // these things aren't particularly big, but walking down the string
-  // and turning it into a regexp can get pretty costly.
-  var cacheKey = pattern + "\n" + sigmund(options)
-  var cached = minimatch.cache.get(cacheKey)
-  if (cached) return cached
-  minimatch.cache.set(cacheKey, this)
-
-  this.options = options
-  this.set = []
-  this.pattern = pattern
-  this.regexp = null
-  this.negate = false
-  this.comment = false
-  this.empty = false
-
-  // make the set of regexps etc.
-  this.make()
-}
-
-Minimatch.prototype.make = make
-function make () {
-  // don't do it more than once.
-  if (this._made) return
-
-  var pattern = this.pattern
-  var options = this.options
-
-  // empty patterns and comments match nothing.
-  if (!options.nocomment && pattern.charAt(0) === "#") {
-    this.comment = true
-    return
-  }
-  if (!pattern) {
-    this.empty = true
-    return
-  }
-
-  // step 1: figure out negation, etc.
-  this.parseNegate()
-
-  // step 2: expand braces
-  var set = this.globSet = this.braceExpand()
-
-  if (options.debug) console.error(this.pattern, set)
-
-  // step 3: now we have a set, so turn each one into a series of path-portion
-  // matching patterns.
-  // These will be regexps, except in the case of "**", which is
-  // set to the GLOBSTAR object for globstar behavior,
-  // and will not contain any / characters
-  set = this.globParts = set.map(function (s) {
-    return s.split(slashSplit)
-  })
-
-  if (options.debug) console.error(this.pattern, set)
-
-  // glob --> regexps
-  set = set.map(function (s, si, set) {
-    return s.map(this.parse, this)
-  }, this)
-
-  if (options.debug) console.error(this.pattern, set)
-
-  // filter out everything that didn't compile properly.
-  set = set.filter(function (s) {
-    return -1 === s.indexOf(false)
-  })
-
-  if (options.debug) console.error(this.pattern, set)
-
-  this.set = set
-}
-
-Minimatch.prototype.parseNegate = parseNegate
-function parseNegate () {
-  var pattern = this.pattern
-    , negate = false
-    , options = this.options
-    , negateOffset = 0
-
-  if (options.nonegate) return
-
-  for ( var i = 0, l = pattern.length
-      ; i < l && pattern.charAt(i) === "!"
-      ; i ++) {
-    negate = !negate
-    negateOffset ++
-  }
-
-  if (negateOffset) this.pattern = pattern.substr(negateOffset)
-  this.negate = negate
-}
-
-// Brace expansion:
-// a{b,c}d -> abd acd
-// a{b,}c -> abc ac
-// a{0..3}d -> a0d a1d a2d a3d
-// a{b,c{d,e}f}g -> abg acdfg acefg
-// a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg
-//
-// Invalid sets are not expanded.
-// a{2..}b -> a{2..}b
-// a{b}c -> a{b}c
-minimatch.braceExpand = function (pattern, options) {
-  return new Minimatch(pattern, options).braceExpand()
-}
-
-Minimatch.prototype.braceExpand = braceExpand
-function braceExpand (pattern, options) {
-  options = options || this.options
-  pattern = typeof pattern === "undefined"
-    ? this.pattern : pattern
-
-  if (typeof pattern === "undefined") {
-    throw new Error("undefined pattern")
-  }
-
-  if (options.nobrace ||
-      !pattern.match(/\{.*\}/)) {
-    // shortcut. no need to expand.
-    return [pattern]
-  }
-
-  var escaping = false
-
-  // examples and comments refer to this crazy pattern:
-  // a{b,c{d,e},{f,g}h}x{y,z}
-  // expected:
-  // abxy
-  // abxz
-  // acdxy
-  // acdxz
-  // acexy
-  // acexz
-  // afhxy
-  // afhxz
-  // aghxy
-  // aghxz
-
-  // everything before the first \{ is just a prefix.
-  // So, we pluck that off, and work with the rest,
-  // and then prepend it to everything we find.
-  if (pattern.charAt(0) !== "{") {
-    // console.error(pattern)
-    var prefix = null
-    for (var i = 0, l = pattern.length; i < l; i ++) {
-      var c = pattern.charAt(i)
-      // console.error(i, c)
-      if (c === "\\") {
-        escaping = !escaping
-      } else if (c === "{" && !escaping) {
-        prefix = pattern.substr(0, i)
-        break
-      }
-    }
-
-    // actually no sets, all { were escaped.
-    if (prefix === null) {
-      // console.error("no sets")
-      return [pattern]
-    }
-
-    var tail = braceExpand(pattern.substr(i), options)
-    return tail.map(function (t) {
-      return prefix + t
-    })
-  }
-
-  // now we have something like:
-  // {b,c{d,e},{f,g}h}x{y,z}
-  // walk through the set, expanding each part, until
-  // the set ends.  then, we'll expand the suffix.
-  // If the set only has a single member, then'll put the {} back
-
-  // first, handle numeric sets, since they're easier
-  var numset = pattern.match(/^\{(-?[0-9]+)\.\.(-?[0-9]+)\}/)
-  if (numset) {
-    // console.error("numset", numset[1], numset[2])
-    var suf = braceExpand(pattern.substr(numset[0].length), options)
-      , start = +numset[1]
-      , end = +numset[2]
-      , inc = start > end ? -1 : 1
-      , set = []
-    for (var i = start; i != (end + inc); i += inc) {
-      // append all the suffixes
-      for (var ii = 0, ll = suf.length; ii < ll; ii ++) {
-        set.push(i + suf[ii])
-      }
-    }
-    return set
-  }
-
-  // ok, walk through the set
-  // We hope, somewhat optimistically, that there
-  // will be a } at the end.
-  // If the closing brace isn't found, then the pattern is
-  // interpreted as braceExpand("\\" + pattern) so that
-  // the leading \{ will be interpreted literally.
-  var i = 1 // skip the \{
-    , depth = 1
-    , set = []
-    , member = ""
-    , sawEnd = false
-    , escaping = false
-
-  function addMember () {
-    set.push(member)
-    member = ""
-  }
-
-  // console.error("Entering for")
-  FOR: for (i = 1, l = pattern.length; i < l; i ++) {
-    var c = pattern.charAt(i)
-    // console.error("", i, c)
-
-    if (escaping) {
-      escaping = false
-      member += "\\" + c
-    } else {
-      switch (c) {
-        case "\\":
-          escaping = true
-          continue
-
-        case "{":
-          depth ++
-          member += "{"
-          continue
-
-        case "}":
-          depth --
-          // if this closes the actual set, then we're done
-          if (depth === 0) {
-            addMember()
-            // pluck off the close-brace
-            i ++
-            break FOR
-          } else {
-            member += c
-            continue
-          }
-
-        case ",":
-          if (depth === 1) {
-            addMember()
-          } else {
-            member += c
-          }
-          continue
-
-        default:
-          member += c
-          continue
-      } // switch
-    } // else
-  } // for
-
-  // now we've either finished the set, and the suffix is
-  // pattern.substr(i), or we have *not* closed the set,
-  // and need to escape the leading brace
-  if (depth !== 0) {
-    // console.error("didn't close", pattern)
-    return braceExpand("\\" + pattern, options)
-  }
-
-  // x{y,z} -> ["xy", "xz"]
-  // console.error("set", set)
-  // console.error("suffix", pattern.substr(i))
-  var suf = braceExpand(pattern.substr(i), options)
-  // ["b", "c{d,e}","{f,g}h"] ->
-  //   [["b"], ["cd", "ce"], ["fh", "gh"]]
-  var addBraces = set.length === 1
-  // console.error("set pre-expanded", set)
-  set = set.map(function (p) {
-    return braceExpand(p, options)
-  })
-  // console.error("set expanded", set)
-
-
-  // [["b"], ["cd", "ce"], ["fh", "gh"]] ->
-  //   ["b", "cd", "ce", "fh", "gh"]
-  set = set.reduce(function (l, r) {
-    return l.concat(r)
-  })
-
-  if (addBraces) {
-    set = set.map(function (s) {
-      return "{" + s + "}"
-    })
-  }
-
-  // now attach the suffixes.
-  var ret = []
-  for (var i = 0, l = set.length; i < l; i ++) {
-    for (var ii = 0, ll = suf.length; ii < ll; ii ++) {
-      ret.push(set[i] + suf[ii])
-    }
-  }
-  return ret
-}
-
-// parse a component of the expanded set.
-// At this point, no pattern may contain "/" in it
-// so we're going to return a 2d array, where each entry is the full
-// pattern, split on '/', and then turned into a regular expression.
-// A regexp is made at the end which joins each array with an
-// escaped /, and another full one which joins each regexp with |.
-//
-// Following the lead of Bash 4.1, note that "**" only has special meaning
-// when it is the *only* thing in a path portion.  Otherwise, any series
-// of * is equivalent to a single *.  Globstar behavior is enabled by
-// default, and can be disabled by setting options.noglobstar.
-Minimatch.prototype.parse = parse
-var SUBPARSE = {}
-function parse (pattern, isSub) {
-  var options = this.options
-
-  // shortcuts
-  if (!options.noglobstar && pattern === "**") return GLOBSTAR
-  if (pattern === "") return ""
-
-  var re = ""
-    , hasMagic = !!options.nocase
-    , escaping = false
-    // ? => one single character
-    , patternListStack = []
-    , plType
-    , stateChar
-    , inClass = false
-    , reClassStart = -1
-    , classStart = -1
-    // . and .. never match anything that doesn't start with .,
-    // even when options.dot is set.
-    , patternStart = pattern.charAt(0) === "." ? "" // anything
-      // not (start or / followed by . or .. followed by / or end)
-      : options.dot ? "(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))"
-      : "(?!\\.)"
-
-  function clearStateChar () {
-    if (stateChar) {
-      // we had some state-tracking character
-      // that wasn't consumed by this pass.
-      switch (stateChar) {
-        case "*":
-          re += star
-          hasMagic = true
-          break
-        case "?":
-          re += qmark
-          hasMagic = true
-          break
-        default:
-          re += "\\"+stateChar
-          break
-      }
-      stateChar = false
-    }
-  }
-
-  for ( var i = 0, len = pattern.length, c
-      ; (i < len) && (c = pattern.charAt(i))
-      ; i ++ ) {
-
-    if (options.debug) {
-      console.error("%s\t%s %s %j", pattern, i, re, c)
-    }
-
-    // skip over any that are escaped.
-    if (escaping && reSpecials[c]) {
-      re += "\\" + c
-      escaping = false
-      continue
-    }
-
-    SWITCH: switch (c) {
-      case "/":
-        // completely not allowed, even escaped.
-        // Should already be path-split by now.
-        return false
-
-      case "\\":
-        clearStateChar()
-        escaping = true
-        continue
-
-      // the various stateChar values
-      // for the "extglob" stuff.
-      case "?":
-      case "*":
-      case "+":
-      case "@":
-      case "!":
-        if (options.debug) {
-          console.error("%s\t%s %s %j <-- stateChar", pattern, i, re, c)
-        }
-
-        // all of those are literals inside a class, except that
-        // the glob [!a] means [^a] in regexp
-        if (inClass) {
-          if (c === "!" && i === classStart + 1) c = "^"
-          re += c
-          continue
-        }
-
-        // if we already have a stateChar, then it means
-        // that there was something like ** or +? in there.
-        // Handle the stateChar, then proceed with this one.
-        clearStateChar()
-        stateChar = c
-        // if extglob is disabled, then +(asdf|foo) isn't a thing.
-        // just clear the statechar *now*, rather than even diving into
-        // the patternList stuff.
-        if (options.noext) clearStateChar()
-        continue
-
-      case "(":
-        if (inClass) {
-          re += "("
-          continue
-        }
-
-        if (!stateChar) {
-          re += "\\("
-          continue
-        }
-
-        plType = stateChar
-        patternListStack.push({ type: plType
-                              , start: i - 1
-                              , reStart: re.length })
-        // negation is (?:(?!js)[^/]*)
-        re += stateChar === "!" ? "(?:(?!" : "(?:"
-        stateChar = false
-        continue
-
-      case ")":
-        if (inClass || !patternListStack.length) {
-          re += "\\)"
-          continue
-        }
-
-        hasMagic = true
-        re += ")"
-        plType = patternListStack.pop().type
-        // negation is (?:(?!js)[^/]*)
-        // The others are (?:<pattern>)<type>
-        switch (plType) {
-          case "!":
-            re += "[^/]*?)"
-            break
-          case "?":
-          case "+":
-          case "*": re += plType
-          case "@": break // the default anyway
-        }
-        continue
-
-      case "|":
-        if (inClass || !patternListStack.length || escaping) {
-          re += "\\|"
-          escaping = false
-          continue
-        }
-
-        re += "|"
-        continue
-
-      // these are mostly the same in regexp and glob
-      case "[":
-        // swallow any state-tracking char before the [
-        clearStateChar()
-
-        if (inClass) {
-          re += "\\" + c
-          continue
-        }
-
-        inClass = true
-        classStart = i
-        reClassStart = re.length
-        re += c
-        continue
-
-      case "]":
-        //  a right bracket shall lose its special
-        //  meaning and represent itself in
-        //  a bracket expression if it occurs
-        //  first in the list.  -- POSIX.2 2.8.3.2
-        if (i === classStart + 1 || !inClass) {
-          re += "\\" + c
-          escaping = false
-          continue
-        }
-
-        // finish up the class.
-        hasMagic = true
-        inClass = false
-        re += c
-        continue
-
-      default:
-        // swallow any state char that wasn't consumed
-        clearStateChar()
-
-        if (escaping) {
-          // no need
-          escaping = false
-        } else if (reSpecials[c]
-                   && !(c === "^" && inClass)) {
-          re += "\\"
-        }
-
-        re += c
-
-    } // switch
-  } // for
-
-
-  // handle the case where we left a class open.
-  // "[abc" is valid, equivalent to "\[abc"
-  if (inClass) {
-    // split where the last [ was, and escape it
-    // this is a huge pita.  We now have to re-walk
-    // the contents of the would-be class to re-translate
-    // any characters that were passed through as-is
-    var cs = pattern.substr(classStart + 1)
-      , sp = this.parse(cs, SUBPARSE)
-    re = re.substr(0, reClassStart) + "\\[" + sp[0]
-    hasMagic = hasMagic || sp[1]
-  }
-
-  // handle the case where we had a +( thing at the *end*
-  // of the pattern.
-  // each pattern list stack adds 3 chars, and we need to go through
-  // and escape any | chars that were passed through as-is for the regexp.
-  // Go through and escape them, taking care not to double-escape any
-  // | chars that were already escaped.
-  var pl
-  while (pl = patternListStack.pop()) {
-    var tail = re.slice(pl.reStart + 3)
-    // maybe some even number of \, then maybe 1 \, followed by a |
-    tail = tail.replace(/((?:\\{2})*)(\\?)\|/g, function (_, $1, $2) {
-      if (!$2) {
-        // the | isn't already escaped, so escape it.
-        $2 = "\\"
-      }
-
-      // need to escape all those slashes *again*, without escaping the
-      // one that we need for escaping the | character.  As it works out,
-      // escaping an even number of slashes can be done by simply repeating
-      // it exactly after itself.  That's why this trick works.
-      //
-      // I am sorry that you have to see this.
-      return $1 + $1 + $2 + "|"
-    })
-
-    // console.error("tail=%j\n   %s", tail, tail)
-    var t = pl.type === "*" ? star
-          : pl.type === "?" ? qmark
-          : "\\" + pl.type
-
-    hasMagic = true
-    re = re.slice(0, pl.reStart)
-       + t + "\\("
-       + tail
-  }
-
-  // handle trailing things that only matter at the very end.
-  clearStateChar()
-  if (escaping) {
-    // trailing \\
-    re += "\\\\"
-  }
-
-  // only need to apply the nodot start if the re starts with
-  // something that could conceivably capture a dot
-  var addPatternStart = false
-  switch (re.charAt(0)) {
-    case ".":
-    case "[":
-    case "(": addPatternStart = true
-  }
-
-  // if the re is not "" at this point, then we need to make sure
-  // it doesn't match against an empty path part.
-  // Otherwise a/* will match a/, which it should not.
-  if (re !== "" && hasMagic) re = "(?=.)" + re
-
-  if (addPatternStart) re = patternStart + re
-
-  // parsing just a piece of a larger pattern.
-  if (isSub === SUBPARSE) {
-    return [ re, hasMagic ]
-  }
-
-  // skip the regexp for non-magical patterns
-  // unescape anything in it, though, so that it'll be
-  // an exact match against a file etc.
-  if (!hasMagic) {
-    return globUnescape(pattern)
-  }
-
-  var flags = options.nocase ? "i" : ""
-    , regExp = new RegExp("^" + re + "$", flags)
-
-  regExp._glob = pattern
-  regExp._src = re
-
-  return regExp
-}
-
-minimatch.makeRe = function (pattern, options) {
-  return new Minimatch(pattern, options || {}).makeRe()
-}
-
-Minimatch.prototype.makeRe = makeRe
-function makeRe () {
-  if (this.regexp || this.regexp === false) return this.regexp
-
-  // at this point, this.set is a 2d array of partial
-  // pattern strings, or "**".
-  //
-  // It's better to use .match().  This function shouldn't
-  // be used, really, but it's pretty convenient sometimes,
-  // when you just want to work with a regex.
-  var set = this.set
-
-  if (!set.length) return this.regexp = false
-  var options = this.options
-
-  var twoStar = options.noglobstar ? star
-      : options.dot ? twoStarDot
-      : twoStarNoDot
-    , flags = options.nocase ? "i" : ""
-
-  var re = set.map(function (pattern) {
-    return pattern.map(function (p) {
-      return (p === GLOBSTAR) ? twoStar
-           : (typeof p === "string") ? regExpEscape(p)
-           : p._src
-    }).join("\\\/")
-  }).join("|")
-
-  // must match entire pattern
-  // ending in a * or ** will make it less strict.
-  re = "^(?:" + re + ")$"
-
-  // can match anything, as long as it's not this.
-  if (this.negate) re = "^(?!" + re + ").*$"
-
-  try {
-    return this.regexp = new RegExp(re, flags)
-  } catch (ex) {
-    return this.regexp = false
-  }
-}
-
-minimatch.match = function (list, pattern, options) {
-  var mm = new Minimatch(pattern, options)
-  list = list.filter(function (f) {
-    return mm.match(f)
-  })
-  if (options.nonull && !list.length) {
-    list.push(pattern)
-  }
-  return list
-}
-
-Minimatch.prototype.match = match
-function match (f, partial) {
-  // console.error("match", f, this.pattern)
-  // short-circuit in the case of busted things.
-  // comments, etc.
-  if (this.comment) return false
-  if (this.empty) return f === ""
-
-  if (f === "/" && partial) return true
-
-  var options = this.options
-
-  // windows: need to use /, not \
-  // On other platforms, \ is a valid (albeit bad) filename char.
-  if (platform === "win32") {
-    f = f.split("\\").join("/")
-  }
-
-  // treat the test path as a set of pathparts.
-  f = f.split(slashSplit)
-  if (options.debug) {
-    console.error(this.pattern, "split", f)
-  }
-
-  // just ONE of the pattern sets in this.set needs to match
-  // in order for it to be valid.  If negating, then just one
-  // match means that we have failed.
-  // Either way, return on the first hit.
-
-  var set = this.set
-  // console.error(this.pattern, "set", set)
-
-  for (var i = 0, l = set.length; i < l; i ++) {
-    var pattern = set[i]
-    var hit = this.matchOne(f, pattern, partial)
-    if (hit) {
-      if (options.flipNegate) return true
-      return !this.negate
-    }
-  }
-
-  // didn't get any hits.  this is success if it's a negative
-  // pattern, failure otherwise.
-  if (options.flipNegate) return false
-  return this.negate
-}
-
-// set partial to true to test if, for example,
-// "/a/b" matches the start of "/*/b/*/d"
-// Partial means, if you run out of file before you run
-// out of pattern, then that's fine, as long as all
-// the parts match.
-Minimatch.prototype.matchOne = function (file, pattern, partial) {
-  var options = this.options
-
-  if (options.debug) {
-    console.error("matchOne",
-                  { "this": this
-                  , file: file
-                  , pattern: pattern })
-  }
-
-  if (options.matchBase && pattern.length === 1) {
-    file = path.basename(file.join("/")).split("/")
-  }
-
-  if (options.debug) {
-    console.error("matchOne", file.length, pattern.length)
-  }
-
-  for ( var fi = 0
-          , pi = 0
-          , fl = file.length
-          , pl = pattern.length
-      ; (fi < fl) && (pi < pl)
-      ; fi ++, pi ++ ) {
-
-    if (options.debug) {
-      console.error("matchOne loop")
-    }
-    var p = pattern[pi]
-      , f = file[fi]
-
-    if (options.debug) {
-      console.error(pattern, p, f)
-    }
-
-    // should be impossible.
-    // some invalid regexp stuff in the set.
-    if (p === false) return false
-
-    if (p === GLOBSTAR) {
-      if (options.debug)
-        console.error('GLOBSTAR', [pattern, p, f])
-
-      // "**"
-      // a/**/b/**/c would match the following:
-      // a/b/x/y/z/c
-      // a/x/y/z/b/c
-      // a/b/x/b/x/c
-      // a/b/c
-      // To do this, take the rest of the pattern after
-      // the **, and see if it would match the file remainder.
-      // If so, return success.
-      // If not, the ** "swallows" a segment, and try again.
-      // This is recursively awful.
-      //
-      // a/**/b/**/c matching a/b/x/y/z/c
-      // - a matches a
-      // - doublestar
-      //   - matchOne(b/x/y/z/c, b/**/c)
-      //     - b matches b
-      //     - doublestar
-      //       - matchOne(x/y/z/c, c) -> no
-      //       - matchOne(y/z/c, c) -> no
-      //       - matchOne(z/c, c) -> no
-      //       - matchOne(c, c) yes, hit
-      var fr = fi
-        , pr = pi + 1
-      if (pr === pl) {
-        if (options.debug)
-          console.error('** at the end')
-        // a ** at the end will just swallow the rest.
-        // We have found a match.
-        // however, it will not swallow /.x, unless
-        // options.dot is set.
-        // . and .. are *never* matched by **, for explosively
-        // exponential reasons.
-        for ( ; fi < fl; fi ++) {
-          if (file[fi] === "." || file[fi] === ".." ||
-              (!options.dot && file[fi].charAt(0) === ".")) return false
-        }
-        return true
-      }
-
-      // ok, let's see if we can swallow whatever we can.
-      WHILE: while (fr < fl) {
-        var swallowee = file[fr]
-
-        if (options.debug) {
-          console.error('\nglobstar while',
-                        file, fr, pattern, pr, swallowee)
-        }
-
-        // XXX remove this slice.  Just pass the start index.
-        if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) {
-          if (options.debug)
-            console.error('globstar found match!', fr, fl, swallowee)
-          // found a match.
-          return true
-        } else {
-          // can't swallow "." or ".." ever.
-          // can only swallow ".foo" when explicitly asked.
-          if (swallowee === "." || swallowee === ".." ||
-              (!options.dot && swallowee.charAt(0) === ".")) {
-            if (options.debug)
-              console.error("dot detected!", file, fr, pattern, pr)
-            break WHILE
-          }
-
-          // ** swallows a segment, and continue.
-          if (options.debug)
-            console.error('globstar swallow a segment, and continue')
-          fr ++
-        }
-      }
-      // no match was found.
-      // However, in partial mode, we can't say this is necessarily over.
-      // If there's more *pattern* left, then 
-      if (partial) {
-        // ran out of file
-        // console.error("\n>>> no match, partial?", file, fr, pattern, pr)
-        if (fr === fl) return true
-      }
-      return false
-    }
-
-    // something other than **
-    // non-magic patterns just have to match exactly
-    // patterns with magic have been turned into regexps.
-    var hit
-    if (typeof p === "string") {
-      if (options.nocase) {
-        hit = f.toLowerCase() === p.toLowerCase()
-      } else {
-        hit = f === p
-      }
-      if (options.debug) {
-        console.error("string match", p, f, hit)
-      }
-    } else {
-      hit = f.match(p)
-      if (options.debug) {
-        console.error("pattern match", p, f, hit)
-      }
-    }
-
-    if (!hit) return false
-  }
-
-  // Note: ending in / means that we'll get a final ""
-  // at the end of the pattern.  This can only match a
-  // corresponding "" at the end of the file.
-  // If the file ends in /, then it can only match a
-  // a pattern that ends in /, unless the pattern just
-  // doesn't have any more for it. But, a/b/ should *not*
-  // match "a/b/*", even though "" matches against the
-  // [^/]*? pattern, except in partial mode, where it might
-  // simply not be reached yet.
-  // However, a/b/ should still satisfy a/*
-
-  // now either we fell off the end of the pattern, or we're done.
-  if (fi === fl && pi === pl) {
-    // ran out of pattern and filename at the same time.
-    // an exact hit!
-    return true
-  } else if (fi === fl) {
-    // ran out of file, but still had pattern left.
-    // this is ok if we're doing the match as part of
-    // a glob fs traversal.
-    return partial
-  } else if (pi === pl) {
-    // ran out of pattern, still have file left.
-    // this is only acceptable if we're on the very last
-    // empty segment of a file with a trailing slash.
-    // a/* should match a/b/
-    var emptyFileEnd = (fi === fl - 1) && (file[fi] === "")
-    return emptyFileEnd
-  }
-
-  // should be unreachable.
-  throw new Error("wtf?")
-}
-
-
-// replace stuff like \* with *
-function globUnescape (s) {
-  return s.replace(/\\(.)/g, "$1")
-}
-
-
-function regExpEscape (s) {
-  return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, "\\$&")
-}
-
-})( typeof require === "function" ? require : null,
-    this,
-    typeof module === "object" ? module : null,
-    typeof process === "object" ? process.platform : "win32"
-  )
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-# sigmund
-
-Quick and dirty signatures for Objects.
-
-This is like a much faster `deepEquals` comparison, which returns a
-string key suitable for caches and the like.
-
-## Usage
-
-```javascript
-function doSomething (someObj) {
-  var key = sigmund(someObj, maxDepth) // max depth defaults to 10
-  var cached = cache.get(key)
-  if (cached) return cached)
-
-  var result = expensiveCalculation(someObj)
-  cache.set(key, result)
-  return result
-}
-```
-
-The resulting key will be as unique and reproducible as calling
-`JSON.stringify` or `util.inspect` on the object, but is much faster.
-In order to achieve this speed, some differences are glossed over.
-For example, the object `{0:'foo'}` will be treated identically to the
-array `['foo']`.
-
-Also, just as there is no way to summon the soul from the scribblings
-of a cocain-addled psychoanalyst, there is no way to revive the object
-from the signature string that sigmund gives you.  In fact, it's
-barely even readable.
-
-As with `sys.inspect` and `JSON.stringify`, larger objects will
-produce larger signature strings.
-
-Because sigmund is a bit less strict than the more thorough
-alternatives, the strings will be shorter, and also there is a
-slightly higher chance for collisions.  For example, these objects
-have the same signature:
-
-    var obj1 = {a:'b',c:/def/,g:['h','i',{j:'',k:'l'}]}
-    var obj2 = {a:'b',c:'/def/',g:['h','i','{jkl']}
-
-Like a good Freudian, sigmund is most effective when you already have
-some understanding of what you're looking for.  It can help you help
-yourself, but you must be willing to do some work as well.
-
-Cycles are handled, and cyclical objects are silently omitted (though
-the key is included in the signature output.)
-
-The second argument is the maximum depth, which defaults to 10,
-because that is the maximum object traversal depth covered by most
-insurance carriers.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/bench.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,283 +0,0 @@
-// different ways to id objects
-// use a req/res pair, since it's crazy deep and cyclical
-
-// sparseFE10 and sigmund are usually pretty close, which is to be expected,
-// since they are essentially the same algorithm, except that sigmund handles
-// regular expression objects properly.
-
-
-var http = require('http')
-var util = require('util')
-var sigmund = require('./sigmund.js')
-var sreq, sres, creq, cres, test
-
-http.createServer(function (q, s) {
-  sreq = q
-  sres = s
-  sres.end('ok')
-  this.close(function () { setTimeout(function () {
-    start()
-  }, 200) })
-}).listen(1337, function () {
-  creq = http.get({ port: 1337 })
-  creq.on('response', function (s) { cres = s })
-})
-
-function start () {
-  test = [sreq, sres, creq, cres]
-  // test = sreq
-  // sreq.sres = sres
-  // sreq.creq = creq
-  // sreq.cres = cres
-
-  for (var i in exports.compare) {
-    console.log(i)
-    var hash = exports.compare[i]()
-    console.log(hash)
-    console.log(hash.length)
-    console.log('')
-  }
-
-  require('bench').runMain()
-}
-
-function customWs (obj, md, d) {
-  d = d || 0
-  var to = typeof obj
-  if (to === 'undefined' || to === 'function' || to === null) return ''
-  if (d > md || !obj || to !== 'object') return ('' + obj).replace(/[\n ]+/g, '')
-
-  if (Array.isArray(obj)) {
-    return obj.map(function (i, _, __) {
-      return customWs(i, md, d + 1)
-    }).reduce(function (a, b) { return a + b }, '')
-  }
-
-  var keys = Object.keys(obj)
-  return keys.map(function (k, _, __) {
-    return k + ':' + customWs(obj[k], md, d + 1)
-  }).reduce(function (a, b) { return a + b }, '')
-}
-
-function custom (obj, md, d) {
-  d = d || 0
-  var to = typeof obj
-  if (to === 'undefined' || to === 'function' || to === null) return ''
-  if (d > md || !obj || to !== 'object') return '' + obj
-
-  if (Array.isArray(obj)) {
-    return obj.map(function (i, _, __) {
-      return custom(i, md, d + 1)
-    }).reduce(function (a, b) { return a + b }, '')
-  }
-
-  var keys = Object.keys(obj)
-  return keys.map(function (k, _, __) {
-    return k + ':' + custom(obj[k], md, d + 1)
-  }).reduce(function (a, b) { return a + b }, '')
-}
-
-function sparseFE2 (obj, maxDepth) {
-  var seen = []
-  var soFar = ''
-  function ch (v, depth) {
-    if (depth > maxDepth) return
-    if (typeof v === 'function' || typeof v === 'undefined') return
-    if (typeof v !== 'object' || !v) {
-      soFar += v
-      return
-    }
-    if (seen.indexOf(v) !== -1 || depth === maxDepth) return
-    seen.push(v)
-    soFar += '{'
-    Object.keys(v).forEach(function (k, _, __) {
-      // pseudo-private values.  skip those.
-      if (k.charAt(0) === '_') return
-      var to = typeof v[k]
-      if (to === 'function' || to === 'undefined') return
-      soFar += k + ':'
-      ch(v[k], depth + 1)
-    })
-    soFar += '}'
-  }
-  ch(obj, 0)
-  return soFar
-}
-
-function sparseFE (obj, maxDepth) {
-  var seen = []
-  var soFar = ''
-  function ch (v, depth) {
-    if (depth > maxDepth) return
-    if (typeof v === 'function' || typeof v === 'undefined') return
-    if (typeof v !== 'object' || !v) {
-      soFar += v
-      return
-    }
-    if (seen.indexOf(v) !== -1 || depth === maxDepth) return
-    seen.push(v)
-    soFar += '{'
-    Object.keys(v).forEach(function (k, _, __) {
-      // pseudo-private values.  skip those.
-      if (k.charAt(0) === '_') return
-      var to = typeof v[k]
-      if (to === 'function' || to === 'undefined') return
-      soFar += k
-      ch(v[k], depth + 1)
-    })
-  }
-  ch(obj, 0)
-  return soFar
-}
-
-function sparse (obj, maxDepth) {
-  var seen = []
-  var soFar = ''
-  function ch (v, depth) {
-    if (depth > maxDepth) return
-    if (typeof v === 'function' || typeof v === 'undefined') return
-    if (typeof v !== 'object' || !v) {
-      soFar += v
-      return
-    }
-    if (seen.indexOf(v) !== -1 || depth === maxDepth) return
-    seen.push(v)
-    soFar += '{'
-    for (var k in v) {
-      // pseudo-private values.  skip those.
-      if (k.charAt(0) === '_') continue
-      var to = typeof v[k]
-      if (to === 'function' || to === 'undefined') continue
-      soFar += k
-      ch(v[k], depth + 1)
-    }
-  }
-  ch(obj, 0)
-  return soFar
-}
-
-function noCommas (obj, maxDepth) {
-  var seen = []
-  var soFar = ''
-  function ch (v, depth) {
-    if (depth > maxDepth) return
-    if (typeof v === 'function' || typeof v === 'undefined') return
-    if (typeof v !== 'object' || !v) {
-      soFar += v
-      return
-    }
-    if (seen.indexOf(v) !== -1 || depth === maxDepth) return
-    seen.push(v)
-    soFar += '{'
-    for (var k in v) {
-      // pseudo-private values.  skip those.
-      if (k.charAt(0) === '_') continue
-      var to = typeof v[k]
-      if (to === 'function' || to === 'undefined') continue
-      soFar += k + ':'
-      ch(v[k], depth + 1)
-    }
-    soFar += '}'
-  }
-  ch(obj, 0)
-  return soFar
-}
-
-
-function flatten (obj, maxDepth) {
-  var seen = []
-  var soFar = ''
-  function ch (v, depth) {
-    if (depth > maxDepth) return
-    if (typeof v === 'function' || typeof v === 'undefined') return
-    if (typeof v !== 'object' || !v) {
-      soFar += v
-      return
-    }
-    if (seen.indexOf(v) !== -1 || depth === maxDepth) return
-    seen.push(v)
-    soFar += '{'
-    for (var k in v) {
-      // pseudo-private values.  skip those.
-      if (k.charAt(0) === '_') continue
-      var to = typeof v[k]
-      if (to === 'function' || to === 'undefined') continue
-      soFar += k + ':'
-      ch(v[k], depth + 1)
-      soFar += ','
-    }
-    soFar += '}'
-  }
-  ch(obj, 0)
-  return soFar
-}
-
-exports.compare =
-{
-  // 'custom 2': function () {
-  //   return custom(test, 2, 0)
-  // },
-  // 'customWs 2': function () {
-  //   return customWs(test, 2, 0)
-  // },
-  'JSON.stringify (guarded)': function () {
-    var seen = []
-    return JSON.stringify(test, function (k, v) {
-      if (typeof v !== 'object' || !v) return v
-      if (seen.indexOf(v) !== -1) return undefined
-      seen.push(v)
-      return v
-    })
-  },
-
-  'flatten 10': function () {
-    return flatten(test, 10)
-  },
-
-  // 'flattenFE 10': function () {
-  //   return flattenFE(test, 10)
-  // },
-
-  'noCommas 10': function () {
-    return noCommas(test, 10)
-  },
-
-  'sparse 10': function () {
-    return sparse(test, 10)
-  },
-
-  'sparseFE 10': function () {
-    return sparseFE(test, 10)
-  },
-
-  'sparseFE2 10': function () {
-    return sparseFE2(test, 10)
-  },
-
-  sigmund: function() {
-    return sigmund(test, 10)
-  },
-
-
-  // 'util.inspect 1': function () {
-  //   return util.inspect(test, false, 1, false)
-  // },
-  // 'util.inspect undefined': function () {
-  //   util.inspect(test)
-  // },
-  // 'util.inspect 2': function () {
-  //   util.inspect(test, false, 2, false)
-  // },
-  // 'util.inspect 3': function () {
-  //   util.inspect(test, false, 3, false)
-  // },
-  // 'util.inspect 4': function () {
-  //   util.inspect(test, false, 4, false)
-  // },
-  // 'util.inspect Infinity': function () {
-  //   util.inspect(test, false, Infinity, false)
-  // }
-}
-
-/** results
-**/
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,45 +0,0 @@
-{
-  "name": "sigmund",
-  "version": "1.0.0",
-  "description": "Quick and dirty signatures for Objects.",
-  "main": "sigmund.js",
-  "directories": {
-    "test": "test"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "tap": "~0.3.0"
-  },
-  "scripts": {
-    "test": "tap test/*.js",
-    "bench": "node bench.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/sigmund"
-  },
-  "keywords": [
-    "object",
-    "signature",
-    "key",
-    "data",
-    "psychoanalysis"
-  ],
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "license": "BSD",
-  "readme": "# sigmund\n\nQuick and dirty signatures for Objects.\n\nThis is like a much faster `deepEquals` comparison, which returns a\nstring key suitable for caches and the like.\n\n## Usage\n\n```javascript\nfunction doSomething (someObj) {\n  var key = sigmund(someObj, maxDepth) // max depth defaults to 10\n  var cached = cache.get(key)\n  if (cached) return cached)\n\n  var result = expensiveCalculation(someObj)\n  cache.set(key, result)\n  return result\n}\n```\n\nThe resulting key will be as unique and reproducible as calling\n`JSON.stringify` or `util.inspect` on the object, but is much faster.\nIn order to achieve this speed, some differences are glossed over.\nFor example, the object `{0:'foo'}` will be treated identically to the\narray `['foo']`.\n\nAlso, just as there is no way to summon the soul from the scribblings\nof a cocain-addled psychoanalyst, there is no way to revive the object\nfrom the signature string that sigmund gives you.  In fact, it's\nbarely even readable.\n\nAs with `sys.inspect` and `JSON.stringify`, larger objects will\nproduce larger signature strings.\n\nBecause sigmund is a bit less strict than the more thorough\nalternatives, the strings will be shorter, and also there is a\nslightly higher chance for collisions.  For example, these objects\nhave the same signature:\n\n    var obj1 = {a:'b',c:/def/,g:['h','i',{j:'',k:'l'}]}\n    var obj2 = {a:'b',c:'/def/',g:['h','i','{jkl']}\n\nLike a good Freudian, sigmund is most effective when you already have\nsome understanding of what you're looking for.  It can help you help\nyourself, but you must be willing to do some work as well.\n\nCycles are handled, and cyclical objects are silently omitted (though\nthe key is included in the signature output.)\n\nThe second argument is the maximum depth, which defaults to 10,\nbecause that is the maximum object traversal depth covered by most\ninsurance carriers.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/sigmund/issues"
-  },
-  "_id": "sigmund@1.0.0",
-  "dist": {
-    "shasum": "66a2b3a749ae8b5fb89efd4fcc01dc94fbe02296"
-  },
-  "_from": "sigmund@~1.0.0",
-  "_resolved": "https://registry.npmjs.org/sigmund/-/sigmund-1.0.0.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/node_modules/sigmund/sigmund.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-module.exports = sigmund
-function sigmund (subject, maxSessions) {
-    maxSessions = maxSessions || 10;
-    var notes = [];
-    var analysis = '';
-    var RE = RegExp;
-
-    function psychoAnalyze (subject, session) {
-        if (session > maxSessions) return;
-
-        if (typeof subject === 'function' ||
-            typeof subject === 'undefined') {
-            return;
-        }
-
-        if (typeof subject !== 'object' || !subject ||
-            (subject instanceof RE)) {
-            analysis += subject;
-            return;
-        }
-
-        if (notes.indexOf(subject) !== -1 || session === maxSessions) return;
-
-        notes.push(subject);
-        analysis += '{';
-        Object.keys(subject).forEach(function (issue, _, __) {
-            // pseudo-private values.  skip those.
-            if (issue.charAt(0) === '_') return;
-            var to = typeof subject[issue];
-            if (to === 'function' || to === 'undefined') return;
-            analysis += issue;
-            psychoAnalyze(subject[issue], session + 1);
-        });
-    }
-    psychoAnalyze(subject, 0);
-    return analysis;
-}
-
-// vim: set softtabstop=4 shiftwidth=4:
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/minimatch/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me"
-  },
-  "name": "minimatch",
-  "description": "a glob matcher in javascript",
-  "version": "0.2.12",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/minimatch.git"
-  },
-  "main": "minimatch.js",
-  "scripts": {
-    "test": "tap test"
-  },
-  "engines": {
-    "node": "*"
-  },
-  "dependencies": {
-    "lru-cache": "2",
-    "sigmund": "~1.0.0"
-  },
-  "devDependencies": {
-    "tap": ""
-  },
-  "license": {
-    "type": "MIT",
-    "url": "http://github.com/isaacs/minimatch/raw/master/LICENSE"
-  },
-  "readme": "# minimatch\n\nA minimal matching utility.\n\n[![Build Status](https://secure.travis-ci.org/isaacs/minimatch.png)](http://travis-ci.org/isaacs/minimatch)\n\n\nThis is the matching library used internally by npm.\n\nEventually, it will replace the C binding in node-glob.\n\nIt works by converting glob expressions into JavaScript `RegExp`\nobjects.\n\n## Usage\n\n```javascript\nvar minimatch = require(\"minimatch\")\n\nminimatch(\"bar.foo\", \"*.foo\") // true!\nminimatch(\"bar.foo\", \"*.bar\") // false!\n```\n\n## Features\n\nSupports these glob features:\n\n* Brace Expansion\n* Extended glob matching\n* \"Globstar\" `**` matching\n\nSee:\n\n* `man sh`\n* `man bash`\n* `man 3 fnmatch`\n* `man 5 gitignore`\n\n### Comparisons to other fnmatch/glob implementations\n\nWhile strict compliance with the existing standards is a worthwhile\ngoal, some discrepancies exist between minimatch and other\nimplementations, and are intentional.\n\nIf the pattern starts with a `!` character, then it is negated.  Set the\n`nonegate` flag to suppress this behavior, and treat leading `!`\ncharacters normally.  This is perhaps relevant if you wish to start the\npattern with a negative extglob pattern like `!(a|B)`.  Multiple `!`\ncharacters at the start of a pattern will negate the pattern multiple\ntimes.\n\nIf a pattern starts with `#`, then it is treated as a comment, and\nwill not match anything.  Use `\\#` to match a literal `#` at the\nstart of a line, or set the `nocomment` flag to suppress this behavior.\n\nThe double-star character `**` is supported by default, unless the\n`noglobstar` flag is set.  This is supported in the manner of bsdglob\nand bash 4.1, where `**` only has special significance if it is the only\nthing in a path part.  That is, `a/**/b` will match `a/x/y/b`, but\n`a/**b` will not.  **Note that this is different from the way that `**` is\nhandled by ruby's `Dir` class.**\n\nIf an escaped pattern has no matches, and the `nonull` flag is set,\nthen minimatch.match returns the pattern as-provided, rather than\ninterpreting the character escapes.  For example,\n`minimatch.match([], \"\\\\*a\\\\?\")` will return `\"\\\\*a\\\\?\"` rather than\n`\"*a?\"`.  This is akin to setting the `nullglob` option in bash, except\nthat it does not resolve escaped pattern characters.\n\nIf brace expansion is not disabled, then it is performed before any\nother interpretation of the glob pattern.  Thus, a pattern like\n`+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded\n**first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are\nchecked for validity.  Since those two are valid, matching proceeds.\n\n\n## Minimatch Class\n\nCreate a minimatch object by instanting the `minimatch.Minimatch` class.\n\n```javascript\nvar Minimatch = require(\"minimatch\").Minimatch\nvar mm = new Minimatch(pattern, options)\n```\n\n### Properties\n\n* `pattern` The original pattern the minimatch object represents.\n* `options` The options supplied to the constructor.\n* `set` A 2-dimensional array of regexp or string expressions.\n  Each row in the\n  array corresponds to a brace-expanded pattern.  Each item in the row\n  corresponds to a single path-part.  For example, the pattern\n  `{a,b/c}/d` would expand to a set of patterns like:\n\n        [ [ a, d ]\n        , [ b, c, d ] ]\n\n    If a portion of the pattern doesn't have any \"magic\" in it\n    (that is, it's something like `\"foo\"` rather than `fo*o?`), then it\n    will be left as a string rather than converted to a regular\n    expression.\n\n* `regexp` Created by the `makeRe` method.  A single regular expression\n  expressing the entire pattern.  This is useful in cases where you wish\n  to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled.\n* `negate` True if the pattern is negated.\n* `comment` True if the pattern is a comment.\n* `empty` True if the pattern is `\"\"`.\n\n### Methods\n\n* `makeRe` Generate the `regexp` member if necessary, and return it.\n  Will return `false` if the pattern is invalid.\n* `match(fname)` Return true if the filename matches the pattern, or\n  false otherwise.\n* `matchOne(fileArray, patternArray, partial)` Take a `/`-split\n  filename, and match it against a single row in the `regExpSet`.  This\n  method is mainly for internal use, but is exposed so that it can be\n  used by a glob-walker that needs to avoid excessive filesystem calls.\n\nAll other methods are internal, and will be called as necessary.\n\n## Functions\n\nThe top-level exported function has a `cache` property, which is an LRU\ncache set to store 100 items.  So, calling these methods repeatedly\nwith the same pattern and options will use the same Minimatch object,\nsaving the cost of parsing it multiple times.\n\n### minimatch(path, pattern, options)\n\nMain export.  Tests a path against the pattern using the options.\n\n```javascript\nvar isJS = minimatch(file, \"*.js\", { matchBase: true })\n```\n\n### minimatch.filter(pattern, options)\n\nReturns a function that tests its\nsupplied argument, suitable for use with `Array.filter`.  Example:\n\n```javascript\nvar javascripts = fileList.filter(minimatch.filter(\"*.js\", {matchBase: true}))\n```\n\n### minimatch.match(list, pattern, options)\n\nMatch against the list of\nfiles, in the style of fnmatch or glob.  If nothing is matched, and\noptions.nonull is set, then return a list containing the pattern itself.\n\n```javascript\nvar javascripts = minimatch.match(fileList, \"*.js\", {matchBase: true}))\n```\n\n### minimatch.makeRe(pattern, options)\n\nMake a regular expression object from the pattern.\n\n## Options\n\nAll options are `false` by default.\n\n### debug\n\nDump a ton of stuff to stderr.\n\n### nobrace\n\nDo not expand `{a,b}` and `{1..3}` brace sets.\n\n### noglobstar\n\nDisable `**` matching against multiple folder names.\n\n### dot\n\nAllow patterns to match filenames starting with a period, even if\nthe pattern does not explicitly have a period in that spot.\n\nNote that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot`\nis set.\n\n### noext\n\nDisable \"extglob\" style patterns like `+(a|b)`.\n\n### nocase\n\nPerform a case-insensitive match.\n\n### nonull\n\nWhen a match is not found by `minimatch.match`, return a list containing\nthe pattern itself.  When set, an empty list is returned if there are\nno matches.\n\n### matchBase\n\nIf set, then patterns without slashes will be matched\nagainst the basename of the path if it contains slashes.  For example,\n`a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`.\n\n### nocomment\n\nSuppress the behavior of treating `#` at the start of a pattern as a\ncomment.\n\n### nonegate\n\nSuppress the behavior of treating a leading `!` character as negation.\n\n### flipNegate\n\nReturns from negate expressions the same as if they were not negated.\n(Ie, true on a hit, false on a miss.)\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/minimatch/issues"
-  },
-  "_id": "minimatch@0.2.12",
-  "_from": "minimatch@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-node_modules/
-npm-debug.log
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-language: node_js
-node_js:
-  - 0.6
-  - 0.8
-  - 0.9
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-Copyright 2010 James Halliday (mail@substack.net)
-
-This project is free software released under the MIT/X11 license:
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/README.markdown	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-# mkdirp
-
-Like `mkdir -p`, but in node.js!
-
-[![build status](https://secure.travis-ci.org/substack/node-mkdirp.png)](http://travis-ci.org/substack/node-mkdirp)
-
-# example
-
-## pow.js
-
-```js
-var mkdirp = require('mkdirp');
-    
-mkdirp('/tmp/foo/bar/baz', function (err) {
-    if (err) console.error(err)
-    else console.log('pow!')
-});
-```
-
-Output
-
-```
-pow!
-```
-
-And now /tmp/foo/bar/baz exists, huzzah!
-
-# methods
-
-```js
-var mkdirp = require('mkdirp');
-```
-
-## mkdirp(dir, mode, cb)
-
-Create a new directory and any necessary subdirectories at `dir` with octal
-permission string `mode`.
-
-If `mode` isn't specified, it defaults to `0777 & (~process.umask())`.
-
-`cb(err, made)` fires with the error or the first directory `made`
-that had to be created, if any.
-
-## mkdirp.sync(dir, mode)
-
-Synchronously create a new directory and any necessary subdirectories at `dir`
-with octal permission string `mode`.
-
-If `mode` isn't specified, it defaults to `0777 & (~process.umask())`.
-
-Returns the first directory that had to be created, if any.
-
-# install
-
-With [npm](http://npmjs.org) do:
-
-```
-npm install mkdirp
-```
-
-# license
-
-MIT
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/examples/pow.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-var mkdirp = require('mkdirp');
-
-mkdirp('/tmp/foo/bar/baz', function (err) {
-    if (err) console.error(err)
-    else console.log('pow!')
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,82 +0,0 @@
-var path = require('path');
-var fs = require('fs');
-
-module.exports = mkdirP.mkdirp = mkdirP.mkdirP = mkdirP;
-
-function mkdirP (p, mode, f, made) {
-    if (typeof mode === 'function' || mode === undefined) {
-        f = mode;
-        mode = 0777 & (~process.umask());
-    }
-    if (!made) made = null;
-
-    var cb = f || function () {};
-    if (typeof mode === 'string') mode = parseInt(mode, 8);
-    p = path.resolve(p);
-
-    fs.mkdir(p, mode, function (er) {
-        if (!er) {
-            made = made || p;
-            return cb(null, made);
-        }
-        switch (er.code) {
-            case 'ENOENT':
-                mkdirP(path.dirname(p), mode, function (er, made) {
-                    if (er) cb(er, made);
-                    else mkdirP(p, mode, cb, made);
-                });
-                break;
-
-            // In the case of any other error, just see if there's a dir
-            // there already.  If so, then hooray!  If not, then something
-            // is borked.
-            default:
-                fs.stat(p, function (er2, stat) {
-                    // if the stat fails, then that's super weird.
-                    // let the original error be the failure reason.
-                    if (er2 || !stat.isDirectory()) cb(er, made)
-                    else cb(null, made);
-                });
-                break;
-        }
-    });
-}
-
-mkdirP.sync = function sync (p, mode, made) {
-    if (mode === undefined) {
-        mode = 0777 & (~process.umask());
-    }
-    if (!made) made = null;
-
-    if (typeof mode === 'string') mode = parseInt(mode, 8);
-    p = path.resolve(p);
-
-    try {
-        fs.mkdirSync(p, mode);
-        made = made || p;
-    }
-    catch (err0) {
-        switch (err0.code) {
-            case 'ENOENT' :
-                made = sync(path.dirname(p), mode, made);
-                sync(p, mode, made);
-                break;
-
-            // In the case of any other error, just see if there's a dir
-            // there already.  If so, then hooray!  If not, then something
-            // is borked.
-            default:
-                var stat;
-                try {
-                    stat = fs.statSync(p);
-                }
-                catch (err1) {
-                    throw err0;
-                }
-                if (!stat.isDirectory()) throw err0;
-                break;
-        }
-    }
-
-    return made;
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/mkdirp/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-{
-  "name": "mkdirp",
-  "description": "Recursively mkdir, like `mkdir -p`",
-  "version": "0.3.5",
-  "author": {
-    "name": "James Halliday",
-    "email": "mail@substack.net",
-    "url": "http://substack.net"
-  },
-  "main": "./index",
-  "keywords": [
-    "mkdir",
-    "directory"
-  ],
-  "repository": {
-    "type": "git",
-    "url": "http://github.com/substack/node-mkdirp.git"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "devDependencies": {
-    "tap": "~0.4.0"
-  },
-  "license": "MIT",
-  "readme": "# mkdirp\n\nLike `mkdir -p`, but in node.js!\n\n[![build status](https://secure.travis-ci.org/substack/node-mkdirp.png)](http://travis-ci.org/substack/node-mkdirp)\n\n# example\n\n## pow.js\n\n```js\nvar mkdirp = require('mkdirp');\n    \nmkdirp('/tmp/foo/bar/baz', function (err) {\n    if (err) console.error(err)\n    else console.log('pow!')\n});\n```\n\nOutput\n\n```\npow!\n```\n\nAnd now /tmp/foo/bar/baz exists, huzzah!\n\n# methods\n\n```js\nvar mkdirp = require('mkdirp');\n```\n\n## mkdirp(dir, mode, cb)\n\nCreate a new directory and any necessary subdirectories at `dir` with octal\npermission string `mode`.\n\nIf `mode` isn't specified, it defaults to `0777 & (~process.umask())`.\n\n`cb(err, made)` fires with the error or the first directory `made`\nthat had to be created, if any.\n\n## mkdirp.sync(dir, mode)\n\nSynchronously create a new directory and any necessary subdirectories at `dir`\nwith octal permission string `mode`.\n\nIf `mode` isn't specified, it defaults to `0777 & (~process.umask())`.\n\nReturns the first directory that had to be created, if any.\n\n# install\n\nWith [npm](http://npmjs.org) do:\n\n```\nnpm install mkdirp\n```\n\n# license\n\nMIT\n",
-  "readmeFilename": "readme.markdown",
-  "bugs": {
-    "url": "https://github.com/substack/node-mkdirp/issues"
-  },
-  "_id": "mkdirp@0.3.5",
-  "_from": "mkdirp@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/.jshintrc	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,7 +0,0 @@
-{
-  "asi": true,
-  "laxcomma": true,
-  "es5": true,
-  "node": true,
-  "strict": false
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-gyp/test
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-(The MIT License)
-
-Copyright (c) 2012 Nathan Rajlich <nathan@tootallnate.net>
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,163 +0,0 @@
-node-gyp
-=========
-### Node.js native addon build tool
-
-`node-gyp` is a cross-platform command-line tool written in Node.js for compiling
-native addon modules for Node.js, which takes away the pain of dealing with the
-various differences in build platforms. It is the replacement to the `node-waf`
-program which is removed for node `v0.8`. If you have a native addon for node that
-still has a `wscript` file, then you should definitely add a `binding.gyp` file
-to support the latest versions of node.
-
-Multiple target versions of node are supported (i.e. `0.8`, `0.9`, `0.10`, ..., `1.0`,
-etc.), regardless of what version of node is actually installed on your system
-(`node-gyp` downloads the necessary development files for the target version).
-
-#### Features:
-
- * Easy to use, consistent interface
- * Same commands to build your module on every platform
- * Supports multiple target versions of Node
-
-
-Installation
-------------
-
-You can install with `npm`:
-
-``` bash
-$ npm install -g node-gyp
-```
-
-You will also need to install:
-
-  * On Unix:
-    * `python` (`v2.7` recommended, `v3.x.x` is __*not*__ supported)
-    * `make`
-    * A proper C/C++ compiler toolchain, like GCC
-  * On Windows:
-    * [Python][windows-python] ([`v2.7.3`][windows-python-v2.7.3] recommended, `v3.x.x` is __*not*__ supported)
-    * Windows XP/Vista/7:
-      * Microsoft Visual Studio C++ 2010 ([Express][msvc2010] version works well)
-      * For 64-bit builds of node and native modules you will _**also**_ need the [Windows 7 64-bit SDK][win7sdk]
-        * If the install fails, try uninstalling any C++ 2010 x64&x86 Redistributable that you have installed first.
-      * If you get errors that the 64-bit compilers are not installed you may also need the [compiler update for the Windows SDK 7.1]
-    * Windows 7/8:
-      * Microsoft Visual Studio C++ 2012 for Windows Desktop ([Express][msvc2012] version works well)
-
-Note that OS X is just a flavour of Unix and so needs `python`, `make`, and C/C++.
-An easy way to obtain these is to install XCode from Apple,
-and then use it to install the command line tools (under Preferences -> Downloads).
-
-How to Use
-----------
-
-To compile your native addon, first go to its root directory:
-
-``` bash
-$ cd my_node_addon
-```
-
-The next step is to generate the appropriate project build files for the current
-platform. Use `configure` for that:
-
-``` bash
-$ node-gyp configure
-```
-
-__Note__: The `configure` step looks for the `binding.gyp` file in the current
-directory to processs. See below for instructions on creating the `binding.gyp` file.
-
-Now you will have either a `Makefile` (on Unix platforms) or a `vcxproj` file
-(on Windows) in the `build/` directory. Next invoke the `build` command:
-
-``` bash
-$ node-gyp build
-```
-
-Now you have your compiled `.node` bindings file! The compiled bindings end up
-in `build/Debug/` or `build/Release/`, depending on the build mode. At this point
-you can require the `.node` file with Node and run your tests!
-
-__Note:__ To create a _Debug_ build of the bindings file, pass the `--debug` (or
-`-d`) switch when running the either `configure` or `build` command.
-
-
-The "binding.gyp" file
-----------------------
-
-Previously when node had `node-waf` you had to write a `wscript` file. The
-replacement for that is the `binding.gyp` file, which describes the configuration
-to build your module in a JSON-like format. This file gets placed in the root of
-your package, alongside the `package.json` file.
-
-A barebones `gyp` file appropriate for building a node addon looks like:
-
-``` python
-{
-  "targets": [
-    {
-      "target_name": "binding",
-      "sources": [ "src/binding.cc" ]
-    }
-  ]
-}
-```
-
-Some additional resources for writing `gyp` files:
-
- * ["Hello World" node addon example](https://github.com/joyent/node/tree/master/test/addons/hello-world)
- * [gyp user documentation](http://code.google.com/p/gyp/wiki/GypUserDocumentation)
- * [gyp input format reference](http://code.google.com/p/gyp/wiki/InputFormatReference)
- * [*"binding.gyp" files out in the wild* wiki page](https://github.com/TooTallNate/node-gyp/wiki/%22binding.gyp%22-files-out-in-the-wild)
-
-
-Commands
---------
-
-`node-gyp` responds to the following commands:
-
-| **Command**   | **Description**
-|:--------------|:---------------------------------------------------------------
-| `build`       | Invokes `make`/`msbuild.exe` and builds the native addon
-| `clean`       | Removes any the `build` dir if it exists
-| `configure`   | Generates project build files for the current platform
-| `rebuild`     | Runs "clean", "configure" and "build" all in a row
-| `install`     | Installs node development header files for the given version
-| `list`        | Lists the currently installed node development file versions
-| `remove`      | Removes the node development header files for the given version
-
-
-License
--------
-
-(The MIT License)
-
-Copyright (c) 2012 Nathan Rajlich &lt;nathan@tootallnate.net&gt;
-
-Permission is hereby granted, free of charge, to any person obtaining
-a copy of this software and associated documentation files (the
-'Software'), to deal in the Software without restriction, including
-without limitation the rights to use, copy, modify, merge, publish,
-distribute, sublicense, and/or sell copies of the Software, and to
-permit persons to whom the Software is furnished to do so, subject to
-the following conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
-IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
-CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
-TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
-SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-
-[windows-python]: http://www.python.org/getit/windows
-[windows-python-v2.7.3]: http://www.python.org/download/releases/2.7.3#download
-[msvc2010]: http://go.microsoft.com/?linkid=9709949
-[msvc2012]: http://go.microsoft.com/?linkid=9816758
-[win7sdk]: http://www.microsoft.com/en-us/download/details.aspx?id=8279
-[compiler update for the Windows SDK 7.1]: http://www.microsoft.com/en-us/download/details.aspx?id=4422
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/addon.gypi	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-{
-  'target_defaults': {
-    'type': 'loadable_module',
-    'product_prefix': '',
-    'include_dirs': [
-      '<(node_root_dir)/src',
-      '<(node_root_dir)/deps/uv/include',
-      '<(node_root_dir)/deps/v8/include'
-    ],
-
-    'target_conditions': [
-      ['_type=="loadable_module"', {
-        'product_extension': 'node',
-        'defines': [ 'BUILDING_NODE_EXTENSION' ],
-      }],
-      ['_type=="static_library"', {
-        # set to `1` to *disable* the -T thin archive 'ld' flag.
-        # older linkers don't support this flag.
-        'standalone_static_library': '<(standalone_static_library)'
-      }],
-    ],
-
-    'conditions': [
-      [ 'OS=="mac"', {
-        'defines': [ '_DARWIN_USE_64_BIT_INODE=1' ],
-        'libraries': [ '-undefined dynamic_lookup' ],
-        'xcode_settings': {
-          'DYLIB_INSTALL_NAME_BASE': '@rpath'
-        },
-      }],
-      [ 'OS=="win"', {
-        'libraries': [
-          '-lkernel32.lib',
-          '-luser32.lib',
-          '-lgdi32.lib',
-          '-lwinspool.lib',
-          '-lcomdlg32.lib',
-          '-ladvapi32.lib',
-          '-lshell32.lib',
-          '-lole32.lib',
-          '-loleaut32.lib',
-          '-luuid.lib',
-          '-lodbc32.lib',
-          '-lDelayImp.lib',
-          '-l<(node_root_dir)/$(Configuration)/node.lib'
-        ],
-        # warning C4251: 'node::ObjectWrap::handle_' : class 'v8::Persistent<T>'
-        # needs to have dll-interface to be used by clients of class 'node::ObjectWrap'
-        'msvs_disabled_warnings': [ 4251 ],
-      }, {
-        # OS!="win"
-        'defines': [ '_LARGEFILE_SOURCE', '_FILE_OFFSET_BITS=64' ],
-      }],
-      [ 'OS=="freebsd" or OS=="openbsd" or OS=="solaris" or (OS=="linux" and target_arch!="ia32")', {
-        'cflags': [ '-fPIC' ],
-      }]
-    ]
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,133 +0,0 @@
-#!/usr/bin/env node
-
-/**
- * Set the title.
- */
-
-process.title = 'node-gyp'
-
-/**
- * Module dependencies.
- */
-
-var gyp = require('../')
-var log = require('npmlog')
-
-/**
- * Process and execute the selected commands.
- */
-
-var prog = gyp()
-var completed = false
-prog.parseArgv(process.argv)
-
-if (prog.todo.length === 0) {
-  if (~process.argv.indexOf('-v') || ~process.argv.indexOf('--version')) {
-    console.log('v%s', prog.version)
-  } else {
-    console.log('%s', prog.usage())
-  }
-  return process.exit(0)
-}
-
-log.info('it worked if it ends with', 'ok')
-log.verbose('cli', process.argv)
-log.info('using', 'node-gyp@%s', prog.version)
-log.info('using', 'node@%s | %s | %s', process.versions.node, process.platform, process.arch)
-
-
-/**
- * Change dir if -C/--directory was passed.
- */
-
-var dir = prog.opts.directory
-if (dir) {
-  var fs = require('fs')
-  try {
-    var stat = fs.statSync(dir)
-    if (stat.isDirectory()) {
-      log.info('chdir', dir)
-      process.chdir(dir)
-    } else {
-      log.warn('chdir', dir + ' is not a directory')
-    }
-  } catch (e) {
-    if (e.code === 'ENOENT') {
-      log.warn('chdir', dir + ' is not a directory')
-    } else {
-      log.warn('chdir', 'error during chdir() "%s"', e.message)
-    }
-  }
-}
-
-function run () {
-  var command = prog.todo.shift()
-  if (!command) {
-    // done!
-    completed = true
-    log.info('ok')
-    return
-  }
-
-  prog.commands[command.name](command.args, function (err) {
-    if (err) {
-      log.error(command.name + ' error')
-      log.error('stack', err.stack)
-      errorMessage()
-      log.error('not ok')
-      return process.exit(1)
-    }
-    if (command.name == 'list') {
-      var versions = arguments[1]
-      if (versions.length > 0) {
-        versions.forEach(function (version) {
-          console.log(version)
-        })
-      } else {
-        console.log('No node development files installed. Use `node-gyp install` to install a version.')
-      }
-    } else if (arguments.length >= 2) {
-      console.log.apply(console, [].slice.call(arguments, 1))
-    }
-
-    // now run the next command in the queue
-    process.nextTick(run)
-  })
-}
-
-process.on('exit', function (code) {
-  if (!completed && !code) {
-    log.error('Completion callback never invoked!')
-    issueMessage()
-    process.exit(6)
-  }
-})
-
-process.on('uncaughtException', function (err) {
-  log.error('UNCAUGHT EXCEPTION')
-  log.error('stack', err.stack)
-  issueMessage()
-  process.exit(7)
-})
-
-function errorMessage () {
-  // copied from npm's lib/util/error-handler.js
-  var os = require('os')
-  log.error('System', os.type() + ' ' + os.release())
-  log.error('command', process.argv
-            .map(JSON.stringify).join(' '))
-  log.error('cwd', process.cwd())
-  log.error('node -v', process.version)
-  log.error('node-gyp -v', 'v' + prog.package.version)
-}
-
-function issueMessage () {
-  errorMessage()
-  log.error('', [ 'This is a bug in `node-gyp`.'
-                , 'Try to update node-gyp and file an Issue if it does not help:'
-                , '    <https://github.com/TooTallNate/node-gyp/issues>'
-                ].join('\n'))
-}
-
-// start running the given commands!
-run()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-*.pyc
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/AUTHORS	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,8 +0,0 @@
-# Names should be added to this file like so:
-# Name or Organization <email address>
-
-Google Inc.
-Bloomberg Finance L.P.
-
-Steven Knight <knight@baldmt.com>
-Ryan Norton <rnorton10@gmail.com>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/DEPS	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-# DEPS file for gclient use in buildbot execution of gyp tests.
-#
-# (You don't need to use gclient for normal GYP development work.)
-
-vars = {
-  "chrome_trunk": "http://src.chromium.org/svn/trunk",
-  "googlecode_url": "http://%s.googlecode.com/svn",
-}
-
-deps = {
-  "scons":
-    Var("chrome_trunk") + "/src/third_party/scons@44099",
-}
-
-deps_os = {
-  "win": {
-    "third_party/cygwin":
-      Var("chrome_trunk") + "/deps/third_party/cygwin@66844",
-
-    "third_party/python_26":
-      Var("chrome_trunk") + "/tools/third_party/python_26@89111",
-
-    "src/third_party/pefile":
-      (Var("googlecode_url") % "pefile") + "/trunk@63",
-  },
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) 2009 Google Inc. All rights reserved.
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions are
-met:
-
-   * Redistributions of source code must retain the above copyright
-notice, this list of conditions and the following disclaimer.
-   * Redistributions in binary form must reproduce the above
-copyright notice, this list of conditions and the following disclaimer
-in the documentation and/or other materials provided with the
-distribution.
-   * Neither the name of Google Inc. nor the names of its
-contributors may be used to endorse or promote products derived from
-this software without specific prior written permission.
-
-THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
-"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
-LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
-A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
-OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
-SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
-LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
-DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
-THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
-OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/MANIFEST	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-setup.py
-gyp
-LICENSE
-AUTHORS
-pylib/gyp/MSVSNew.py
-pylib/gyp/MSVSProject.py
-pylib/gyp/MSVSToolFile.py
-pylib/gyp/MSVSUserFile.py
-pylib/gyp/MSVSVersion.py
-pylib/gyp/SCons.py
-pylib/gyp/__init__.py
-pylib/gyp/common.py
-pylib/gyp/input.py
-pylib/gyp/xcodeproj_file.py
-pylib/gyp/generator/__init__.py
-pylib/gyp/generator/gypd.py
-pylib/gyp/generator/gypsh.py
-pylib/gyp/generator/make.py
-pylib/gyp/generator/msvs.py
-pylib/gyp/generator/scons.py
-pylib/gyp/generator/xcode.py
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/OWNERS	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-*
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/PRESUBMIT.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,116 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-
-"""Top-level presubmit script for GYP.
-
-See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
-for more details about the presubmit API built into gcl.
-"""
-
-
-PYLINT_BLACKLIST = [
-    # TODO: fix me.
-    # From SCons, not done in google style.
-    'test/lib/TestCmd.py',
-    'test/lib/TestCommon.py',
-    'test/lib/TestGyp.py',
-    # Needs style fix.
-    'pylib/gyp/generator/scons.py',
-    'pylib/gyp/generator/xcode.py',
-]
-
-
-PYLINT_DISABLED_WARNINGS = [
-    # TODO: fix me.
-    # Many tests include modules they don't use.
-    'W0611',
-    # Include order doesn't properly include local files?
-    'F0401',
-    # Some use of built-in names.
-    'W0622',
-    # Some unused variables.
-    'W0612',
-    # Operator not preceded/followed by space.
-    'C0323',
-    'C0322',
-    # Unnecessary semicolon.
-    'W0301',
-    # Unused argument.
-    'W0613',
-    # String has no effect (docstring in wrong place).
-    'W0105',
-    # Comma not followed by space.
-    'C0324',
-    # Access to a protected member.
-    'W0212',
-    # Bad indent.
-    'W0311',
-    # Line too long.
-    'C0301',
-    # Undefined variable.
-    'E0602',
-    # Not exception type specified.
-    'W0702',
-    # No member of that name.
-    'E1101',
-    # Dangerous default {}.
-    'W0102',
-    # Others, too many to sort.
-    'W0201', 'W0232', 'E1103', 'W0621', 'W0108', 'W0223', 'W0231',
-    'R0201', 'E0101', 'C0321',
-    # ************* Module copy
-    # W0104:427,12:_test.odict.__setitem__: Statement seems to have no effect
-    'W0104',
-]
-
-
-def CheckChangeOnUpload(input_api, output_api):
-  report = []
-  report.extend(input_api.canned_checks.PanProjectChecks(
-      input_api, output_api))
-  return report
-
-
-def CheckChangeOnCommit(input_api, output_api):
-  report = []
-
-  # Accept any year number from 2009 to the current year.
-  current_year = int(input_api.time.strftime('%Y'))
-  allowed_years = (str(s) for s in reversed(xrange(2009, current_year + 1)))
-  years_re = '(' + '|'.join(allowed_years) + ')'
-
-  # The (c) is deprecated, but tolerate it until it's removed from all files.
-  license = (
-      r'.*? Copyright (\(c\) )?%(year)s Google Inc\. All rights reserved\.\n'
-      r'.*? Use of this source code is governed by a BSD-style license that '
-        r'can be\n'
-      r'.*? found in the LICENSE file\.\n'
-  ) % {
-      'year': years_re,
-  }
-
-  report.extend(input_api.canned_checks.PanProjectChecks(
-      input_api, output_api, license_header=license))
-  report.extend(input_api.canned_checks.CheckTreeIsOpen(
-      input_api, output_api,
-      'http://gyp-status.appspot.com/status',
-      'http://gyp-status.appspot.com/current'))
-
-  import sys
-  old_sys_path = sys.path
-  try:
-    sys.path = ['pylib', 'test/lib'] + sys.path
-    report.extend(input_api.canned_checks.RunPylint(
-        input_api,
-        output_api,
-        black_list=PYLINT_BLACKLIST,
-        disabled_warnings=PYLINT_DISABLED_WARNINGS))
-  finally:
-    sys.path = old_sys_path
-  return report
-
-
-def GetPreferredTrySlaves():
-  return ['gyp-win32', 'gyp-win64', 'gyp-linux', 'gyp-mac', 'gyp-android']
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/codereview.settings	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,10 +0,0 @@
-# This file is used by gcl to get repository specific information.
-CODE_REVIEW_SERVER: codereview.chromium.org
-CC_LIST: gyp-developer@googlegroups.com
-VIEW_VC: http://code.google.com/p/gyp/source/detail?r=
-TRY_ON_UPLOAD: True
-TRYSERVER_PROJECT: gyp
-TRYSERVER_PATCHLEVEL: 0
-TRYSERVER_ROOT: trunk
-TRYSERVER_SVN_URL: svn://svn.chromium.org/chrome-try/try-nacl
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/data/win/large-pdb-shim.cc	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,12 +0,0 @@
-// Copyright (c) 2013 Google Inc. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-// This file is used to generate an empty .pdb -- with a 4KB pagesize -- that is
-// then used during the final link for modules that have large PDBs. Otherwise,
-// the linker will generate a pdb with a page size of 1KB, which imposes a limit
-// of 1GB on the .pdb. By generating an initial empty .pdb with the compiler
-// (rather than the linker), this limit is avoided. With this in place PDBs may
-// grow to 2GB.
-//
-// This file is referenced by the msvs_large_pdb mechanism in MSVSUtil.py.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2009 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import sys
-
-# TODO(mark): sys.path manipulation is some temporary testing stuff.
-try:
-  import gyp
-except ImportError, e:
-  import os.path
-  sys.path.append(os.path.join(os.path.dirname(sys.argv[0]), 'pylib'))
-  import gyp
-
-if __name__ == '__main__':
-  sys.exit(gyp.main(sys.argv[1:]))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp.bat	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-@rem Copyright (c) 2009 Google Inc. All rights reserved.
-@rem Use of this source code is governed by a BSD-style license that can be
-@rem found in the LICENSE file.
-
-@python "%~dp0/gyp" %*
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp_dummy.c	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,7 +0,0 @@
-/* Copyright (c) 2009 Google Inc. All rights reserved.
- * Use of this source code is governed by a BSD-style license that can be
- * found in the LICENSE file. */
-
-int main() {
-  return 0;
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/gyptest.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,267 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-__doc__ = """
-gyptest.py -- test runner for GYP tests.
-"""
-
-import os
-import optparse
-import subprocess
-import sys
-
-class CommandRunner:
-  """
-  Executor class for commands, including "commands" implemented by
-  Python functions.
-  """
-  verbose = True
-  active = True
-
-  def __init__(self, dictionary={}):
-    self.subst_dictionary(dictionary)
-
-  def subst_dictionary(self, dictionary):
-    self._subst_dictionary = dictionary
-
-  def subst(self, string, dictionary=None):
-    """
-    Substitutes (via the format operator) the values in the specified
-    dictionary into the specified command.
-
-    The command can be an (action, string) tuple.  In all cases, we
-    perform substitution on strings and don't worry if something isn't
-    a string.  (It's probably a Python function to be executed.)
-    """
-    if dictionary is None:
-      dictionary = self._subst_dictionary
-    if dictionary:
-      try:
-        string = string % dictionary
-      except TypeError:
-        pass
-    return string
-
-  def display(self, command, stdout=None, stderr=None):
-    if not self.verbose:
-      return
-    if type(command) == type(()):
-      func = command[0]
-      args = command[1:]
-      s = '%s(%s)' % (func.__name__, ', '.join(map(repr, args)))
-    if type(command) == type([]):
-      # TODO:  quote arguments containing spaces
-      # TODO:  handle meta characters?
-      s = ' '.join(command)
-    else:
-      s = self.subst(command)
-    if not s.endswith('\n'):
-      s += '\n'
-    sys.stdout.write(s)
-    sys.stdout.flush()
-
-  def execute(self, command, stdout=None, stderr=None):
-    """
-    Executes a single command.
-    """
-    if not self.active:
-      return 0
-    if type(command) == type(''):
-      command = self.subst(command)
-      cmdargs = shlex.split(command)
-      if cmdargs[0] == 'cd':
-         command = (os.chdir,) + tuple(cmdargs[1:])
-    if type(command) == type(()):
-      func = command[0]
-      args = command[1:]
-      return func(*args)
-    else:
-      if stdout is sys.stdout:
-        # Same as passing sys.stdout, except python2.4 doesn't fail on it.
-        subout = None
-      else:
-        # Open pipe for anything else so Popen works on python2.4.
-        subout = subprocess.PIPE
-      if stderr is sys.stderr:
-        # Same as passing sys.stderr, except python2.4 doesn't fail on it.
-        suberr = None
-      elif stderr is None:
-        # Merge with stdout if stderr isn't specified.
-        suberr = subprocess.STDOUT
-      else:
-        # Open pipe for anything else so Popen works on python2.4.
-        suberr = subprocess.PIPE
-      p = subprocess.Popen(command,
-                           shell=(sys.platform == 'win32'),
-                           stdout=subout,
-                           stderr=suberr)
-      p.wait()
-      if stdout is None:
-        self.stdout = p.stdout.read()
-      elif stdout is not sys.stdout:
-        stdout.write(p.stdout.read())
-      if stderr not in (None, sys.stderr):
-        stderr.write(p.stderr.read())
-      return p.returncode
-
-  def run(self, command, display=None, stdout=None, stderr=None):
-    """
-    Runs a single command, displaying it first.
-    """
-    if display is None:
-      display = command
-    self.display(display)
-    return self.execute(command, stdout, stderr)
-
-
-class Unbuffered:
-  def __init__(self, fp):
-    self.fp = fp
-  def write(self, arg):
-    self.fp.write(arg)
-    self.fp.flush()
-  def __getattr__(self, attr):
-    return getattr(self.fp, attr)
-
-sys.stdout = Unbuffered(sys.stdout)
-sys.stderr = Unbuffered(sys.stderr)
-
-
-def find_all_gyptest_files(directory):
-    result = []
-    for root, dirs, files in os.walk(directory):
-      if '.svn' in dirs:
-        dirs.remove('.svn')
-      result.extend([ os.path.join(root, f) for f in files
-                     if f.startswith('gyptest') and f.endswith('.py') ])
-    result.sort()
-    return result
-
-
-def main(argv=None):
-  if argv is None:
-    argv = sys.argv
-
-  usage = "gyptest.py [-ahlnq] [-f formats] [test ...]"
-  parser = optparse.OptionParser(usage=usage)
-  parser.add_option("-a", "--all", action="store_true",
-            help="run all tests")
-  parser.add_option("-C", "--chdir", action="store", default=None,
-            help="chdir to the specified directory")
-  parser.add_option("-f", "--format", action="store", default='',
-            help="run tests with the specified formats")
-  parser.add_option("-G", '--gyp_option', action="append", default=[],
-            help="Add -G options to the gyp command line")
-  parser.add_option("-l", "--list", action="store_true",
-            help="list available tests and exit")
-  parser.add_option("-n", "--no-exec", action="store_true",
-            help="no execute, just print the command line")
-  parser.add_option("--passed", action="store_true",
-            help="report passed tests")
-  parser.add_option("--path", action="append", default=[],
-            help="additional $PATH directory")
-  parser.add_option("-q", "--quiet", action="store_true",
-            help="quiet, don't print test command lines")
-  opts, args = parser.parse_args(argv[1:])
-
-  if opts.chdir:
-    os.chdir(opts.chdir)
-
-  if opts.path:
-    extra_path = [os.path.abspath(p) for p in opts.path]
-    extra_path = os.pathsep.join(extra_path)
-    os.environ['PATH'] += os.pathsep + extra_path
-
-  if not args:
-    if not opts.all:
-      sys.stderr.write('Specify -a to get all tests.\n')
-      return 1
-    args = ['test']
-
-  tests = []
-  for arg in args:
-    if os.path.isdir(arg):
-      tests.extend(find_all_gyptest_files(os.path.normpath(arg)))
-    else:
-      tests.append(arg)
-
-  if opts.list:
-    for test in tests:
-      print test
-    sys.exit(0)
-
-  CommandRunner.verbose = not opts.quiet
-  CommandRunner.active = not opts.no_exec
-  cr = CommandRunner()
-
-  os.environ['PYTHONPATH'] = os.path.abspath('test/lib')
-  if not opts.quiet:
-    sys.stdout.write('PYTHONPATH=%s\n' % os.environ['PYTHONPATH'])
-
-  passed = []
-  failed = []
-  no_result = []
-
-  if opts.format:
-    format_list = opts.format.split(',')
-  else:
-    # TODO:  not duplicate this mapping from pylib/gyp/__init__.py
-    format_list = {
-      'freebsd7': ['make'],
-      'freebsd8': ['make'],
-      'openbsd5': ['make'],
-      'cygwin':   ['msvs'],
-      'win32':    ['msvs', 'ninja'],
-      'linux2':   ['make', 'ninja'],
-      'linux3':   ['make', 'ninja'],
-      'darwin':   ['make', 'ninja', 'xcode'],
-    }[sys.platform]
-
-  for format in format_list:
-    os.environ['TESTGYP_FORMAT'] = format
-    if not opts.quiet:
-      sys.stdout.write('TESTGYP_FORMAT=%s\n' % format)
-
-    gyp_options = []
-    for option in opts.gyp_option:
-      gyp_options += ['-G', option]
-    if gyp_options and not opts.quiet:
-      sys.stdout.write('Extra Gyp options: %s\n' % gyp_options)
-
-    for test in tests:
-      status = cr.run([sys.executable, test] + gyp_options,
-                      stdout=sys.stdout,
-                      stderr=sys.stderr)
-      if status == 2:
-        no_result.append(test)
-      elif status:
-        failed.append(test)
-      else:
-        passed.append(test)
-
-  if not opts.quiet:
-    def report(description, tests):
-      if tests:
-        if len(tests) == 1:
-          sys.stdout.write("\n%s the following test:\n" % description)
-        else:
-          fmt = "\n%s the following %d tests:\n"
-          sys.stdout.write(fmt % (description, len(tests)))
-        sys.stdout.write("\t" + "\n\t".join(tests) + "\n")
-
-    if opts.passed:
-      report("Passed", passed)
-    report("Failed", failed)
-    report("No result from", no_result)
-
-  if failed:
-    return 1
-  else:
-    return 0
-
-
-if __name__ == "__main__":
-  sys.exit(main())
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSNew.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,339 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""New implementation of Visual Studio project generation for SCons."""
-
-import os
-import random
-
-import gyp.common
-
-# hashlib is supplied as of Python 2.5 as the replacement interface for md5
-# and other secure hashes.  In 2.6, md5 is deprecated.  Import hashlib if
-# available, avoiding a deprecation warning under 2.6.  Import md5 otherwise,
-# preserving 2.4 compatibility.
-try:
-  import hashlib
-  _new_md5 = hashlib.md5
-except ImportError:
-  import md5
-  _new_md5 = md5.new
-
-
-# Initialize random number generator
-random.seed()
-
-# GUIDs for project types
-ENTRY_TYPE_GUIDS = {
-    'project': '{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}',
-    'folder': '{2150E333-8FDC-42A3-9474-1A3956D46DE8}',
-}
-
-#------------------------------------------------------------------------------
-# Helper functions
-
-
-def MakeGuid(name, seed='msvs_new'):
-  """Returns a GUID for the specified target name.
-
-  Args:
-    name: Target name.
-    seed: Seed for MD5 hash.
-  Returns:
-    A GUID-line string calculated from the name and seed.
-
-  This generates something which looks like a GUID, but depends only on the
-  name and seed.  This means the same name/seed will always generate the same
-  GUID, so that projects and solutions which refer to each other can explicitly
-  determine the GUID to refer to explicitly.  It also means that the GUID will
-  not change when the project for a target is rebuilt.
-  """
-  # Calculate a MD5 signature for the seed and name.
-  d = _new_md5(str(seed) + str(name)).hexdigest().upper()
-  # Convert most of the signature to GUID form (discard the rest)
-  guid = ('{' + d[:8] + '-' + d[8:12] + '-' + d[12:16] + '-' + d[16:20]
-          + '-' + d[20:32] + '}')
-  return guid
-
-#------------------------------------------------------------------------------
-
-
-class MSVSSolutionEntry(object):
-  def __cmp__(self, other):
-    # Sort by name then guid (so things are in order on vs2008).
-    return cmp((self.name, self.get_guid()), (other.name, other.get_guid()))
-
-
-class MSVSFolder(MSVSSolutionEntry):
-  """Folder in a Visual Studio project or solution."""
-
-  def __init__(self, path, name = None, entries = None,
-               guid = None, items = None):
-    """Initializes the folder.
-
-    Args:
-      path: Full path to the folder.
-      name: Name of the folder.
-      entries: List of folder entries to nest inside this folder.  May contain
-          Folder or Project objects.  May be None, if the folder is empty.
-      guid: GUID to use for folder, if not None.
-      items: List of solution items to include in the folder project.  May be
-          None, if the folder does not directly contain items.
-    """
-    if name:
-      self.name = name
-    else:
-      # Use last layer.
-      self.name = os.path.basename(path)
-
-    self.path = path
-    self.guid = guid
-
-    # Copy passed lists (or set to empty lists)
-    self.entries = sorted(list(entries or []))
-    self.items = list(items or [])
-
-    self.entry_type_guid = ENTRY_TYPE_GUIDS['folder']
-
-  def get_guid(self):
-    if self.guid is None:
-      # Use consistent guids for folders (so things don't regenerate).
-      self.guid = MakeGuid(self.path, seed='msvs_folder')
-    return self.guid
-
-
-#------------------------------------------------------------------------------
-
-
-class MSVSProject(MSVSSolutionEntry):
-  """Visual Studio project."""
-
-  def __init__(self, path, name = None, dependencies = None, guid = None,
-               spec = None, build_file = None, config_platform_overrides = None,
-               fixpath_prefix = None):
-    """Initializes the project.
-
-    Args:
-      path: Absolute path to the project file.
-      name: Name of project.  If None, the name will be the same as the base
-          name of the project file.
-      dependencies: List of other Project objects this project is dependent
-          upon, if not None.
-      guid: GUID to use for project, if not None.
-      spec: Dictionary specifying how to build this project.
-      build_file: Filename of the .gyp file that the vcproj file comes from.
-      config_platform_overrides: optional dict of configuration platforms to
-          used in place of the default for this target.
-      fixpath_prefix: the path used to adjust the behavior of _fixpath
-    """
-    self.path = path
-    self.guid = guid
-    self.spec = spec
-    self.build_file = build_file
-    # Use project filename if name not specified
-    self.name = name or os.path.splitext(os.path.basename(path))[0]
-
-    # Copy passed lists (or set to empty lists)
-    self.dependencies = list(dependencies or [])
-
-    self.entry_type_guid = ENTRY_TYPE_GUIDS['project']
-
-    if config_platform_overrides:
-      self.config_platform_overrides = config_platform_overrides
-    else:
-      self.config_platform_overrides = {}
-    self.fixpath_prefix = fixpath_prefix
-    self.msbuild_toolset = None
-
-  def set_dependencies(self, dependencies):
-    self.dependencies = list(dependencies or [])
-
-  def get_guid(self):
-    if self.guid is None:
-      # Set GUID from path
-      # TODO(rspangler): This is fragile.
-      # 1. We can't just use the project filename sans path, since there could
-      #    be multiple projects with the same base name (for example,
-      #    foo/unittest.vcproj and bar/unittest.vcproj).
-      # 2. The path needs to be relative to $SOURCE_ROOT, so that the project
-      #    GUID is the same whether it's included from base/base.sln or
-      #    foo/bar/baz/baz.sln.
-      # 3. The GUID needs to be the same each time this builder is invoked, so
-      #    that we don't need to rebuild the solution when the project changes.
-      # 4. We should be able to handle pre-built project files by reading the
-      #    GUID from the files.
-      self.guid = MakeGuid(self.name)
-    return self.guid
-
-  def set_msbuild_toolset(self, msbuild_toolset):
-    self.msbuild_toolset = msbuild_toolset
-
-#------------------------------------------------------------------------------
-
-
-class MSVSSolution:
-  """Visual Studio solution."""
-
-  def __init__(self, path, version, entries=None, variants=None,
-               websiteProperties=True):
-    """Initializes the solution.
-
-    Args:
-      path: Path to solution file.
-      version: Format version to emit.
-      entries: List of entries in solution.  May contain Folder or Project
-          objects.  May be None, if the folder is empty.
-      variants: List of build variant strings.  If none, a default list will
-          be used.
-      websiteProperties: Flag to decide if the website properties section
-          is generated.
-    """
-    self.path = path
-    self.websiteProperties = websiteProperties
-    self.version = version
-
-    # Copy passed lists (or set to empty lists)
-    self.entries = list(entries or [])
-
-    if variants:
-      # Copy passed list
-      self.variants = variants[:]
-    else:
-      # Use default
-      self.variants = ['Debug|Win32', 'Release|Win32']
-    # TODO(rspangler): Need to be able to handle a mapping of solution config
-    # to project config.  Should we be able to handle variants being a dict,
-    # or add a separate variant_map variable?  If it's a dict, we can't
-    # guarantee the order of variants since dict keys aren't ordered.
-
-
-    # TODO(rspangler): Automatically write to disk for now; should delay until
-    # node-evaluation time.
-    self.Write()
-
-
-  def Write(self, writer=gyp.common.WriteOnDiff):
-    """Writes the solution file to disk.
-
-    Raises:
-      IndexError: An entry appears multiple times.
-    """
-    # Walk the entry tree and collect all the folders and projects.
-    all_entries = set()
-    entries_to_check = self.entries[:]
-    while entries_to_check:
-      e = entries_to_check.pop(0)
-
-      # If this entry has been visited, nothing to do.
-      if e in all_entries:
-        continue
-
-      all_entries.add(e)
-
-      # If this is a folder, check its entries too.
-      if isinstance(e, MSVSFolder):
-        entries_to_check += e.entries
-
-    all_entries = sorted(all_entries)
-
-    # Open file and print header
-    f = writer(self.path)
-    f.write('Microsoft Visual Studio Solution File, '
-            'Format Version %s\r\n' % self.version.SolutionVersion())
-    f.write('# %s\r\n' % self.version.Description())
-
-    # Project entries
-    sln_root = os.path.split(self.path)[0]
-    for e in all_entries:
-      relative_path = gyp.common.RelativePath(e.path, sln_root)
-      # msbuild does not accept an empty folder_name.
-      # use '.' in case relative_path is empty.
-      folder_name = relative_path.replace('/', '\\') or '.'
-      f.write('Project("%s") = "%s", "%s", "%s"\r\n' % (
-          e.entry_type_guid,          # Entry type GUID
-          e.name,                     # Folder name
-          folder_name,                # Folder name (again)
-          e.get_guid(),               # Entry GUID
-      ))
-
-      # TODO(rspangler): Need a way to configure this stuff
-      if self.websiteProperties:
-        f.write('\tProjectSection(WebsiteProperties) = preProject\r\n'
-                '\t\tDebug.AspNetCompiler.Debug = "True"\r\n'
-                '\t\tRelease.AspNetCompiler.Debug = "False"\r\n'
-                '\tEndProjectSection\r\n')
-
-      if isinstance(e, MSVSFolder):
-        if e.items:
-          f.write('\tProjectSection(SolutionItems) = preProject\r\n')
-          for i in e.items:
-            f.write('\t\t%s = %s\r\n' % (i, i))
-          f.write('\tEndProjectSection\r\n')
-
-      if isinstance(e, MSVSProject):
-        if e.dependencies:
-          f.write('\tProjectSection(ProjectDependencies) = postProject\r\n')
-          for d in e.dependencies:
-            f.write('\t\t%s = %s\r\n' % (d.get_guid(), d.get_guid()))
-          f.write('\tEndProjectSection\r\n')
-
-      f.write('EndProject\r\n')
-
-    # Global section
-    f.write('Global\r\n')
-
-    # Configurations (variants)
-    f.write('\tGlobalSection(SolutionConfigurationPlatforms) = preSolution\r\n')
-    for v in self.variants:
-      f.write('\t\t%s = %s\r\n' % (v, v))
-    f.write('\tEndGlobalSection\r\n')
-
-    # Sort config guids for easier diffing of solution changes.
-    config_guids = []
-    config_guids_overrides = {}
-    for e in all_entries:
-      if isinstance(e, MSVSProject):
-        config_guids.append(e.get_guid())
-        config_guids_overrides[e.get_guid()] = e.config_platform_overrides
-    config_guids.sort()
-
-    f.write('\tGlobalSection(ProjectConfigurationPlatforms) = postSolution\r\n')
-    for g in config_guids:
-      for v in self.variants:
-        nv = config_guids_overrides[g].get(v, v)
-        # Pick which project configuration to build for this solution
-        # configuration.
-        f.write('\t\t%s.%s.ActiveCfg = %s\r\n' % (
-            g,              # Project GUID
-            v,              # Solution build configuration
-            nv,             # Project build config for that solution config
-        ))
-
-        # Enable project in this solution configuration.
-        f.write('\t\t%s.%s.Build.0 = %s\r\n' % (
-            g,              # Project GUID
-            v,              # Solution build configuration
-            nv,             # Project build config for that solution config
-        ))
-    f.write('\tEndGlobalSection\r\n')
-
-    # TODO(rspangler): Should be able to configure this stuff too (though I've
-    # never seen this be any different)
-    f.write('\tGlobalSection(SolutionProperties) = preSolution\r\n')
-    f.write('\t\tHideSolutionNode = FALSE\r\n')
-    f.write('\tEndGlobalSection\r\n')
-
-    # Folder mappings
-    # TODO(rspangler): Should omit this section if there are no folders
-    f.write('\tGlobalSection(NestedProjects) = preSolution\r\n')
-    for e in all_entries:
-      if not isinstance(e, MSVSFolder):
-        continue        # Does not apply to projects, only folders
-      for subentry in e.entries:
-        f.write('\t\t%s = %s\r\n' % (subentry.get_guid(), e.get_guid()))
-    f.write('\tEndGlobalSection\r\n')
-
-    f.write('EndGlobal\r\n')
-
-    f.close()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSProject.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,208 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Visual Studio project reader/writer."""
-
-import gyp.common
-import gyp.easy_xml as easy_xml
-
-#------------------------------------------------------------------------------
-
-
-class Tool(object):
-  """Visual Studio tool."""
-
-  def __init__(self, name, attrs=None):
-    """Initializes the tool.
-
-    Args:
-      name: Tool name.
-      attrs: Dict of tool attributes; may be None.
-    """
-    self._attrs = attrs or {}
-    self._attrs['Name'] = name
-
-  def _GetSpecification(self):
-    """Creates an element for the tool.
-
-    Returns:
-      A new xml.dom.Element for the tool.
-    """
-    return ['Tool', self._attrs]
-
-class Filter(object):
-  """Visual Studio filter - that is, a virtual folder."""
-
-  def __init__(self, name, contents=None):
-    """Initializes the folder.
-
-    Args:
-      name: Filter (folder) name.
-      contents: List of filenames and/or Filter objects contained.
-    """
-    self.name = name
-    self.contents = list(contents or [])
-
-
-#------------------------------------------------------------------------------
-
-
-class Writer(object):
-  """Visual Studio XML project writer."""
-
-  def __init__(self, project_path, version, name, guid=None, platforms=None):
-    """Initializes the project.
-
-    Args:
-      project_path: Path to the project file.
-      version: Format version to emit.
-      name: Name of the project.
-      guid: GUID to use for project, if not None.
-      platforms: Array of string, the supported platforms.  If null, ['Win32']
-    """
-    self.project_path = project_path
-    self.version = version
-    self.name = name
-    self.guid = guid
-
-    # Default to Win32 for platforms.
-    if not platforms:
-      platforms = ['Win32']
-
-    # Initialize the specifications of the various sections.
-    self.platform_section = ['Platforms']
-    for platform in platforms:
-      self.platform_section.append(['Platform', {'Name': platform}])
-    self.tool_files_section = ['ToolFiles']
-    self.configurations_section = ['Configurations']
-    self.files_section = ['Files']
-
-    # Keep a dict keyed on filename to speed up access.
-    self.files_dict = dict()
-
-  def AddToolFile(self, path):
-    """Adds a tool file to the project.
-
-    Args:
-      path: Relative path from project to tool file.
-    """
-    self.tool_files_section.append(['ToolFile', {'RelativePath': path}])
-
-  def _GetSpecForConfiguration(self, config_type, config_name, attrs, tools):
-    """Returns the specification for a configuration.
-
-    Args:
-      config_type: Type of configuration node.
-      config_name: Configuration name.
-      attrs: Dict of configuration attributes; may be None.
-      tools: List of tools (strings or Tool objects); may be None.
-    Returns:
-    """
-    # Handle defaults
-    if not attrs:
-      attrs = {}
-    if not tools:
-      tools = []
-
-    # Add configuration node and its attributes
-    node_attrs = attrs.copy()
-    node_attrs['Name'] = config_name
-    specification = [config_type, node_attrs]
-
-    # Add tool nodes and their attributes
-    if tools:
-      for t in tools:
-        if isinstance(t, Tool):
-          specification.append(t._GetSpecification())
-        else:
-          specification.append(Tool(t)._GetSpecification())
-    return specification
-
-
-  def AddConfig(self, name, attrs=None, tools=None):
-    """Adds a configuration to the project.
-
-    Args:
-      name: Configuration name.
-      attrs: Dict of configuration attributes; may be None.
-      tools: List of tools (strings or Tool objects); may be None.
-    """
-    spec = self._GetSpecForConfiguration('Configuration', name, attrs, tools)
-    self.configurations_section.append(spec)
-
-  def _AddFilesToNode(self, parent, files):
-    """Adds files and/or filters to the parent node.
-
-    Args:
-      parent: Destination node
-      files: A list of Filter objects and/or relative paths to files.
-
-    Will call itself recursively, if the files list contains Filter objects.
-    """
-    for f in files:
-      if isinstance(f, Filter):
-        node = ['Filter', {'Name': f.name}]
-        self._AddFilesToNode(node, f.contents)
-      else:
-        node = ['File', {'RelativePath': f}]
-        self.files_dict[f] = node
-      parent.append(node)
-
-  def AddFiles(self, files):
-    """Adds files to the project.
-
-    Args:
-      files: A list of Filter objects and/or relative paths to files.
-
-    This makes a copy of the file/filter tree at the time of this call.  If you
-    later add files to a Filter object which was passed into a previous call
-    to AddFiles(), it will not be reflected in this project.
-    """
-    self._AddFilesToNode(self.files_section, files)
-    # TODO(rspangler) This also doesn't handle adding files to an existing
-    # filter.  That is, it doesn't merge the trees.
-
-  def AddFileConfig(self, path, config, attrs=None, tools=None):
-    """Adds a configuration to a file.
-
-    Args:
-      path: Relative path to the file.
-      config: Name of configuration to add.
-      attrs: Dict of configuration attributes; may be None.
-      tools: List of tools (strings or Tool objects); may be None.
-
-    Raises:
-      ValueError: Relative path does not match any file added via AddFiles().
-    """
-    # Find the file node with the right relative path
-    parent = self.files_dict.get(path)
-    if not parent:
-      raise ValueError('AddFileConfig: file "%s" not in project.' % path)
-
-    # Add the config to the file node
-    spec = self._GetSpecForConfiguration('FileConfiguration', config, attrs,
-                                         tools)
-    parent.append(spec)
-
-  def WriteIfChanged(self):
-    """Writes the project file."""
-    # First create XML content definition
-    content = [
-        'VisualStudioProject',
-        {'ProjectType': 'Visual C++',
-         'Version': self.version.ProjectVersion(),
-         'Name': self.name,
-         'ProjectGUID': self.guid,
-         'RootNamespace': self.name,
-         'Keyword': 'Win32Proj'
-        },
-        self.platform_section,
-        self.tool_files_section,
-        self.configurations_section,
-        ['References'],  # empty section
-        self.files_section,
-        ['Globals']  # empty section
-    ]
-    easy_xml.WriteXmlIfChanged(content, self.project_path,
-                               encoding="Windows-1252")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1046 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Code to validate and convert settings of the Microsoft build tools.
-
-This file contains code to validate and convert settings of the Microsoft
-build tools.  The function ConvertToMSBuildSettings(), ValidateMSVSSettings(),
-and ValidateMSBuildSettings() are the entry points.
-
-This file was created by comparing the projects created by Visual Studio 2008
-and Visual Studio 2010 for all available settings through the user interface.
-The MSBuild schemas were also considered.  They are typically found in the
-MSBuild install directory, e.g. c:\Program Files (x86)\MSBuild
-"""
-
-import sys
-import re
-
-# Dictionaries of settings validators. The key is the tool name, the value is
-# a dictionary mapping setting names to validation functions.
-_msvs_validators = {}
-_msbuild_validators = {}
-
-
-# A dictionary of settings converters. The key is the tool name, the value is
-# a dictionary mapping setting names to conversion functions.
-_msvs_to_msbuild_converters = {}
-
-
-# Tool name mapping from MSVS to MSBuild.
-_msbuild_name_of_tool = {}
-
-
-class _Tool(object):
-  """Represents a tool used by MSVS or MSBuild.
-
-  Attributes:
-      msvs_name: The name of the tool in MSVS.
-      msbuild_name: The name of the tool in MSBuild.
-  """
-
-  def __init__(self, msvs_name, msbuild_name):
-    self.msvs_name = msvs_name
-    self.msbuild_name = msbuild_name
-
-
-def _AddTool(tool):
-  """Adds a tool to the four dictionaries used to process settings.
-
-  This only defines the tool.  Each setting also needs to be added.
-
-  Args:
-    tool: The _Tool object to be added.
-  """
-  _msvs_validators[tool.msvs_name] = {}
-  _msbuild_validators[tool.msbuild_name] = {}
-  _msvs_to_msbuild_converters[tool.msvs_name] = {}
-  _msbuild_name_of_tool[tool.msvs_name] = tool.msbuild_name
-
-
-def _GetMSBuildToolSettings(msbuild_settings, tool):
-  """Returns an MSBuild tool dictionary.  Creates it if needed."""
-  return msbuild_settings.setdefault(tool.msbuild_name, {})
-
-
-class _Type(object):
-  """Type of settings (Base class)."""
-
-  def ValidateMSVS(self, value):
-    """Verifies that the value is legal for MSVS.
-
-    Args:
-      value: the value to check for this type.
-
-    Raises:
-      ValueError if value is not valid for MSVS.
-    """
-
-  def ValidateMSBuild(self, value):
-    """Verifies that the value is legal for MSBuild.
-
-    Args:
-      value: the value to check for this type.
-
-    Raises:
-      ValueError if value is not valid for MSBuild.
-    """
-
-  def ConvertToMSBuild(self, value):
-    """Returns the MSBuild equivalent of the MSVS value given.
-
-    Args:
-      value: the MSVS value to convert.
-
-    Returns:
-      the MSBuild equivalent.
-
-    Raises:
-      ValueError if value is not valid.
-    """
-    return value
-
-
-class _String(_Type):
-  """A setting that's just a string."""
-
-  def ValidateMSVS(self, value):
-    if not isinstance(value, basestring):
-      raise ValueError('expected string; got %r' % value)
-
-  def ValidateMSBuild(self, value):
-    if not isinstance(value, basestring):
-      raise ValueError('expected string; got %r' % value)
-
-  def ConvertToMSBuild(self, value):
-    # Convert the macros
-    return ConvertVCMacrosToMSBuild(value)
-
-
-class _StringList(_Type):
-  """A settings that's a list of strings."""
-
-  def ValidateMSVS(self, value):
-    if not isinstance(value, basestring) and not isinstance(value, list):
-      raise ValueError('expected string list; got %r' % value)
-
-  def ValidateMSBuild(self, value):
-    if not isinstance(value, basestring) and not isinstance(value, list):
-      raise ValueError('expected string list; got %r' % value)
-
-  def ConvertToMSBuild(self, value):
-    # Convert the macros
-    if isinstance(value, list):
-      return [ConvertVCMacrosToMSBuild(i) for i in value]
-    else:
-      return ConvertVCMacrosToMSBuild(value)
-
-
-class _Boolean(_Type):
-  """Boolean settings, can have the values 'false' or 'true'."""
-
-  def _Validate(self, value):
-    if value != 'true' and value != 'false':
-      raise ValueError('expected bool; got %r' % value)
-
-  def ValidateMSVS(self, value):
-    self._Validate(value)
-
-  def ValidateMSBuild(self, value):
-    self._Validate(value)
-
-  def ConvertToMSBuild(self, value):
-    self._Validate(value)
-    return value
-
-
-class _Integer(_Type):
-  """Integer settings."""
-
-  def __init__(self, msbuild_base=10):
-    _Type.__init__(self)
-    self._msbuild_base = msbuild_base
-
-  def ValidateMSVS(self, value):
-    # Try to convert, this will raise ValueError if invalid.
-    self.ConvertToMSBuild(value)
-
-  def ValidateMSBuild(self, value):
-    # Try to convert, this will raise ValueError if invalid.
-    int(value, self._msbuild_base)
-
-  def ConvertToMSBuild(self, value):
-    msbuild_format = (self._msbuild_base == 10) and '%d' or '0x%04x'
-    return msbuild_format % int(value)
-
-
-class _Enumeration(_Type):
-  """Type of settings that is an enumeration.
-
-  In MSVS, the values are indexes like '0', '1', and '2'.
-  MSBuild uses text labels that are more representative, like 'Win32'.
-
-  Constructor args:
-    label_list: an array of MSBuild labels that correspond to the MSVS index.
-        In the rare cases where MSVS has skipped an index value, None is
-        used in the array to indicate the unused spot.
-    new: an array of labels that are new to MSBuild.
-  """
-
-  def __init__(self, label_list, new=None):
-    _Type.__init__(self)
-    self._label_list = label_list
-    self._msbuild_values = set(value for value in label_list
-                               if value is not None)
-    if new is not None:
-      self._msbuild_values.update(new)
-
-  def ValidateMSVS(self, value):
-    # Try to convert.  It will raise an exception if not valid.
-    self.ConvertToMSBuild(value)
-
-  def ValidateMSBuild(self, value):
-    if value not in self._msbuild_values:
-      raise ValueError('unrecognized enumerated value %s' % value)
-
-  def ConvertToMSBuild(self, value):
-    index = int(value)
-    if index < 0 or index >= len(self._label_list):
-      raise ValueError('index value (%d) not in expected range [0, %d)' %
-                       (index, len(self._label_list)))
-    label = self._label_list[index]
-    if label is None:
-      raise ValueError('converted value for %s not specified.' % value)
-    return label
-
-
-# Instantiate the various generic types.
-_boolean = _Boolean()
-_integer = _Integer()
-# For now, we don't do any special validation on these types:
-_string = _String()
-_file_name = _String()
-_folder_name = _String()
-_file_list = _StringList()
-_folder_list = _StringList()
-_string_list = _StringList()
-# Some boolean settings went from numerical values to boolean.  The
-# mapping is 0: default, 1: false, 2: true.
-_newly_boolean = _Enumeration(['', 'false', 'true'])
-
-
-def _Same(tool, name, setting_type):
-  """Defines a setting that has the same name in MSVS and MSBuild.
-
-  Args:
-    tool: a dictionary that gives the names of the tool for MSVS and MSBuild.
-    name: the name of the setting.
-    setting_type: the type of this setting.
-  """
-  _Renamed(tool, name, name, setting_type)
-
-
-def _Renamed(tool, msvs_name, msbuild_name, setting_type):
-  """Defines a setting for which the name has changed.
-
-  Args:
-    tool: a dictionary that gives the names of the tool for MSVS and MSBuild.
-    msvs_name: the name of the MSVS setting.
-    msbuild_name: the name of the MSBuild setting.
-    setting_type: the type of this setting.
-  """
-
-  def _Translate(value, msbuild_settings):
-    msbuild_tool_settings = _GetMSBuildToolSettings(msbuild_settings, tool)
-    msbuild_tool_settings[msbuild_name] = setting_type.ConvertToMSBuild(value)
-
-  _msvs_validators[tool.msvs_name][msvs_name] = setting_type.ValidateMSVS
-  _msbuild_validators[tool.msbuild_name][msbuild_name] = (
-      setting_type.ValidateMSBuild)
-  _msvs_to_msbuild_converters[tool.msvs_name][msvs_name] = _Translate
-
-
-def _Moved(tool, settings_name, msbuild_tool_name, setting_type):
-  _MovedAndRenamed(tool, settings_name, msbuild_tool_name, settings_name,
-                   setting_type)
-
-
-def _MovedAndRenamed(tool, msvs_settings_name, msbuild_tool_name,
-                     msbuild_settings_name, setting_type):
-  """Defines a setting that may have moved to a new section.
-
-  Args:
-    tool: a dictionary that gives the names of the tool for MSVS and MSBuild.
-    msvs_settings_name: the MSVS name of the setting.
-    msbuild_tool_name: the name of the MSBuild tool to place the setting under.
-    msbuild_settings_name: the MSBuild name of the setting.
-    setting_type: the type of this setting.
-  """
-
-  def _Translate(value, msbuild_settings):
-    tool_settings = msbuild_settings.setdefault(msbuild_tool_name, {})
-    tool_settings[msbuild_settings_name] = setting_type.ConvertToMSBuild(value)
-
-  _msvs_validators[tool.msvs_name][msvs_settings_name] = (
-      setting_type.ValidateMSVS)
-  validator = setting_type.ValidateMSBuild
-  _msbuild_validators[msbuild_tool_name][msbuild_settings_name] = validator
-  _msvs_to_msbuild_converters[tool.msvs_name][msvs_settings_name] = _Translate
-
-
-def _MSVSOnly(tool, name, setting_type):
-  """Defines a setting that is only found in MSVS.
-
-  Args:
-    tool: a dictionary that gives the names of the tool for MSVS and MSBuild.
-    name: the name of the setting.
-    setting_type: the type of this setting.
-  """
-
-  def _Translate(unused_value, unused_msbuild_settings):
-    # Since this is for MSVS only settings, no translation will happen.
-    pass
-
-  _msvs_validators[tool.msvs_name][name] = setting_type.ValidateMSVS
-  _msvs_to_msbuild_converters[tool.msvs_name][name] = _Translate
-
-
-def _MSBuildOnly(tool, name, setting_type):
-  """Defines a setting that is only found in MSBuild.
-
-  Args:
-    tool: a dictionary that gives the names of the tool for MSVS and MSBuild.
-    name: the name of the setting.
-    setting_type: the type of this setting.
-  """
-  _msbuild_validators[tool.msbuild_name][name] = setting_type.ValidateMSBuild
-
-
-def _ConvertedToAdditionalOption(tool, msvs_name, flag):
-  """Defines a setting that's handled via a command line option in MSBuild.
-
-  Args:
-    tool: a dictionary that gives the names of the tool for MSVS and MSBuild.
-    msvs_name: the name of the MSVS setting that if 'true' becomes a flag
-    flag: the flag to insert at the end of the AdditionalOptions
-  """
-
-  def _Translate(value, msbuild_settings):
-    if value == 'true':
-      tool_settings = _GetMSBuildToolSettings(msbuild_settings, tool)
-      if 'AdditionalOptions' in tool_settings:
-        new_flags = '%s %s' % (tool_settings['AdditionalOptions'], flag)
-      else:
-        new_flags = flag
-      tool_settings['AdditionalOptions'] = new_flags
-  _msvs_validators[tool.msvs_name][msvs_name] = _boolean.ValidateMSVS
-  _msvs_to_msbuild_converters[tool.msvs_name][msvs_name] = _Translate
-
-
-def _CustomGeneratePreprocessedFile(tool, msvs_name):
-  def _Translate(value, msbuild_settings):
-    tool_settings = _GetMSBuildToolSettings(msbuild_settings, tool)
-    if value == '0':
-      tool_settings['PreprocessToFile'] = 'false'
-      tool_settings['PreprocessSuppressLineNumbers'] = 'false'
-    elif value == '1':  # /P
-      tool_settings['PreprocessToFile'] = 'true'
-      tool_settings['PreprocessSuppressLineNumbers'] = 'false'
-    elif value == '2':  # /EP /P
-      tool_settings['PreprocessToFile'] = 'true'
-      tool_settings['PreprocessSuppressLineNumbers'] = 'true'
-    else:
-      raise ValueError('value must be one of [0, 1, 2]; got %s' % value)
-  # Create a bogus validator that looks for '0', '1', or '2'
-  msvs_validator = _Enumeration(['a', 'b', 'c']).ValidateMSVS
-  _msvs_validators[tool.msvs_name][msvs_name] = msvs_validator
-  msbuild_validator = _boolean.ValidateMSBuild
-  msbuild_tool_validators = _msbuild_validators[tool.msbuild_name]
-  msbuild_tool_validators['PreprocessToFile'] = msbuild_validator
-  msbuild_tool_validators['PreprocessSuppressLineNumbers'] = msbuild_validator
-  _msvs_to_msbuild_converters[tool.msvs_name][msvs_name] = _Translate
-
-
-fix_vc_macro_slashes_regex_list = ('IntDir', 'OutDir')
-fix_vc_macro_slashes_regex = re.compile(
-  r'(\$\((?:%s)\))(?:[\\/]+)' % "|".join(fix_vc_macro_slashes_regex_list)
-)
-
-def FixVCMacroSlashes(s):
-  """Replace macros which have excessive following slashes.
-
-  These macros are known to have a built-in trailing slash. Furthermore, many
-  scripts hiccup on processing paths with extra slashes in the middle.
-
-  This list is probably not exhaustive.  Add as needed.
-  """
-  if '$' in s:
-    s = fix_vc_macro_slashes_regex.sub(r'\1', s)
-  return s
-
-
-def ConvertVCMacrosToMSBuild(s):
-  """Convert the the MSVS macros found in the string to the MSBuild equivalent.
-
-  This list is probably not exhaustive.  Add as needed.
-  """
-  if '$' in s:
-    replace_map = {
-        '$(ConfigurationName)': '$(Configuration)',
-        '$(InputDir)': '%(RootDir)%(Directory)',
-        '$(InputExt)': '%(Extension)',
-        '$(InputFileName)': '%(Filename)%(Extension)',
-        '$(InputName)': '%(Filename)',
-        '$(InputPath)': '%(FullPath)',
-        '$(ParentName)': '$(ProjectFileName)',
-        '$(PlatformName)': '$(Platform)',
-        '$(SafeInputName)': '%(Filename)',
-    }
-    for old, new in replace_map.iteritems():
-      s = s.replace(old, new)
-    s = FixVCMacroSlashes(s)
-  return s
-
-
-def ConvertToMSBuildSettings(msvs_settings, stderr=sys.stderr):
-  """Converts MSVS settings (VS2008 and earlier) to MSBuild settings (VS2010+).
-
-  Args:
-      msvs_settings: A dictionary.  The key is the tool name.  The values are
-          themselves dictionaries of settings and their values.
-      stderr: The stream receiving the error messages.
-
-  Returns:
-      A dictionary of MSBuild settings.  The key is either the MSBuild tool name
-      or the empty string (for the global settings).  The values are themselves
-      dictionaries of settings and their values.
-  """
-  msbuild_settings = {}
-  for msvs_tool_name, msvs_tool_settings in msvs_settings.iteritems():
-    if msvs_tool_name in _msvs_to_msbuild_converters:
-      msvs_tool = _msvs_to_msbuild_converters[msvs_tool_name]
-      for msvs_setting, msvs_value in msvs_tool_settings.iteritems():
-        if msvs_setting in msvs_tool:
-          # Invoke the translation function.
-          try:
-            msvs_tool[msvs_setting](msvs_value, msbuild_settings)
-          except ValueError, e:
-            print >> stderr, ('Warning: while converting %s/%s to MSBuild, '
-                              '%s' % (msvs_tool_name, msvs_setting, e))
-        else:
-          # We don't know this setting.  Give a warning.
-          print >> stderr, ('Warning: unrecognized setting %s/%s '
-                            'while converting to MSBuild.' %
-                            (msvs_tool_name, msvs_setting))
-    else:
-      print >> stderr, ('Warning: unrecognized tool %s while converting to '
-                        'MSBuild.' % msvs_tool_name)
-  return msbuild_settings
-
-
-def ValidateMSVSSettings(settings, stderr=sys.stderr):
-  """Validates that the names of the settings are valid for MSVS.
-
-  Args:
-      settings: A dictionary.  The key is the tool name.  The values are
-          themselves dictionaries of settings and their values.
-      stderr: The stream receiving the error messages.
-  """
-  _ValidateSettings(_msvs_validators, settings, stderr)
-
-
-def ValidateMSBuildSettings(settings, stderr=sys.stderr):
-  """Validates that the names of the settings are valid for MSBuild.
-
-  Args:
-      settings: A dictionary.  The key is the tool name.  The values are
-          themselves dictionaries of settings and their values.
-      stderr: The stream receiving the error messages.
-  """
-  _ValidateSettings(_msbuild_validators, settings, stderr)
-
-
-def _ValidateSettings(validators, settings, stderr):
-  """Validates that the settings are valid for MSBuild or MSVS.
-
-  We currently only validate the names of the settings, not their values.
-
-  Args:
-      validators: A dictionary of tools and their validators.
-      settings: A dictionary.  The key is the tool name.  The values are
-          themselves dictionaries of settings and their values.
-      stderr: The stream receiving the error messages.
-  """
-  for tool_name in settings:
-    if tool_name in validators:
-      tool_validators = validators[tool_name]
-      for setting, value in settings[tool_name].iteritems():
-        if setting in tool_validators:
-          try:
-            tool_validators[setting](value)
-          except ValueError, e:
-            print >> stderr, ('Warning: for %s/%s, %s' %
-                              (tool_name, setting, e))
-        else:
-          print >> stderr, ('Warning: unrecognized setting %s/%s' %
-                            (tool_name, setting))
-    else:
-      print >> stderr, ('Warning: unrecognized tool %s' % tool_name)
-
-
-# MSVS and MBuild names of the tools.
-_compile = _Tool('VCCLCompilerTool', 'ClCompile')
-_link = _Tool('VCLinkerTool', 'Link')
-_midl = _Tool('VCMIDLTool', 'Midl')
-_rc = _Tool('VCResourceCompilerTool', 'ResourceCompile')
-_lib = _Tool('VCLibrarianTool', 'Lib')
-_manifest = _Tool('VCManifestTool', 'Manifest')
-
-
-_AddTool(_compile)
-_AddTool(_link)
-_AddTool(_midl)
-_AddTool(_rc)
-_AddTool(_lib)
-_AddTool(_manifest)
-# Add sections only found in the MSBuild settings.
-_msbuild_validators[''] = {}
-_msbuild_validators['ProjectReference'] = {}
-_msbuild_validators['ManifestResourceCompile'] = {}
-
-# Descriptions of the compiler options, i.e. VCCLCompilerTool in MSVS and
-# ClCompile in MSBuild.
-# See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\cl.xml" for
-# the schema of the MSBuild ClCompile settings.
-
-# Options that have the same name in MSVS and MSBuild
-_Same(_compile, 'AdditionalIncludeDirectories', _folder_list)  # /I
-_Same(_compile, 'AdditionalOptions', _string_list)
-_Same(_compile, 'AdditionalUsingDirectories', _folder_list)  # /AI
-_Same(_compile, 'AssemblerListingLocation', _file_name)  # /Fa
-_Same(_compile, 'BrowseInformationFile', _file_name)
-_Same(_compile, 'BufferSecurityCheck', _boolean)  # /GS
-_Same(_compile, 'DisableLanguageExtensions', _boolean)  # /Za
-_Same(_compile, 'DisableSpecificWarnings', _string_list)  # /wd
-_Same(_compile, 'EnableFiberSafeOptimizations', _boolean)  # /GT
-_Same(_compile, 'EnablePREfast', _boolean)  # /analyze Visible='false'
-_Same(_compile, 'ExpandAttributedSource', _boolean)  # /Fx
-_Same(_compile, 'FloatingPointExceptions', _boolean)  # /fp:except
-_Same(_compile, 'ForceConformanceInForLoopScope', _boolean)  # /Zc:forScope
-_Same(_compile, 'ForcedIncludeFiles', _file_list)  # /FI
-_Same(_compile, 'ForcedUsingFiles', _file_list)  # /FU
-_Same(_compile, 'GenerateXMLDocumentationFiles', _boolean)  # /doc
-_Same(_compile, 'IgnoreStandardIncludePath', _boolean)  # /X
-_Same(_compile, 'MinimalRebuild', _boolean)  # /Gm
-_Same(_compile, 'OmitDefaultLibName', _boolean)  # /Zl
-_Same(_compile, 'OmitFramePointers', _boolean)  # /Oy
-_Same(_compile, 'PreprocessorDefinitions', _string_list)  # /D
-_Same(_compile, 'ProgramDataBaseFileName', _file_name)  # /Fd
-_Same(_compile, 'RuntimeTypeInfo', _boolean)  # /GR
-_Same(_compile, 'ShowIncludes', _boolean)  # /showIncludes
-_Same(_compile, 'SmallerTypeCheck', _boolean)  # /RTCc
-_Same(_compile, 'StringPooling', _boolean)  # /GF
-_Same(_compile, 'SuppressStartupBanner', _boolean)  # /nologo
-_Same(_compile, 'TreatWChar_tAsBuiltInType', _boolean)  # /Zc:wchar_t
-_Same(_compile, 'UndefineAllPreprocessorDefinitions', _boolean)  # /u
-_Same(_compile, 'UndefinePreprocessorDefinitions', _string_list)  # /U
-_Same(_compile, 'UseFullPaths', _boolean)  # /FC
-_Same(_compile, 'WholeProgramOptimization', _boolean)  # /GL
-_Same(_compile, 'XMLDocumentationFileName', _file_name)
-
-_Same(_compile, 'AssemblerOutput',
-      _Enumeration(['NoListing',
-                    'AssemblyCode',  # /FA
-                    'All',  # /FAcs
-                    'AssemblyAndMachineCode',  # /FAc
-                    'AssemblyAndSourceCode']))  # /FAs
-_Same(_compile, 'BasicRuntimeChecks',
-      _Enumeration(['Default',
-                    'StackFrameRuntimeCheck',  # /RTCs
-                    'UninitializedLocalUsageCheck',  # /RTCu
-                    'EnableFastChecks']))  # /RTC1
-_Same(_compile, 'BrowseInformation',
-      _Enumeration(['false',
-                    'true',  # /FR
-                    'true']))  # /Fr
-_Same(_compile, 'CallingConvention',
-      _Enumeration(['Cdecl',  # /Gd
-                    'FastCall',  # /Gr
-                    'StdCall']))  # /Gz
-_Same(_compile, 'CompileAs',
-      _Enumeration(['Default',
-                    'CompileAsC',  # /TC
-                    'CompileAsCpp']))  # /TP
-_Same(_compile, 'DebugInformationFormat',
-      _Enumeration(['',  # Disabled
-                    'OldStyle',  # /Z7
-                    None,
-                    'ProgramDatabase',  # /Zi
-                    'EditAndContinue']))  # /ZI
-_Same(_compile, 'EnableEnhancedInstructionSet',
-      _Enumeration(['NotSet',
-                    'StreamingSIMDExtensions',  # /arch:SSE
-                    'StreamingSIMDExtensions2']))  # /arch:SSE2
-_Same(_compile, 'ErrorReporting',
-      _Enumeration(['None',  # /errorReport:none
-                    'Prompt',  # /errorReport:prompt
-                    'Queue'],  # /errorReport:queue
-                   new=['Send']))  # /errorReport:send"
-_Same(_compile, 'ExceptionHandling',
-      _Enumeration(['false',
-                    'Sync',  # /EHsc
-                    'Async'],  # /EHa
-                   new=['SyncCThrow']))  # /EHs
-_Same(_compile, 'FavorSizeOrSpeed',
-      _Enumeration(['Neither',
-                    'Speed',  # /Ot
-                    'Size']))  # /Os
-_Same(_compile, 'FloatingPointModel',
-      _Enumeration(['Precise',  # /fp:precise
-                    'Strict',  # /fp:strict
-                    'Fast']))  # /fp:fast
-_Same(_compile, 'InlineFunctionExpansion',
-      _Enumeration(['Default',
-                    'OnlyExplicitInline',  # /Ob1
-                    'AnySuitable'],  # /Ob2
-                   new=['Disabled']))  # /Ob0
-_Same(_compile, 'Optimization',
-      _Enumeration(['Disabled',  # /Od
-                    'MinSpace',  # /O1
-                    'MaxSpeed',  # /O2
-                    'Full']))  # /Ox
-_Same(_compile, 'RuntimeLibrary',
-      _Enumeration(['MultiThreaded',  # /MT
-                    'MultiThreadedDebug',  # /MTd
-                    'MultiThreadedDLL',  # /MD
-                    'MultiThreadedDebugDLL']))  # /MDd
-_Same(_compile, 'StructMemberAlignment',
-      _Enumeration(['Default',
-                    '1Byte',  # /Zp1
-                    '2Bytes',  # /Zp2
-                    '4Bytes',  # /Zp4
-                    '8Bytes',  # /Zp8
-                    '16Bytes']))  # /Zp16
-_Same(_compile, 'WarningLevel',
-      _Enumeration(['TurnOffAllWarnings',  # /W0
-                    'Level1',  # /W1
-                    'Level2',  # /W2
-                    'Level3',  # /W3
-                    'Level4'],  # /W4
-                   new=['EnableAllWarnings']))  # /Wall
-
-# Options found in MSVS that have been renamed in MSBuild.
-_Renamed(_compile, 'EnableFunctionLevelLinking', 'FunctionLevelLinking',
-         _boolean)  # /Gy
-_Renamed(_compile, 'EnableIntrinsicFunctions', 'IntrinsicFunctions',
-         _boolean)  # /Oi
-_Renamed(_compile, 'KeepComments', 'PreprocessKeepComments', _boolean)  # /C
-_Renamed(_compile, 'ObjectFile', 'ObjectFileName', _file_name)  # /Fo
-_Renamed(_compile, 'OpenMP', 'OpenMPSupport', _boolean)  # /openmp
-_Renamed(_compile, 'PrecompiledHeaderThrough', 'PrecompiledHeaderFile',
-         _file_name)  # Used with /Yc and /Yu
-_Renamed(_compile, 'PrecompiledHeaderFile', 'PrecompiledHeaderOutputFile',
-         _file_name)  # /Fp
-_Renamed(_compile, 'UsePrecompiledHeader', 'PrecompiledHeader',
-         _Enumeration(['NotUsing',  # VS recognized '' for this value too.
-                       'Create',   # /Yc
-                       'Use']))  # /Yu
-_Renamed(_compile, 'WarnAsError', 'TreatWarningAsError', _boolean)  # /WX
-
-_ConvertedToAdditionalOption(_compile, 'DefaultCharIsUnsigned', '/J')
-
-# MSVS options not found in MSBuild.
-_MSVSOnly(_compile, 'Detect64BitPortabilityProblems', _boolean)
-_MSVSOnly(_compile, 'UseUnicodeResponseFiles', _boolean)
-
-# MSBuild options not found in MSVS.
-_MSBuildOnly(_compile, 'BuildingInIDE', _boolean)
-_MSBuildOnly(_compile, 'CompileAsManaged',
-             _Enumeration([], new=['false',
-                                   'true',  # /clr
-                                   'Pure',  # /clr:pure
-                                   'Safe',  # /clr:safe
-                                   'OldSyntax']))  # /clr:oldSyntax
-_MSBuildOnly(_compile, 'CreateHotpatchableImage', _boolean)  # /hotpatch
-_MSBuildOnly(_compile, 'MultiProcessorCompilation', _boolean)  # /MP
-_MSBuildOnly(_compile, 'PreprocessOutputPath', _string)  # /Fi
-_MSBuildOnly(_compile, 'ProcessorNumber', _integer)  # the number of processors
-_MSBuildOnly(_compile, 'TrackerLogDirectory', _folder_name)
-_MSBuildOnly(_compile, 'TreatSpecificWarningsAsErrors', _string_list)  # /we
-_MSBuildOnly(_compile, 'UseUnicodeForAssemblerListing', _boolean)  # /FAu
-
-# Defines a setting that needs very customized processing
-_CustomGeneratePreprocessedFile(_compile, 'GeneratePreprocessedFile')
-
-
-# Directives for converting MSVS VCLinkerTool to MSBuild Link.
-# See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\link.xml" for
-# the schema of the MSBuild Link settings.
-
-# Options that have the same name in MSVS and MSBuild
-_Same(_link, 'AdditionalDependencies', _file_list)
-_Same(_link, 'AdditionalLibraryDirectories', _folder_list)  # /LIBPATH
-#  /MANIFESTDEPENDENCY:
-_Same(_link, 'AdditionalManifestDependencies', _file_list)
-_Same(_link, 'AdditionalOptions', _string_list)
-_Same(_link, 'AddModuleNamesToAssembly', _file_list)  # /ASSEMBLYMODULE
-_Same(_link, 'AllowIsolation', _boolean)  # /ALLOWISOLATION
-_Same(_link, 'AssemblyLinkResource', _file_list)  # /ASSEMBLYLINKRESOURCE
-_Same(_link, 'BaseAddress', _string)  # /BASE
-_Same(_link, 'CLRUnmanagedCodeCheck', _boolean)  # /CLRUNMANAGEDCODECHECK
-_Same(_link, 'DelayLoadDLLs', _file_list)  # /DELAYLOAD
-_Same(_link, 'DelaySign', _boolean)  # /DELAYSIGN
-_Same(_link, 'EmbedManagedResourceFile', _file_list)  # /ASSEMBLYRESOURCE
-_Same(_link, 'EnableUAC', _boolean)  # /MANIFESTUAC
-_Same(_link, 'EntryPointSymbol', _string)  # /ENTRY
-_Same(_link, 'ForceSymbolReferences', _file_list)  # /INCLUDE
-_Same(_link, 'FunctionOrder', _file_name)  # /ORDER
-_Same(_link, 'GenerateDebugInformation', _boolean)  # /DEBUG
-_Same(_link, 'GenerateMapFile', _boolean)  # /MAP
-_Same(_link, 'HeapCommitSize', _string)
-_Same(_link, 'HeapReserveSize', _string)  # /HEAP
-_Same(_link, 'IgnoreAllDefaultLibraries', _boolean)  # /NODEFAULTLIB
-_Same(_link, 'IgnoreEmbeddedIDL', _boolean)  # /IGNOREIDL
-_Same(_link, 'ImportLibrary', _file_name)  # /IMPLIB
-_Same(_link, 'KeyContainer', _file_name)  # /KEYCONTAINER
-_Same(_link, 'KeyFile', _file_name)  # /KEYFILE
-_Same(_link, 'ManifestFile', _file_name)  # /ManifestFile
-_Same(_link, 'MapExports', _boolean)  # /MAPINFO:EXPORTS
-_Same(_link, 'MapFileName', _file_name)
-_Same(_link, 'MergedIDLBaseFileName', _file_name)  # /IDLOUT
-_Same(_link, 'MergeSections', _string)  # /MERGE
-_Same(_link, 'MidlCommandFile', _file_name)  # /MIDL
-_Same(_link, 'ModuleDefinitionFile', _file_name)  # /DEF
-_Same(_link, 'OutputFile', _file_name)  # /OUT
-_Same(_link, 'PerUserRedirection', _boolean)
-_Same(_link, 'Profile', _boolean)  # /PROFILE
-_Same(_link, 'ProfileGuidedDatabase', _file_name)  # /PGD
-_Same(_link, 'ProgramDatabaseFile', _file_name)  # /PDB
-_Same(_link, 'RegisterOutput', _boolean)
-_Same(_link, 'SetChecksum', _boolean)  # /RELEASE
-_Same(_link, 'StackCommitSize', _string)
-_Same(_link, 'StackReserveSize', _string)  # /STACK
-_Same(_link, 'StripPrivateSymbols', _file_name)  # /PDBSTRIPPED
-_Same(_link, 'SupportUnloadOfDelayLoadedDLL', _boolean)  # /DELAY:UNLOAD
-_Same(_link, 'SuppressStartupBanner', _boolean)  # /NOLOGO
-_Same(_link, 'SwapRunFromCD', _boolean)  # /SWAPRUN:CD
-_Same(_link, 'TurnOffAssemblyGeneration', _boolean)  # /NOASSEMBLY
-_Same(_link, 'TypeLibraryFile', _file_name)  # /TLBOUT
-_Same(_link, 'TypeLibraryResourceID', _integer)  # /TLBID
-_Same(_link, 'UACUIAccess', _boolean)  # /uiAccess='true'
-_Same(_link, 'Version', _string)  # /VERSION
-
-_Same(_link, 'EnableCOMDATFolding', _newly_boolean)  # /OPT:ICF
-_Same(_link, 'FixedBaseAddress', _newly_boolean)  # /FIXED
-_Same(_link, 'LargeAddressAware', _newly_boolean)  # /LARGEADDRESSAWARE
-_Same(_link, 'OptimizeReferences', _newly_boolean)  # /OPT:REF
-_Same(_link, 'RandomizedBaseAddress', _newly_boolean)  # /DYNAMICBASE
-_Same(_link, 'TerminalServerAware', _newly_boolean)  # /TSAWARE
-
-_subsystem_enumeration = _Enumeration(
-    ['NotSet',
-     'Console',  # /SUBSYSTEM:CONSOLE
-     'Windows',  # /SUBSYSTEM:WINDOWS
-     'Native',  # /SUBSYSTEM:NATIVE
-     'EFI Application',  # /SUBSYSTEM:EFI_APPLICATION
-     'EFI Boot Service Driver',  # /SUBSYSTEM:EFI_BOOT_SERVICE_DRIVER
-     'EFI ROM',  # /SUBSYSTEM:EFI_ROM
-     'EFI Runtime',  # /SUBSYSTEM:EFI_RUNTIME_DRIVER
-     'WindowsCE'],  # /SUBSYSTEM:WINDOWSCE
-    new=['POSIX'])  # /SUBSYSTEM:POSIX
-
-_target_machine_enumeration = _Enumeration(
-    ['NotSet',
-     'MachineX86',  # /MACHINE:X86
-     None,
-     'MachineARM',  # /MACHINE:ARM
-     'MachineEBC',  # /MACHINE:EBC
-     'MachineIA64',  # /MACHINE:IA64
-     None,
-     'MachineMIPS',  # /MACHINE:MIPS
-     'MachineMIPS16',  # /MACHINE:MIPS16
-     'MachineMIPSFPU',  # /MACHINE:MIPSFPU
-     'MachineMIPSFPU16',  # /MACHINE:MIPSFPU16
-     None,
-     None,
-     None,
-     'MachineSH4',  # /MACHINE:SH4
-     None,
-     'MachineTHUMB',  # /MACHINE:THUMB
-     'MachineX64'])  # /MACHINE:X64
-
-_Same(_link, 'AssemblyDebug',
-      _Enumeration(['',
-                    'true',  # /ASSEMBLYDEBUG
-                    'false']))  # /ASSEMBLYDEBUG:DISABLE
-_Same(_link, 'CLRImageType',
-      _Enumeration(['Default',
-                    'ForceIJWImage',  # /CLRIMAGETYPE:IJW
-                    'ForcePureILImage',  # /Switch="CLRIMAGETYPE:PURE
-                    'ForceSafeILImage']))  # /Switch="CLRIMAGETYPE:SAFE
-_Same(_link, 'CLRThreadAttribute',
-      _Enumeration(['DefaultThreadingAttribute',  # /CLRTHREADATTRIBUTE:NONE
-                    'MTAThreadingAttribute',  # /CLRTHREADATTRIBUTE:MTA
-                    'STAThreadingAttribute']))  # /CLRTHREADATTRIBUTE:STA
-_Same(_link, 'DataExecutionPrevention',
-      _Enumeration(['',
-                    'false',  # /NXCOMPAT:NO
-                    'true']))  # /NXCOMPAT
-_Same(_link, 'Driver',
-      _Enumeration(['NotSet',
-                    'Driver',  # /Driver
-                    'UpOnly',  # /DRIVER:UPONLY
-                    'WDM']))  # /DRIVER:WDM
-_Same(_link, 'LinkTimeCodeGeneration',
-      _Enumeration(['Default',
-                    'UseLinkTimeCodeGeneration',  # /LTCG
-                    'PGInstrument',  # /LTCG:PGInstrument
-                    'PGOptimization',  # /LTCG:PGOptimize
-                    'PGUpdate']))  # /LTCG:PGUpdate
-_Same(_link, 'ShowProgress',
-      _Enumeration(['NotSet',
-                    'LinkVerbose',  # /VERBOSE
-                    'LinkVerboseLib'],  # /VERBOSE:Lib
-                   new=['LinkVerboseICF',  # /VERBOSE:ICF
-                        'LinkVerboseREF',  # /VERBOSE:REF
-                        'LinkVerboseSAFESEH',  # /VERBOSE:SAFESEH
-                        'LinkVerboseCLR']))  # /VERBOSE:CLR
-_Same(_link, 'SubSystem', _subsystem_enumeration)
-_Same(_link, 'TargetMachine', _target_machine_enumeration)
-_Same(_link, 'UACExecutionLevel',
-      _Enumeration(['AsInvoker',  # /level='asInvoker'
-                    'HighestAvailable',  # /level='highestAvailable'
-                    'RequireAdministrator']))  # /level='requireAdministrator'
-
-
-# Options found in MSVS that have been renamed in MSBuild.
-_Renamed(_link, 'ErrorReporting', 'LinkErrorReporting',
-         _Enumeration(['NoErrorReport',  # /ERRORREPORT:NONE
-                       'PromptImmediately',  # /ERRORREPORT:PROMPT
-                       'QueueForNextLogin'],  # /ERRORREPORT:QUEUE
-                      new=['SendErrorReport']))  # /ERRORREPORT:SEND
-_Renamed(_link, 'IgnoreDefaultLibraryNames', 'IgnoreSpecificDefaultLibraries',
-         _file_list)  # /NODEFAULTLIB
-_Renamed(_link, 'ResourceOnlyDLL', 'NoEntryPoint', _boolean)  # /NOENTRY
-_Renamed(_link, 'SwapRunFromNet', 'SwapRunFromNET', _boolean)  # /SWAPRUN:NET
-
-_Moved(_link, 'GenerateManifest', '', _boolean)
-_Moved(_link, 'IgnoreImportLibrary', '', _boolean)
-_Moved(_link, 'LinkIncremental', '', _newly_boolean)
-_Moved(_link, 'LinkLibraryDependencies', 'ProjectReference', _boolean)
-_Moved(_link, 'UseLibraryDependencyInputs', 'ProjectReference', _boolean)
-
-# MSVS options not found in MSBuild.
-_MSVSOnly(_link, 'OptimizeForWindows98', _newly_boolean)
-_MSVSOnly(_link, 'UseUnicodeResponseFiles', _boolean)
-# TODO(jeanluc) I don't think these are genuine settings but byproducts of Gyp.
-_MSVSOnly(_link, 'AdditionalLibraryDirectories_excluded', _folder_list)
-
-# MSBuild options not found in MSVS.
-_MSBuildOnly(_link, 'BuildingInIDE', _boolean)
-_MSBuildOnly(_link, 'ImageHasSafeExceptionHandlers', _boolean)  # /SAFESEH
-_MSBuildOnly(_link, 'LinkDLL', _boolean)  # /DLL Visible='false'
-_MSBuildOnly(_link, 'LinkStatus', _boolean)  # /LTCG:STATUS
-_MSBuildOnly(_link, 'PreventDllBinding', _boolean)  # /ALLOWBIND
-_MSBuildOnly(_link, 'SupportNobindOfDelayLoadedDLL', _boolean)  # /DELAY:NOBIND
-_MSBuildOnly(_link, 'TrackerLogDirectory', _folder_name)
-_MSBuildOnly(_link, 'TreatLinkerWarningAsErrors', _boolean)  # /WX
-_MSBuildOnly(_link, 'MinimumRequiredVersion', _string)
-_MSBuildOnly(_link, 'MSDOSStubFileName', _file_name)  # /STUB Visible='false'
-_MSBuildOnly(_link, 'SectionAlignment', _integer)  # /ALIGN
-_MSBuildOnly(_link, 'SpecifySectionAttributes', _string)  # /SECTION
-_MSBuildOnly(_link, 'ForceFileOutput',
-             _Enumeration([], new=['Enabled',  # /FORCE
-                                   # /FORCE:MULTIPLE
-                                   'MultiplyDefinedSymbolOnly',
-                                   'UndefinedSymbolOnly']))  # /FORCE:UNRESOLVED
-_MSBuildOnly(_link, 'CreateHotPatchableImage',
-             _Enumeration([], new=['Enabled',  # /FUNCTIONPADMIN
-                                   'X86Image',  # /FUNCTIONPADMIN:5
-                                   'X64Image',  # /FUNCTIONPADMIN:6
-                                   'ItaniumImage']))  # /FUNCTIONPADMIN:16
-_MSBuildOnly(_link, 'CLRSupportLastError',
-             _Enumeration([], new=['Enabled',  # /CLRSupportLastError
-                                   'Disabled',  # /CLRSupportLastError:NO
-                                   # /CLRSupportLastError:SYSTEMDLL
-                                   'SystemDlls']))
-
-
-# Directives for converting VCResourceCompilerTool to ResourceCompile.
-# See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\rc.xml" for
-# the schema of the MSBuild ResourceCompile settings.
-
-_Same(_rc, 'AdditionalOptions', _string_list)
-_Same(_rc, 'AdditionalIncludeDirectories', _folder_list)  # /I
-_Same(_rc, 'Culture', _Integer(msbuild_base=16))
-_Same(_rc, 'IgnoreStandardIncludePath', _boolean)  # /X
-_Same(_rc, 'PreprocessorDefinitions', _string_list)  # /D
-_Same(_rc, 'ResourceOutputFileName', _string)  # /fo
-_Same(_rc, 'ShowProgress', _boolean)  # /v
-# There is no UI in VisualStudio 2008 to set the following properties.
-# However they are found in CL and other tools.  Include them here for
-# completeness, as they are very likely to have the same usage pattern.
-_Same(_rc, 'SuppressStartupBanner', _boolean)  # /nologo
-_Same(_rc, 'UndefinePreprocessorDefinitions', _string_list)  # /u
-
-# MSBuild options not found in MSVS.
-_MSBuildOnly(_rc, 'NullTerminateStrings', _boolean)  # /n
-_MSBuildOnly(_rc, 'TrackerLogDirectory', _folder_name)
-
-
-# Directives for converting VCMIDLTool to Midl.
-# See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\midl.xml" for
-# the schema of the MSBuild Midl settings.
-
-_Same(_midl, 'AdditionalIncludeDirectories', _folder_list)  # /I
-_Same(_midl, 'AdditionalOptions', _string_list)
-_Same(_midl, 'CPreprocessOptions', _string)  # /cpp_opt
-_Same(_midl, 'ErrorCheckAllocations', _boolean)  # /error allocation
-_Same(_midl, 'ErrorCheckBounds', _boolean)  # /error bounds_check
-_Same(_midl, 'ErrorCheckEnumRange', _boolean)  # /error enum
-_Same(_midl, 'ErrorCheckRefPointers', _boolean)  # /error ref
-_Same(_midl, 'ErrorCheckStubData', _boolean)  # /error stub_data
-_Same(_midl, 'GenerateStublessProxies', _boolean)  # /Oicf
-_Same(_midl, 'GenerateTypeLibrary', _boolean)
-_Same(_midl, 'HeaderFileName', _file_name)  # /h
-_Same(_midl, 'IgnoreStandardIncludePath', _boolean)  # /no_def_idir
-_Same(_midl, 'InterfaceIdentifierFileName', _file_name)  # /iid
-_Same(_midl, 'MkTypLibCompatible', _boolean)  # /mktyplib203
-_Same(_midl, 'OutputDirectory', _string)  # /out
-_Same(_midl, 'PreprocessorDefinitions', _string_list)  # /D
-_Same(_midl, 'ProxyFileName', _file_name)  # /proxy
-_Same(_midl, 'RedirectOutputAndErrors', _file_name)  # /o
-_Same(_midl, 'SuppressStartupBanner', _boolean)  # /nologo
-_Same(_midl, 'TypeLibraryName', _file_name)  # /tlb
-_Same(_midl, 'UndefinePreprocessorDefinitions', _string_list)  # /U
-_Same(_midl, 'WarnAsError', _boolean)  # /WX
-
-_Same(_midl, 'DefaultCharType',
-      _Enumeration(['Unsigned',  # /char unsigned
-                    'Signed',  # /char signed
-                    'Ascii']))  # /char ascii7
-_Same(_midl, 'TargetEnvironment',
-      _Enumeration(['NotSet',
-                    'Win32',  # /env win32
-                    'Itanium',  # /env ia64
-                    'X64']))  # /env x64
-_Same(_midl, 'EnableErrorChecks',
-      _Enumeration(['EnableCustom',
-                    'None',  # /error none
-                    'All']))  # /error all
-_Same(_midl, 'StructMemberAlignment',
-      _Enumeration(['NotSet',
-                    '1',  # Zp1
-                    '2',  # Zp2
-                    '4',  # Zp4
-                    '8']))  # Zp8
-_Same(_midl, 'WarningLevel',
-      _Enumeration(['0',  # /W0
-                    '1',  # /W1
-                    '2',  # /W2
-                    '3',  # /W3
-                    '4']))  # /W4
-
-_Renamed(_midl, 'DLLDataFileName', 'DllDataFileName', _file_name)  # /dlldata
-_Renamed(_midl, 'ValidateParameters', 'ValidateAllParameters',
-         _boolean)  # /robust
-
-# MSBuild options not found in MSVS.
-_MSBuildOnly(_midl, 'ApplicationConfigurationMode', _boolean)  # /app_config
-_MSBuildOnly(_midl, 'ClientStubFile', _file_name)  # /cstub
-_MSBuildOnly(_midl, 'GenerateClientFiles',
-             _Enumeration([], new=['Stub',  # /client stub
-                                   'None']))  # /client none
-_MSBuildOnly(_midl, 'GenerateServerFiles',
-             _Enumeration([], new=['Stub',  # /client stub
-                                   'None']))  # /client none
-_MSBuildOnly(_midl, 'LocaleID', _integer)  # /lcid DECIMAL
-_MSBuildOnly(_midl, 'ServerStubFile', _file_name)  # /sstub
-_MSBuildOnly(_midl, 'SuppressCompilerWarnings', _boolean)  # /no_warn
-_MSBuildOnly(_midl, 'TrackerLogDirectory', _folder_name)
-_MSBuildOnly(_midl, 'TypeLibFormat',
-             _Enumeration([], new=['NewFormat',  # /newtlb
-                                   'OldFormat']))  # /oldtlb
-
-
-# Directives for converting VCLibrarianTool to Lib.
-# See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\lib.xml" for
-# the schema of the MSBuild Lib settings.
-
-_Same(_lib, 'AdditionalDependencies', _file_list)
-_Same(_lib, 'AdditionalLibraryDirectories', _folder_list)  # /LIBPATH
-_Same(_lib, 'AdditionalOptions', _string_list)
-_Same(_lib, 'ExportNamedFunctions', _string_list)  # /EXPORT
-_Same(_lib, 'ForceSymbolReferences', _string)  # /INCLUDE
-_Same(_lib, 'IgnoreAllDefaultLibraries', _boolean)  # /NODEFAULTLIB
-_Same(_lib, 'IgnoreSpecificDefaultLibraries', _file_list)  # /NODEFAULTLIB
-_Same(_lib, 'ModuleDefinitionFile', _file_name)  # /DEF
-_Same(_lib, 'OutputFile', _file_name)  # /OUT
-_Same(_lib, 'SuppressStartupBanner', _boolean)  # /NOLOGO
-_Same(_lib, 'UseUnicodeResponseFiles', _boolean)
-_Same(_lib, 'LinkTimeCodeGeneration', _boolean)  # /LTCG
-
-# TODO(jeanluc) _link defines the same value that gets moved to
-# ProjectReference.  We may want to validate that they are consistent.
-_Moved(_lib, 'LinkLibraryDependencies', 'ProjectReference', _boolean)
-
-# TODO(jeanluc) I don't think these are genuine settings but byproducts of Gyp.
-_MSVSOnly(_lib, 'AdditionalLibraryDirectories_excluded', _folder_list)
-
-_MSBuildOnly(_lib, 'DisplayLibrary', _string)  # /LIST Visible='false'
-_MSBuildOnly(_lib, 'ErrorReporting',
-             _Enumeration([], new=['PromptImmediately',  # /ERRORREPORT:PROMPT
-                                   'QueueForNextLogin',  # /ERRORREPORT:QUEUE
-                                   'SendErrorReport',  # /ERRORREPORT:SEND
-                                   'NoErrorReport']))  # /ERRORREPORT:NONE
-_MSBuildOnly(_lib, 'MinimumRequiredVersion', _string)
-_MSBuildOnly(_lib, 'Name', _file_name)  # /NAME
-_MSBuildOnly(_lib, 'RemoveObjects', _file_list)  # /REMOVE
-_MSBuildOnly(_lib, 'SubSystem', _subsystem_enumeration)
-_MSBuildOnly(_lib, 'TargetMachine', _target_machine_enumeration)
-_MSBuildOnly(_lib, 'TrackerLogDirectory', _folder_name)
-_MSBuildOnly(_lib, 'TreatLibWarningAsErrors', _boolean)  # /WX
-_MSBuildOnly(_lib, 'Verbose', _boolean)
-
-
-# Directives for converting VCManifestTool to Mt.
-# See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\mt.xml" for
-# the schema of the MSBuild Lib settings.
-
-# Options that have the same name in MSVS and MSBuild
-_Same(_manifest, 'AdditionalManifestFiles', _file_list)  # /manifest
-_Same(_manifest, 'AdditionalOptions', _string_list)
-_Same(_manifest, 'AssemblyIdentity', _string)  # /identity:
-_Same(_manifest, 'ComponentFileName', _file_name)  # /dll
-_Same(_manifest, 'GenerateCatalogFiles', _boolean)  # /makecdfs
-_Same(_manifest, 'InputResourceManifests', _string)  # /inputresource
-_Same(_manifest, 'OutputManifestFile', _file_name)  # /out
-_Same(_manifest, 'RegistrarScriptFile', _file_name)  # /rgs
-_Same(_manifest, 'ReplacementsFile', _file_name)  # /replacements
-_Same(_manifest, 'SuppressStartupBanner', _boolean)  # /nologo
-_Same(_manifest, 'TypeLibraryFile', _file_name)  # /tlb:
-_Same(_manifest, 'UpdateFileHashes', _boolean)  # /hashupdate
-_Same(_manifest, 'UpdateFileHashesSearchPath', _file_name)
-_Same(_manifest, 'VerboseOutput', _boolean)  # /verbose
-
-# Options that have moved location.
-_MovedAndRenamed(_manifest, 'ManifestResourceFile',
-                 'ManifestResourceCompile',
-                 'ResourceOutputFileName',
-                 _file_name)
-_Moved(_manifest, 'EmbedManifest', '', _boolean)
-
-# MSVS options not found in MSBuild.
-_MSVSOnly(_manifest, 'DependencyInformationFile', _file_name)
-_MSVSOnly(_manifest, 'UseFAT32Workaround', _boolean)
-_MSVSOnly(_manifest, 'UseUnicodeResponseFiles', _boolean)
-
-# MSBuild options not found in MSVS.
-_MSBuildOnly(_manifest, 'EnableDPIAwareness', _boolean)
-_MSBuildOnly(_manifest, 'GenerateCategoryTags', _boolean)  # /category
-_MSBuildOnly(_manifest, 'ManifestFromManagedAssembly',
-             _file_name)  # /managedassemblyname
-_MSBuildOnly(_manifest, 'OutputResourceManifests', _string)  # /outputresource
-_MSBuildOnly(_manifest, 'SuppressDependencyElement', _boolean)  # /nodependency
-_MSBuildOnly(_manifest, 'TrackerLogDirectory', _folder_name)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings_test.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1482 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Unit tests for the MSVSSettings.py file."""
-
-import StringIO
-import unittest
-import gyp.MSVSSettings as MSVSSettings
-
-
-class TestSequenceFunctions(unittest.TestCase):
-
-  def setUp(self):
-    self.stderr = StringIO.StringIO()
-
-  def _ExpectedWarnings(self, expected):
-    """Compares recorded lines to expected warnings."""
-    self.stderr.seek(0)
-    actual = self.stderr.read().split('\n')
-    actual = [line for line in actual if line]
-    self.assertEqual(sorted(expected), sorted(actual))
-
-  def testValidateMSVSSettings_tool_names(self):
-    """Tests that only MSVS tool names are allowed."""
-    MSVSSettings.ValidateMSVSSettings(
-        {'VCCLCompilerTool': {},
-         'VCLinkerTool': {},
-         'VCMIDLTool': {},
-         'foo': {},
-         'VCResourceCompilerTool': {},
-         'VCLibrarianTool': {},
-         'VCManifestTool': {},
-         'ClCompile': {}},
-        self.stderr)
-    self._ExpectedWarnings([
-        'Warning: unrecognized tool foo',
-        'Warning: unrecognized tool ClCompile'])
-
-  def testValidateMSVSSettings_settings(self):
-    """Tests that for invalid MSVS settings."""
-    MSVSSettings.ValidateMSVSSettings(
-        {'VCCLCompilerTool': {
-            'AdditionalIncludeDirectories': 'folder1;folder2',
-            'AdditionalOptions': ['string1', 'string2'],
-            'AdditionalUsingDirectories': 'folder1;folder2',
-            'AssemblerListingLocation': 'a_file_name',
-            'AssemblerOutput': '0',
-            'BasicRuntimeChecks': '5',
-            'BrowseInformation': 'fdkslj',
-            'BrowseInformationFile': 'a_file_name',
-            'BufferSecurityCheck': 'true',
-            'CallingConvention': '-1',
-            'CompileAs': '1',
-            'DebugInformationFormat': '2',
-            'DefaultCharIsUnsigned': 'true',
-            'Detect64BitPortabilityProblems': 'true',
-            'DisableLanguageExtensions': 'true',
-            'DisableSpecificWarnings': 'string1;string2',
-            'EnableEnhancedInstructionSet': '1',
-            'EnableFiberSafeOptimizations': 'true',
-            'EnableFunctionLevelLinking': 'true',
-            'EnableIntrinsicFunctions': 'true',
-            'EnablePREfast': 'true',
-            'Enableprefast': 'bogus',
-            'ErrorReporting': '1',
-            'ExceptionHandling': '1',
-            'ExpandAttributedSource': 'true',
-            'FavorSizeOrSpeed': '1',
-            'FloatingPointExceptions': 'true',
-            'FloatingPointModel': '1',
-            'ForceConformanceInForLoopScope': 'true',
-            'ForcedIncludeFiles': 'file1;file2',
-            'ForcedUsingFiles': 'file1;file2',
-            'GeneratePreprocessedFile': '1',
-            'GenerateXMLDocumentationFiles': 'true',
-            'IgnoreStandardIncludePath': 'true',
-            'InlineFunctionExpansion': '1',
-            'KeepComments': 'true',
-            'MinimalRebuild': 'true',
-            'ObjectFile': 'a_file_name',
-            'OmitDefaultLibName': 'true',
-            'OmitFramePointers': 'true',
-            'OpenMP': 'true',
-            'Optimization': '1',
-            'PrecompiledHeaderFile': 'a_file_name',
-            'PrecompiledHeaderThrough': 'a_file_name',
-            'PreprocessorDefinitions': 'string1;string2',
-            'ProgramDataBaseFileName': 'a_file_name',
-            'RuntimeLibrary': '1',
-            'RuntimeTypeInfo': 'true',
-            'ShowIncludes': 'true',
-            'SmallerTypeCheck': 'true',
-            'StringPooling': 'true',
-            'StructMemberAlignment': '1',
-            'SuppressStartupBanner': 'true',
-            'TreatWChar_tAsBuiltInType': 'true',
-            'UndefineAllPreprocessorDefinitions': 'true',
-            'UndefinePreprocessorDefinitions': 'string1;string2',
-            'UseFullPaths': 'true',
-            'UsePrecompiledHeader': '1',
-            'UseUnicodeResponseFiles': 'true',
-            'WarnAsError': 'true',
-            'WarningLevel': '1',
-            'WholeProgramOptimization': 'true',
-            'XMLDocumentationFileName': 'a_file_name',
-            'ZZXYZ': 'bogus'},
-         'VCLinkerTool': {
-             'AdditionalDependencies': 'file1;file2',
-             'AdditionalLibraryDirectories': 'folder1;folder2',
-             'AdditionalManifestDependencies': 'file1;file2',
-             'AdditionalOptions': 'a string1',
-             'AddModuleNamesToAssembly': 'file1;file2',
-             'AllowIsolation': 'true',
-             'AssemblyDebug': '2',
-             'AssemblyLinkResource': 'file1;file2',
-             'BaseAddress': 'a string1',
-             'CLRImageType': '2',
-             'CLRThreadAttribute': '2',
-             'CLRUnmanagedCodeCheck': 'true',
-             'DataExecutionPrevention': '2',
-             'DelayLoadDLLs': 'file1;file2',
-             'DelaySign': 'true',
-             'Driver': '2',
-             'EmbedManagedResourceFile': 'file1;file2',
-             'EnableCOMDATFolding': '2',
-             'EnableUAC': 'true',
-             'EntryPointSymbol': 'a string1',
-             'ErrorReporting': '2',
-             'FixedBaseAddress': '2',
-             'ForceSymbolReferences': 'file1;file2',
-             'FunctionOrder': 'a_file_name',
-             'GenerateDebugInformation': 'true',
-             'GenerateManifest': 'true',
-             'GenerateMapFile': 'true',
-             'HeapCommitSize': 'a string1',
-             'HeapReserveSize': 'a string1',
-             'IgnoreAllDefaultLibraries': 'true',
-             'IgnoreDefaultLibraryNames': 'file1;file2',
-             'IgnoreEmbeddedIDL': 'true',
-             'IgnoreImportLibrary': 'true',
-             'ImportLibrary': 'a_file_name',
-             'KeyContainer': 'a_file_name',
-             'KeyFile': 'a_file_name',
-             'LargeAddressAware': '2',
-             'LinkIncremental': '2',
-             'LinkLibraryDependencies': 'true',
-             'LinkTimeCodeGeneration': '2',
-             'ManifestFile': 'a_file_name',
-             'MapExports': 'true',
-             'MapFileName': 'a_file_name',
-             'MergedIDLBaseFileName': 'a_file_name',
-             'MergeSections': 'a string1',
-             'MidlCommandFile': 'a_file_name',
-             'ModuleDefinitionFile': 'a_file_name',
-             'OptimizeForWindows98': '1',
-             'OptimizeReferences': '2',
-             'OutputFile': 'a_file_name',
-             'PerUserRedirection': 'true',
-             'Profile': 'true',
-             'ProfileGuidedDatabase': 'a_file_name',
-             'ProgramDatabaseFile': 'a_file_name',
-             'RandomizedBaseAddress': '2',
-             'RegisterOutput': 'true',
-             'ResourceOnlyDLL': 'true',
-             'SetChecksum': 'true',
-             'ShowProgress': '2',
-             'StackCommitSize': 'a string1',
-             'StackReserveSize': 'a string1',
-             'StripPrivateSymbols': 'a_file_name',
-             'SubSystem': '2',
-             'SupportUnloadOfDelayLoadedDLL': 'true',
-             'SuppressStartupBanner': 'true',
-             'SwapRunFromCD': 'true',
-             'SwapRunFromNet': 'true',
-             'TargetMachine': '2',
-             'TerminalServerAware': '2',
-             'TurnOffAssemblyGeneration': 'true',
-             'TypeLibraryFile': 'a_file_name',
-             'TypeLibraryResourceID': '33',
-             'UACExecutionLevel': '2',
-             'UACUIAccess': 'true',
-             'UseLibraryDependencyInputs': 'true',
-             'UseUnicodeResponseFiles': 'true',
-             'Version': 'a string1'},
-         'VCMIDLTool': {
-             'AdditionalIncludeDirectories': 'folder1;folder2',
-             'AdditionalOptions': 'a string1',
-             'CPreprocessOptions': 'a string1',
-             'DefaultCharType': '1',
-             'DLLDataFileName': 'a_file_name',
-             'EnableErrorChecks': '1',
-             'ErrorCheckAllocations': 'true',
-             'ErrorCheckBounds': 'true',
-             'ErrorCheckEnumRange': 'true',
-             'ErrorCheckRefPointers': 'true',
-             'ErrorCheckStubData': 'true',
-             'GenerateStublessProxies': 'true',
-             'GenerateTypeLibrary': 'true',
-             'HeaderFileName': 'a_file_name',
-             'IgnoreStandardIncludePath': 'true',
-             'InterfaceIdentifierFileName': 'a_file_name',
-             'MkTypLibCompatible': 'true',
-             'notgood': 'bogus',
-             'OutputDirectory': 'a string1',
-             'PreprocessorDefinitions': 'string1;string2',
-             'ProxyFileName': 'a_file_name',
-             'RedirectOutputAndErrors': 'a_file_name',
-             'StructMemberAlignment': '1',
-             'SuppressStartupBanner': 'true',
-             'TargetEnvironment': '1',
-             'TypeLibraryName': 'a_file_name',
-             'UndefinePreprocessorDefinitions': 'string1;string2',
-             'ValidateParameters': 'true',
-             'WarnAsError': 'true',
-             'WarningLevel': '1'},
-         'VCResourceCompilerTool': {
-             'AdditionalOptions': 'a string1',
-             'AdditionalIncludeDirectories': 'folder1;folder2',
-             'Culture': '1003',
-             'IgnoreStandardIncludePath': 'true',
-             'notgood2': 'bogus',
-             'PreprocessorDefinitions': 'string1;string2',
-             'ResourceOutputFileName': 'a string1',
-             'ShowProgress': 'true',
-             'SuppressStartupBanner': 'true',
-             'UndefinePreprocessorDefinitions': 'string1;string2'},
-         'VCLibrarianTool': {
-             'AdditionalDependencies': 'file1;file2',
-             'AdditionalLibraryDirectories': 'folder1;folder2',
-             'AdditionalOptions': 'a string1',
-             'ExportNamedFunctions': 'string1;string2',
-             'ForceSymbolReferences': 'a string1',
-             'IgnoreAllDefaultLibraries': 'true',
-             'IgnoreSpecificDefaultLibraries': 'file1;file2',
-             'LinkLibraryDependencies': 'true',
-             'ModuleDefinitionFile': 'a_file_name',
-             'OutputFile': 'a_file_name',
-             'SuppressStartupBanner': 'true',
-             'UseUnicodeResponseFiles': 'true'},
-         'VCManifestTool': {
-             'AdditionalManifestFiles': 'file1;file2',
-             'AdditionalOptions': 'a string1',
-             'AssemblyIdentity': 'a string1',
-             'ComponentFileName': 'a_file_name',
-             'DependencyInformationFile': 'a_file_name',
-             'GenerateCatalogFiles': 'true',
-             'InputResourceManifests': 'a string1',
-             'ManifestResourceFile': 'a_file_name',
-             'OutputManifestFile': 'a_file_name',
-             'RegistrarScriptFile': 'a_file_name',
-             'ReplacementsFile': 'a_file_name',
-             'SuppressStartupBanner': 'true',
-             'TypeLibraryFile': 'a_file_name',
-             'UpdateFileHashes': 'truel',
-             'UpdateFileHashesSearchPath': 'a_file_name',
-             'UseFAT32Workaround': 'true',
-             'UseUnicodeResponseFiles': 'true',
-             'VerboseOutput': 'true'}},
-        self.stderr)
-    self._ExpectedWarnings([
-        'Warning: for VCCLCompilerTool/BasicRuntimeChecks, '
-        'index value (5) not in expected range [0, 4)',
-        'Warning: for VCCLCompilerTool/BrowseInformation, '
-        "invalid literal for int() with base 10: 'fdkslj'",
-        'Warning: for VCCLCompilerTool/CallingConvention, '
-        'index value (-1) not in expected range [0, 3)',
-        'Warning: for VCCLCompilerTool/DebugInformationFormat, '
-        'converted value for 2 not specified.',
-        'Warning: unrecognized setting VCCLCompilerTool/Enableprefast',
-        'Warning: unrecognized setting VCCLCompilerTool/ZZXYZ',
-        'Warning: for VCLinkerTool/TargetMachine, '
-        'converted value for 2 not specified.',
-        'Warning: unrecognized setting VCMIDLTool/notgood',
-        'Warning: unrecognized setting VCResourceCompilerTool/notgood2',
-        'Warning: for VCManifestTool/UpdateFileHashes, '
-        "expected bool; got 'truel'"
-        ''])
-
-  def testValidateMSBuildSettings_settings(self):
-    """Tests that for invalid MSBuild settings."""
-    MSVSSettings.ValidateMSBuildSettings(
-        {'ClCompile': {
-            'AdditionalIncludeDirectories': 'folder1;folder2',
-            'AdditionalOptions': ['string1', 'string2'],
-            'AdditionalUsingDirectories': 'folder1;folder2',
-            'AssemblerListingLocation': 'a_file_name',
-            'AssemblerOutput': 'NoListing',
-            'BasicRuntimeChecks': 'StackFrameRuntimeCheck',
-            'BrowseInformation': 'false',
-            'BrowseInformationFile': 'a_file_name',
-            'BufferSecurityCheck': 'true',
-            'BuildingInIDE': 'true',
-            'CallingConvention': 'Cdecl',
-            'CompileAs': 'CompileAsC',
-            'CompileAsManaged': 'Pure',
-            'CreateHotpatchableImage': 'true',
-            'DebugInformationFormat': 'ProgramDatabase',
-            'DisableLanguageExtensions': 'true',
-            'DisableSpecificWarnings': 'string1;string2',
-            'EnableEnhancedInstructionSet': 'StreamingSIMDExtensions',
-            'EnableFiberSafeOptimizations': 'true',
-            'EnablePREfast': 'true',
-            'Enableprefast': 'bogus',
-            'ErrorReporting': 'Prompt',
-            'ExceptionHandling': 'SyncCThrow',
-            'ExpandAttributedSource': 'true',
-            'FavorSizeOrSpeed': 'Neither',
-            'FloatingPointExceptions': 'true',
-            'FloatingPointModel': 'Precise',
-            'ForceConformanceInForLoopScope': 'true',
-            'ForcedIncludeFiles': 'file1;file2',
-            'ForcedUsingFiles': 'file1;file2',
-            'FunctionLevelLinking': 'false',
-            'GenerateXMLDocumentationFiles': 'true',
-            'IgnoreStandardIncludePath': 'true',
-            'InlineFunctionExpansion': 'OnlyExplicitInline',
-            'IntrinsicFunctions': 'false',
-            'MinimalRebuild': 'true',
-            'MultiProcessorCompilation': 'true',
-            'ObjectFileName': 'a_file_name',
-            'OmitDefaultLibName': 'true',
-            'OmitFramePointers': 'true',
-            'OpenMPSupport': 'true',
-            'Optimization': 'Disabled',
-            'PrecompiledHeader': 'NotUsing',
-            'PrecompiledHeaderFile': 'a_file_name',
-            'PrecompiledHeaderOutputFile': 'a_file_name',
-            'PreprocessKeepComments': 'true',
-            'PreprocessorDefinitions': 'string1;string2',
-            'PreprocessOutputPath': 'a string1',
-            'PreprocessSuppressLineNumbers': 'false',
-            'PreprocessToFile': 'false',
-            'ProcessorNumber': '33',
-            'ProgramDataBaseFileName': 'a_file_name',
-            'RuntimeLibrary': 'MultiThreaded',
-            'RuntimeTypeInfo': 'true',
-            'ShowIncludes': 'true',
-            'SmallerTypeCheck': 'true',
-            'StringPooling': 'true',
-            'StructMemberAlignment': '1Byte',
-            'SuppressStartupBanner': 'true',
-            'TrackerLogDirectory': 'a_folder',
-            'TreatSpecificWarningsAsErrors': 'string1;string2',
-            'TreatWarningAsError': 'true',
-            'TreatWChar_tAsBuiltInType': 'true',
-            'UndefineAllPreprocessorDefinitions': 'true',
-            'UndefinePreprocessorDefinitions': 'string1;string2',
-            'UseFullPaths': 'true',
-            'UseUnicodeForAssemblerListing': 'true',
-            'WarningLevel': 'TurnOffAllWarnings',
-            'WholeProgramOptimization': 'true',
-            'XMLDocumentationFileName': 'a_file_name',
-            'ZZXYZ': 'bogus'},
-         'Link': {
-             'AdditionalDependencies': 'file1;file2',
-             'AdditionalLibraryDirectories': 'folder1;folder2',
-             'AdditionalManifestDependencies': 'file1;file2',
-             'AdditionalOptions': 'a string1',
-             'AddModuleNamesToAssembly': 'file1;file2',
-             'AllowIsolation': 'true',
-             'AssemblyDebug': '',
-             'AssemblyLinkResource': 'file1;file2',
-             'BaseAddress': 'a string1',
-             'BuildingInIDE': 'true',
-             'CLRImageType': 'ForceIJWImage',
-             'CLRSupportLastError': 'Enabled',
-             'CLRThreadAttribute': 'MTAThreadingAttribute',
-             'CLRUnmanagedCodeCheck': 'true',
-             'CreateHotPatchableImage': 'X86Image',
-             'DataExecutionPrevention': 'false',
-             'DelayLoadDLLs': 'file1;file2',
-             'DelaySign': 'true',
-             'Driver': 'NotSet',
-             'EmbedManagedResourceFile': 'file1;file2',
-             'EnableCOMDATFolding': 'false',
-             'EnableUAC': 'true',
-             'EntryPointSymbol': 'a string1',
-             'FixedBaseAddress': 'false',
-             'ForceFileOutput': 'Enabled',
-             'ForceSymbolReferences': 'file1;file2',
-             'FunctionOrder': 'a_file_name',
-             'GenerateDebugInformation': 'true',
-             'GenerateMapFile': 'true',
-             'HeapCommitSize': 'a string1',
-             'HeapReserveSize': 'a string1',
-             'IgnoreAllDefaultLibraries': 'true',
-             'IgnoreEmbeddedIDL': 'true',
-             'IgnoreSpecificDefaultLibraries': 'a_file_list',
-             'ImageHasSafeExceptionHandlers': 'true',
-             'ImportLibrary': 'a_file_name',
-             'KeyContainer': 'a_file_name',
-             'KeyFile': 'a_file_name',
-             'LargeAddressAware': 'false',
-             'LinkDLL': 'true',
-             'LinkErrorReporting': 'SendErrorReport',
-             'LinkStatus': 'true',
-             'LinkTimeCodeGeneration': 'UseLinkTimeCodeGeneration',
-             'ManifestFile': 'a_file_name',
-             'MapExports': 'true',
-             'MapFileName': 'a_file_name',
-             'MergedIDLBaseFileName': 'a_file_name',
-             'MergeSections': 'a string1',
-             'MidlCommandFile': 'a_file_name',
-             'MinimumRequiredVersion': 'a string1',
-             'ModuleDefinitionFile': 'a_file_name',
-             'MSDOSStubFileName': 'a_file_name',
-             'NoEntryPoint': 'true',
-             'OptimizeReferences': 'false',
-             'OutputFile': 'a_file_name',
-             'PerUserRedirection': 'true',
-             'PreventDllBinding': 'true',
-             'Profile': 'true',
-             'ProfileGuidedDatabase': 'a_file_name',
-             'ProgramDatabaseFile': 'a_file_name',
-             'RandomizedBaseAddress': 'false',
-             'RegisterOutput': 'true',
-             'SectionAlignment': '33',
-             'SetChecksum': 'true',
-             'ShowProgress': 'LinkVerboseREF',
-             'SpecifySectionAttributes': 'a string1',
-             'StackCommitSize': 'a string1',
-             'StackReserveSize': 'a string1',
-             'StripPrivateSymbols': 'a_file_name',
-             'SubSystem': 'Console',
-             'SupportNobindOfDelayLoadedDLL': 'true',
-             'SupportUnloadOfDelayLoadedDLL': 'true',
-             'SuppressStartupBanner': 'true',
-             'SwapRunFromCD': 'true',
-             'SwapRunFromNET': 'true',
-             'TargetMachine': 'MachineX86',
-             'TerminalServerAware': 'false',
-             'TrackerLogDirectory': 'a_folder',
-             'TreatLinkerWarningAsErrors': 'true',
-             'TurnOffAssemblyGeneration': 'true',
-             'TypeLibraryFile': 'a_file_name',
-             'TypeLibraryResourceID': '33',
-             'UACExecutionLevel': 'AsInvoker',
-             'UACUIAccess': 'true',
-             'Version': 'a string1'},
-         'ResourceCompile': {
-             'AdditionalIncludeDirectories': 'folder1;folder2',
-             'AdditionalOptions': 'a string1',
-             'Culture': '0x236',
-             'IgnoreStandardIncludePath': 'true',
-             'NullTerminateStrings': 'true',
-             'PreprocessorDefinitions': 'string1;string2',
-             'ResourceOutputFileName': 'a string1',
-             'ShowProgress': 'true',
-             'SuppressStartupBanner': 'true',
-             'TrackerLogDirectory': 'a_folder',
-             'UndefinePreprocessorDefinitions': 'string1;string2'},
-         'Midl': {
-             'AdditionalIncludeDirectories': 'folder1;folder2',
-             'AdditionalOptions': 'a string1',
-             'ApplicationConfigurationMode': 'true',
-             'ClientStubFile': 'a_file_name',
-             'CPreprocessOptions': 'a string1',
-             'DefaultCharType': 'Signed',
-             'DllDataFileName': 'a_file_name',
-             'EnableErrorChecks': 'EnableCustom',
-             'ErrorCheckAllocations': 'true',
-             'ErrorCheckBounds': 'true',
-             'ErrorCheckEnumRange': 'true',
-             'ErrorCheckRefPointers': 'true',
-             'ErrorCheckStubData': 'true',
-             'GenerateClientFiles': 'Stub',
-             'GenerateServerFiles': 'None',
-             'GenerateStublessProxies': 'true',
-             'GenerateTypeLibrary': 'true',
-             'HeaderFileName': 'a_file_name',
-             'IgnoreStandardIncludePath': 'true',
-             'InterfaceIdentifierFileName': 'a_file_name',
-             'LocaleID': '33',
-             'MkTypLibCompatible': 'true',
-             'OutputDirectory': 'a string1',
-             'PreprocessorDefinitions': 'string1;string2',
-             'ProxyFileName': 'a_file_name',
-             'RedirectOutputAndErrors': 'a_file_name',
-             'ServerStubFile': 'a_file_name',
-             'StructMemberAlignment': 'NotSet',
-             'SuppressCompilerWarnings': 'true',
-             'SuppressStartupBanner': 'true',
-             'TargetEnvironment': 'Itanium',
-             'TrackerLogDirectory': 'a_folder',
-             'TypeLibFormat': 'NewFormat',
-             'TypeLibraryName': 'a_file_name',
-             'UndefinePreprocessorDefinitions': 'string1;string2',
-             'ValidateAllParameters': 'true',
-             'WarnAsError': 'true',
-             'WarningLevel': '1'},
-         'Lib': {
-             'AdditionalDependencies': 'file1;file2',
-             'AdditionalLibraryDirectories': 'folder1;folder2',
-             'AdditionalOptions': 'a string1',
-             'DisplayLibrary': 'a string1',
-             'ErrorReporting': 'PromptImmediately',
-             'ExportNamedFunctions': 'string1;string2',
-             'ForceSymbolReferences': 'a string1',
-             'IgnoreAllDefaultLibraries': 'true',
-             'IgnoreSpecificDefaultLibraries': 'file1;file2',
-             'LinkTimeCodeGeneration': 'true',
-             'MinimumRequiredVersion': 'a string1',
-             'ModuleDefinitionFile': 'a_file_name',
-             'Name': 'a_file_name',
-             'OutputFile': 'a_file_name',
-             'RemoveObjects': 'file1;file2',
-             'SubSystem': 'Console',
-             'SuppressStartupBanner': 'true',
-             'TargetMachine': 'MachineX86i',
-             'TrackerLogDirectory': 'a_folder',
-             'TreatLibWarningAsErrors': 'true',
-             'UseUnicodeResponseFiles': 'true',
-             'Verbose': 'true'},
-         'Manifest': {
-             'AdditionalManifestFiles': 'file1;file2',
-             'AdditionalOptions': 'a string1',
-             'AssemblyIdentity': 'a string1',
-             'ComponentFileName': 'a_file_name',
-             'EnableDPIAwareness': 'fal',
-             'GenerateCatalogFiles': 'truel',
-             'GenerateCategoryTags': 'true',
-             'InputResourceManifests': 'a string1',
-             'ManifestFromManagedAssembly': 'a_file_name',
-             'notgood3': 'bogus',
-             'OutputManifestFile': 'a_file_name',
-             'OutputResourceManifests': 'a string1',
-             'RegistrarScriptFile': 'a_file_name',
-             'ReplacementsFile': 'a_file_name',
-             'SuppressDependencyElement': 'true',
-             'SuppressStartupBanner': 'true',
-             'TrackerLogDirectory': 'a_folder',
-             'TypeLibraryFile': 'a_file_name',
-             'UpdateFileHashes': 'true',
-             'UpdateFileHashesSearchPath': 'a_file_name',
-             'VerboseOutput': 'true'},
-         'ProjectReference': {
-             'LinkLibraryDependencies': 'true',
-             'UseLibraryDependencyInputs': 'true'},
-         'ManifestResourceCompile': {
-             'ResourceOutputFileName': 'a_file_name'},
-         '': {
-             'EmbedManifest': 'true',
-             'GenerateManifest': 'true',
-             'IgnoreImportLibrary': 'true',
-             'LinkIncremental': 'false'}},
-        self.stderr)
-    self._ExpectedWarnings([
-        'Warning: unrecognized setting ClCompile/Enableprefast',
-        'Warning: unrecognized setting ClCompile/ZZXYZ',
-        'Warning: unrecognized setting Manifest/notgood3',
-        'Warning: for Manifest/GenerateCatalogFiles, '
-        "expected bool; got 'truel'",
-        'Warning: for Lib/TargetMachine, unrecognized enumerated value '
-        'MachineX86i',
-        "Warning: for Manifest/EnableDPIAwareness, expected bool; got 'fal'"])
-
-  def testConvertToMSBuildSettings_empty(self):
-    """Tests an empty conversion."""
-    msvs_settings = {}
-    expected_msbuild_settings = {}
-    actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings(
-        msvs_settings,
-        self.stderr)
-    self.assertEqual(expected_msbuild_settings, actual_msbuild_settings)
-    self._ExpectedWarnings([])
-
-  def testConvertToMSBuildSettings_minimal(self):
-    """Tests a minimal conversion."""
-    msvs_settings = {
-        'VCCLCompilerTool': {
-            'AdditionalIncludeDirectories': 'dir1',
-            'AdditionalOptions': '/foo',
-            'BasicRuntimeChecks': '0',
-            },
-        'VCLinkerTool': {
-            'LinkTimeCodeGeneration': '1',
-            'ErrorReporting': '1',
-            'DataExecutionPrevention': '2',
-            },
-        }
-    expected_msbuild_settings = {
-        'ClCompile': {
-            'AdditionalIncludeDirectories': 'dir1',
-            'AdditionalOptions': '/foo',
-            'BasicRuntimeChecks': 'Default',
-            },
-        'Link': {
-            'LinkTimeCodeGeneration': 'UseLinkTimeCodeGeneration',
-            'LinkErrorReporting': 'PromptImmediately',
-            'DataExecutionPrevention': 'true',
-            },
-        }
-    actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings(
-        msvs_settings,
-        self.stderr)
-    self.assertEqual(expected_msbuild_settings, actual_msbuild_settings)
-    self._ExpectedWarnings([])
-
-  def testConvertToMSBuildSettings_warnings(self):
-    """Tests conversion that generates warnings."""
-    msvs_settings = {
-        'VCCLCompilerTool': {
-            'AdditionalIncludeDirectories': '1',
-            'AdditionalOptions': '2',
-            # These are incorrect values:
-            'BasicRuntimeChecks': '12',
-            'BrowseInformation': '21',
-            'UsePrecompiledHeader': '13',
-            'GeneratePreprocessedFile': '14'},
-        'VCLinkerTool': {
-            # These are incorrect values:
-            'Driver': '10',
-            'LinkTimeCodeGeneration': '31',
-            'ErrorReporting': '21',
-            'FixedBaseAddress': '6'},
-        'VCResourceCompilerTool': {
-            # Custom
-            'Culture': '1003'}}
-    expected_msbuild_settings = {
-        'ClCompile': {
-            'AdditionalIncludeDirectories': '1',
-            'AdditionalOptions': '2'},
-        'Link': {},
-        'ResourceCompile': {
-            # Custom
-            'Culture': '0x03eb'}}
-    actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings(
-        msvs_settings,
-        self.stderr)
-    self.assertEqual(expected_msbuild_settings, actual_msbuild_settings)
-    self._ExpectedWarnings([
-        'Warning: while converting VCCLCompilerTool/BasicRuntimeChecks to '
-        'MSBuild, index value (12) not in expected range [0, 4)',
-        'Warning: while converting VCCLCompilerTool/BrowseInformation to '
-        'MSBuild, index value (21) not in expected range [0, 3)',
-        'Warning: while converting VCCLCompilerTool/UsePrecompiledHeader to '
-        'MSBuild, index value (13) not in expected range [0, 3)',
-        'Warning: while converting VCCLCompilerTool/GeneratePreprocessedFile to '
-        'MSBuild, value must be one of [0, 1, 2]; got 14',
-
-        'Warning: while converting VCLinkerTool/Driver to '
-        'MSBuild, index value (10) not in expected range [0, 4)',
-        'Warning: while converting VCLinkerTool/LinkTimeCodeGeneration to '
-        'MSBuild, index value (31) not in expected range [0, 5)',
-        'Warning: while converting VCLinkerTool/ErrorReporting to '
-        'MSBuild, index value (21) not in expected range [0, 3)',
-        'Warning: while converting VCLinkerTool/FixedBaseAddress to '
-        'MSBuild, index value (6) not in expected range [0, 3)',
-        ])
-
-  def testConvertToMSBuildSettings_full_synthetic(self):
-    """Tests conversion of all the MSBuild settings."""
-    msvs_settings = {
-        'VCCLCompilerTool': {
-            'AdditionalIncludeDirectories': 'folder1;folder2;folder3',
-            'AdditionalOptions': 'a_string',
-            'AdditionalUsingDirectories': 'folder1;folder2;folder3',
-            'AssemblerListingLocation': 'a_file_name',
-            'AssemblerOutput': '0',
-            'BasicRuntimeChecks': '1',
-            'BrowseInformation': '2',
-            'BrowseInformationFile': 'a_file_name',
-            'BufferSecurityCheck': 'true',
-            'CallingConvention': '0',
-            'CompileAs': '1',
-            'DebugInformationFormat': '4',
-            'DefaultCharIsUnsigned': 'true',
-            'Detect64BitPortabilityProblems': 'true',
-            'DisableLanguageExtensions': 'true',
-            'DisableSpecificWarnings': 'd1;d2;d3',
-            'EnableEnhancedInstructionSet': '0',
-            'EnableFiberSafeOptimizations': 'true',
-            'EnableFunctionLevelLinking': 'true',
-            'EnableIntrinsicFunctions': 'true',
-            'EnablePREfast': 'true',
-            'ErrorReporting': '1',
-            'ExceptionHandling': '2',
-            'ExpandAttributedSource': 'true',
-            'FavorSizeOrSpeed': '0',
-            'FloatingPointExceptions': 'true',
-            'FloatingPointModel': '1',
-            'ForceConformanceInForLoopScope': 'true',
-            'ForcedIncludeFiles': 'file1;file2;file3',
-            'ForcedUsingFiles': 'file1;file2;file3',
-            'GeneratePreprocessedFile': '1',
-            'GenerateXMLDocumentationFiles': 'true',
-            'IgnoreStandardIncludePath': 'true',
-            'InlineFunctionExpansion': '2',
-            'KeepComments': 'true',
-            'MinimalRebuild': 'true',
-            'ObjectFile': 'a_file_name',
-            'OmitDefaultLibName': 'true',
-            'OmitFramePointers': 'true',
-            'OpenMP': 'true',
-            'Optimization': '3',
-            'PrecompiledHeaderFile': 'a_file_name',
-            'PrecompiledHeaderThrough': 'a_file_name',
-            'PreprocessorDefinitions': 'd1;d2;d3',
-            'ProgramDataBaseFileName': 'a_file_name',
-            'RuntimeLibrary': '0',
-            'RuntimeTypeInfo': 'true',
-            'ShowIncludes': 'true',
-            'SmallerTypeCheck': 'true',
-            'StringPooling': 'true',
-            'StructMemberAlignment': '1',
-            'SuppressStartupBanner': 'true',
-            'TreatWChar_tAsBuiltInType': 'true',
-            'UndefineAllPreprocessorDefinitions': 'true',
-            'UndefinePreprocessorDefinitions': 'd1;d2;d3',
-            'UseFullPaths': 'true',
-            'UsePrecompiledHeader': '1',
-            'UseUnicodeResponseFiles': 'true',
-            'WarnAsError': 'true',
-            'WarningLevel': '2',
-            'WholeProgramOptimization': 'true',
-            'XMLDocumentationFileName': 'a_file_name'},
-        'VCLinkerTool': {
-            'AdditionalDependencies': 'file1;file2;file3',
-            'AdditionalLibraryDirectories': 'folder1;folder2;folder3',
-            'AdditionalLibraryDirectories_excluded': 'folder1;folder2;folder3',
-            'AdditionalManifestDependencies': 'file1;file2;file3',
-            'AdditionalOptions': 'a_string',
-            'AddModuleNamesToAssembly': 'file1;file2;file3',
-            'AllowIsolation': 'true',
-            'AssemblyDebug': '0',
-            'AssemblyLinkResource': 'file1;file2;file3',
-            'BaseAddress': 'a_string',
-            'CLRImageType': '1',
-            'CLRThreadAttribute': '2',
-            'CLRUnmanagedCodeCheck': 'true',
-            'DataExecutionPrevention': '0',
-            'DelayLoadDLLs': 'file1;file2;file3',
-            'DelaySign': 'true',
-            'Driver': '1',
-            'EmbedManagedResourceFile': 'file1;file2;file3',
-            'EnableCOMDATFolding': '0',
-            'EnableUAC': 'true',
-            'EntryPointSymbol': 'a_string',
-            'ErrorReporting': '0',
-            'FixedBaseAddress': '1',
-            'ForceSymbolReferences': 'file1;file2;file3',
-            'FunctionOrder': 'a_file_name',
-            'GenerateDebugInformation': 'true',
-            'GenerateManifest': 'true',
-            'GenerateMapFile': 'true',
-            'HeapCommitSize': 'a_string',
-            'HeapReserveSize': 'a_string',
-            'IgnoreAllDefaultLibraries': 'true',
-            'IgnoreDefaultLibraryNames': 'file1;file2;file3',
-            'IgnoreEmbeddedIDL': 'true',
-            'IgnoreImportLibrary': 'true',
-            'ImportLibrary': 'a_file_name',
-            'KeyContainer': 'a_file_name',
-            'KeyFile': 'a_file_name',
-            'LargeAddressAware': '2',
-            'LinkIncremental': '1',
-            'LinkLibraryDependencies': 'true',
-            'LinkTimeCodeGeneration': '2',
-            'ManifestFile': 'a_file_name',
-            'MapExports': 'true',
-            'MapFileName': 'a_file_name',
-            'MergedIDLBaseFileName': 'a_file_name',
-            'MergeSections': 'a_string',
-            'MidlCommandFile': 'a_file_name',
-            'ModuleDefinitionFile': 'a_file_name',
-            'OptimizeForWindows98': '1',
-            'OptimizeReferences': '0',
-            'OutputFile': 'a_file_name',
-            'PerUserRedirection': 'true',
-            'Profile': 'true',
-            'ProfileGuidedDatabase': 'a_file_name',
-            'ProgramDatabaseFile': 'a_file_name',
-            'RandomizedBaseAddress': '1',
-            'RegisterOutput': 'true',
-            'ResourceOnlyDLL': 'true',
-            'SetChecksum': 'true',
-            'ShowProgress': '0',
-            'StackCommitSize': 'a_string',
-            'StackReserveSize': 'a_string',
-            'StripPrivateSymbols': 'a_file_name',
-            'SubSystem': '2',
-            'SupportUnloadOfDelayLoadedDLL': 'true',
-            'SuppressStartupBanner': 'true',
-            'SwapRunFromCD': 'true',
-            'SwapRunFromNet': 'true',
-            'TargetMachine': '3',
-            'TerminalServerAware': '2',
-            'TurnOffAssemblyGeneration': 'true',
-            'TypeLibraryFile': 'a_file_name',
-            'TypeLibraryResourceID': '33',
-            'UACExecutionLevel': '1',
-            'UACUIAccess': 'true',
-            'UseLibraryDependencyInputs': 'false',
-            'UseUnicodeResponseFiles': 'true',
-            'Version': 'a_string'},
-        'VCResourceCompilerTool': {
-            'AdditionalIncludeDirectories': 'folder1;folder2;folder3',
-            'AdditionalOptions': 'a_string',
-            'Culture': '1003',
-            'IgnoreStandardIncludePath': 'true',
-            'PreprocessorDefinitions': 'd1;d2;d3',
-            'ResourceOutputFileName': 'a_string',
-            'ShowProgress': 'true',
-            'SuppressStartupBanner': 'true',
-            'UndefinePreprocessorDefinitions': 'd1;d2;d3'},
-        'VCMIDLTool': {
-            'AdditionalIncludeDirectories': 'folder1;folder2;folder3',
-            'AdditionalOptions': 'a_string',
-            'CPreprocessOptions': 'a_string',
-            'DefaultCharType': '0',
-            'DLLDataFileName': 'a_file_name',
-            'EnableErrorChecks': '2',
-            'ErrorCheckAllocations': 'true',
-            'ErrorCheckBounds': 'true',
-            'ErrorCheckEnumRange': 'true',
-            'ErrorCheckRefPointers': 'true',
-            'ErrorCheckStubData': 'true',
-            'GenerateStublessProxies': 'true',
-            'GenerateTypeLibrary': 'true',
-            'HeaderFileName': 'a_file_name',
-            'IgnoreStandardIncludePath': 'true',
-            'InterfaceIdentifierFileName': 'a_file_name',
-            'MkTypLibCompatible': 'true',
-            'OutputDirectory': 'a_string',
-            'PreprocessorDefinitions': 'd1;d2;d3',
-            'ProxyFileName': 'a_file_name',
-            'RedirectOutputAndErrors': 'a_file_name',
-            'StructMemberAlignment': '3',
-            'SuppressStartupBanner': 'true',
-            'TargetEnvironment': '1',
-            'TypeLibraryName': 'a_file_name',
-            'UndefinePreprocessorDefinitions': 'd1;d2;d3',
-            'ValidateParameters': 'true',
-            'WarnAsError': 'true',
-            'WarningLevel': '4'},
-        'VCLibrarianTool': {
-            'AdditionalDependencies': 'file1;file2;file3',
-            'AdditionalLibraryDirectories': 'folder1;folder2;folder3',
-            'AdditionalLibraryDirectories_excluded': 'folder1;folder2;folder3',
-            'AdditionalOptions': 'a_string',
-            'ExportNamedFunctions': 'd1;d2;d3',
-            'ForceSymbolReferences': 'a_string',
-            'IgnoreAllDefaultLibraries': 'true',
-            'IgnoreSpecificDefaultLibraries': 'file1;file2;file3',
-            'LinkLibraryDependencies': 'true',
-            'ModuleDefinitionFile': 'a_file_name',
-            'OutputFile': 'a_file_name',
-            'SuppressStartupBanner': 'true',
-            'UseUnicodeResponseFiles': 'true'},
-        'VCManifestTool': {
-            'AdditionalManifestFiles': 'file1;file2;file3',
-            'AdditionalOptions': 'a_string',
-            'AssemblyIdentity': 'a_string',
-            'ComponentFileName': 'a_file_name',
-            'DependencyInformationFile': 'a_file_name',
-            'EmbedManifest': 'true',
-            'GenerateCatalogFiles': 'true',
-            'InputResourceManifests': 'a_string',
-            'ManifestResourceFile': 'my_name',
-            'OutputManifestFile': 'a_file_name',
-            'RegistrarScriptFile': 'a_file_name',
-            'ReplacementsFile': 'a_file_name',
-            'SuppressStartupBanner': 'true',
-            'TypeLibraryFile': 'a_file_name',
-            'UpdateFileHashes': 'true',
-            'UpdateFileHashesSearchPath': 'a_file_name',
-            'UseFAT32Workaround': 'true',
-            'UseUnicodeResponseFiles': 'true',
-            'VerboseOutput': 'true'}}
-    expected_msbuild_settings = {
-        'ClCompile': {
-            'AdditionalIncludeDirectories': 'folder1;folder2;folder3',
-            'AdditionalOptions': 'a_string /J',
-            'AdditionalUsingDirectories': 'folder1;folder2;folder3',
-            'AssemblerListingLocation': 'a_file_name',
-            'AssemblerOutput': 'NoListing',
-            'BasicRuntimeChecks': 'StackFrameRuntimeCheck',
-            'BrowseInformation': 'true',
-            'BrowseInformationFile': 'a_file_name',
-            'BufferSecurityCheck': 'true',
-            'CallingConvention': 'Cdecl',
-            'CompileAs': 'CompileAsC',
-            'DebugInformationFormat': 'EditAndContinue',
-            'DisableLanguageExtensions': 'true',
-            'DisableSpecificWarnings': 'd1;d2;d3',
-            'EnableEnhancedInstructionSet': 'NotSet',
-            'EnableFiberSafeOptimizations': 'true',
-            'EnablePREfast': 'true',
-            'ErrorReporting': 'Prompt',
-            'ExceptionHandling': 'Async',
-            'ExpandAttributedSource': 'true',
-            'FavorSizeOrSpeed': 'Neither',
-            'FloatingPointExceptions': 'true',
-            'FloatingPointModel': 'Strict',
-            'ForceConformanceInForLoopScope': 'true',
-            'ForcedIncludeFiles': 'file1;file2;file3',
-            'ForcedUsingFiles': 'file1;file2;file3',
-            'FunctionLevelLinking': 'true',
-            'GenerateXMLDocumentationFiles': 'true',
-            'IgnoreStandardIncludePath': 'true',
-            'InlineFunctionExpansion': 'AnySuitable',
-            'IntrinsicFunctions': 'true',
-            'MinimalRebuild': 'true',
-            'ObjectFileName': 'a_file_name',
-            'OmitDefaultLibName': 'true',
-            'OmitFramePointers': 'true',
-            'OpenMPSupport': 'true',
-            'Optimization': 'Full',
-            'PrecompiledHeader': 'Create',
-            'PrecompiledHeaderFile': 'a_file_name',
-            'PrecompiledHeaderOutputFile': 'a_file_name',
-            'PreprocessKeepComments': 'true',
-            'PreprocessorDefinitions': 'd1;d2;d3',
-            'PreprocessSuppressLineNumbers': 'false',
-            'PreprocessToFile': 'true',
-            'ProgramDataBaseFileName': 'a_file_name',
-            'RuntimeLibrary': 'MultiThreaded',
-            'RuntimeTypeInfo': 'true',
-            'ShowIncludes': 'true',
-            'SmallerTypeCheck': 'true',
-            'StringPooling': 'true',
-            'StructMemberAlignment': '1Byte',
-            'SuppressStartupBanner': 'true',
-            'TreatWarningAsError': 'true',
-            'TreatWChar_tAsBuiltInType': 'true',
-            'UndefineAllPreprocessorDefinitions': 'true',
-            'UndefinePreprocessorDefinitions': 'd1;d2;d3',
-            'UseFullPaths': 'true',
-            'WarningLevel': 'Level2',
-            'WholeProgramOptimization': 'true',
-            'XMLDocumentationFileName': 'a_file_name'},
-        'Link': {
-            'AdditionalDependencies': 'file1;file2;file3',
-            'AdditionalLibraryDirectories': 'folder1;folder2;folder3',
-            'AdditionalManifestDependencies': 'file1;file2;file3',
-            'AdditionalOptions': 'a_string',
-            'AddModuleNamesToAssembly': 'file1;file2;file3',
-            'AllowIsolation': 'true',
-            'AssemblyDebug': '',
-            'AssemblyLinkResource': 'file1;file2;file3',
-            'BaseAddress': 'a_string',
-            'CLRImageType': 'ForceIJWImage',
-            'CLRThreadAttribute': 'STAThreadingAttribute',
-            'CLRUnmanagedCodeCheck': 'true',
-            'DataExecutionPrevention': '',
-            'DelayLoadDLLs': 'file1;file2;file3',
-            'DelaySign': 'true',
-            'Driver': 'Driver',
-            'EmbedManagedResourceFile': 'file1;file2;file3',
-            'EnableCOMDATFolding': '',
-            'EnableUAC': 'true',
-            'EntryPointSymbol': 'a_string',
-            'FixedBaseAddress': 'false',
-            'ForceSymbolReferences': 'file1;file2;file3',
-            'FunctionOrder': 'a_file_name',
-            'GenerateDebugInformation': 'true',
-            'GenerateMapFile': 'true',
-            'HeapCommitSize': 'a_string',
-            'HeapReserveSize': 'a_string',
-            'IgnoreAllDefaultLibraries': 'true',
-            'IgnoreEmbeddedIDL': 'true',
-            'IgnoreSpecificDefaultLibraries': 'file1;file2;file3',
-            'ImportLibrary': 'a_file_name',
-            'KeyContainer': 'a_file_name',
-            'KeyFile': 'a_file_name',
-            'LargeAddressAware': 'true',
-            'LinkErrorReporting': 'NoErrorReport',
-            'LinkTimeCodeGeneration': 'PGInstrument',
-            'ManifestFile': 'a_file_name',
-            'MapExports': 'true',
-            'MapFileName': 'a_file_name',
-            'MergedIDLBaseFileName': 'a_file_name',
-            'MergeSections': 'a_string',
-            'MidlCommandFile': 'a_file_name',
-            'ModuleDefinitionFile': 'a_file_name',
-            'NoEntryPoint': 'true',
-            'OptimizeReferences': '',
-            'OutputFile': 'a_file_name',
-            'PerUserRedirection': 'true',
-            'Profile': 'true',
-            'ProfileGuidedDatabase': 'a_file_name',
-            'ProgramDatabaseFile': 'a_file_name',
-            'RandomizedBaseAddress': 'false',
-            'RegisterOutput': 'true',
-            'SetChecksum': 'true',
-            'ShowProgress': 'NotSet',
-            'StackCommitSize': 'a_string',
-            'StackReserveSize': 'a_string',
-            'StripPrivateSymbols': 'a_file_name',
-            'SubSystem': 'Windows',
-            'SupportUnloadOfDelayLoadedDLL': 'true',
-            'SuppressStartupBanner': 'true',
-            'SwapRunFromCD': 'true',
-            'SwapRunFromNET': 'true',
-            'TargetMachine': 'MachineARM',
-            'TerminalServerAware': 'true',
-            'TurnOffAssemblyGeneration': 'true',
-            'TypeLibraryFile': 'a_file_name',
-            'TypeLibraryResourceID': '33',
-            'UACExecutionLevel': 'HighestAvailable',
-            'UACUIAccess': 'true',
-            'Version': 'a_string'},
-        'ResourceCompile': {
-            'AdditionalIncludeDirectories': 'folder1;folder2;folder3',
-            'AdditionalOptions': 'a_string',
-            'Culture': '0x03eb',
-            'IgnoreStandardIncludePath': 'true',
-            'PreprocessorDefinitions': 'd1;d2;d3',
-            'ResourceOutputFileName': 'a_string',
-            'ShowProgress': 'true',
-            'SuppressStartupBanner': 'true',
-            'UndefinePreprocessorDefinitions': 'd1;d2;d3'},
-        'Midl': {
-            'AdditionalIncludeDirectories': 'folder1;folder2;folder3',
-            'AdditionalOptions': 'a_string',
-            'CPreprocessOptions': 'a_string',
-            'DefaultCharType': 'Unsigned',
-            'DllDataFileName': 'a_file_name',
-            'EnableErrorChecks': 'All',
-            'ErrorCheckAllocations': 'true',
-            'ErrorCheckBounds': 'true',
-            'ErrorCheckEnumRange': 'true',
-            'ErrorCheckRefPointers': 'true',
-            'ErrorCheckStubData': 'true',
-            'GenerateStublessProxies': 'true',
-            'GenerateTypeLibrary': 'true',
-            'HeaderFileName': 'a_file_name',
-            'IgnoreStandardIncludePath': 'true',
-            'InterfaceIdentifierFileName': 'a_file_name',
-            'MkTypLibCompatible': 'true',
-            'OutputDirectory': 'a_string',
-            'PreprocessorDefinitions': 'd1;d2;d3',
-            'ProxyFileName': 'a_file_name',
-            'RedirectOutputAndErrors': 'a_file_name',
-            'StructMemberAlignment': '4',
-            'SuppressStartupBanner': 'true',
-            'TargetEnvironment': 'Win32',
-            'TypeLibraryName': 'a_file_name',
-            'UndefinePreprocessorDefinitions': 'd1;d2;d3',
-            'ValidateAllParameters': 'true',
-            'WarnAsError': 'true',
-            'WarningLevel': '4'},
-        'Lib': {
-            'AdditionalDependencies': 'file1;file2;file3',
-            'AdditionalLibraryDirectories': 'folder1;folder2;folder3',
-            'AdditionalOptions': 'a_string',
-            'ExportNamedFunctions': 'd1;d2;d3',
-            'ForceSymbolReferences': 'a_string',
-            'IgnoreAllDefaultLibraries': 'true',
-            'IgnoreSpecificDefaultLibraries': 'file1;file2;file3',
-            'ModuleDefinitionFile': 'a_file_name',
-            'OutputFile': 'a_file_name',
-            'SuppressStartupBanner': 'true',
-            'UseUnicodeResponseFiles': 'true'},
-        'Manifest': {
-            'AdditionalManifestFiles': 'file1;file2;file3',
-            'AdditionalOptions': 'a_string',
-            'AssemblyIdentity': 'a_string',
-            'ComponentFileName': 'a_file_name',
-            'GenerateCatalogFiles': 'true',
-            'InputResourceManifests': 'a_string',
-            'OutputManifestFile': 'a_file_name',
-            'RegistrarScriptFile': 'a_file_name',
-            'ReplacementsFile': 'a_file_name',
-            'SuppressStartupBanner': 'true',
-            'TypeLibraryFile': 'a_file_name',
-            'UpdateFileHashes': 'true',
-            'UpdateFileHashesSearchPath': 'a_file_name',
-            'VerboseOutput': 'true'},
-        'ManifestResourceCompile': {
-            'ResourceOutputFileName': 'my_name'},
-        'ProjectReference': {
-            'LinkLibraryDependencies': 'true',
-            'UseLibraryDependencyInputs': 'false'},
-        '': {
-            'EmbedManifest': 'true',
-            'GenerateManifest': 'true',
-            'IgnoreImportLibrary': 'true',
-            'LinkIncremental': 'false'}}
-    actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings(
-        msvs_settings,
-        self.stderr)
-    self.assertEqual(expected_msbuild_settings, actual_msbuild_settings)
-    self._ExpectedWarnings([])
-
-  def testConvertToMSBuildSettings_actual(self):
-    """Tests the conversion of an actual project.
-
-    A VS2008 project with most of the options defined was created through the
-    VS2008 IDE.  It was then converted to VS2010.  The tool settings found in
-    the .vcproj and .vcxproj files were converted to the two dictionaries
-    msvs_settings and expected_msbuild_settings.
-
-    Note that for many settings, the VS2010 converter adds macros like
-    %(AdditionalIncludeDirectories) to make sure than inherited values are
-    included.  Since the Gyp projects we generate do not use inheritance,
-    we removed these macros.  They were:
-        ClCompile:
-            AdditionalIncludeDirectories:  ';%(AdditionalIncludeDirectories)'
-            AdditionalOptions:  ' %(AdditionalOptions)'
-            AdditionalUsingDirectories:  ';%(AdditionalUsingDirectories)'
-            DisableSpecificWarnings: ';%(DisableSpecificWarnings)',
-            ForcedIncludeFiles:  ';%(ForcedIncludeFiles)',
-            ForcedUsingFiles:  ';%(ForcedUsingFiles)',
-            PreprocessorDefinitions:  ';%(PreprocessorDefinitions)',
-            UndefinePreprocessorDefinitions:
-                ';%(UndefinePreprocessorDefinitions)',
-        Link:
-            AdditionalDependencies:  ';%(AdditionalDependencies)',
-            AdditionalLibraryDirectories:  ';%(AdditionalLibraryDirectories)',
-            AdditionalManifestDependencies:
-                ';%(AdditionalManifestDependencies)',
-            AdditionalOptions:  ' %(AdditionalOptions)',
-            AddModuleNamesToAssembly:  ';%(AddModuleNamesToAssembly)',
-            AssemblyLinkResource:  ';%(AssemblyLinkResource)',
-            DelayLoadDLLs:  ';%(DelayLoadDLLs)',
-            EmbedManagedResourceFile:  ';%(EmbedManagedResourceFile)',
-            ForceSymbolReferences:  ';%(ForceSymbolReferences)',
-            IgnoreSpecificDefaultLibraries:
-                ';%(IgnoreSpecificDefaultLibraries)',
-        ResourceCompile:
-            AdditionalIncludeDirectories:  ';%(AdditionalIncludeDirectories)',
-            AdditionalOptions:  ' %(AdditionalOptions)',
-            PreprocessorDefinitions:  ';%(PreprocessorDefinitions)',
-        Manifest:
-            AdditionalManifestFiles:  ';%(AdditionalManifestFiles)',
-            AdditionalOptions:  ' %(AdditionalOptions)',
-            InputResourceManifests:  ';%(InputResourceManifests)',
-    """
-    msvs_settings = {
-        'VCCLCompilerTool': {
-            'AdditionalIncludeDirectories': 'dir1',
-            'AdditionalOptions': '/more',
-            'AdditionalUsingDirectories': 'test',
-            'AssemblerListingLocation': '$(IntDir)\\a',
-            'AssemblerOutput': '1',
-            'BasicRuntimeChecks': '3',
-            'BrowseInformation': '1',
-            'BrowseInformationFile': '$(IntDir)\\e',
-            'BufferSecurityCheck': 'false',
-            'CallingConvention': '1',
-            'CompileAs': '1',
-            'DebugInformationFormat': '4',
-            'DefaultCharIsUnsigned': 'true',
-            'Detect64BitPortabilityProblems': 'true',
-            'DisableLanguageExtensions': 'true',
-            'DisableSpecificWarnings': 'abc',
-            'EnableEnhancedInstructionSet': '1',
-            'EnableFiberSafeOptimizations': 'true',
-            'EnableFunctionLevelLinking': 'true',
-            'EnableIntrinsicFunctions': 'true',
-            'EnablePREfast': 'true',
-            'ErrorReporting': '2',
-            'ExceptionHandling': '2',
-            'ExpandAttributedSource': 'true',
-            'FavorSizeOrSpeed': '2',
-            'FloatingPointExceptions': 'true',
-            'FloatingPointModel': '1',
-            'ForceConformanceInForLoopScope': 'false',
-            'ForcedIncludeFiles': 'def',
-            'ForcedUsingFiles': 'ge',
-            'GeneratePreprocessedFile': '2',
-            'GenerateXMLDocumentationFiles': 'true',
-            'IgnoreStandardIncludePath': 'true',
-            'InlineFunctionExpansion': '1',
-            'KeepComments': 'true',
-            'MinimalRebuild': 'true',
-            'ObjectFile': '$(IntDir)\\b',
-            'OmitDefaultLibName': 'true',
-            'OmitFramePointers': 'true',
-            'OpenMP': 'true',
-            'Optimization': '3',
-            'PrecompiledHeaderFile': '$(IntDir)\\$(TargetName).pche',
-            'PrecompiledHeaderThrough': 'StdAfx.hd',
-            'PreprocessorDefinitions': 'WIN32;_DEBUG;_CONSOLE',
-            'ProgramDataBaseFileName': '$(IntDir)\\vc90b.pdb',
-            'RuntimeLibrary': '3',
-            'RuntimeTypeInfo': 'false',
-            'ShowIncludes': 'true',
-            'SmallerTypeCheck': 'true',
-            'StringPooling': 'true',
-            'StructMemberAlignment': '3',
-            'SuppressStartupBanner': 'false',
-            'TreatWChar_tAsBuiltInType': 'false',
-            'UndefineAllPreprocessorDefinitions': 'true',
-            'UndefinePreprocessorDefinitions': 'wer',
-            'UseFullPaths': 'true',
-            'UsePrecompiledHeader': '0',
-            'UseUnicodeResponseFiles': 'false',
-            'WarnAsError': 'true',
-            'WarningLevel': '3',
-            'WholeProgramOptimization': 'true',
-            'XMLDocumentationFileName': '$(IntDir)\\c'},
-        'VCLinkerTool': {
-            'AdditionalDependencies': 'zx',
-            'AdditionalLibraryDirectories': 'asd',
-            'AdditionalManifestDependencies': 's2',
-            'AdditionalOptions': '/mor2',
-            'AddModuleNamesToAssembly': 'd1',
-            'AllowIsolation': 'false',
-            'AssemblyDebug': '1',
-            'AssemblyLinkResource': 'd5',
-            'BaseAddress': '23423',
-            'CLRImageType': '3',
-            'CLRThreadAttribute': '1',
-            'CLRUnmanagedCodeCheck': 'true',
-            'DataExecutionPrevention': '0',
-            'DelayLoadDLLs': 'd4',
-            'DelaySign': 'true',
-            'Driver': '2',
-            'EmbedManagedResourceFile': 'd2',
-            'EnableCOMDATFolding': '1',
-            'EnableUAC': 'false',
-            'EntryPointSymbol': 'f5',
-            'ErrorReporting': '2',
-            'FixedBaseAddress': '1',
-            'ForceSymbolReferences': 'd3',
-            'FunctionOrder': 'fssdfsd',
-            'GenerateDebugInformation': 'true',
-            'GenerateManifest': 'false',
-            'GenerateMapFile': 'true',
-            'HeapCommitSize': '13',
-            'HeapReserveSize': '12',
-            'IgnoreAllDefaultLibraries': 'true',
-            'IgnoreDefaultLibraryNames': 'flob;flok',
-            'IgnoreEmbeddedIDL': 'true',
-            'IgnoreImportLibrary': 'true',
-            'ImportLibrary': 'f4',
-            'KeyContainer': 'f7',
-            'KeyFile': 'f6',
-            'LargeAddressAware': '2',
-            'LinkIncremental': '0',
-            'LinkLibraryDependencies': 'false',
-            'LinkTimeCodeGeneration': '1',
-            'ManifestFile':
-            '$(IntDir)\\$(TargetFileName).2intermediate.manifest',
-            'MapExports': 'true',
-            'MapFileName': 'd5',
-            'MergedIDLBaseFileName': 'f2',
-            'MergeSections': 'f5',
-            'MidlCommandFile': 'f1',
-            'ModuleDefinitionFile': 'sdsd',
-            'OptimizeForWindows98': '2',
-            'OptimizeReferences': '2',
-            'OutputFile': '$(OutDir)\\$(ProjectName)2.exe',
-            'PerUserRedirection': 'true',
-            'Profile': 'true',
-            'ProfileGuidedDatabase': '$(TargetDir)$(TargetName).pgdd',
-            'ProgramDatabaseFile': 'Flob.pdb',
-            'RandomizedBaseAddress': '1',
-            'RegisterOutput': 'true',
-            'ResourceOnlyDLL': 'true',
-            'SetChecksum': 'false',
-            'ShowProgress': '1',
-            'StackCommitSize': '15',
-            'StackReserveSize': '14',
-            'StripPrivateSymbols': 'd3',
-            'SubSystem': '1',
-            'SupportUnloadOfDelayLoadedDLL': 'true',
-            'SuppressStartupBanner': 'false',
-            'SwapRunFromCD': 'true',
-            'SwapRunFromNet': 'true',
-            'TargetMachine': '1',
-            'TerminalServerAware': '1',
-            'TurnOffAssemblyGeneration': 'true',
-            'TypeLibraryFile': 'f3',
-            'TypeLibraryResourceID': '12',
-            'UACExecutionLevel': '2',
-            'UACUIAccess': 'true',
-            'UseLibraryDependencyInputs': 'true',
-            'UseUnicodeResponseFiles': 'false',
-            'Version': '333'},
-        'VCResourceCompilerTool': {
-            'AdditionalIncludeDirectories': 'f3',
-            'AdditionalOptions': '/more3',
-            'Culture': '3084',
-            'IgnoreStandardIncludePath': 'true',
-            'PreprocessorDefinitions': '_UNICODE;UNICODE2',
-            'ResourceOutputFileName': '$(IntDir)/$(InputName)3.res',
-            'ShowProgress': 'true'},
-        'VCManifestTool': {
-            'AdditionalManifestFiles': 'sfsdfsd',
-            'AdditionalOptions': 'afdsdafsd',
-            'AssemblyIdentity': 'sddfdsadfsa',
-            'ComponentFileName': 'fsdfds',
-            'DependencyInformationFile': '$(IntDir)\\mt.depdfd',
-            'EmbedManifest': 'false',
-            'GenerateCatalogFiles': 'true',
-            'InputResourceManifests': 'asfsfdafs',
-            'ManifestResourceFile':
-            '$(IntDir)\\$(TargetFileName).embed.manifest.resfdsf',
-            'OutputManifestFile': '$(TargetPath).manifestdfs',
-            'RegistrarScriptFile': 'sdfsfd',
-            'ReplacementsFile': 'sdffsd',
-            'SuppressStartupBanner': 'false',
-            'TypeLibraryFile': 'sfsd',
-            'UpdateFileHashes': 'true',
-            'UpdateFileHashesSearchPath': 'sfsd',
-            'UseFAT32Workaround': 'true',
-            'UseUnicodeResponseFiles': 'false',
-            'VerboseOutput': 'true'}}
-    expected_msbuild_settings = {
-        'ClCompile': {
-            'AdditionalIncludeDirectories': 'dir1',
-            'AdditionalOptions': '/more /J',
-            'AdditionalUsingDirectories': 'test',
-            'AssemblerListingLocation': '$(IntDir)a',
-            'AssemblerOutput': 'AssemblyCode',
-            'BasicRuntimeChecks': 'EnableFastChecks',
-            'BrowseInformation': 'true',
-            'BrowseInformationFile': '$(IntDir)e',
-            'BufferSecurityCheck': 'false',
-            'CallingConvention': 'FastCall',
-            'CompileAs': 'CompileAsC',
-            'DebugInformationFormat': 'EditAndContinue',
-            'DisableLanguageExtensions': 'true',
-            'DisableSpecificWarnings': 'abc',
-            'EnableEnhancedInstructionSet': 'StreamingSIMDExtensions',
-            'EnableFiberSafeOptimizations': 'true',
-            'EnablePREfast': 'true',
-            'ErrorReporting': 'Queue',
-            'ExceptionHandling': 'Async',
-            'ExpandAttributedSource': 'true',
-            'FavorSizeOrSpeed': 'Size',
-            'FloatingPointExceptions': 'true',
-            'FloatingPointModel': 'Strict',
-            'ForceConformanceInForLoopScope': 'false',
-            'ForcedIncludeFiles': 'def',
-            'ForcedUsingFiles': 'ge',
-            'FunctionLevelLinking': 'true',
-            'GenerateXMLDocumentationFiles': 'true',
-            'IgnoreStandardIncludePath': 'true',
-            'InlineFunctionExpansion': 'OnlyExplicitInline',
-            'IntrinsicFunctions': 'true',
-            'MinimalRebuild': 'true',
-            'ObjectFileName': '$(IntDir)b',
-            'OmitDefaultLibName': 'true',
-            'OmitFramePointers': 'true',
-            'OpenMPSupport': 'true',
-            'Optimization': 'Full',
-            'PrecompiledHeader': 'NotUsing',  # Actual conversion gives ''
-            'PrecompiledHeaderFile': 'StdAfx.hd',
-            'PrecompiledHeaderOutputFile': '$(IntDir)$(TargetName).pche',
-            'PreprocessKeepComments': 'true',
-            'PreprocessorDefinitions': 'WIN32;_DEBUG;_CONSOLE',
-            'PreprocessSuppressLineNumbers': 'true',
-            'PreprocessToFile': 'true',
-            'ProgramDataBaseFileName': '$(IntDir)vc90b.pdb',
-            'RuntimeLibrary': 'MultiThreadedDebugDLL',
-            'RuntimeTypeInfo': 'false',
-            'ShowIncludes': 'true',
-            'SmallerTypeCheck': 'true',
-            'StringPooling': 'true',
-            'StructMemberAlignment': '4Bytes',
-            'SuppressStartupBanner': 'false',
-            'TreatWarningAsError': 'true',
-            'TreatWChar_tAsBuiltInType': 'false',
-            'UndefineAllPreprocessorDefinitions': 'true',
-            'UndefinePreprocessorDefinitions': 'wer',
-            'UseFullPaths': 'true',
-            'WarningLevel': 'Level3',
-            'WholeProgramOptimization': 'true',
-            'XMLDocumentationFileName': '$(IntDir)c'},
-        'Link': {
-            'AdditionalDependencies': 'zx',
-            'AdditionalLibraryDirectories': 'asd',
-            'AdditionalManifestDependencies': 's2',
-            'AdditionalOptions': '/mor2',
-            'AddModuleNamesToAssembly': 'd1',
-            'AllowIsolation': 'false',
-            'AssemblyDebug': 'true',
-            'AssemblyLinkResource': 'd5',
-            'BaseAddress': '23423',
-            'CLRImageType': 'ForceSafeILImage',
-            'CLRThreadAttribute': 'MTAThreadingAttribute',
-            'CLRUnmanagedCodeCheck': 'true',
-            'DataExecutionPrevention': '',
-            'DelayLoadDLLs': 'd4',
-            'DelaySign': 'true',
-            'Driver': 'UpOnly',
-            'EmbedManagedResourceFile': 'd2',
-            'EnableCOMDATFolding': 'false',
-            'EnableUAC': 'false',
-            'EntryPointSymbol': 'f5',
-            'FixedBaseAddress': 'false',
-            'ForceSymbolReferences': 'd3',
-            'FunctionOrder': 'fssdfsd',
-            'GenerateDebugInformation': 'true',
-            'GenerateMapFile': 'true',
-            'HeapCommitSize': '13',
-            'HeapReserveSize': '12',
-            'IgnoreAllDefaultLibraries': 'true',
-            'IgnoreEmbeddedIDL': 'true',
-            'IgnoreSpecificDefaultLibraries': 'flob;flok',
-            'ImportLibrary': 'f4',
-            'KeyContainer': 'f7',
-            'KeyFile': 'f6',
-            'LargeAddressAware': 'true',
-            'LinkErrorReporting': 'QueueForNextLogin',
-            'LinkTimeCodeGeneration': 'UseLinkTimeCodeGeneration',
-            'ManifestFile': '$(IntDir)$(TargetFileName).2intermediate.manifest',
-            'MapExports': 'true',
-            'MapFileName': 'd5',
-            'MergedIDLBaseFileName': 'f2',
-            'MergeSections': 'f5',
-            'MidlCommandFile': 'f1',
-            'ModuleDefinitionFile': 'sdsd',
-            'NoEntryPoint': 'true',
-            'OptimizeReferences': 'true',
-            'OutputFile': '$(OutDir)$(ProjectName)2.exe',
-            'PerUserRedirection': 'true',
-            'Profile': 'true',
-            'ProfileGuidedDatabase': '$(TargetDir)$(TargetName).pgdd',
-            'ProgramDatabaseFile': 'Flob.pdb',
-            'RandomizedBaseAddress': 'false',
-            'RegisterOutput': 'true',
-            'SetChecksum': 'false',
-            'ShowProgress': 'LinkVerbose',
-            'StackCommitSize': '15',
-            'StackReserveSize': '14',
-            'StripPrivateSymbols': 'd3',
-            'SubSystem': 'Console',
-            'SupportUnloadOfDelayLoadedDLL': 'true',
-            'SuppressStartupBanner': 'false',
-            'SwapRunFromCD': 'true',
-            'SwapRunFromNET': 'true',
-            'TargetMachine': 'MachineX86',
-            'TerminalServerAware': 'false',
-            'TurnOffAssemblyGeneration': 'true',
-            'TypeLibraryFile': 'f3',
-            'TypeLibraryResourceID': '12',
-            'UACExecutionLevel': 'RequireAdministrator',
-            'UACUIAccess': 'true',
-            'Version': '333'},
-        'ResourceCompile': {
-            'AdditionalIncludeDirectories': 'f3',
-            'AdditionalOptions': '/more3',
-            'Culture': '0x0c0c',
-            'IgnoreStandardIncludePath': 'true',
-            'PreprocessorDefinitions': '_UNICODE;UNICODE2',
-            'ResourceOutputFileName': '$(IntDir)%(Filename)3.res',
-            'ShowProgress': 'true'},
-        'Manifest': {
-            'AdditionalManifestFiles': 'sfsdfsd',
-            'AdditionalOptions': 'afdsdafsd',
-            'AssemblyIdentity': 'sddfdsadfsa',
-            'ComponentFileName': 'fsdfds',
-            'GenerateCatalogFiles': 'true',
-            'InputResourceManifests': 'asfsfdafs',
-            'OutputManifestFile': '$(TargetPath).manifestdfs',
-            'RegistrarScriptFile': 'sdfsfd',
-            'ReplacementsFile': 'sdffsd',
-            'SuppressStartupBanner': 'false',
-            'TypeLibraryFile': 'sfsd',
-            'UpdateFileHashes': 'true',
-            'UpdateFileHashesSearchPath': 'sfsd',
-            'VerboseOutput': 'true'},
-        'ProjectReference': {
-            'LinkLibraryDependencies': 'false',
-            'UseLibraryDependencyInputs': 'true'},
-        '': {
-            'EmbedManifest': 'false',
-            'GenerateManifest': 'false',
-            'IgnoreImportLibrary': 'true',
-            'LinkIncremental': ''
-            },
-        'ManifestResourceCompile': {
-            'ResourceOutputFileName':
-            '$(IntDir)$(TargetFileName).embed.manifest.resfdsf'}
-        }
-    actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings(
-        msvs_settings,
-        self.stderr)
-    self.assertEqual(expected_msbuild_settings, actual_msbuild_settings)
-    self._ExpectedWarnings([])
-
-
-if __name__ == '__main__':
-  unittest.main()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSToolFile.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,58 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Visual Studio project reader/writer."""
-
-import gyp.common
-import gyp.easy_xml as easy_xml
-
-
-class Writer(object):
-  """Visual Studio XML tool file writer."""
-
-  def __init__(self, tool_file_path, name):
-    """Initializes the tool file.
-
-    Args:
-      tool_file_path: Path to the tool file.
-      name: Name of the tool file.
-    """
-    self.tool_file_path = tool_file_path
-    self.name = name
-    self.rules_section = ['Rules']
-
-  def AddCustomBuildRule(self, name, cmd, description,
-                         additional_dependencies,
-                         outputs, extensions):
-    """Adds a rule to the tool file.
-
-    Args:
-      name: Name of the rule.
-      description: Description of the rule.
-      cmd: Command line of the rule.
-      additional_dependencies: other files which may trigger the rule.
-      outputs: outputs of the rule.
-      extensions: extensions handled by the rule.
-    """
-    rule = ['CustomBuildRule',
-            {'Name': name,
-             'ExecutionDescription': description,
-             'CommandLine': cmd,
-             'Outputs': ';'.join(outputs),
-             'FileExtensions': ';'.join(extensions),
-             'AdditionalDependencies':
-                 ';'.join(additional_dependencies)
-            }]
-    self.rules_section.append(rule)
-
-  def WriteIfChanged(self):
-    """Writes the tool file."""
-    content = ['VisualStudioToolFile',
-               {'Version': '8.00',
-                'Name': self.name
-               },
-               self.rules_section
-               ]
-    easy_xml.WriteXmlIfChanged(content, self.tool_file_path,
-                               encoding="Windows-1252")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSUserFile.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,147 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Visual Studio user preferences file writer."""
-
-import os
-import re
-import socket # for gethostname
-
-import gyp.common
-import gyp.easy_xml as easy_xml
-
-
-#------------------------------------------------------------------------------
-
-def _FindCommandInPath(command):
-  """If there are no slashes in the command given, this function
-     searches the PATH env to find the given command, and converts it
-     to an absolute path.  We have to do this because MSVS is looking
-     for an actual file to launch a debugger on, not just a command
-     line.  Note that this happens at GYP time, so anything needing to
-     be built needs to have a full path."""
-  if '/' in command or '\\' in command:
-    # If the command already has path elements (either relative or
-    # absolute), then assume it is constructed properly.
-    return command
-  else:
-    # Search through the path list and find an existing file that
-    # we can access.
-    paths = os.environ.get('PATH','').split(os.pathsep)
-    for path in paths:
-      item = os.path.join(path, command)
-      if os.path.isfile(item) and os.access(item, os.X_OK):
-        return item
-  return command
-
-def _QuoteWin32CommandLineArgs(args):
-  new_args = []
-  for arg in args:
-    # Replace all double-quotes with double-double-quotes to escape
-    # them for cmd shell, and then quote the whole thing if there
-    # are any.
-    if arg.find('"') != -1:
-      arg = '""'.join(arg.split('"'))
-      arg = '"%s"' % arg
-
-    # Otherwise, if there are any spaces, quote the whole arg.
-    elif re.search(r'[ \t\n]', arg):
-      arg = '"%s"' % arg
-    new_args.append(arg)
-  return new_args
-
-class Writer(object):
-  """Visual Studio XML user user file writer."""
-
-  def __init__(self, user_file_path, version, name):
-    """Initializes the user file.
-
-    Args:
-      user_file_path: Path to the user file.
-      version: Version info.
-      name: Name of the user file.
-    """
-    self.user_file_path = user_file_path
-    self.version = version
-    self.name = name
-    self.configurations = {}
-
-  def AddConfig(self, name):
-    """Adds a configuration to the project.
-
-    Args:
-      name: Configuration name.
-    """
-    self.configurations[name] = ['Configuration', {'Name': name}]
-
-  def AddDebugSettings(self, config_name, command, environment = {},
-                       working_directory=""):
-    """Adds a DebugSettings node to the user file for a particular config.
-
-    Args:
-      command: command line to run.  First element in the list is the
-        executable.  All elements of the command will be quoted if
-        necessary.
-      working_directory: other files which may trigger the rule. (optional)
-    """
-    command = _QuoteWin32CommandLineArgs(command)
-
-    abs_command = _FindCommandInPath(command[0])
-
-    if environment and isinstance(environment, dict):
-      env_list = ['%s="%s"' % (key, val)
-                  for (key,val) in environment.iteritems()]
-      environment = ' '.join(env_list)
-    else:
-      environment = ''
-
-    n_cmd = ['DebugSettings',
-             {'Command': abs_command,
-              'WorkingDirectory': working_directory,
-              'CommandArguments': " ".join(command[1:]),
-              'RemoteMachine': socket.gethostname(),
-              'Environment': environment,
-              'EnvironmentMerge': 'true',
-              # Currently these are all "dummy" values that we're just setting
-              # in the default manner that MSVS does it.  We could use some of
-              # these to add additional capabilities, I suppose, but they might
-              # not have parity with other platforms then.
-              'Attach': 'false',
-              'DebuggerType': '3',  # 'auto' debugger
-              'Remote': '1',
-              'RemoteCommand': '',
-              'HttpUrl': '',
-              'PDBPath': '',
-              'SQLDebugging': '',
-              'DebuggerFlavor': '0',
-              'MPIRunCommand': '',
-              'MPIRunArguments': '',
-              'MPIRunWorkingDirectory': '',
-              'ApplicationCommand': '',
-              'ApplicationArguments': '',
-              'ShimCommand': '',
-              'MPIAcceptMode': '',
-              'MPIAcceptFilter': ''
-             }]
-
-    # Find the config, and add it if it doesn't exist.
-    if config_name not in self.configurations:
-      self.AddConfig(config_name)
-
-    # Add the DebugSettings onto the appropriate config.
-    self.configurations[config_name].append(n_cmd)
-
-  def WriteIfChanged(self):
-    """Writes the user file."""
-    configs = ['Configurations']
-    for config, spec in sorted(self.configurations.iteritems()):
-      configs.append(spec)
-
-    content = ['VisualStudioUserFile',
-               {'Version': self.version.ProjectVersion(),
-                'Name': self.name
-               },
-               configs]
-    easy_xml.WriteXmlIfChanged(content, self.user_file_path,
-                               encoding="Windows-1252")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSUtil.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,212 +0,0 @@
-# Copyright (c) 2013 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Utility functions shared amongst the Windows generators."""
-
-import copy
-import os
-
-
-_TARGET_TYPE_EXT = {
-  'executable': '.exe',
-  'shared_library': '.dll'
-}
-
-
-def _GetLargePdbShimCcPath():
-  """Returns the path of the large_pdb_shim.cc file."""
-  this_dir = os.path.abspath(os.path.dirname(__file__))
-  src_dir = os.path.abspath(os.path.join(this_dir, '..', '..'))
-  win_data_dir = os.path.join(src_dir, 'data', 'win')
-  large_pdb_shim_cc = os.path.join(win_data_dir, 'large-pdb-shim.cc')
-  return large_pdb_shim_cc
-
-
-def _DeepCopySomeKeys(in_dict, keys):
-  """Performs a partial deep-copy on |in_dict|, only copying the keys in |keys|.
-
-  Arguments:
-    in_dict: The dictionary to copy.
-    keys: The keys to be copied. If a key is in this list and doesn't exist in
-        |in_dict| this is not an error.
-  Returns:
-    The partially deep-copied dictionary.
-  """
-  d = {}
-  for key in keys:
-    if key not in in_dict:
-      continue
-    d[key] = copy.deepcopy(in_dict[key])
-  return d
-
-
-def _SuffixName(name, suffix):
-  """Add a suffix to the end of a target.
-
-  Arguments:
-    name: name of the target (foo#target)
-    suffix: the suffix to be added
-  Returns:
-    Target name with suffix added (foo_suffix#target)
-  """
-  parts = name.rsplit('#', 1)
-  parts[0] = '%s_%s' % (parts[0], suffix)
-  return '#'.join(parts)
-
-
-def _ShardName(name, number):
-  """Add a shard number to the end of a target.
-
-  Arguments:
-    name: name of the target (foo#target)
-    number: shard number
-  Returns:
-    Target name with shard added (foo_1#target)
-  """
-  return _SuffixName(name, str(number))
-
-
-def ShardTargets(target_list, target_dicts):
-  """Shard some targets apart to work around the linkers limits.
-
-  Arguments:
-    target_list: List of target pairs: 'base/base.gyp:base'.
-    target_dicts: Dict of target properties keyed on target pair.
-  Returns:
-    Tuple of the new sharded versions of the inputs.
-  """
-  # Gather the targets to shard, and how many pieces.
-  targets_to_shard = {}
-  for t in target_dicts:
-    shards = int(target_dicts[t].get('msvs_shard', 0))
-    if shards:
-      targets_to_shard[t] = shards
-  # Shard target_list.
-  new_target_list = []
-  for t in target_list:
-    if t in targets_to_shard:
-      for i in range(targets_to_shard[t]):
-        new_target_list.append(_ShardName(t, i))
-    else:
-      new_target_list.append(t)
-  # Shard target_dict.
-  new_target_dicts = {}
-  for t in target_dicts:
-    if t in targets_to_shard:
-      for i in range(targets_to_shard[t]):
-        name = _ShardName(t, i)
-        new_target_dicts[name] = copy.copy(target_dicts[t])
-        new_target_dicts[name]['target_name'] = _ShardName(
-             new_target_dicts[name]['target_name'], i)
-        sources = new_target_dicts[name].get('sources', [])
-        new_sources = []
-        for pos in range(i, len(sources), targets_to_shard[t]):
-          new_sources.append(sources[pos])
-        new_target_dicts[name]['sources'] = new_sources
-    else:
-      new_target_dicts[t] = target_dicts[t]
-  # Shard dependencies.
-  for t in new_target_dicts:
-    dependencies = copy.copy(new_target_dicts[t].get('dependencies', []))
-    new_dependencies = []
-    for d in dependencies:
-      if d in targets_to_shard:
-        for i in range(targets_to_shard[d]):
-          new_dependencies.append(_ShardName(d, i))
-      else:
-        new_dependencies.append(d)
-    new_target_dicts[t]['dependencies'] = new_dependencies
-
-  return (new_target_list, new_target_dicts)
-
-
-def InsertLargePdbShims(target_list, target_dicts, vars):
-  """Insert a shim target that forces the linker to use 4KB pagesize PDBs.
-
-  This is a workaround for targets with PDBs greater than 1GB in size, the
-  limit for the 1KB pagesize PDBs created by the linker by default.
-
-  Arguments:
-    target_list: List of target pairs: 'base/base.gyp:base'.
-    target_dicts: Dict of target properties keyed on target pair.
-    vars: A dictionary of common GYP variables with generator-specific values.
-  Returns:
-    Tuple of the shimmed version of the inputs.
-  """
-  # Determine which targets need shimming.
-  targets_to_shim = []
-  for t in target_dicts:
-    target_dict = target_dicts[t]
-    # We only want to shim targets that have msvs_large_pdb enabled.
-    if not int(target_dict.get('msvs_large_pdb', 0)):
-      continue
-    # This is intended for executable, shared_library and loadable_module
-    # targets where every configuration is set up to produce a PDB output.
-    # If any of these conditions is not true then the shim logic will fail
-    # below.
-    targets_to_shim.append(t)
-
-  large_pdb_shim_cc = _GetLargePdbShimCcPath()
-
-  for t in targets_to_shim:
-    target_dict = target_dicts[t]
-    target_name = target_dict.get('target_name')
-
-    base_dict = _DeepCopySomeKeys(target_dict,
-          ['configurations', 'default_configuration', 'toolset'])
-
-    # This is the dict for copying the source file (part of the GYP tree)
-    # to the intermediate directory of the project. This is necessary because
-    # we can't always build a relative path to the shim source file (on Windows
-    # GYP and the project may be on different drives), and Ninja hates absolute
-    # paths (it ends up generating the .obj and .obj.d alongside the source
-    # file, polluting GYPs tree).
-    copy_suffix = '_large_pdb_copy'
-    copy_target_name = target_name + '_' + copy_suffix
-    full_copy_target_name = _SuffixName(t, copy_suffix)
-    shim_cc_basename = os.path.basename(large_pdb_shim_cc)
-    shim_cc_dir = vars['SHARED_INTERMEDIATE_DIR'] + '/' + copy_target_name
-    shim_cc_path = shim_cc_dir + '/' + shim_cc_basename
-    copy_dict = copy.deepcopy(base_dict)
-    copy_dict['target_name'] = copy_target_name
-    copy_dict['type'] = 'none'
-    copy_dict['sources'] = [ large_pdb_shim_cc ]
-    copy_dict['copies'] = [{
-      'destination': shim_cc_dir,
-      'files': [ large_pdb_shim_cc ]
-    }]
-
-    # This is the dict for the PDB generating shim target. It depends on the
-    # copy target.
-    shim_suffix = '_large_pdb_shim'
-    shim_target_name = target_name + '_' + shim_suffix
-    full_shim_target_name = _SuffixName(t, shim_suffix)
-    shim_dict = copy.deepcopy(base_dict)
-    shim_dict['target_name'] = shim_target_name
-    shim_dict['type'] = 'static_library'
-    shim_dict['sources'] = [ shim_cc_path ]
-    shim_dict['dependencies'] = [ full_copy_target_name ]
-
-    # Set up the shim to output its PDB to the same location as the final linker
-    # target.
-    for config in shim_dict.get('configurations').itervalues():
-      msvs = config.setdefault('msvs_settings')
-
-      linker = msvs.pop('VCLinkerTool')  # We want to clear this dict.
-      pdb_path = linker.get('ProgramDatabaseFile')
-
-      compiler = msvs.setdefault('VCCLCompilerTool', {})
-      compiler.setdefault('DebugInformationFormat', '3')
-      compiler.setdefault('ProgramDataBaseFileName', pdb_path)
-
-    # Add the new targets.
-    target_list.append(full_copy_target_name)
-    target_list.append(full_shim_target_name)
-    target_dicts[full_copy_target_name] = copy_dict
-    target_dicts[full_shim_target_name] = shim_dict
-
-    # Update the original target to depend on the shim target.
-    target_dict.setdefault('dependencies', []).append(full_shim_target_name)
-
-  return (target_list, target_dicts)
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSVersion.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,373 +0,0 @@
-# Copyright (c) 2013 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Handle version information related to Visual Stuio."""
-
-import errno
-import os
-import re
-import subprocess
-import sys
-import gyp
-
-
-class VisualStudioVersion(object):
-  """Information regarding a version of Visual Studio."""
-
-  def __init__(self, short_name, description,
-               solution_version, project_version, flat_sln, uses_vcxproj,
-               path, sdk_based, default_toolset=None):
-    self.short_name = short_name
-    self.description = description
-    self.solution_version = solution_version
-    self.project_version = project_version
-    self.flat_sln = flat_sln
-    self.uses_vcxproj = uses_vcxproj
-    self.path = path
-    self.sdk_based = sdk_based
-    self.default_toolset = default_toolset
-
-  def ShortName(self):
-    return self.short_name
-
-  def Description(self):
-    """Get the full description of the version."""
-    return self.description
-
-  def SolutionVersion(self):
-    """Get the version number of the sln files."""
-    return self.solution_version
-
-  def ProjectVersion(self):
-    """Get the version number of the vcproj or vcxproj files."""
-    return self.project_version
-
-  def FlatSolution(self):
-    return self.flat_sln
-
-  def UsesVcxproj(self):
-    """Returns true if this version uses a vcxproj file."""
-    return self.uses_vcxproj
-
-  def ProjectExtension(self):
-    """Returns the file extension for the project."""
-    return self.uses_vcxproj and '.vcxproj' or '.vcproj'
-
-  def Path(self):
-    """Returns the path to Visual Studio installation."""
-    return self.path
-
-  def ToolPath(self, tool):
-    """Returns the path to a given compiler tool. """
-    return os.path.normpath(os.path.join(self.path, "VC/bin", tool))
-
-  def DefaultToolset(self):
-    """Returns the msbuild toolset version that will be used in the absence
-    of a user override."""
-    return self.default_toolset
-
-  def SetupScript(self, target_arch):
-    """Returns a command (with arguments) to be used to set up the
-    environment."""
-    # Check if we are running in the SDK command line environment and use
-    # the setup script from the SDK if so. |target_arch| should be either
-    # 'x86' or 'x64'.
-    assert target_arch in ('x86', 'x64')
-    sdk_dir = os.environ.get('WindowsSDKDir')
-    if self.sdk_based and sdk_dir:
-      return [os.path.normpath(os.path.join(sdk_dir, 'Bin/SetEnv.Cmd')),
-              '/' + target_arch]
-    else:
-      # We don't use VC/vcvarsall.bat for x86 because vcvarsall calls
-      # vcvars32, which it can only find if VS??COMNTOOLS is set, which it
-      # isn't always.
-      if target_arch == 'x86':
-        return [os.path.normpath(
-          os.path.join(self.path, 'Common7/Tools/vsvars32.bat'))]
-      else:
-        assert target_arch == 'x64'
-        arg = 'x86_amd64'
-        if (os.environ.get('PROCESSOR_ARCHITECTURE') == 'AMD64' or
-            os.environ.get('PROCESSOR_ARCHITEW6432') == 'AMD64'):
-          # Use the 64-on-64 compiler if we can.
-          arg = 'amd64'
-        return [os.path.normpath(
-            os.path.join(self.path, 'VC/vcvarsall.bat')), arg]
-
-
-def _RegistryQueryBase(sysdir, key, value):
-  """Use reg.exe to read a particular key.
-
-  While ideally we might use the win32 module, we would like gyp to be
-  python neutral, so for instance cygwin python lacks this module.
-
-  Arguments:
-    sysdir: The system subdirectory to attempt to launch reg.exe from.
-    key: The registry key to read from.
-    value: The particular value to read.
-  Return:
-    stdout from reg.exe, or None for failure.
-  """
-  # Skip if not on Windows or Python Win32 setup issue
-  if sys.platform not in ('win32', 'cygwin'):
-    return None
-  # Setup params to pass to and attempt to launch reg.exe
-  cmd = [os.path.join(os.environ.get('WINDIR', ''), sysdir, 'reg.exe'),
-         'query', key]
-  if value:
-    cmd.extend(['/v', value])
-  p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
-  # Obtain the stdout from reg.exe, reading to the end so p.returncode is valid
-  # Note that the error text may be in [1] in some cases
-  text = p.communicate()[0]
-  # Check return code from reg.exe; officially 0==success and 1==error
-  if p.returncode:
-    return None
-  return text
-
-
-def _RegistryQuery(key, value=None):
-  """Use reg.exe to read a particular key through _RegistryQueryBase.
-
-  First tries to launch from %WinDir%\Sysnative to avoid WoW64 redirection. If
-  that fails, it falls back to System32.  Sysnative is available on Vista and
-  up and available on Windows Server 2003 and XP through KB patch 942589. Note
-  that Sysnative will always fail if using 64-bit python due to it being a
-  virtual directory and System32 will work correctly in the first place.
-
-  KB 942589 - http://support.microsoft.com/kb/942589/en-us.
-
-  Arguments:
-    key: The registry key.
-    value: The particular registry value to read (optional).
-  Return:
-    stdout from reg.exe, or None for failure.
-  """
-  text = None
-  try:
-    text = _RegistryQueryBase('Sysnative', key, value)
-  except OSError, e:
-    if e.errno == errno.ENOENT:
-      text = _RegistryQueryBase('System32', key, value)
-    else:
-      raise
-  return text
-
-
-def _RegistryGetValue(key, value):
-  """Use reg.exe to obtain the value of a registry key.
-
-  Args:
-    key: The registry key.
-    value: The particular registry value to read.
-  Return:
-    contents of the registry key's value, or None on failure.
-  """
-  text = _RegistryQuery(key, value)
-  if not text:
-    return None
-  # Extract value.
-  match = re.search(r'REG_\w+\s+([^\r]+)\r\n', text)
-  if not match:
-    return None
-  return match.group(1)
-
-
-def _RegistryKeyExists(key):
-  """Use reg.exe to see if a key exists.
-
-  Args:
-    key: The registry key to check.
-  Return:
-    True if the key exists
-  """
-  if not _RegistryQuery(key):
-    return False
-  return True
-
-
-def _CreateVersion(name, path, sdk_based=False):
-  """Sets up MSVS project generation.
-
-  Setup is based off the GYP_MSVS_VERSION environment variable or whatever is
-  autodetected if GYP_MSVS_VERSION is not explicitly specified. If a version is
-  passed in that doesn't match a value in versions python will throw a error.
-  """
-  if path:
-    path = os.path.normpath(path)
-  versions = {
-      '2012': VisualStudioVersion('2012',
-                                  'Visual Studio 2012',
-                                  solution_version='12.00',
-                                  project_version='4.0',
-                                  flat_sln=False,
-                                  uses_vcxproj=True,
-                                  path=path,
-                                  sdk_based=sdk_based,
-                                  default_toolset='v110'),
-      '2012e': VisualStudioVersion('2012e',
-                                   'Visual Studio 2012',
-                                   solution_version='12.00',
-                                   project_version='4.0',
-                                   flat_sln=True,
-                                   uses_vcxproj=True,
-                                   path=path,
-                                   sdk_based=sdk_based,
-                                   default_toolset='v110'),
-      '2010': VisualStudioVersion('2010',
-                                  'Visual Studio 2010',
-                                  solution_version='11.00',
-                                  project_version='4.0',
-                                  flat_sln=False,
-                                  uses_vcxproj=True,
-                                  path=path,
-                                  sdk_based=sdk_based),
-      '2010e': VisualStudioVersion('2010e',
-                                   'Visual Studio 2010',
-                                   solution_version='11.00',
-                                   project_version='4.0',
-                                   flat_sln=True,
-                                   uses_vcxproj=True,
-                                   path=path,
-                                   sdk_based=sdk_based),
-      '2008': VisualStudioVersion('2008',
-                                  'Visual Studio 2008',
-                                  solution_version='10.00',
-                                  project_version='9.00',
-                                  flat_sln=False,
-                                  uses_vcxproj=False,
-                                  path=path,
-                                  sdk_based=sdk_based),
-      '2008e': VisualStudioVersion('2008e',
-                                   'Visual Studio 2008',
-                                   solution_version='10.00',
-                                   project_version='9.00',
-                                   flat_sln=True,
-                                   uses_vcxproj=False,
-                                   path=path,
-                                   sdk_based=sdk_based),
-      '2005': VisualStudioVersion('2005',
-                                  'Visual Studio 2005',
-                                  solution_version='9.00',
-                                  project_version='8.00',
-                                  flat_sln=False,
-                                  uses_vcxproj=False,
-                                  path=path,
-                                  sdk_based=sdk_based),
-      '2005e': VisualStudioVersion('2005e',
-                                   'Visual Studio 2005',
-                                   solution_version='9.00',
-                                   project_version='8.00',
-                                   flat_sln=True,
-                                   uses_vcxproj=False,
-                                   path=path,
-                                   sdk_based=sdk_based),
-  }
-  return versions[str(name)]
-
-
-def _ConvertToCygpath(path):
-  """Convert to cygwin path if we are using cygwin."""
-  if sys.platform == 'cygwin':
-    p = subprocess.Popen(['cygpath', path], stdout=subprocess.PIPE)
-    path = p.communicate()[0].strip()
-  return path
-
-
-def _DetectVisualStudioVersions(versions_to_check, force_express):
-  """Collect the list of installed visual studio versions.
-
-  Returns:
-    A list of visual studio versions installed in descending order of
-    usage preference.
-    Base this on the registry and a quick check if devenv.exe exists.
-    Only versions 8-10 are considered.
-    Possibilities are:
-      2005(e) - Visual Studio 2005 (8)
-      2008(e) - Visual Studio 2008 (9)
-      2010(e) - Visual Studio 2010 (10)
-      2012(e) - Visual Studio 2012 (11)
-    Where (e) is e for express editions of MSVS and blank otherwise.
-  """
-  version_to_year = {
-      '8.0': '2005', '9.0': '2008', '10.0': '2010', '11.0': '2012'}
-  versions = []
-  for version in versions_to_check:
-    # Old method of searching for which VS version is installed
-    # We don't use the 2010-encouraged-way because we also want to get the
-    # path to the binaries, which it doesn't offer.
-    keys = [r'HKLM\Software\Microsoft\VisualStudio\%s' % version,
-            r'HKLM\Software\Wow6432Node\Microsoft\VisualStudio\%s' % version,
-            r'HKLM\Software\Microsoft\VCExpress\%s' % version,
-            r'HKLM\Software\Wow6432Node\Microsoft\VCExpress\%s' % version]
-    for index in range(len(keys)):
-      path = _RegistryGetValue(keys[index], 'InstallDir')
-      if not path:
-        continue
-      path = _ConvertToCygpath(path)
-      # Check for full.
-      full_path = os.path.join(path, 'devenv.exe')
-      express_path = os.path.join(path, 'vcexpress.exe')
-      if not force_express and os.path.exists(full_path):
-        # Add this one.
-        versions.append(_CreateVersion(version_to_year[version],
-            os.path.join(path, '..', '..')))
-      # Check for express.
-      elif os.path.exists(express_path):
-        # Add this one.
-        versions.append(_CreateVersion(version_to_year[version] + 'e',
-            os.path.join(path, '..', '..')))
-
-    # The old method above does not work when only SDK is installed.
-    keys = [r'HKLM\Software\Microsoft\VisualStudio\SxS\VC7',
-            r'HKLM\Software\Wow6432Node\Microsoft\VisualStudio\SxS\VC7']
-    for index in range(len(keys)):
-      path = _RegistryGetValue(keys[index], version)
-      if not path:
-        continue
-      path = _ConvertToCygpath(path)
-      versions.append(_CreateVersion(version_to_year[version] + 'e',
-          os.path.join(path, '..'), sdk_based=True))
-
-  return versions
-
-
-def SelectVisualStudioVersion(version='auto'):
-  """Select which version of Visual Studio projects to generate.
-
-  Arguments:
-    version: Hook to allow caller to force a particular version (vs auto).
-  Returns:
-    An object representing a visual studio project format version.
-  """
-  # In auto mode, check environment variable for override.
-  if version == 'auto':
-    version = os.environ.get('GYP_MSVS_VERSION', 'auto')
-  version_map = {
-    'auto': ('10.0', '9.0', '8.0', '11.0'),
-    '2005': ('8.0',),
-    '2005e': ('8.0',),
-    '2008': ('9.0',),
-    '2008e': ('9.0',),
-    '2010': ('10.0',),
-    '2010e': ('10.0',),
-    '2012': ('11.0',),
-    '2012e': ('11.0',),
-  }
-  override_path = os.environ.get('GYP_MSVS_OVERRIDE_PATH')
-  if override_path:
-    msvs_version = os.environ.get('GYP_MSVS_VERSION')
-    if not msvs_version or 'e' not in msvs_version:
-      raise ValueError('GYP_MSVS_OVERRIDE_PATH requires GYP_MSVS_VERSION to be '
-                       'set to an "e" version (e.g. 2010e)')
-    return _CreateVersion(msvs_version, override_path, sdk_based=True)
-  version = str(version)
-  versions = _DetectVisualStudioVersions(version_map[version], 'e' in version)
-  if not versions:
-    if version == 'auto':
-      # Default to 2005 if we couldn't find anything
-      return _CreateVersion('2005', None)
-    else:
-      return _CreateVersion(version, None)
-  return versions[0]
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/SCons.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,199 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""
-SCons generator.
-
-This contains class definitions and supporting functions for generating
-pieces of SCons files for the different types of GYP targets.
-"""
-
-import os
-
-
-def WriteList(fp, list, prefix='',
-                        separator=',\n    ',
-                        preamble=None,
-                        postamble=None):
-  fp.write(preamble or '')
-  fp.write((separator or ' ').join([prefix + l for l in list]))
-  fp.write(postamble or '')
-
-
-class TargetBase(object):
-  """
-  Base class for a SCons representation of a GYP target.
-  """
-  is_ignored = False
-  target_prefix = ''
-  target_suffix = ''
-  def __init__(self, spec):
-    self.spec = spec
-  def full_product_name(self):
-    """
-    Returns the full name of the product being built:
-
-      * Uses 'product_name' if it's set, else prefix + 'target_name'.
-      * Prepends 'product_dir' if set.
-      * Appends SCons suffix variables for the target type (or
-        product_extension).
-    """
-    suffix = self.target_suffix
-    product_extension = self.spec.get('product_extension')
-    if product_extension:
-      suffix = '.' + product_extension
-    prefix = self.spec.get('product_prefix', self.target_prefix)
-    name = self.spec['target_name']
-    name = prefix + self.spec.get('product_name', name) + suffix
-    product_dir = self.spec.get('product_dir')
-    if product_dir:
-      name = os.path.join(product_dir, name)
-    else:
-      name = os.path.join(self.out_dir, name)
-    return name
-
-  def write_input_files(self, fp):
-    """
-    Writes the definition of the input files (sources).
-    """
-    sources = self.spec.get('sources')
-    if not sources:
-      fp.write('\ninput_files = []\n')
-      return
-    preamble = '\ninput_files = [\n    '
-    postamble = ',\n]\n'
-    WriteList(fp, map(repr, sources), preamble=preamble, postamble=postamble)
-
-  def builder_call(self):
-    """
-    Returns the actual SCons builder call to build this target.
-    """
-    name = self.full_product_name()
-    return 'env.%s(env.File(%r), input_files)' % (self.builder_name, name)
-  def write_target(self, fp, src_dir='', pre=''):
-    """
-    Writes the lines necessary to build this target.
-    """
-    fp.write('\n' + pre)
-    fp.write('_outputs = %s\n' % self.builder_call())
-    fp.write('target_files.extend(_outputs)\n')
-
-
-class NoneTarget(TargetBase):
-  """
-  A GYP target type of 'none', implicitly or explicitly.
-  """
-  def write_target(self, fp, src_dir='', pre=''):
-    fp.write('\ntarget_files.extend(input_files)\n')
-
-
-class SettingsTarget(TargetBase):
-  """
-  A GYP target type of 'settings'.
-  """
-  is_ignored = True
-
-
-compilable_sources_template = """
-_result = []
-for infile in input_files:
-  if env.compilable(infile):
-    if (type(infile) == type('')
-        and (infile.startswith(%(src_dir)r)
-             or not os.path.isabs(env.subst(infile)))):
-      # Force files below the build directory by replacing all '..'
-      # elements in the path with '__':
-      base, ext = os.path.splitext(os.path.normpath(infile))
-      base = [d == '..' and '__' or d for d in base.split('/')]
-      base = os.path.join(*base)
-      object = '${OBJ_DIR}/${COMPONENT_NAME}/${TARGET_NAME}/' + base
-      if not infile.startswith(%(src_dir)r):
-        infile = %(src_dir)r + infile
-      infile = env.%(name)s(object, infile)[0]
-    else:
-      infile = env.%(name)s(infile)[0]
-  _result.append(infile)
-input_files = _result
-"""
-
-class CompilableSourcesTargetBase(TargetBase):
-  """
-  An abstract base class for targets that compile their source files.
-
-  We explicitly transform compilable files into object files,
-  even though SCons could infer that for us, because we want
-  to control where the object file ends up.  (The implicit rules
-  in SCons always put the object file next to the source file.)
-  """
-  intermediate_builder_name = None
-  def write_target(self, fp, src_dir='', pre=''):
-    if self.intermediate_builder_name is None:
-      raise NotImplementedError
-    if src_dir and not src_dir.endswith('/'):
-      src_dir += '/'
-    variables = {
-        'src_dir': src_dir,
-        'name': self.intermediate_builder_name,
-    }
-    fp.write(compilable_sources_template % variables)
-    super(CompilableSourcesTargetBase, self).write_target(fp)
-
-
-class ProgramTarget(CompilableSourcesTargetBase):
-  """
-  A GYP target type of 'executable'.
-  """
-  builder_name = 'GypProgram'
-  intermediate_builder_name = 'StaticObject'
-  target_prefix = '${PROGPREFIX}'
-  target_suffix = '${PROGSUFFIX}'
-  out_dir = '${TOP_BUILDDIR}'
-
-
-class StaticLibraryTarget(CompilableSourcesTargetBase):
-  """
-  A GYP target type of 'static_library'.
-  """
-  builder_name = 'GypStaticLibrary'
-  intermediate_builder_name = 'StaticObject'
-  target_prefix = '${LIBPREFIX}'
-  target_suffix = '${LIBSUFFIX}'
-  out_dir = '${LIB_DIR}'
-
-
-class SharedLibraryTarget(CompilableSourcesTargetBase):
-  """
-  A GYP target type of 'shared_library'.
-  """
-  builder_name = 'GypSharedLibrary'
-  intermediate_builder_name = 'SharedObject'
-  target_prefix = '${SHLIBPREFIX}'
-  target_suffix = '${SHLIBSUFFIX}'
-  out_dir = '${LIB_DIR}'
-
-
-class LoadableModuleTarget(CompilableSourcesTargetBase):
-  """
-  A GYP target type of 'loadable_module'.
-  """
-  builder_name = 'GypLoadableModule'
-  intermediate_builder_name = 'SharedObject'
-  target_prefix = '${SHLIBPREFIX}'
-  target_suffix = '${SHLIBSUFFIX}'
-  out_dir = '${TOP_BUILDDIR}'
-
-
-TargetMap = {
-  None : NoneTarget,
-  'none' : NoneTarget,
-  'settings' : SettingsTarget,
-  'executable' : ProgramTarget,
-  'static_library' : StaticLibraryTarget,
-  'shared_library' : SharedLibraryTarget,
-  'loadable_module' : LoadableModuleTarget,
-}
-
-
-def Target(spec):
-  return TargetMap[spec.get('type')](spec)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/__init__.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,532 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import copy
-import gyp.input
-import optparse
-import os.path
-import re
-import shlex
-import sys
-import traceback
-from gyp.common import GypError
-
-# Default debug modes for GYP
-debug = {}
-
-# List of "official" debug modes, but you can use anything you like.
-DEBUG_GENERAL = 'general'
-DEBUG_VARIABLES = 'variables'
-DEBUG_INCLUDES = 'includes'
-
-
-def DebugOutput(mode, message, *args):
-  if 'all' in gyp.debug or mode in gyp.debug:
-    ctx = ('unknown', 0, 'unknown')
-    try:
-      f = traceback.extract_stack(limit=2)
-      if f:
-        ctx = f[0][:3]
-    except:
-      pass
-    if args:
-      message %= args
-    print '%s:%s:%d:%s %s' % (mode.upper(), os.path.basename(ctx[0]),
-                              ctx[1], ctx[2], message)
-
-def FindBuildFiles():
-  extension = '.gyp'
-  files = os.listdir(os.getcwd())
-  build_files = []
-  for file in files:
-    if file.endswith(extension):
-      build_files.append(file)
-  return build_files
-
-
-def Load(build_files, format, default_variables={},
-         includes=[], depth='.', params=None, check=False,
-         circular_check=True):
-  """
-  Loads one or more specified build files.
-  default_variables and includes will be copied before use.
-  Returns the generator for the specified format and the
-  data returned by loading the specified build files.
-  """
-  if params is None:
-    params = {}
-
-  flavor = None
-  if '-' in format:
-    format, params['flavor'] = format.split('-', 1)
-
-  default_variables = copy.copy(default_variables)
-
-  # Default variables provided by this program and its modules should be
-  # named WITH_CAPITAL_LETTERS to provide a distinct "best practice" namespace,
-  # avoiding collisions with user and automatic variables.
-  default_variables['GENERATOR'] = format
-
-  # Format can be a custom python file, or by default the name of a module
-  # within gyp.generator.
-  if format.endswith('.py'):
-    generator_name = os.path.splitext(format)[0]
-    path, generator_name = os.path.split(generator_name)
-
-    # Make sure the path to the custom generator is in sys.path
-    # Don't worry about removing it once we are done.  Keeping the path
-    # to each generator that is used in sys.path is likely harmless and
-    # arguably a good idea.
-    path = os.path.abspath(path)
-    if path not in sys.path:
-      sys.path.insert(0, path)
-  else:
-    generator_name = 'gyp.generator.' + format
-
-  # These parameters are passed in order (as opposed to by key)
-  # because ActivePython cannot handle key parameters to __import__.
-  generator = __import__(generator_name, globals(), locals(), generator_name)
-  for (key, val) in generator.generator_default_variables.items():
-    default_variables.setdefault(key, val)
-
-  # Give the generator the opportunity to set additional variables based on
-  # the params it will receive in the output phase.
-  if getattr(generator, 'CalculateVariables', None):
-    generator.CalculateVariables(default_variables, params)
-
-  # Give the generator the opportunity to set generator_input_info based on
-  # the params it will receive in the output phase.
-  if getattr(generator, 'CalculateGeneratorInputInfo', None):
-    generator.CalculateGeneratorInputInfo(params)
-
-  # Fetch the generator specific info that gets fed to input, we use getattr
-  # so we can default things and the generators only have to provide what
-  # they need.
-  generator_input_info = {
-    'generator_wants_absolute_build_file_paths':
-        getattr(generator, 'generator_wants_absolute_build_file_paths', False),
-    'generator_handles_variants':
-        getattr(generator, 'generator_handles_variants', False),
-    'non_configuration_keys':
-        getattr(generator, 'generator_additional_non_configuration_keys', []),
-    'path_sections':
-        getattr(generator, 'generator_additional_path_sections', []),
-    'extra_sources_for_rules':
-        getattr(generator, 'generator_extra_sources_for_rules', []),
-    'generator_supports_multiple_toolsets':
-        getattr(generator, 'generator_supports_multiple_toolsets', False),
-    'generator_wants_static_library_dependencies_adjusted':
-        getattr(generator,
-                'generator_wants_static_library_dependencies_adjusted', True),
-    'generator_wants_sorted_dependencies':
-        getattr(generator, 'generator_wants_sorted_dependencies', False),
-  }
-
-  # Process the input specific to this generator.
-  result = gyp.input.Load(build_files, default_variables, includes[:],
-                          depth, generator_input_info, check, circular_check,
-                          params['parallel'])
-  return [generator] + result
-
-def NameValueListToDict(name_value_list):
-  """
-  Takes an array of strings of the form 'NAME=VALUE' and creates a dictionary
-  of the pairs.  If a string is simply NAME, then the value in the dictionary
-  is set to True.  If VALUE can be converted to an integer, it is.
-  """
-  result = { }
-  for item in name_value_list:
-    tokens = item.split('=', 1)
-    if len(tokens) == 2:
-      # If we can make it an int, use that, otherwise, use the string.
-      try:
-        token_value = int(tokens[1])
-      except ValueError:
-        token_value = tokens[1]
-      # Set the variable to the supplied value.
-      result[tokens[0]] = token_value
-    else:
-      # No value supplied, treat it as a boolean and set it.
-      result[tokens[0]] = True
-  return result
-
-def ShlexEnv(env_name):
-  flags = os.environ.get(env_name, [])
-  if flags:
-    flags = shlex.split(flags)
-  return flags
-
-def FormatOpt(opt, value):
-  if opt.startswith('--'):
-    return '%s=%s' % (opt, value)
-  return opt + value
-
-def RegenerateAppendFlag(flag, values, predicate, env_name, options):
-  """Regenerate a list of command line flags, for an option of action='append'.
-
-  The |env_name|, if given, is checked in the environment and used to generate
-  an initial list of options, then the options that were specified on the
-  command line (given in |values|) are appended.  This matches the handling of
-  environment variables and command line flags where command line flags override
-  the environment, while not requiring the environment to be set when the flags
-  are used again.
-  """
-  flags = []
-  if options.use_environment and env_name:
-    for flag_value in ShlexEnv(env_name):
-      value = FormatOpt(flag, predicate(flag_value))
-      if value in flags:
-        flags.remove(value)
-      flags.append(value)
-  if values:
-    for flag_value in values:
-      flags.append(FormatOpt(flag, predicate(flag_value)))
-  return flags
-
-def RegenerateFlags(options):
-  """Given a parsed options object, and taking the environment variables into
-  account, returns a list of flags that should regenerate an equivalent options
-  object (even in the absence of the environment variables.)
-
-  Any path options will be normalized relative to depth.
-
-  The format flag is not included, as it is assumed the calling generator will
-  set that as appropriate.
-  """
-  def FixPath(path):
-    path = gyp.common.FixIfRelativePath(path, options.depth)
-    if not path:
-      return os.path.curdir
-    return path
-
-  def Noop(value):
-    return value
-
-  # We always want to ignore the environment when regenerating, to avoid
-  # duplicate or changed flags in the environment at the time of regeneration.
-  flags = ['--ignore-environment']
-  for name, metadata in options._regeneration_metadata.iteritems():
-    opt = metadata['opt']
-    value = getattr(options, name)
-    value_predicate = metadata['type'] == 'path' and FixPath or Noop
-    action = metadata['action']
-    env_name = metadata['env_name']
-    if action == 'append':
-      flags.extend(RegenerateAppendFlag(opt, value, value_predicate,
-                                        env_name, options))
-    elif action in ('store', None):  # None is a synonym for 'store'.
-      if value:
-        flags.append(FormatOpt(opt, value_predicate(value)))
-      elif options.use_environment and env_name and os.environ.get(env_name):
-        flags.append(FormatOpt(opt, value_predicate(os.environ.get(env_name))))
-    elif action in ('store_true', 'store_false'):
-      if ((action == 'store_true' and value) or
-          (action == 'store_false' and not value)):
-        flags.append(opt)
-      elif options.use_environment and env_name:
-        print >>sys.stderr, ('Warning: environment regeneration unimplemented '
-                             'for %s flag %r env_name %r' % (action, opt,
-                                                             env_name))
-    else:
-      print >>sys.stderr, ('Warning: regeneration unimplemented for action %r '
-                           'flag %r' % (action, opt))
-
-  return flags
-
-class RegeneratableOptionParser(optparse.OptionParser):
-  def __init__(self):
-    self.__regeneratable_options = {}
-    optparse.OptionParser.__init__(self)
-
-  def add_option(self, *args, **kw):
-    """Add an option to the parser.
-
-    This accepts the same arguments as OptionParser.add_option, plus the
-    following:
-      regenerate: can be set to False to prevent this option from being included
-                  in regeneration.
-      env_name: name of environment variable that additional values for this
-                option come from.
-      type: adds type='path', to tell the regenerator that the values of
-            this option need to be made relative to options.depth
-    """
-    env_name = kw.pop('env_name', None)
-    if 'dest' in kw and kw.pop('regenerate', True):
-      dest = kw['dest']
-
-      # The path type is needed for regenerating, for optparse we can just treat
-      # it as a string.
-      type = kw.get('type')
-      if type == 'path':
-        kw['type'] = 'string'
-
-      self.__regeneratable_options[dest] = {
-          'action': kw.get('action'),
-          'type': type,
-          'env_name': env_name,
-          'opt': args[0],
-        }
-
-    optparse.OptionParser.add_option(self, *args, **kw)
-
-  def parse_args(self, *args):
-    values, args = optparse.OptionParser.parse_args(self, *args)
-    values._regeneration_metadata = self.__regeneratable_options
-    return values, args
-
-def gyp_main(args):
-  my_name = os.path.basename(sys.argv[0])
-
-  parser = RegeneratableOptionParser()
-  usage = 'usage: %s [options ...] [build_file ...]'
-  parser.set_usage(usage.replace('%s', '%prog'))
-  parser.add_option('-D', dest='defines', action='append', metavar='VAR=VAL',
-                    env_name='GYP_DEFINES',
-                    help='sets variable VAR to value VAL')
-  parser.add_option('-f', '--format', dest='formats', action='append',
-                    env_name='GYP_GENERATORS', regenerate=False,
-                    help='output formats to generate')
-  parser.add_option('--msvs-version', dest='msvs_version',
-                    regenerate=False,
-                    help='Deprecated; use -G msvs_version=MSVS_VERSION instead')
-  parser.add_option('-I', '--include', dest='includes', action='append',
-                    metavar='INCLUDE', type='path',
-                    help='files to include in all loaded .gyp files')
-  parser.add_option('--depth', dest='depth', metavar='PATH', type='path',
-                    help='set DEPTH gyp variable to a relative path to PATH')
-  parser.add_option('-d', '--debug', dest='debug', metavar='DEBUGMODE',
-                    action='append', default=[], help='turn on a debugging '
-                    'mode for debugging GYP.  Supported modes are "variables", '
-                    '"includes" and "general" or "all" for all of them.')
-  parser.add_option('-S', '--suffix', dest='suffix', default='',
-                    help='suffix to add to generated files')
-  parser.add_option('-G', dest='generator_flags', action='append', default=[],
-                    metavar='FLAG=VAL', env_name='GYP_GENERATOR_FLAGS',
-                    help='sets generator flag FLAG to VAL')
-  parser.add_option('--generator-output', dest='generator_output',
-                    action='store', default=None, metavar='DIR', type='path',
-                    env_name='GYP_GENERATOR_OUTPUT',
-                    help='puts generated build files under DIR')
-  parser.add_option('--ignore-environment', dest='use_environment',
-                    action='store_false', default=True, regenerate=False,
-                    help='do not read options from environment variables')
-  parser.add_option('--check', dest='check', action='store_true',
-                    help='check format of gyp files')
-  parser.add_option('--parallel', action='store_true',
-                    env_name='GYP_PARALLEL',
-                    help='Use multiprocessing for speed (experimental)')
-  parser.add_option('--toplevel-dir', dest='toplevel_dir', action='store',
-                    default=None, metavar='DIR', type='path',
-                    help='directory to use as the root of the source tree')
-  parser.add_option('--build', dest='configs', action='append',
-                    help='configuration for build after project generation')
-  # --no-circular-check disables the check for circular relationships between
-  # .gyp files.  These relationships should not exist, but they've only been
-  # observed to be harmful with the Xcode generator.  Chromium's .gyp files
-  # currently have some circular relationships on non-Mac platforms, so this
-  # option allows the strict behavior to be used on Macs and the lenient
-  # behavior to be used elsewhere.
-  # TODO(mark): Remove this option when http://crbug.com/35878 is fixed.
-  parser.add_option('--no-circular-check', dest='circular_check',
-                    action='store_false', default=True, regenerate=False,
-                    help="don't check for circular relationships between files")
-
-  # We read a few things from ~/.gyp, so set up a var for that.
-  home_vars = ['HOME']
-  if sys.platform in ('cygwin', 'win32'):
-    home_vars.append('USERPROFILE')
-  home = None
-  home_dot_gyp = None
-  for home_var in home_vars:
-    home = os.getenv(home_var)
-    if home != None:
-      home_dot_gyp = os.path.join(home, '.gyp')
-      if not os.path.exists(home_dot_gyp):
-        home_dot_gyp = None
-      else:
-        break
-
-  # TODO(thomasvl): add support for ~/.gyp/defaults
-
-  options, build_files_arg = parser.parse_args(args)
-  build_files = build_files_arg
-
-  if not options.formats:
-    # If no format was given on the command line, then check the env variable.
-    generate_formats = []
-    if options.use_environment:
-      generate_formats = os.environ.get('GYP_GENERATORS', [])
-    if generate_formats:
-      generate_formats = re.split('[\s,]', generate_formats)
-    if generate_formats:
-      options.formats = generate_formats
-    else:
-      # Nothing in the variable, default based on platform.
-      if sys.platform == 'darwin':
-        options.formats = ['xcode']
-      elif sys.platform in ('win32', 'cygwin'):
-        options.formats = ['msvs']
-      else:
-        options.formats = ['make']
-
-  if not options.generator_output and options.use_environment:
-    g_o = os.environ.get('GYP_GENERATOR_OUTPUT')
-    if g_o:
-      options.generator_output = g_o
-
-  if not options.parallel and options.use_environment:
-    p = os.environ.get('GYP_PARALLEL')
-    options.parallel = bool(p and p != '0')
-
-  for mode in options.debug:
-    gyp.debug[mode] = 1
-
-  # Do an extra check to avoid work when we're not debugging.
-  if DEBUG_GENERAL in gyp.debug:
-    DebugOutput(DEBUG_GENERAL, 'running with these options:')
-    for option, value in sorted(options.__dict__.items()):
-      if option[0] == '_':
-        continue
-      if isinstance(value, basestring):
-        DebugOutput(DEBUG_GENERAL, "  %s: '%s'", option, value)
-      else:
-        DebugOutput(DEBUG_GENERAL, "  %s: %s", option, value)
-
-  if not build_files:
-    build_files = FindBuildFiles()
-  if not build_files:
-    raise GypError((usage + '\n\n%s: error: no build_file') %
-                   (my_name, my_name))
-
-  # TODO(mark): Chromium-specific hack!
-  # For Chromium, the gyp "depth" variable should always be a relative path
-  # to Chromium's top-level "src" directory.  If no depth variable was set
-  # on the command line, try to find a "src" directory by looking at the
-  # absolute path to each build file's directory.  The first "src" component
-  # found will be treated as though it were the path used for --depth.
-  if not options.depth:
-    for build_file in build_files:
-      build_file_dir = os.path.abspath(os.path.dirname(build_file))
-      build_file_dir_components = build_file_dir.split(os.path.sep)
-      components_len = len(build_file_dir_components)
-      for index in xrange(components_len - 1, -1, -1):
-        if build_file_dir_components[index] == 'src':
-          options.depth = os.path.sep.join(build_file_dir_components)
-          break
-        del build_file_dir_components[index]
-
-      # If the inner loop found something, break without advancing to another
-      # build file.
-      if options.depth:
-        break
-
-    if not options.depth:
-      raise GypError('Could not automatically locate src directory.  This is'
-                     'a temporary Chromium feature that will be removed.  Use'
-                     '--depth as a workaround.')
-
-  # If toplevel-dir is not set, we assume that depth is the root of our source
-  # tree.
-  if not options.toplevel_dir:
-    options.toplevel_dir = options.depth
-
-  # -D on the command line sets variable defaults - D isn't just for define,
-  # it's for default.  Perhaps there should be a way to force (-F?) a
-  # variable's value so that it can't be overridden by anything else.
-  cmdline_default_variables = {}
-  defines = []
-  if options.use_environment:
-    defines += ShlexEnv('GYP_DEFINES')
-  if options.defines:
-    defines += options.defines
-  cmdline_default_variables = NameValueListToDict(defines)
-  if DEBUG_GENERAL in gyp.debug:
-    DebugOutput(DEBUG_GENERAL,
-                "cmdline_default_variables: %s", cmdline_default_variables)
-
-  # Set up includes.
-  includes = []
-
-  # If ~/.gyp/include.gypi exists, it'll be forcibly included into every
-  # .gyp file that's loaded, before anything else is included.
-  if home_dot_gyp != None:
-    default_include = os.path.join(home_dot_gyp, 'include.gypi')
-    if os.path.exists(default_include):
-      print 'Using overrides found in ' + default_include
-      includes.append(default_include)
-
-  # Command-line --include files come after the default include.
-  if options.includes:
-    includes.extend(options.includes)
-
-  # Generator flags should be prefixed with the target generator since they
-  # are global across all generator runs.
-  gen_flags = []
-  if options.use_environment:
-    gen_flags += ShlexEnv('GYP_GENERATOR_FLAGS')
-  if options.generator_flags:
-    gen_flags += options.generator_flags
-  generator_flags = NameValueListToDict(gen_flags)
-  if DEBUG_GENERAL in gyp.debug.keys():
-    DebugOutput(DEBUG_GENERAL, "generator_flags: %s", generator_flags)
-
-  # TODO: Remove this and the option after we've gotten folks to move to the
-  # generator flag.
-  if options.msvs_version:
-    print >>sys.stderr, \
-      'DEPRECATED: Use generator flag (-G msvs_version=' + \
-      options.msvs_version + ') instead of --msvs-version=' + \
-      options.msvs_version
-    generator_flags['msvs_version'] = options.msvs_version
-
-  # Generate all requested formats (use a set in case we got one format request
-  # twice)
-  for format in set(options.formats):
-    params = {'options': options,
-              'build_files': build_files,
-              'generator_flags': generator_flags,
-              'cwd': os.getcwd(),
-              'build_files_arg': build_files_arg,
-              'gyp_binary': sys.argv[0],
-              'home_dot_gyp': home_dot_gyp,
-              'parallel': options.parallel}
-
-    # Start with the default variables from the command line.
-    [generator, flat_list, targets, data] = Load(build_files, format,
-                                                 cmdline_default_variables,
-                                                 includes, options.depth,
-                                                 params, options.check,
-                                                 options.circular_check)
-
-    # TODO(mark): Pass |data| for now because the generator needs a list of
-    # build files that came in.  In the future, maybe it should just accept
-    # a list, and not the whole data dict.
-    # NOTE: flat_list is the flattened dependency graph specifying the order
-    # that targets may be built.  Build systems that operate serially or that
-    # need to have dependencies defined before dependents reference them should
-    # generate targets in the order specified in flat_list.
-    generator.GenerateOutput(flat_list, targets, data, params)
-
-    if options.configs:
-      valid_configs = targets[flat_list[0]]['configurations'].keys()
-      for conf in options.configs:
-        if conf not in valid_configs:
-          raise GypError('Invalid config specified via --build: %s' % conf)
-      generator.PerformBuild(data, options.configs, params)
-
-  # Done
-  return 0
-
-
-def main(args):
-  try:
-    return gyp_main(args)
-  except GypError, e:
-    sys.stderr.write("gyp: %s\n" % e)
-    return 1
-
-if __name__ == '__main__':
-  sys.exit(main(sys.argv[1:]))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/common.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,491 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-from __future__ import with_statement
-
-import errno
-import filecmp
-import os.path
-import re
-import tempfile
-import sys
-
-
-# A minimal memoizing decorator. It'll blow up if the args aren't immutable,
-# among other "problems".
-class memoize(object):
-  def __init__(self, func):
-    self.func = func
-    self.cache = {}
-  def __call__(self, *args):
-    try:
-      return self.cache[args]
-    except KeyError:
-      result = self.func(*args)
-      self.cache[args] = result
-      return result
-
-
-class GypError(Exception):
-  """Error class representing an error, which is to be presented
-  to the user.  The main entry point will catch and display this.
-  """
-  pass
-
-
-def ExceptionAppend(e, msg):
-  """Append a message to the given exception's message."""
-  if not e.args:
-    e.args = (msg,)
-  elif len(e.args) == 1:
-    e.args = (str(e.args[0]) + ' ' + msg,)
-  else:
-    e.args = (str(e.args[0]) + ' ' + msg,) + e.args[1:]
-
-
-def ParseQualifiedTarget(target):
-  # Splits a qualified target into a build file, target name and toolset.
-
-  # NOTE: rsplit is used to disambiguate the Windows drive letter separator.
-  target_split = target.rsplit(':', 1)
-  if len(target_split) == 2:
-    [build_file, target] = target_split
-  else:
-    build_file = None
-
-  target_split = target.rsplit('#', 1)
-  if len(target_split) == 2:
-    [target, toolset] = target_split
-  else:
-    toolset = None
-
-  return [build_file, target, toolset]
-
-
-def ResolveTarget(build_file, target, toolset):
-  # This function resolves a target into a canonical form:
-  # - a fully defined build file, either absolute or relative to the current
-  # directory
-  # - a target name
-  # - a toolset
-  #
-  # build_file is the file relative to which 'target' is defined.
-  # target is the qualified target.
-  # toolset is the default toolset for that target.
-  [parsed_build_file, target, parsed_toolset] = ParseQualifiedTarget(target)
-
-  if parsed_build_file:
-    if build_file:
-      # If a relative path, parsed_build_file is relative to the directory
-      # containing build_file.  If build_file is not in the current directory,
-      # parsed_build_file is not a usable path as-is.  Resolve it by
-      # interpreting it as relative to build_file.  If parsed_build_file is
-      # absolute, it is usable as a path regardless of the current directory,
-      # and os.path.join will return it as-is.
-      build_file = os.path.normpath(os.path.join(os.path.dirname(build_file),
-                                                 parsed_build_file))
-      # Further (to handle cases like ../cwd), make it relative to cwd)
-      if not os.path.isabs(build_file):
-        build_file = RelativePath(build_file, '.')
-    else:
-      build_file = parsed_build_file
-
-  if parsed_toolset:
-    toolset = parsed_toolset
-
-  return [build_file, target, toolset]
-
-
-def BuildFile(fully_qualified_target):
-  # Extracts the build file from the fully qualified target.
-  return ParseQualifiedTarget(fully_qualified_target)[0]
-
-
-def GetEnvironFallback(var_list, default):
-  """Look up a key in the environment, with fallback to secondary keys
-  and finally falling back to a default value."""
-  for var in var_list:
-    if var in os.environ:
-      return os.environ[var]
-  return default
-
-
-def QualifiedTarget(build_file, target, toolset):
-  # "Qualified" means the file that a target was defined in and the target
-  # name, separated by a colon, suffixed by a # and the toolset name:
-  # /path/to/file.gyp:target_name#toolset
-  fully_qualified = build_file + ':' + target
-  if toolset:
-    fully_qualified = fully_qualified + '#' + toolset
-  return fully_qualified
-
-
-@memoize
-def RelativePath(path, relative_to):
-  # Assuming both |path| and |relative_to| are relative to the current
-  # directory, returns a relative path that identifies path relative to
-  # relative_to.
-
-  # Convert to normalized (and therefore absolute paths).
-  path = os.path.realpath(path)
-  relative_to = os.path.realpath(relative_to)
-
-  # Split the paths into components.
-  path_split = path.split(os.path.sep)
-  relative_to_split = relative_to.split(os.path.sep)
-
-  # Determine how much of the prefix the two paths share.
-  prefix_len = len(os.path.commonprefix([path_split, relative_to_split]))
-
-  # Put enough ".." components to back up out of relative_to to the common
-  # prefix, and then append the part of path_split after the common prefix.
-  relative_split = [os.path.pardir] * (len(relative_to_split) - prefix_len) + \
-                   path_split[prefix_len:]
-
-  if len(relative_split) == 0:
-    # The paths were the same.
-    return ''
-
-  # Turn it back into a string and we're done.
-  return os.path.join(*relative_split)
-
-
-@memoize
-def InvertRelativePath(path, toplevel_dir=None):
-  """Given a path like foo/bar that is relative to toplevel_dir, return
-  the inverse relative path back to the toplevel_dir.
-
-  E.g. os.path.normpath(os.path.join(path, InvertRelativePath(path)))
-  should always produce the empty string, unless the path contains symlinks.
-  """
-  if not path:
-    return path
-  toplevel_dir = '.' if toplevel_dir is None else toplevel_dir
-  return RelativePath(toplevel_dir, os.path.join(toplevel_dir, path))
-
-
-def FixIfRelativePath(path, relative_to):
-  # Like RelativePath but returns |path| unchanged if it is absolute.
-  if os.path.isabs(path):
-    return path
-  return RelativePath(path, relative_to)
-
-
-def UnrelativePath(path, relative_to):
-  # Assuming that |relative_to| is relative to the current directory, and |path|
-  # is a path relative to the dirname of |relative_to|, returns a path that
-  # identifies |path| relative to the current directory.
-  rel_dir = os.path.dirname(relative_to)
-  return os.path.normpath(os.path.join(rel_dir, path))
-
-
-# re objects used by EncodePOSIXShellArgument.  See IEEE 1003.1 XCU.2.2 at
-# http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html#tag_02_02
-# and the documentation for various shells.
-
-# _quote is a pattern that should match any argument that needs to be quoted
-# with double-quotes by EncodePOSIXShellArgument.  It matches the following
-# characters appearing anywhere in an argument:
-#   \t, \n, space  parameter separators
-#   #              comments
-#   $              expansions (quoted to always expand within one argument)
-#   %              called out by IEEE 1003.1 XCU.2.2
-#   &              job control
-#   '              quoting
-#   (, )           subshell execution
-#   *, ?, [        pathname expansion
-#   ;              command delimiter
-#   <, >, |        redirection
-#   =              assignment
-#   {, }           brace expansion (bash)
-#   ~              tilde expansion
-# It also matches the empty string, because "" (or '') is the only way to
-# represent an empty string literal argument to a POSIX shell.
-#
-# This does not match the characters in _escape, because those need to be
-# backslash-escaped regardless of whether they appear in a double-quoted
-# string.
-_quote = re.compile('[\t\n #$%&\'()*;<=>?[{|}~]|^$')
-
-# _escape is a pattern that should match any character that needs to be
-# escaped with a backslash, whether or not the argument matched the _quote
-# pattern.  _escape is used with re.sub to backslash anything in _escape's
-# first match group, hence the (parentheses) in the regular expression.
-#
-# _escape matches the following characters appearing anywhere in an argument:
-#   "  to prevent POSIX shells from interpreting this character for quoting
-#   \  to prevent POSIX shells from interpreting this character for escaping
-#   `  to prevent POSIX shells from interpreting this character for command
-#      substitution
-# Missing from this list is $, because the desired behavior of
-# EncodePOSIXShellArgument is to permit parameter (variable) expansion.
-#
-# Also missing from this list is !, which bash will interpret as the history
-# expansion character when history is enabled.  bash does not enable history
-# by default in non-interactive shells, so this is not thought to be a problem.
-# ! was omitted from this list because bash interprets "\!" as a literal string
-# including the backslash character (avoiding history expansion but retaining
-# the backslash), which would not be correct for argument encoding.  Handling
-# this case properly would also be problematic because bash allows the history
-# character to be changed with the histchars shell variable.  Fortunately,
-# as history is not enabled in non-interactive shells and
-# EncodePOSIXShellArgument is only expected to encode for non-interactive
-# shells, there is no room for error here by ignoring !.
-_escape = re.compile(r'(["\\`])')
-
-def EncodePOSIXShellArgument(argument):
-  """Encodes |argument| suitably for consumption by POSIX shells.
-
-  argument may be quoted and escaped as necessary to ensure that POSIX shells
-  treat the returned value as a literal representing the argument passed to
-  this function.  Parameter (variable) expansions beginning with $ are allowed
-  to remain intact without escaping the $, to allow the argument to contain
-  references to variables to be expanded by the shell.
-  """
-
-  if not isinstance(argument, str):
-    argument = str(argument)
-
-  if _quote.search(argument):
-    quote = '"'
-  else:
-    quote = ''
-
-  encoded = quote + re.sub(_escape, r'\\\1', argument) + quote
-
-  return encoded
-
-
-def EncodePOSIXShellList(list):
-  """Encodes |list| suitably for consumption by POSIX shells.
-
-  Returns EncodePOSIXShellArgument for each item in list, and joins them
-  together using the space character as an argument separator.
-  """
-
-  encoded_arguments = []
-  for argument in list:
-    encoded_arguments.append(EncodePOSIXShellArgument(argument))
-  return ' '.join(encoded_arguments)
-
-
-def DeepDependencyTargets(target_dicts, roots):
-  """Returns the recursive list of target dependencies."""
-  dependencies = set()
-  pending = set(roots)
-  while pending:
-    # Pluck out one.
-    r = pending.pop()
-    # Skip if visited already.
-    if r in dependencies:
-      continue
-    # Add it.
-    dependencies.add(r)
-    # Add its children.
-    spec = target_dicts[r]
-    pending.update(set(spec.get('dependencies', [])))
-    pending.update(set(spec.get('dependencies_original', [])))
-  return list(dependencies - set(roots))
-
-
-def BuildFileTargets(target_list, build_file):
-  """From a target_list, returns the subset from the specified build_file.
-  """
-  return [p for p in target_list if BuildFile(p) == build_file]
-
-
-def AllTargets(target_list, target_dicts, build_file):
-  """Returns all targets (direct and dependencies) for the specified build_file.
-  """
-  bftargets = BuildFileTargets(target_list, build_file)
-  deptargets = DeepDependencyTargets(target_dicts, bftargets)
-  return bftargets + deptargets
-
-
-def WriteOnDiff(filename):
-  """Write to a file only if the new contents differ.
-
-  Arguments:
-    filename: name of the file to potentially write to.
-  Returns:
-    A file like object which will write to temporary file and only overwrite
-    the target if it differs (on close).
-  """
-
-  class Writer:
-    """Wrapper around file which only covers the target if it differs."""
-    def __init__(self):
-      # Pick temporary file.
-      tmp_fd, self.tmp_path = tempfile.mkstemp(
-          suffix='.tmp',
-          prefix=os.path.split(filename)[1] + '.gyp.',
-          dir=os.path.split(filename)[0])
-      try:
-        self.tmp_file = os.fdopen(tmp_fd, 'wb')
-      except Exception:
-        # Don't leave turds behind.
-        os.unlink(self.tmp_path)
-        raise
-
-    def __getattr__(self, attrname):
-      # Delegate everything else to self.tmp_file
-      return getattr(self.tmp_file, attrname)
-
-    def close(self):
-      try:
-        # Close tmp file.
-        self.tmp_file.close()
-        # Determine if different.
-        same = False
-        try:
-          same = filecmp.cmp(self.tmp_path, filename, False)
-        except OSError, e:
-          if e.errno != errno.ENOENT:
-            raise
-
-        if same:
-          # The new file is identical to the old one, just get rid of the new
-          # one.
-          os.unlink(self.tmp_path)
-        else:
-          # The new file is different from the old one, or there is no old one.
-          # Rename the new file to the permanent name.
-          #
-          # tempfile.mkstemp uses an overly restrictive mode, resulting in a
-          # file that can only be read by the owner, regardless of the umask.
-          # There's no reason to not respect the umask here, which means that
-          # an extra hoop is required to fetch it and reset the new file's mode.
-          #
-          # No way to get the umask without setting a new one?  Set a safe one
-          # and then set it back to the old value.
-          umask = os.umask(077)
-          os.umask(umask)
-          os.chmod(self.tmp_path, 0666 & ~umask)
-          if sys.platform == 'win32' and os.path.exists(filename):
-            # NOTE: on windows (but not cygwin) rename will not replace an
-            # existing file, so it must be preceded with a remove. Sadly there
-            # is no way to make the switch atomic.
-            os.remove(filename)
-          os.rename(self.tmp_path, filename)
-      except Exception:
-        # Don't leave turds behind.
-        os.unlink(self.tmp_path)
-        raise
-
-  return Writer()
-
-
-def GetFlavor(params):
-  """Returns |params.flavor| if it's set, the system's default flavor else."""
-  flavors = {
-    'cygwin': 'win',
-    'win32': 'win',
-    'darwin': 'mac',
-  }
-
-  if 'flavor' in params:
-    return params['flavor']
-  if sys.platform in flavors:
-    return flavors[sys.platform]
-  if sys.platform.startswith('sunos'):
-    return 'solaris'
-  if sys.platform.startswith('freebsd'):
-    return 'freebsd'
-  if sys.platform.startswith('openbsd'):
-    return 'openbsd'
-  if sys.platform.startswith('aix'):
-    return 'aix'
-
-  return 'linux'
-
-
-def CopyTool(flavor, out_path):
-  """Finds (mac|sun|win)_tool.gyp in the gyp directory and copies it
-  to |out_path|."""
-  prefix = { 'solaris': 'sun', 'mac': 'mac', 'win': 'win' }.get(flavor, None)
-  if not prefix:
-    return
-
-  # Slurp input file.
-  source_path = os.path.join(
-      os.path.dirname(os.path.abspath(__file__)), '%s_tool.py' % prefix)
-  with open(source_path) as source_file:
-    source = source_file.readlines()
-
-  # Add header and write it out.
-  tool_path = os.path.join(out_path, 'gyp-%s-tool' % prefix)
-  with open(tool_path, 'w') as tool_file:
-    tool_file.write(
-        ''.join([source[0], '# Generated by gyp. Do not edit.\n'] + source[1:]))
-
-  # Make file executable.
-  os.chmod(tool_path, 0755)
-
-
-# From Alex Martelli,
-# http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52560
-# ASPN: Python Cookbook: Remove duplicates from a sequence
-# First comment, dated 2001/10/13.
-# (Also in the printed Python Cookbook.)
-
-def uniquer(seq, idfun=None):
-    if idfun is None:
-        idfun = lambda x: x
-    seen = {}
-    result = []
-    for item in seq:
-        marker = idfun(item)
-        if marker in seen: continue
-        seen[marker] = 1
-        result.append(item)
-    return result
-
-
-class CycleError(Exception):
-  """An exception raised when an unexpected cycle is detected."""
-  def __init__(self, nodes):
-    self.nodes = nodes
-  def __str__(self):
-    return 'CycleError: cycle involving: ' + str(self.nodes)
-
-
-def TopologicallySorted(graph, get_edges):
-  """Topologically sort based on a user provided edge definition.
-
-  Args:
-    graph: A list of node names.
-    get_edges: A function mapping from node name to a hashable collection
-               of node names which this node has outgoing edges to.
-  Returns:
-    A list containing all of the node in graph in topological order.
-    It is assumed that calling get_edges once for each node and caching is
-    cheaper than repeatedly calling get_edges.
-  Raises:
-    CycleError in the event of a cycle.
-  Example:
-    graph = {'a': '$(b) $(c)', 'b': 'hi', 'c': '$(b)'}
-    def GetEdges(node):
-      return re.findall(r'\$\(([^))]\)', graph[node])
-    print TopologicallySorted(graph.keys(), GetEdges)
-    ==>
-    ['a', 'c', b']
-  """
-  get_edges = memoize(get_edges)
-  visited = set()
-  visiting = set()
-  ordered_nodes = []
-  def Visit(node):
-    if node in visiting:
-      raise CycleError(visiting)
-    if node in visited:
-      return
-    visited.add(node)
-    visiting.add(node)
-    for neighbor in get_edges(node):
-      Visit(neighbor)
-    visiting.remove(node)
-    ordered_nodes.insert(0, node)
-  for node in sorted(graph):
-    Visit(node)
-  return ordered_nodes
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/common_test.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,72 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Unit tests for the common.py file."""
-
-import gyp.common
-import unittest
-import sys
-
-
-class TestTopologicallySorted(unittest.TestCase):
-  def test_Valid(self):
-    """Test that sorting works on a valid graph with one possible order."""
-    graph = {
-        'a': ['b', 'c'],
-        'b': [],
-        'c': ['d'],
-        'd': ['b'],
-        }
-    def GetEdge(node):
-      return tuple(graph[node])
-    self.assertEqual(
-      gyp.common.TopologicallySorted(graph.keys(), GetEdge),
-      ['a', 'c', 'd', 'b'])
-
-  def test_Cycle(self):
-    """Test that an exception is thrown on a cyclic graph."""
-    graph = {
-        'a': ['b'],
-        'b': ['c'],
-        'c': ['d'],
-        'd': ['a'],
-        }
-    def GetEdge(node):
-      return tuple(graph[node])
-    self.assertRaises(
-      gyp.common.CycleError, gyp.common.TopologicallySorted,
-      graph.keys(), GetEdge)
-
-
-class TestGetFlavor(unittest.TestCase):
-  """Test that gyp.common.GetFlavor works as intended"""
-  original_platform = ''
-
-  def setUp(self):
-    self.original_platform = sys.platform
-
-  def tearDown(self):
-    sys.platform = self.original_platform
-
-  def assertFlavor(self, expected, argument, param):
-    sys.platform = argument
-    self.assertEqual(expected, gyp.common.GetFlavor(param))
-
-  def test_platform_default(self):
-    self.assertFlavor('freebsd', 'freebsd9' , {})
-    self.assertFlavor('freebsd', 'freebsd10', {})
-    self.assertFlavor('openbsd', 'openbsd5' , {})
-    self.assertFlavor('solaris', 'sunos5'   , {});
-    self.assertFlavor('solaris', 'sunos'    , {});
-    self.assertFlavor('linux'  , 'linux2'   , {});
-    self.assertFlavor('linux'  , 'linux3'   , {});
-
-  def test_param(self):
-    self.assertFlavor('foobar', 'linux2' , {'flavor': 'foobar'})
-
-
-if __name__ == '__main__':
-  unittest.main()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/easy_xml.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,157 +0,0 @@
-# Copyright (c) 2011 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import re
-import os
-
-
-def XmlToString(content, encoding='utf-8', pretty=False):
-  """ Writes the XML content to disk, touching the file only if it has changed.
-
-  Visual Studio files have a lot of pre-defined structures.  This function makes
-  it easy to represent these structures as Python data structures, instead of
-  having to create a lot of function calls.
-
-  Each XML element of the content is represented as a list composed of:
-  1. The name of the element, a string,
-  2. The attributes of the element, a dictionary (optional), and
-  3+. The content of the element, if any.  Strings are simple text nodes and
-      lists are child elements.
-
-  Example 1:
-      <test/>
-  becomes
-      ['test']
-
-  Example 2:
-      <myelement a='value1' b='value2'>
-         <childtype>This is</childtype>
-         <childtype>it!</childtype>
-      </myelement>
-
-  becomes
-      ['myelement', {'a':'value1', 'b':'value2'},
-         ['childtype', 'This is'],
-         ['childtype', 'it!'],
-      ]
-
-  Args:
-    content:  The structured content to be converted.
-    encoding: The encoding to report on the first XML line.
-    pretty: True if we want pretty printing with indents and new lines.
-
-  Returns:
-    The XML content as a string.
-  """
-  # We create a huge list of all the elements of the file.
-  xml_parts = ['<?xml version="1.0" encoding="%s"?>' % encoding]
-  if pretty:
-    xml_parts.append('\n')
-  _ConstructContentList(xml_parts, content, pretty)
-
-  # Convert it to a string
-  return ''.join(xml_parts)
-
-
-def _ConstructContentList(xml_parts, specification, pretty, level=0):
-  """ Appends the XML parts corresponding to the specification.
-
-  Args:
-    xml_parts: A list of XML parts to be appended to.
-    specification:  The specification of the element.  See EasyXml docs.
-    pretty: True if we want pretty printing with indents and new lines.
-    level: Indentation level.
-  """
-  # The first item in a specification is the name of the element.
-  if pretty:
-    indentation = '  ' * level
-    new_line = '\n'
-  else:
-    indentation = ''
-    new_line = ''
-  name = specification[0]
-  if not isinstance(name, str):
-    raise Exception('The first item of an EasyXml specification should be '
-                    'a string.  Specification was ' + str(specification))
-  xml_parts.append(indentation + '<' + name)
-
-  # Optionally in second position is a dictionary of the attributes.
-  rest = specification[1:]
-  if rest and isinstance(rest[0], dict):
-    for at, val in sorted(rest[0].iteritems()):
-      xml_parts.append(' %s="%s"' % (at, _XmlEscape(val, attr=True)))
-    rest = rest[1:]
-  if rest:
-    xml_parts.append('>')
-    all_strings = reduce(lambda x, y: x and isinstance(y, str), rest, True)
-    multi_line = not all_strings
-    if multi_line and new_line:
-      xml_parts.append(new_line)
-    for child_spec in rest:
-      # If it's a string, append a text node.
-      # Otherwise recurse over that child definition
-      if isinstance(child_spec, str):
-       xml_parts.append(_XmlEscape(child_spec))
-      else:
-        _ConstructContentList(xml_parts, child_spec, pretty, level + 1)
-    if multi_line and indentation:
-      xml_parts.append(indentation)
-    xml_parts.append('</%s>%s' % (name, new_line))
-  else:
-    xml_parts.append('/>%s' % new_line)
-
-
-def WriteXmlIfChanged(content, path, encoding='utf-8', pretty=False,
-                      win32=False):
-  """ Writes the XML content to disk, touching the file only if it has changed.
-
-  Args:
-    content:  The structured content to be written.
-    path: Location of the file.
-    encoding: The encoding to report on the first line of the XML file.
-    pretty: True if we want pretty printing with indents and new lines.
-  """
-  xml_string = XmlToString(content, encoding, pretty)
-  if win32 and os.linesep != '\r\n':
-    xml_string = xml_string.replace('\n', '\r\n')
-
-  # Get the old content
-  try:
-    f = open(path, 'r')
-    existing = f.read()
-    f.close()
-  except:
-    existing = None
-
-  # It has changed, write it
-  if existing != xml_string:
-    f = open(path, 'w')
-    f.write(xml_string)
-    f.close()
-
-
-_xml_escape_map = {
-    '"': '&quot;',
-    "'": '&apos;',
-    '<': '&lt;',
-    '>': '&gt;',
-    '&': '&amp;',
-    '\n': '&#xA;',
-    '\r': '&#xD;',
-}
-
-
-_xml_escape_re = re.compile(
-    "(%s)" % "|".join(map(re.escape, _xml_escape_map.keys())))
-
-
-def _XmlEscape(value, attr=False):
-  """ Escape a string for inclusion in XML."""
-  def replace(match):
-    m = match.string[match.start() : match.end()]
-    # don't replace single quotes in attrs
-    if attr and m == "'":
-      return m
-    return _xml_escape_map[m]
-  return _xml_escape_re.sub(replace, value)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/easy_xml_test.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,103 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2011 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-""" Unit tests for the easy_xml.py file. """
-
-import gyp.easy_xml as easy_xml
-import unittest
-import StringIO
-
-
-class TestSequenceFunctions(unittest.TestCase):
-
-  def setUp(self):
-    self.stderr = StringIO.StringIO()
-
-  def test_EasyXml_simple(self):
-    self.assertEqual(
-      easy_xml.XmlToString(['test']),
-      '<?xml version="1.0" encoding="utf-8"?><test/>')
-
-    self.assertEqual(
-      easy_xml.XmlToString(['test'], encoding='Windows-1252'),
-      '<?xml version="1.0" encoding="Windows-1252"?><test/>')
-
-  def test_EasyXml_simple_with_attributes(self):
-    self.assertEqual(
-      easy_xml.XmlToString(['test2', {'a': 'value1', 'b': 'value2'}]),
-      '<?xml version="1.0" encoding="utf-8"?><test2 a="value1" b="value2"/>')
-
-  def test_EasyXml_escaping(self):
-    original = '<test>\'"\r&\nfoo'
-    converted = '&lt;test&gt;\'&quot;&#xD;&amp;&#xA;foo'
-    converted_apos = converted.replace("'", '&apos;')
-    self.assertEqual(
-      easy_xml.XmlToString(['test3', {'a': original}, original]),
-      '<?xml version="1.0" encoding="utf-8"?><test3 a="%s">%s</test3>' %
-      (converted, converted_apos))
-
-  def test_EasyXml_pretty(self):
-    self.assertEqual(
-      easy_xml.XmlToString(
-          ['test3',
-            ['GrandParent',
-              ['Parent1',
-                ['Child']
-              ],
-              ['Parent2']
-            ]
-          ],
-          pretty=True),
-      '<?xml version="1.0" encoding="utf-8"?>\n'
-      '<test3>\n'
-      '  <GrandParent>\n'
-      '    <Parent1>\n'
-      '      <Child/>\n'
-      '    </Parent1>\n'
-      '    <Parent2/>\n'
-      '  </GrandParent>\n'
-      '</test3>\n')
-
-
-  def test_EasyXml_complex(self):
-    # We want to create:
-    target = (
-      '<?xml version="1.0" encoding="utf-8"?>'
-      '<Project>'
-        '<PropertyGroup Label="Globals">'
-          '<ProjectGuid>{D2250C20-3A94-4FB9-AF73-11BC5B73884B}</ProjectGuid>'
-          '<Keyword>Win32Proj</Keyword>'
-          '<RootNamespace>automated_ui_tests</RootNamespace>'
-        '</PropertyGroup>'
-        '<Import Project="$(VCTargetsPath)\\Microsoft.Cpp.props"/>'
-        '<PropertyGroup '
-            'Condition="\'$(Configuration)|$(Platform)\'=='
-                       '\'Debug|Win32\'" Label="Configuration">'
-          '<ConfigurationType>Application</ConfigurationType>'
-          '<CharacterSet>Unicode</CharacterSet>'
-        '</PropertyGroup>'
-      '</Project>')
-
-    xml = easy_xml.XmlToString(
-        ['Project',
-          ['PropertyGroup', {'Label': 'Globals'},
-            ['ProjectGuid', '{D2250C20-3A94-4FB9-AF73-11BC5B73884B}'],
-            ['Keyword', 'Win32Proj'],
-            ['RootNamespace', 'automated_ui_tests']
-          ],
-          ['Import', {'Project': '$(VCTargetsPath)\\Microsoft.Cpp.props'}],
-          ['PropertyGroup',
-            {'Condition': "'$(Configuration)|$(Platform)'=='Debug|Win32'",
-             'Label': 'Configuration'},
-            ['ConfigurationType', 'Application'],
-            ['CharacterSet', 'Unicode']
-          ]
-        ])
-    self.assertEqual(xml, target)
-
-
-if __name__ == '__main__':
-  unittest.main()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/android.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1099 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Notes:
-#
-# This generates makefiles suitable for inclusion into the Android build system
-# via an Android.mk file. It is based on make.py, the standard makefile
-# generator.
-#
-# The code below generates a separate .mk file for each target, but
-# all are sourced by the top-level GypAndroid.mk.  This means that all
-# variables in .mk-files clobber one another, and furthermore that any
-# variables set potentially clash with other Android build system variables.
-# Try to avoid setting global variables where possible.
-
-import gyp
-import gyp.common
-import gyp.generator.make as make  # Reuse global functions from make backend.
-import os
-import re
-import subprocess
-
-generator_default_variables = {
-  'OS': 'android',
-  'EXECUTABLE_PREFIX': '',
-  'EXECUTABLE_SUFFIX': '',
-  'STATIC_LIB_PREFIX': 'lib',
-  'SHARED_LIB_PREFIX': 'lib',
-  'STATIC_LIB_SUFFIX': '.a',
-  'SHARED_LIB_SUFFIX': '.so',
-  'INTERMEDIATE_DIR': '$(gyp_intermediate_dir)',
-  'SHARED_INTERMEDIATE_DIR': '$(gyp_shared_intermediate_dir)',
-  'PRODUCT_DIR': '$(gyp_shared_intermediate_dir)',
-  'SHARED_LIB_DIR': '$(builddir)/lib.$(TOOLSET)',
-  'LIB_DIR': '$(obj).$(TOOLSET)',
-  'RULE_INPUT_ROOT': '%(INPUT_ROOT)s',  # This gets expanded by Python.
-  'RULE_INPUT_DIRNAME': '%(INPUT_DIRNAME)s',  # This gets expanded by Python.
-  'RULE_INPUT_PATH': '$(RULE_SOURCES)',
-  'RULE_INPUT_EXT': '$(suffix $<)',
-  'RULE_INPUT_NAME': '$(notdir $<)',
-  'CONFIGURATION_NAME': '$(GYP_DEFAULT_CONFIGURATION)',
-}
-
-# Make supports multiple toolsets
-generator_supports_multiple_toolsets = True
-
-
-# Generator-specific gyp specs.
-generator_additional_non_configuration_keys = [
-    # Boolean to declare that this target does not want its name mangled.
-    'android_unmangled_name',
-]
-generator_additional_path_sections = []
-generator_extra_sources_for_rules = []
-
-
-SHARED_FOOTER = """\
-# "gyp_all_modules" is a concatenation of the "gyp_all_modules" targets from
-# all the included sub-makefiles. This is just here to clarify.
-gyp_all_modules:
-"""
-
-header = """\
-# This file is generated by gyp; do not edit.
-
-"""
-
-android_standard_include_paths = set([
-    # JNI_H_INCLUDE in build/core/binary.mk
-    'dalvik/libnativehelper/include/nativehelper',
-    # from SRC_HEADERS in build/core/config.mk
-    'system/core/include',
-    'hardware/libhardware/include',
-    'hardware/libhardware_legacy/include',
-    'hardware/ril/include',
-    'dalvik/libnativehelper/include',
-    'frameworks/native/include',
-    'frameworks/native/opengl/include',
-    'frameworks/base/include',
-    'frameworks/base/opengl/include',
-    'frameworks/base/native/include',
-    'external/skia/include',
-    # TARGET_C_INCLUDES in build/core/combo/TARGET_linux-arm.mk
-    'bionic/libc/arch-arm/include',
-    'bionic/libc/include',
-    'bionic/libstdc++/include',
-    'bionic/libc/kernel/common',
-    'bionic/libc/kernel/arch-arm',
-    'bionic/libm/include',
-    'bionic/libm/include/arm',
-    'bionic/libthread_db/include',
-    ])
-
-
-# Map gyp target types to Android module classes.
-MODULE_CLASSES = {
-    'static_library': 'STATIC_LIBRARIES',
-    'shared_library': 'SHARED_LIBRARIES',
-    'executable': 'EXECUTABLES',
-}
-
-
-def IsCPPExtension(ext):
-  return make.COMPILABLE_EXTENSIONS.get(ext) == 'cxx'
-
-
-def Sourceify(path):
-  """Convert a path to its source directory form. The Android backend does not
-     support options.generator_output, so this function is a noop."""
-  return path
-
-
-# Map from qualified target to path to output.
-# For Android, the target of these maps is a tuple ('static', 'modulename'),
-# ('dynamic', 'modulename'), or ('path', 'some/path') instead of a string,
-# since we link by module.
-target_outputs = {}
-# Map from qualified target to any linkable output.  A subset
-# of target_outputs.  E.g. when mybinary depends on liba, we want to
-# include liba in the linker line; when otherbinary depends on
-# mybinary, we just want to build mybinary first.
-target_link_deps = {}
-
-
-class AndroidMkWriter(object):
-  """AndroidMkWriter packages up the writing of one target-specific Android.mk.
-
-  Its only real entry point is Write(), and is mostly used for namespacing.
-  """
-
-  def __init__(self, android_top_dir):
-    self.android_top_dir = android_top_dir
-
-  def Write(self, qualified_target, relative_target, base_path, output_filename,
-            spec, configs, part_of_all):
-    """The main entry point: writes a .mk file for a single target.
-
-    Arguments:
-      qualified_target: target we're generating
-      relative_target: qualified target name relative to the root
-      base_path: path relative to source root we're building in, used to resolve
-                 target-relative paths
-      output_filename: output .mk file name to write
-      spec, configs: gyp info
-      part_of_all: flag indicating this target is part of 'all'
-    """
-    make.ensure_directory_exists(output_filename)
-
-    self.fp = open(output_filename, 'w')
-
-    self.fp.write(header)
-
-    self.qualified_target = qualified_target
-    self.relative_target = relative_target
-    self.path = base_path
-    self.target = spec['target_name']
-    self.type = spec['type']
-    self.toolset = spec['toolset']
-
-    deps, link_deps = self.ComputeDeps(spec)
-
-    # Some of the generation below can add extra output, sources, or
-    # link dependencies.  All of the out params of the functions that
-    # follow use names like extra_foo.
-    extra_outputs = []
-    extra_sources = []
-
-    self.android_class = MODULE_CLASSES.get(self.type, 'GYP')
-    self.android_module = self.ComputeAndroidModule(spec)
-    (self.android_stem, self.android_suffix) = self.ComputeOutputParts(spec)
-    self.output = self.output_binary = self.ComputeOutput(spec)
-
-    # Standard header.
-    self.WriteLn('include $(CLEAR_VARS)\n')
-
-    # Module class and name.
-    self.WriteLn('LOCAL_MODULE_CLASS := ' + self.android_class)
-    self.WriteLn('LOCAL_MODULE := ' + self.android_module)
-    # Only emit LOCAL_MODULE_STEM if it's different to LOCAL_MODULE.
-    # The library module classes fail if the stem is set. ComputeOutputParts
-    # makes sure that stem == modulename in these cases.
-    if self.android_stem != self.android_module:
-      self.WriteLn('LOCAL_MODULE_STEM := ' + self.android_stem)
-    self.WriteLn('LOCAL_MODULE_SUFFIX := ' + self.android_suffix)
-    self.WriteLn('LOCAL_MODULE_TAGS := optional')
-    if self.toolset == 'host':
-      self.WriteLn('LOCAL_IS_HOST_MODULE := true')
-
-    # Grab output directories; needed for Actions and Rules.
-    self.WriteLn('gyp_intermediate_dir := $(call local-intermediates-dir)')
-    self.WriteLn('gyp_shared_intermediate_dir := '
-                 '$(call intermediates-dir-for,GYP,shared)')
-    self.WriteLn()
-
-    # List files this target depends on so that actions/rules/copies/sources
-    # can depend on the list.
-    # TODO: doesn't pull in things through transitive link deps; needed?
-    target_dependencies = [x[1] for x in deps if x[0] == 'path']
-    self.WriteLn('# Make sure our deps are built first.')
-    self.WriteList(target_dependencies, 'GYP_TARGET_DEPENDENCIES',
-                   local_pathify=True)
-
-    # Actions must come first, since they can generate more OBJs for use below.
-    if 'actions' in spec:
-      self.WriteActions(spec['actions'], extra_sources, extra_outputs)
-
-    # Rules must be early like actions.
-    if 'rules' in spec:
-      self.WriteRules(spec['rules'], extra_sources, extra_outputs)
-
-    if 'copies' in spec:
-      self.WriteCopies(spec['copies'], extra_outputs)
-
-    # GYP generated outputs.
-    self.WriteList(extra_outputs, 'GYP_GENERATED_OUTPUTS', local_pathify=True)
-
-    # Set LOCAL_ADDITIONAL_DEPENDENCIES so that Android's build rules depend
-    # on both our dependency targets and our generated files.
-    self.WriteLn('# Make sure our deps and generated files are built first.')
-    self.WriteLn('LOCAL_ADDITIONAL_DEPENDENCIES := $(GYP_TARGET_DEPENDENCIES) '
-                 '$(GYP_GENERATED_OUTPUTS)')
-    self.WriteLn()
-
-    # Sources.
-    if spec.get('sources', []) or extra_sources:
-      self.WriteSources(spec, configs, extra_sources)
-
-    self.WriteTarget(spec, configs, deps, link_deps, part_of_all)
-
-    # Update global list of target outputs, used in dependency tracking.
-    target_outputs[qualified_target] = ('path', self.output_binary)
-
-    # Update global list of link dependencies.
-    if self.type == 'static_library':
-      target_link_deps[qualified_target] = ('static', self.android_module)
-    elif self.type == 'shared_library':
-      target_link_deps[qualified_target] = ('shared', self.android_module)
-
-    self.fp.close()
-    return self.android_module
-
-
-  def WriteActions(self, actions, extra_sources, extra_outputs):
-    """Write Makefile code for any 'actions' from the gyp input.
-
-    extra_sources: a list that will be filled in with newly generated source
-                   files, if any
-    extra_outputs: a list that will be filled in with any outputs of these
-                   actions (used to make other pieces dependent on these
-                   actions)
-    """
-    for action in actions:
-      name = make.StringToMakefileVariable('%s_%s' % (self.relative_target,
-                                                      action['action_name']))
-      self.WriteLn('### Rules for action "%s":' % action['action_name'])
-      inputs = action['inputs']
-      outputs = action['outputs']
-
-      # Build up a list of outputs.
-      # Collect the output dirs we'll need.
-      dirs = set()
-      for out in outputs:
-        if not out.startswith('$'):
-          print ('WARNING: Action for target "%s" writes output to local path '
-                 '"%s".' % (self.target, out))
-        dir = os.path.split(out)[0]
-        if dir:
-          dirs.add(dir)
-      if int(action.get('process_outputs_as_sources', False)):
-        extra_sources += outputs
-
-      # Prepare the actual command.
-      command = gyp.common.EncodePOSIXShellList(action['action'])
-      if 'message' in action:
-        quiet_cmd = 'Gyp action: %s ($@)' % action['message']
-      else:
-        quiet_cmd = 'Gyp action: %s ($@)' % name
-      if len(dirs) > 0:
-        command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command
-
-      cd_action = 'cd $(gyp_local_path)/%s; ' % self.path
-      command = cd_action + command
-
-      # The makefile rules are all relative to the top dir, but the gyp actions
-      # are defined relative to their containing dir.  This replaces the gyp_*
-      # variables for the action rule with an absolute version so that the
-      # output goes in the right place.
-      # Only write the gyp_* rules for the "primary" output (:1);
-      # it's superfluous for the "extra outputs", and this avoids accidentally
-      # writing duplicate dummy rules for those outputs.
-      main_output = make.QuoteSpaces(self.LocalPathify(outputs[0]))
-      self.WriteLn('%s: gyp_local_path := $(LOCAL_PATH)' % main_output)
-      self.WriteLn('%s: gyp_intermediate_dir := '
-                   '$(GYP_ABS_ANDROID_TOP_DIR)/$(gyp_intermediate_dir)' %
-                   main_output)
-      self.WriteLn('%s: gyp_shared_intermediate_dir := '
-                   '$(GYP_ABS_ANDROID_TOP_DIR)/$(gyp_shared_intermediate_dir)' %
-                   main_output)
-
-      # Android's envsetup.sh adds a number of directories to the path including
-      # the built host binary directory. This causes actions/rules invoked by
-      # gyp to sometimes use these instead of system versions, e.g. bison.
-      # The built host binaries may not be suitable, and can cause errors.
-      # So, we remove them from the PATH using the ANDROID_BUILD_PATHS variable
-      # set by envsetup.
-      self.WriteLn('%s: export PATH := $(subst $(ANDROID_BUILD_PATHS),,$(PATH))'
-                   % main_output)
-
-      for input in inputs:
-        assert ' ' not in input, (
-            "Spaces in action input filenames not supported (%s)"  % input)
-      for output in outputs:
-        assert ' ' not in output, (
-            "Spaces in action output filenames not supported (%s)"  % output)
-
-      self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES)' %
-                   (main_output, ' '.join(map(self.LocalPathify, inputs))))
-      self.WriteLn('\t@echo "%s"' % quiet_cmd)
-      self.WriteLn('\t$(hide)%s\n' % command)
-      for output in outputs[1:]:
-        # Make each output depend on the main output, with an empty command
-        # to force make to notice that the mtime has changed.
-        self.WriteLn('%s: %s ;' % (self.LocalPathify(output), main_output))
-
-      extra_outputs += outputs
-      self.WriteLn()
-
-    self.WriteLn()
-
-
-  def WriteRules(self, rules, extra_sources, extra_outputs):
-    """Write Makefile code for any 'rules' from the gyp input.
-
-    extra_sources: a list that will be filled in with newly generated source
-                   files, if any
-    extra_outputs: a list that will be filled in with any outputs of these
-                   rules (used to make other pieces dependent on these rules)
-    """
-    if len(rules) == 0:
-      return
-    rule_trigger = '%s_rule_trigger' % self.android_module
-
-    did_write_rule = False
-    for rule in rules:
-      if len(rule.get('rule_sources', [])) == 0:
-        continue
-      did_write_rule = True
-      name = make.StringToMakefileVariable('%s_%s' % (self.relative_target,
-                                                      rule['rule_name']))
-      self.WriteLn('\n### Generated for rule "%s":' % name)
-      self.WriteLn('# "%s":' % rule)
-
-      inputs = rule.get('inputs')
-      for rule_source in rule.get('rule_sources', []):
-        (rule_source_dirname, rule_source_basename) = os.path.split(rule_source)
-        (rule_source_root, rule_source_ext) = \
-            os.path.splitext(rule_source_basename)
-
-        outputs = [self.ExpandInputRoot(out, rule_source_root,
-                                        rule_source_dirname)
-                   for out in rule['outputs']]
-
-        dirs = set()
-        for out in outputs:
-          if not out.startswith('$'):
-            print ('WARNING: Rule for target %s writes output to local path %s'
-                   % (self.target, out))
-          dir = os.path.dirname(out)
-          if dir:
-            dirs.add(dir)
-        extra_outputs += outputs
-        if int(rule.get('process_outputs_as_sources', False)):
-          extra_sources.extend(outputs)
-
-        components = []
-        for component in rule['action']:
-          component = self.ExpandInputRoot(component, rule_source_root,
-                                           rule_source_dirname)
-          if '$(RULE_SOURCES)' in component:
-            component = component.replace('$(RULE_SOURCES)',
-                                          rule_source)
-          components.append(component)
-
-        command = gyp.common.EncodePOSIXShellList(components)
-        cd_action = 'cd $(gyp_local_path)/%s; ' % self.path
-        command = cd_action + command
-        if dirs:
-          command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command
-
-        # We set up a rule to build the first output, and then set up
-        # a rule for each additional output to depend on the first.
-        outputs = map(self.LocalPathify, outputs)
-        main_output = outputs[0]
-        self.WriteLn('%s: gyp_local_path := $(LOCAL_PATH)' % main_output)
-        self.WriteLn('%s: gyp_intermediate_dir := '
-                     '$(GYP_ABS_ANDROID_TOP_DIR)/$(gyp_intermediate_dir)'
-                     % main_output)
-        self.WriteLn('%s: gyp_shared_intermediate_dir := '
-                     '$(GYP_ABS_ANDROID_TOP_DIR)/$(gyp_shared_intermediate_dir)'
-                     % main_output)
-
-        # See explanation in WriteActions.
-        self.WriteLn('%s: export PATH := '
-                     '$(subst $(ANDROID_BUILD_PATHS),,$(PATH))' % main_output)
-
-        main_output_deps = self.LocalPathify(rule_source)
-        if inputs:
-          main_output_deps += ' '
-          main_output_deps += ' '.join([self.LocalPathify(f) for f in inputs])
-
-        self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES)' %
-                     (main_output, main_output_deps))
-        self.WriteLn('\t%s\n' % command)
-        for output in outputs[1:]:
-          self.WriteLn('%s: %s' % (output, main_output))
-        self.WriteLn('.PHONY: %s' % (rule_trigger))
-        self.WriteLn('%s: %s' % (rule_trigger, main_output))
-        self.WriteLn('')
-    if did_write_rule:
-      extra_sources.append(rule_trigger)  # Force all rules to run.
-      self.WriteLn('### Finished generating for all rules')
-      self.WriteLn('')
-
-
-  def WriteCopies(self, copies, extra_outputs):
-    """Write Makefile code for any 'copies' from the gyp input.
-
-    extra_outputs: a list that will be filled in with any outputs of this action
-                   (used to make other pieces dependent on this action)
-    """
-    self.WriteLn('### Generated for copy rule.')
-
-    variable = make.StringToMakefileVariable(self.relative_target + '_copies')
-    outputs = []
-    for copy in copies:
-      for path in copy['files']:
-        # The Android build system does not allow generation of files into the
-        # source tree. The destination should start with a variable, which will
-        # typically be $(gyp_intermediate_dir) or
-        # $(gyp_shared_intermediate_dir). Note that we can't use an assertion
-        # because some of the gyp tests depend on this.
-        if not copy['destination'].startswith('$'):
-          print ('WARNING: Copy rule for target %s writes output to '
-                 'local path %s' % (self.target, copy['destination']))
-
-        # LocalPathify() calls normpath, stripping trailing slashes.
-        path = Sourceify(self.LocalPathify(path))
-        filename = os.path.split(path)[1]
-        output = Sourceify(self.LocalPathify(os.path.join(copy['destination'],
-                                                          filename)))
-
-        self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES) | $(ACP)' %
-                     (output, path))
-        self.WriteLn('\t@echo Copying: $@')
-        self.WriteLn('\t$(hide) mkdir -p $(dir $@)')
-        self.WriteLn('\t$(hide) $(ACP) -r $< $@')
-        self.WriteLn()
-        outputs.append(output)
-    self.WriteLn('%s = %s' % (variable,
-                              ' '.join(map(make.QuoteSpaces, outputs))))
-    extra_outputs.append('$(%s)' % variable)
-    self.WriteLn()
-
-
-  def WriteSourceFlags(self, spec, configs):
-    """Write out the flags and include paths used to compile source files for
-    the current target.
-
-    Args:
-      spec, configs: input from gyp.
-    """
-    config = configs[spec['default_configuration']]
-    extracted_includes = []
-
-    self.WriteLn('\n# Flags passed to both C and C++ files.')
-    cflags, includes_from_cflags = self.ExtractIncludesFromCFlags(
-        config.get('cflags'))
-    extracted_includes.extend(includes_from_cflags)
-    self.WriteList(cflags, 'MY_CFLAGS')
-
-    cflags_c, includes_from_cflags_c = self.ExtractIncludesFromCFlags(
-        config.get('cflags_c'))
-    extracted_includes.extend(includes_from_cflags_c)
-    self.WriteList(cflags_c, 'MY_CFLAGS_C')
-
-    self.WriteList(config.get('defines'), 'MY_DEFS', prefix='-D',
-                   quoter=make.EscapeCppDefine)
-    self.WriteLn('LOCAL_CFLAGS := $(MY_CFLAGS_C) $(MY_CFLAGS) $(MY_DEFS)')
-
-    # Undefine ANDROID for host modules
-    # TODO: the source code should not use macro ANDROID to tell if it's host or
-    # target module.
-    if self.toolset == 'host':
-      self.WriteLn('# Undefine ANDROID for host modules')
-      self.WriteLn('LOCAL_CFLAGS += -UANDROID')
-
-    self.WriteLn('\n# Include paths placed before CFLAGS/CPPFLAGS')
-    includes = list(config.get('include_dirs', []))
-    includes.extend(extracted_includes)
-    includes = map(Sourceify, map(self.LocalPathify, includes))
-    includes = self.NormalizeIncludePaths(includes)
-    self.WriteList(includes, 'LOCAL_C_INCLUDES')
-    self.WriteLn('LOCAL_C_INCLUDES := $(GYP_COPIED_SOURCE_ORIGIN_DIRS) '
-                                     '$(LOCAL_C_INCLUDES)')
-
-    self.WriteLn('\n# Flags passed to only C++ (and not C) files.')
-    self.WriteList(config.get('cflags_cc'), 'LOCAL_CPPFLAGS')
-
-
-  def WriteSources(self, spec, configs, extra_sources):
-    """Write Makefile code for any 'sources' from the gyp input.
-    These are source files necessary to build the current target.
-    We need to handle shared_intermediate directory source files as
-    a special case by copying them to the intermediate directory and
-    treating them as a genereated sources. Otherwise the Android build
-    rules won't pick them up.
-
-    Args:
-      spec, configs: input from gyp.
-      extra_sources: Sources generated from Actions or Rules.
-    """
-    sources = filter(make.Compilable, spec.get('sources', []))
-    generated_not_sources = [x for x in extra_sources if not make.Compilable(x)]
-    extra_sources = filter(make.Compilable, extra_sources)
-
-    # Determine and output the C++ extension used by these sources.
-    # We simply find the first C++ file and use that extension.
-    all_sources = sources + extra_sources
-    local_cpp_extension = '.cpp'
-    for source in all_sources:
-      (root, ext) = os.path.splitext(source)
-      if IsCPPExtension(ext):
-        local_cpp_extension = ext
-        break
-    if local_cpp_extension != '.cpp':
-      self.WriteLn('LOCAL_CPP_EXTENSION := %s' % local_cpp_extension)
-
-    # We need to move any non-generated sources that are coming from the
-    # shared intermediate directory out of LOCAL_SRC_FILES and put them
-    # into LOCAL_GENERATED_SOURCES. We also need to move over any C++ files
-    # that don't match our local_cpp_extension, since Android will only
-    # generate Makefile rules for a single LOCAL_CPP_EXTENSION.
-    local_files = []
-    for source in sources:
-      (root, ext) = os.path.splitext(source)
-      if '$(gyp_shared_intermediate_dir)' in source:
-        extra_sources.append(source)
-      elif '$(gyp_intermediate_dir)' in source:
-        extra_sources.append(source)
-      elif IsCPPExtension(ext) and ext != local_cpp_extension:
-        extra_sources.append(source)
-      else:
-        local_files.append(os.path.normpath(os.path.join(self.path, source)))
-
-    # For any generated source, if it is coming from the shared intermediate
-    # directory then we add a Make rule to copy them to the local intermediate
-    # directory first. This is because the Android LOCAL_GENERATED_SOURCES
-    # must be in the local module intermediate directory for the compile rules
-    # to work properly. If the file has the wrong C++ extension, then we add
-    # a rule to copy that to intermediates and use the new version.
-    final_generated_sources = []
-    # If a source file gets copied, we still need to add the orginal source
-    # directory as header search path, for GCC searches headers in the
-    # directory that contains the source file by default.
-    origin_src_dirs = []
-    for source in extra_sources:
-      local_file = source
-      if not '$(gyp_intermediate_dir)/' in local_file:
-        basename = os.path.basename(local_file)
-        local_file = '$(gyp_intermediate_dir)/' + basename
-      (root, ext) = os.path.splitext(local_file)
-      if IsCPPExtension(ext) and ext != local_cpp_extension:
-        local_file = root + local_cpp_extension
-      if local_file != source:
-        self.WriteLn('%s: %s' % (local_file, self.LocalPathify(source)))
-        self.WriteLn('\tmkdir -p $(@D); cp $< $@')
-        origin_src_dirs.append(os.path.dirname(source))
-      final_generated_sources.append(local_file)
-
-    # We add back in all of the non-compilable stuff to make sure that the
-    # make rules have dependencies on them.
-    final_generated_sources.extend(generated_not_sources)
-    self.WriteList(final_generated_sources, 'LOCAL_GENERATED_SOURCES')
-
-    origin_src_dirs = gyp.common.uniquer(origin_src_dirs)
-    origin_src_dirs = map(Sourceify, map(self.LocalPathify, origin_src_dirs))
-    self.WriteList(origin_src_dirs, 'GYP_COPIED_SOURCE_ORIGIN_DIRS')
-
-    self.WriteList(local_files, 'LOCAL_SRC_FILES')
-
-    # Write out the flags used to compile the source; this must be done last
-    # so that GYP_COPIED_SOURCE_ORIGIN_DIRS can be used as an include path.
-    self.WriteSourceFlags(spec, configs)
-
-
-  def ComputeAndroidModule(self, spec):
-    """Return the Android module name used for a gyp spec.
-
-    We use the complete qualified target name to avoid collisions between
-    duplicate targets in different directories. We also add a suffix to
-    distinguish gyp-generated module names.
-    """
-
-    if int(spec.get('android_unmangled_name', 0)):
-      assert self.type != 'shared_library' or self.target.startswith('lib')
-      return self.target
-
-    if self.type == 'shared_library':
-      # For reasons of convention, the Android build system requires that all
-      # shared library modules are named 'libfoo' when generating -l flags.
-      prefix = 'lib_'
-    else:
-      prefix = ''
-
-    if spec['toolset'] == 'host':
-      suffix = '_host_gyp'
-    else:
-      suffix = '_gyp'
-
-    if self.path:
-      name = '%s%s_%s%s' % (prefix, self.path, self.target, suffix)
-    else:
-      name = '%s%s%s' % (prefix, self.target, suffix)
-
-    return make.StringToMakefileVariable(name)
-
-
-  def ComputeOutputParts(self, spec):
-    """Return the 'output basename' of a gyp spec, split into filename + ext.
-
-    Android libraries must be named the same thing as their module name,
-    otherwise the linker can't find them, so product_name and so on must be
-    ignored if we are building a library, and the "lib" prepending is
-    not done for Android.
-    """
-    assert self.type != 'loadable_module' # TODO: not supported?
-
-    target = spec['target_name']
-    target_prefix = ''
-    target_ext = ''
-    if self.type == 'static_library':
-      target = self.ComputeAndroidModule(spec)
-      target_ext = '.a'
-    elif self.type == 'shared_library':
-      target = self.ComputeAndroidModule(spec)
-      target_ext = '.so'
-    elif self.type == 'none':
-      target_ext = '.stamp'
-    elif self.type != 'executable':
-      print ("ERROR: What output file should be generated?",
-             "type", self.type, "target", target)
-
-    if self.type != 'static_library' and self.type != 'shared_library':
-      target_prefix = spec.get('product_prefix', target_prefix)
-      target = spec.get('product_name', target)
-      product_ext = spec.get('product_extension')
-      if product_ext:
-        target_ext = '.' + product_ext
-
-    target_stem = target_prefix + target
-    return (target_stem, target_ext)
-
-
-  def ComputeOutputBasename(self, spec):
-    """Return the 'output basename' of a gyp spec.
-
-    E.g., the loadable module 'foobar' in directory 'baz' will produce
-      'libfoobar.so'
-    """
-    return ''.join(self.ComputeOutputParts(spec))
-
-
-  def ComputeOutput(self, spec):
-    """Return the 'output' (full output path) of a gyp spec.
-
-    E.g., the loadable module 'foobar' in directory 'baz' will produce
-      '$(obj)/baz/libfoobar.so'
-    """
-    if self.type == 'executable' and self.toolset == 'host':
-      # We install host executables into shared_intermediate_dir so they can be
-      # run by gyp rules that refer to PRODUCT_DIR.
-      path = '$(gyp_shared_intermediate_dir)'
-    elif self.type == 'shared_library':
-      if self.toolset == 'host':
-        path = '$(HOST_OUT_INTERMEDIATE_LIBRARIES)'
-      else:
-        path = '$(TARGET_OUT_INTERMEDIATE_LIBRARIES)'
-    else:
-      # Other targets just get built into their intermediate dir.
-      if self.toolset == 'host':
-        path = '$(call intermediates-dir-for,%s,%s,true)' % (self.android_class,
-                                                            self.android_module)
-      else:
-        path = '$(call intermediates-dir-for,%s,%s)' % (self.android_class,
-                                                        self.android_module)
-
-    assert spec.get('product_dir') is None # TODO: not supported?
-    return os.path.join(path, self.ComputeOutputBasename(spec))
-
-
-  def NormalizeLdFlags(self, ld_flags):
-    """ Clean up ldflags from gyp file.
-    Remove any ldflags that contain android_top_dir.
-
-    Args:
-      ld_flags: ldflags from gyp files.
-
-    Returns:
-      clean ldflags
-    """
-    clean_ldflags = []
-    for flag in ld_flags:
-      if self.android_top_dir in flag:
-        continue
-      clean_ldflags.append(flag)
-    return clean_ldflags
-
-  def NormalizeIncludePaths(self, include_paths):
-    """ Normalize include_paths.
-    Convert absolute paths to relative to the Android top directory;
-    filter out include paths that are already brought in by the Android build
-    system.
-
-    Args:
-      include_paths: A list of unprocessed include paths.
-    Returns:
-      A list of normalized include paths.
-    """
-    normalized = []
-    for path in include_paths:
-      if path[0] == '/':
-        path = gyp.common.RelativePath(path, self.android_top_dir)
-
-      # Filter out the Android standard search path.
-      if path not in android_standard_include_paths:
-        normalized.append(path)
-    return normalized
-
-  def ExtractIncludesFromCFlags(self, cflags):
-    """Extract includes "-I..." out from cflags
-
-    Args:
-      cflags: A list of compiler flags, which may be mixed with "-I.."
-    Returns:
-      A tuple of lists: (clean_clfags, include_paths). "-I.." is trimmed.
-    """
-    clean_cflags = []
-    include_paths = []
-    if cflags:
-      for flag in cflags:
-        if flag.startswith('-I'):
-          include_paths.append(flag[2:])
-        else:
-          clean_cflags.append(flag)
-
-    return (clean_cflags, include_paths)
-
-  def ComputeAndroidLibraryModuleNames(self, libraries):
-    """Compute the Android module names from libraries, ie spec.get('libraries')
-
-    Args:
-      libraries: the value of spec.get('libraries')
-    Returns:
-      A tuple (static_lib_modules, dynamic_lib_modules)
-    """
-    static_lib_modules = []
-    dynamic_lib_modules = []
-    for libs in libraries:
-      # Libs can have multiple words.
-      for lib in libs.split():
-        # Filter the system libraries, which are added by default by the Android
-        # build system.
-        if (lib == '-lc' or lib == '-lstdc++' or lib == '-lm' or
-            lib.endswith('libgcc.a')):
-          continue
-        match = re.search(r'([^/]+)\.a$', lib)
-        if match:
-          static_lib_modules.append(match.group(1))
-          continue
-        match = re.search(r'([^/]+)\.so$', lib)
-        if match:
-          dynamic_lib_modules.append(match.group(1))
-          continue
-        # "-lstlport" -> libstlport
-        if lib.startswith('-l'):
-          if lib.endswith('_static'):
-            static_lib_modules.append('lib' + lib[2:])
-          else:
-            dynamic_lib_modules.append('lib' + lib[2:])
-    return (static_lib_modules, dynamic_lib_modules)
-
-
-  def ComputeDeps(self, spec):
-    """Compute the dependencies of a gyp spec.
-
-    Returns a tuple (deps, link_deps), where each is a list of
-    filenames that will need to be put in front of make for either
-    building (deps) or linking (link_deps).
-    """
-    deps = []
-    link_deps = []
-    if 'dependencies' in spec:
-      deps.extend([target_outputs[dep] for dep in spec['dependencies']
-                   if target_outputs[dep]])
-      for dep in spec['dependencies']:
-        if dep in target_link_deps:
-          link_deps.append(target_link_deps[dep])
-      deps.extend(link_deps)
-    return (gyp.common.uniquer(deps), gyp.common.uniquer(link_deps))
-
-
-  def WriteTargetFlags(self, spec, configs, link_deps):
-    """Write Makefile code to specify the link flags and library dependencies.
-
-    spec, configs: input from gyp.
-    link_deps: link dependency list; see ComputeDeps()
-    """
-    config = configs[spec['default_configuration']]
-
-    # LDFLAGS
-    ldflags = list(config.get('ldflags', []))
-    static_flags, dynamic_flags = self.ComputeAndroidLibraryModuleNames(
-        ldflags)
-    self.WriteLn('')
-    self.WriteList(self.NormalizeLdFlags(ldflags), 'LOCAL_LDFLAGS')
-
-    # Libraries (i.e. -lfoo)
-    libraries = gyp.common.uniquer(spec.get('libraries', []))
-    static_libs, dynamic_libs = self.ComputeAndroidLibraryModuleNames(
-        libraries)
-
-    # Link dependencies (i.e. libfoo.a, libfoo.so)
-    static_link_deps = [x[1] for x in link_deps if x[0] == 'static']
-    shared_link_deps = [x[1] for x in link_deps if x[0] == 'shared']
-    self.WriteLn('')
-    self.WriteList(static_flags + static_libs + static_link_deps,
-                   'LOCAL_STATIC_LIBRARIES')
-    self.WriteLn('# Enable grouping to fix circular references')
-    self.WriteLn('LOCAL_GROUP_STATIC_LIBRARIES := true')
-    self.WriteLn('')
-    self.WriteList(dynamic_flags + dynamic_libs + shared_link_deps,
-                   'LOCAL_SHARED_LIBRARIES')
-
-
-  def WriteTarget(self, spec, configs, deps, link_deps, part_of_all):
-    """Write Makefile code to produce the final target of the gyp spec.
-
-    spec, configs: input from gyp.
-    deps, link_deps: dependency lists; see ComputeDeps()
-    part_of_all: flag indicating this target is part of 'all'
-    """
-    self.WriteLn('### Rules for final target.')
-
-    if self.type != 'none':
-      self.WriteTargetFlags(spec, configs, link_deps)
-
-    # Add to the set of targets which represent the gyp 'all' target. We use the
-    # name 'gyp_all_modules' as the Android build system doesn't allow the use
-    # of the Make target 'all' and because 'all_modules' is the equivalent of
-    # the Make target 'all' on Android.
-    if part_of_all:
-      self.WriteLn('# Add target alias to "gyp_all_modules" target.')
-      self.WriteLn('.PHONY: gyp_all_modules')
-      self.WriteLn('gyp_all_modules: %s' % self.android_module)
-      self.WriteLn('')
-
-    # Add an alias from the gyp target name to the Android module name. This
-    # simplifies manual builds of the target, and is required by the test
-    # framework.
-    if self.target != self.android_module:
-      self.WriteLn('# Alias gyp target name.')
-      self.WriteLn('.PHONY: %s' % self.target)
-      self.WriteLn('%s: %s' % (self.target, self.android_module))
-      self.WriteLn('')
-
-    # Add the command to trigger build of the target type depending
-    # on the toolset. Ex: BUILD_STATIC_LIBRARY vs. BUILD_HOST_STATIC_LIBRARY
-    # NOTE: This has to come last!
-    modifier = ''
-    if self.toolset == 'host':
-      modifier = 'HOST_'
-    if self.type == 'static_library':
-      self.WriteLn('include $(BUILD_%sSTATIC_LIBRARY)' % modifier)
-    elif self.type == 'shared_library':
-      self.WriteLn('LOCAL_PRELINK_MODULE := false')
-      self.WriteLn('include $(BUILD_%sSHARED_LIBRARY)' % modifier)
-    elif self.type == 'executable':
-      if self.toolset == 'host':
-        self.WriteLn('LOCAL_MODULE_PATH := $(gyp_shared_intermediate_dir)')
-      else:
-        # Don't install target executables for now, as it results in them being
-        # included in ROM. This can be revisited if there's a reason to install
-        # them later.
-        self.WriteLn('LOCAL_UNINSTALLABLE_MODULE := true')
-      self.WriteLn('include $(BUILD_%sEXECUTABLE)' % modifier)
-    else:
-      self.WriteLn('LOCAL_MODULE_PATH := $(PRODUCT_OUT)/gyp_stamp')
-      self.WriteLn('LOCAL_UNINSTALLABLE_MODULE := true')
-      self.WriteLn()
-      self.WriteLn('include $(BUILD_SYSTEM)/base_rules.mk')
-      self.WriteLn()
-      self.WriteLn('$(LOCAL_BUILT_MODULE): $(LOCAL_ADDITIONAL_DEPENDENCIES)')
-      self.WriteLn('\t$(hide) echo "Gyp timestamp: $@"')
-      self.WriteLn('\t$(hide) mkdir -p $(dir $@)')
-      self.WriteLn('\t$(hide) touch $@')
-
-
-  def WriteList(self, value_list, variable=None, prefix='',
-                quoter=make.QuoteIfNecessary, local_pathify=False):
-    """Write a variable definition that is a list of values.
-
-    E.g. WriteList(['a','b'], 'foo', prefix='blah') writes out
-         foo = blaha blahb
-    but in a pretty-printed style.
-    """
-    values = ''
-    if value_list:
-      value_list = [quoter(prefix + l) for l in value_list]
-      if local_pathify:
-        value_list = [self.LocalPathify(l) for l in value_list]
-      values = ' \\\n\t' + ' \\\n\t'.join(value_list)
-    self.fp.write('%s :=%s\n\n' % (variable, values))
-
-
-  def WriteLn(self, text=''):
-    self.fp.write(text + '\n')
-
-
-  def LocalPathify(self, path):
-    """Convert a subdirectory-relative path into a normalized path which starts
-    with the make variable $(LOCAL_PATH) (i.e. the top of the project tree).
-    Absolute paths, or paths that contain variables, are just normalized."""
-    if '$(' in path or os.path.isabs(path):
-      # path is not a file in the project tree in this case, but calling
-      # normpath is still important for trimming trailing slashes.
-      return os.path.normpath(path)
-    local_path = os.path.join('$(LOCAL_PATH)', self.path, path)
-    local_path = os.path.normpath(local_path)
-    # Check that normalizing the path didn't ../ itself out of $(LOCAL_PATH)
-    # - i.e. that the resulting path is still inside the project tree. The
-    # path may legitimately have ended up containing just $(LOCAL_PATH), though,
-    # so we don't look for a slash.
-    assert local_path.startswith('$(LOCAL_PATH)'), (
-           'Path %s attempts to escape from gyp path %s !)' % (path, self.path))
-    return local_path
-
-
-  def ExpandInputRoot(self, template, expansion, dirname):
-    if '%(INPUT_ROOT)s' not in template and '%(INPUT_DIRNAME)s' not in template:
-      return template
-    path = template % {
-        'INPUT_ROOT': expansion,
-        'INPUT_DIRNAME': dirname,
-        }
-    return path
-
-
-def PerformBuild(data, configurations, params):
-  # The android backend only supports the default configuration.
-  options = params['options']
-  makefile = os.path.abspath(os.path.join(options.toplevel_dir,
-                                          'GypAndroid.mk'))
-  env = dict(os.environ)
-  env['ONE_SHOT_MAKEFILE'] = makefile
-  arguments = ['make', '-C', os.environ['ANDROID_BUILD_TOP'], 'gyp_all_modules']
-  print 'Building: %s' % arguments
-  subprocess.check_call(arguments, env=env)
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  options = params['options']
-  generator_flags = params.get('generator_flags', {})
-  builddir_name = generator_flags.get('output_dir', 'out')
-  limit_to_target_all = generator_flags.get('limit_to_target_all', False)
-  android_top_dir = os.environ.get('ANDROID_BUILD_TOP')
-  assert android_top_dir, '$ANDROID_BUILD_TOP not set; you need to run lunch.'
-
-  def CalculateMakefilePath(build_file, base_name):
-    """Determine where to write a Makefile for a given gyp file."""
-    # Paths in gyp files are relative to the .gyp file, but we want
-    # paths relative to the source root for the master makefile.  Grab
-    # the path of the .gyp file as the base to relativize against.
-    # E.g. "foo/bar" when we're constructing targets for "foo/bar/baz.gyp".
-    base_path = gyp.common.RelativePath(os.path.dirname(build_file),
-                                        options.depth)
-    # We write the file in the base_path directory.
-    output_file = os.path.join(options.depth, base_path, base_name)
-    assert not options.generator_output, (
-        'The Android backend does not support options.generator_output.')
-    base_path = gyp.common.RelativePath(os.path.dirname(build_file),
-                                        options.toplevel_dir)
-    return base_path, output_file
-
-  # TODO:  search for the first non-'Default' target.  This can go
-  # away when we add verification that all targets have the
-  # necessary configurations.
-  default_configuration = None
-  toolsets = set([target_dicts[target]['toolset'] for target in target_list])
-  for target in target_list:
-    spec = target_dicts[target]
-    if spec['default_configuration'] != 'Default':
-      default_configuration = spec['default_configuration']
-      break
-  if not default_configuration:
-    default_configuration = 'Default'
-
-  srcdir = '.'
-  makefile_name = 'GypAndroid' + options.suffix + '.mk'
-  makefile_path = os.path.join(options.toplevel_dir, makefile_name)
-  assert not options.generator_output, (
-      'The Android backend does not support options.generator_output.')
-  make.ensure_directory_exists(makefile_path)
-  root_makefile = open(makefile_path, 'w')
-
-  root_makefile.write(header)
-
-  # We set LOCAL_PATH just once, here, to the top of the project tree. This
-  # allows all the other paths we use to be relative to the Android.mk file,
-  # as the Android build system expects.
-  root_makefile.write('\nLOCAL_PATH := $(call my-dir)\n')
-
-  # Find the list of targets that derive from the gyp file(s) being built.
-  needed_targets = set()
-  for build_file in params['build_files']:
-    for target in gyp.common.AllTargets(target_list, target_dicts, build_file):
-      needed_targets.add(target)
-
-  build_files = set()
-  include_list = set()
-  android_modules = {}
-  for qualified_target in target_list:
-    build_file, target, toolset = gyp.common.ParseQualifiedTarget(
-        qualified_target)
-    relative_build_file = gyp.common.RelativePath(build_file,
-                                                  options.toplevel_dir)
-    build_files.add(relative_build_file)
-    included_files = data[build_file]['included_files']
-    for included_file in included_files:
-      # The included_files entries are relative to the dir of the build file
-      # that included them, so we have to undo that and then make them relative
-      # to the root dir.
-      relative_include_file = gyp.common.RelativePath(
-          gyp.common.UnrelativePath(included_file, build_file),
-          options.toplevel_dir)
-      abs_include_file = os.path.abspath(relative_include_file)
-      # If the include file is from the ~/.gyp dir, we should use absolute path
-      # so that relocating the src dir doesn't break the path.
-      if (params['home_dot_gyp'] and
-          abs_include_file.startswith(params['home_dot_gyp'])):
-        build_files.add(abs_include_file)
-      else:
-        build_files.add(relative_include_file)
-
-    base_path, output_file = CalculateMakefilePath(build_file,
-        target + '.' + toolset + options.suffix + '.mk')
-
-    spec = target_dicts[qualified_target]
-    configs = spec['configurations']
-
-    part_of_all = (qualified_target in needed_targets and
-                   not int(spec.get('suppress_wildcard', False)))
-    if limit_to_target_all and not part_of_all:
-      continue
-
-    relative_target = gyp.common.QualifiedTarget(relative_build_file, target,
-                                                 toolset)
-    writer = AndroidMkWriter(android_top_dir)
-    android_module = writer.Write(qualified_target, relative_target, base_path,
-                                  output_file, spec, configs,
-                                  part_of_all=part_of_all)
-    if android_module in android_modules:
-      print ('ERROR: Android module names must be unique. The following '
-             'targets both generate Android module name %s.\n  %s\n  %s' %
-             (android_module, android_modules[android_module],
-              qualified_target))
-      return
-    android_modules[android_module] = qualified_target
-
-    # Our root_makefile lives at the source root.  Compute the relative path
-    # from there to the output_file for including.
-    mkfile_rel_path = gyp.common.RelativePath(output_file,
-                                              os.path.dirname(makefile_path))
-    include_list.add(mkfile_rel_path)
-
-  # Some tools need to know the absolute path of the top directory.
-  root_makefile.write('GYP_ABS_ANDROID_TOP_DIR := $(shell pwd)\n')
-  root_makefile.write('GYP_DEFAULT_CONFIGURATION := %s\n' %
-                      default_configuration)
-
-  # Write out the sorted list of includes.
-  root_makefile.write('\n')
-  for include_file in sorted(include_list):
-    root_makefile.write('include $(LOCAL_PATH)/' + include_file + '\n')
-  root_makefile.write('\n')
-
-  root_makefile.write(SHARED_FOOTER)
-
-  root_makefile.close()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/dump_dependency_json.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,93 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import collections
-import os
-import gyp
-import gyp.common
-import gyp.msvs_emulation
-import json
-import sys
-
-generator_supports_multiple_toolsets = True
-
-generator_wants_static_library_dependencies_adjusted = False
-
-generator_default_variables = {
-}
-for dirname in ['INTERMEDIATE_DIR', 'SHARED_INTERMEDIATE_DIR', 'PRODUCT_DIR',
-                'LIB_DIR', 'SHARED_LIB_DIR']:
-  # Some gyp steps fail if these are empty(!).
-  generator_default_variables[dirname] = 'dir'
-for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME',
-               'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT',
-               'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX',
-               'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX',
-               'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX',
-               'CONFIGURATION_NAME']:
-  generator_default_variables[unused] = ''
-
-
-def CalculateVariables(default_variables, params):
-  generator_flags = params.get('generator_flags', {})
-  for key, val in generator_flags.items():
-    default_variables.setdefault(key, val)
-  default_variables.setdefault('OS', gyp.common.GetFlavor(params))
-
-  flavor = gyp.common.GetFlavor(params)
-  if flavor =='win':
-    # Copy additional generator configuration data from VS, which is shared
-    # by the Windows Ninja generator.
-    import gyp.generator.msvs as msvs_generator
-    generator_additional_non_configuration_keys = getattr(msvs_generator,
-        'generator_additional_non_configuration_keys', [])
-    generator_additional_path_sections = getattr(msvs_generator,
-        'generator_additional_path_sections', [])
-
-    # Set a variable so conditions can be based on msvs_version.
-    msvs_version = gyp.msvs_emulation.GetVSVersion(generator_flags)
-    default_variables['MSVS_VERSION'] = msvs_version.ShortName()
-
-    # To determine processor word size on Windows, in addition to checking
-    # PROCESSOR_ARCHITECTURE (which reflects the word size of the current
-    # process), it is also necessary to check PROCESSOR_ARCHITEW6432 (which
-    # contains the actual word size of the system when running thru WOW64).
-    if ('64' in os.environ.get('PROCESSOR_ARCHITECTURE', '') or
-        '64' in os.environ.get('PROCESSOR_ARCHITEW6432', '')):
-      default_variables['MSVS_OS_BITS'] = 64
-    else:
-      default_variables['MSVS_OS_BITS'] = 32
-
-
-def CalculateGeneratorInputInfo(params):
-  """Calculate the generator specific info that gets fed to input (called by
-  gyp)."""
-  generator_flags = params.get('generator_flags', {})
-  if generator_flags.get('adjust_static_libraries', False):
-    global generator_wants_static_library_dependencies_adjusted
-    generator_wants_static_library_dependencies_adjusted = True
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  # Map of target -> list of targets it depends on.
-  edges = {}
-
-  # Queue of targets to visit.
-  targets_to_visit = target_list[:]
-
-  while len(targets_to_visit) > 0:
-    target = targets_to_visit.pop()
-    if target in edges:
-      continue
-    edges[target] = []
-
-    for dep in target_dicts[target].get('dependencies', []):
-      edges[target].append(dep)
-      targets_to_visit.append(dep)
-
-  filename = 'dump.json'
-  f = open(filename, 'w')
-  json.dump(edges, f)
-  f.close()
-  print 'Wrote json to %s.' % filename
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/eclipse.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,277 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""GYP backend that generates Eclipse CDT settings files.
-
-This backend DOES NOT generate Eclipse CDT projects. Instead, it generates XML
-files that can be imported into an Eclipse CDT project. The XML file contains a
-list of include paths and symbols (i.e. defines).
-
-Because a full .cproject definition is not created by this generator, it's not
-possible to properly define the include dirs and symbols for each file
-individually.  Instead, one set of includes/symbols is generated for the entire
-project.  This works fairly well (and is a vast improvement in general), but may
-still result in a few indexer issues here and there.
-
-This generator has no automated tests, so expect it to be broken.
-"""
-
-from xml.sax.saxutils import escape
-import os.path
-import subprocess
-import gyp
-import gyp.common
-import shlex
-
-generator_wants_static_library_dependencies_adjusted = False
-
-generator_default_variables = {
-}
-
-for dirname in ['INTERMEDIATE_DIR', 'PRODUCT_DIR', 'LIB_DIR', 'SHARED_LIB_DIR']:
-  # Some gyp steps fail if these are empty(!).
-  generator_default_variables[dirname] = 'dir'
-
-for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME',
-               'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT',
-               'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX',
-               'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX',
-               'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX',
-               'CONFIGURATION_NAME']:
-  generator_default_variables[unused] = ''
-
-# Include dirs will occasionally use the SHARED_INTERMEDIATE_DIR variable as
-# part of the path when dealing with generated headers.  This value will be
-# replaced dynamically for each configuration.
-generator_default_variables['SHARED_INTERMEDIATE_DIR'] = \
-    '$SHARED_INTERMEDIATE_DIR'
-
-
-def CalculateVariables(default_variables, params):
-  generator_flags = params.get('generator_flags', {})
-  for key, val in generator_flags.items():
-    default_variables.setdefault(key, val)
-  default_variables.setdefault('OS', gyp.common.GetFlavor(params))
-
-
-def CalculateGeneratorInputInfo(params):
-  """Calculate the generator specific info that gets fed to input (called by
-  gyp)."""
-  generator_flags = params.get('generator_flags', {})
-  if generator_flags.get('adjust_static_libraries', False):
-    global generator_wants_static_library_dependencies_adjusted
-    generator_wants_static_library_dependencies_adjusted = True
-
-
-def GetAllIncludeDirectories(target_list, target_dicts,
-                             shared_intermediate_dirs, config_name):
-  """Calculate the set of include directories to be used.
-
-  Returns:
-    A list including all the include_dir's specified for every target followed
-    by any include directories that were added as cflag compiler options.
-  """
-
-  gyp_includes_set = set()
-  compiler_includes_list = []
-
-  for target_name in target_list:
-    target = target_dicts[target_name]
-    if config_name in target['configurations']:
-      config = target['configurations'][config_name]
-
-      # Look for any include dirs that were explicitly added via cflags. This
-      # may be done in gyp files to force certain includes to come at the end.
-      # TODO(jgreenwald): Change the gyp files to not abuse cflags for this, and
-      # remove this.
-      cflags = config['cflags']
-      for cflag in cflags:
-        include_dir = ''
-        if cflag.startswith('-I'):
-          include_dir = cflag[2:]
-        if include_dir and not include_dir in compiler_includes_list:
-          compiler_includes_list.append(include_dir)
-
-      # Find standard gyp include dirs.
-      if config.has_key('include_dirs'):
-        include_dirs = config['include_dirs']
-        for shared_intermediate_dir in shared_intermediate_dirs:
-          for include_dir in include_dirs:
-            include_dir = include_dir.replace('$SHARED_INTERMEDIATE_DIR',
-                                              shared_intermediate_dir)
-            if not os.path.isabs(include_dir):
-              base_dir = os.path.dirname(target_name)
-
-              include_dir = base_dir + '/' + include_dir
-              include_dir = os.path.abspath(include_dir)
-
-            if not include_dir in gyp_includes_set:
-              gyp_includes_set.add(include_dir)
-
-
-  # Generate a list that has all the include dirs.
-  all_includes_list = list(gyp_includes_set)
-  all_includes_list.sort()
-  for compiler_include in compiler_includes_list:
-    if not compiler_include in gyp_includes_set:
-      all_includes_list.append(compiler_include)
-
-  # All done.
-  return all_includes_list
-
-
-def GetCompilerPath(target_list, target_dicts, data):
-  """Determine a command that can be used to invoke the compiler.
-
-  Returns:
-    If this is a gyp project that has explicit make settings, try to determine
-    the compiler from that.  Otherwise, see if a compiler was specified via the
-    CC_target environment variable.
-  """
-
-  # First, see if the compiler is configured in make's settings.
-  build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0])
-  make_global_settings_dict = data[build_file].get('make_global_settings', {})
-  for key, value in make_global_settings_dict:
-    if key in ['CC', 'CXX']:
-      return value
-
-  # Check to see if the compiler was specified as an environment variable.
-  for key in ['CC_target', 'CC', 'CXX']:
-    compiler = os.environ.get(key)
-    if compiler:
-      return compiler
-
-  return 'gcc'
-
-
-def GetAllDefines(target_list, target_dicts, data, config_name):
-  """Calculate the defines for a project.
-
-  Returns:
-    A dict that includes explict defines declared in gyp files along with all of
-    the default defines that the compiler uses.
-  """
-
-  # Get defines declared in the gyp files.
-  all_defines = {}
-  for target_name in target_list:
-    target = target_dicts[target_name]
-
-    if config_name in target['configurations']:
-      config = target['configurations'][config_name]
-      for define in config['defines']:
-        split_define = define.split('=', 1)
-        if len(split_define) == 1:
-          split_define.append('1')
-        if split_define[0].strip() in all_defines:
-          # Already defined
-          continue
-
-        all_defines[split_define[0].strip()] = split_define[1].strip()
-
-  # Get default compiler defines (if possible).
-  cc_target = GetCompilerPath(target_list, target_dicts, data)
-  if cc_target:
-    command = shlex.split(cc_target)
-    command.extend(['-E', '-dM', '-'])
-    cpp_proc = subprocess.Popen(args=command, cwd='.',
-                                stdin=subprocess.PIPE, stdout=subprocess.PIPE)
-    cpp_output = cpp_proc.communicate()[0]
-    cpp_lines = cpp_output.split('\n')
-    for cpp_line in cpp_lines:
-      if not cpp_line.strip():
-        continue
-      cpp_line_parts = cpp_line.split(' ', 2)
-      key = cpp_line_parts[1]
-      if len(cpp_line_parts) >= 3:
-        val = cpp_line_parts[2]
-      else:
-        val = '1'
-      all_defines[key] = val
-
-  return all_defines
-
-
-def WriteIncludePaths(out, eclipse_langs, include_dirs):
-  """Write the includes section of a CDT settings export file."""
-
-  out.write('  <section name="org.eclipse.cdt.internal.ui.wizards.' \
-            'settingswizards.IncludePaths">\n')
-  out.write('    <language name="holder for library settings"></language>\n')
-  for lang in eclipse_langs:
-    out.write('    <language name="%s">\n' % lang)
-    for include_dir in include_dirs:
-      out.write('      <includepath workspace_path="false">%s</includepath>\n' %
-                include_dir)
-    out.write('    </language>\n')
-  out.write('  </section>\n')
-
-
-def WriteMacros(out, eclipse_langs, defines):
-  """Write the macros section of a CDT settings export file."""
-
-  out.write('  <section name="org.eclipse.cdt.internal.ui.wizards.' \
-            'settingswizards.Macros">\n')
-  out.write('    <language name="holder for library settings"></language>\n')
-  for lang in eclipse_langs:
-    out.write('    <language name="%s">\n' % lang)
-    for key in sorted(defines.iterkeys()):
-      out.write('      <macro><name>%s</name><value>%s</value></macro>\n' %
-                (escape(key), escape(defines[key])))
-    out.write('    </language>\n')
-  out.write('  </section>\n')
-
-
-def GenerateOutputForConfig(target_list, target_dicts, data, params,
-                            config_name):
-  options = params['options']
-  generator_flags = params.get('generator_flags', {})
-
-  # build_dir: relative path from source root to our output files.
-  # e.g. "out/Debug"
-  build_dir = os.path.join(generator_flags.get('output_dir', 'out'),
-                           config_name)
-
-  toplevel_build = os.path.join(options.toplevel_dir, build_dir)
-  # Ninja uses out/Debug/gen while make uses out/Debug/obj/gen as the
-  # SHARED_INTERMEDIATE_DIR. Include both possible locations.
-  shared_intermediate_dirs = [os.path.join(toplevel_build, 'obj', 'gen'),
-                              os.path.join(toplevel_build, 'gen')]
-
-  if not os.path.exists(toplevel_build):
-    os.makedirs(toplevel_build)
-  out = open(os.path.join(toplevel_build, 'eclipse-cdt-settings.xml'), 'w')
-
-  out.write('<?xml version="1.0" encoding="UTF-8"?>\n')
-  out.write('<cdtprojectproperties>\n')
-
-  eclipse_langs = ['C++ Source File', 'C Source File', 'Assembly Source File',
-                   'GNU C++', 'GNU C', 'Assembly']
-  include_dirs = GetAllIncludeDirectories(target_list, target_dicts,
-                                          shared_intermediate_dirs, config_name)
-  WriteIncludePaths(out, eclipse_langs, include_dirs)
-  defines = GetAllDefines(target_list, target_dicts, data, config_name)
-  WriteMacros(out, eclipse_langs, defines)
-
-  out.write('</cdtprojectproperties>\n')
-  out.close()
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  """Generate an XML settings file that can be imported into a CDT project."""
-
-  if params['options'].generator_output:
-    raise NotImplementedError, "--generator_output not implemented for eclipse"
-
-  user_config = params.get('generator_flags', {}).get('config', None)
-  if user_config:
-    GenerateOutputForConfig(target_list, target_dicts, data, params,
-                            user_config)
-  else:
-    config_names = target_dicts[target_list[0]]['configurations'].keys()
-    for config_name in config_names:
-      GenerateOutputForConfig(target_list, target_dicts, data, params,
-                              config_name)
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/gypd.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,87 +0,0 @@
-# Copyright (c) 2011 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""gypd output module
-
-This module produces gyp input as its output.  Output files are given the
-.gypd extension to avoid overwriting the .gyp files that they are generated
-from.  Internal references to .gyp files (such as those found in
-"dependencies" sections) are not adjusted to point to .gypd files instead;
-unlike other paths, which are relative to the .gyp or .gypd file, such paths
-are relative to the directory from which gyp was run to create the .gypd file.
-
-This generator module is intended to be a sample and a debugging aid, hence
-the "d" for "debug" in .gypd.  It is useful to inspect the results of the
-various merges, expansions, and conditional evaluations performed by gyp
-and to see a representation of what would be fed to a generator module.
-
-It's not advisable to rename .gypd files produced by this module to .gyp,
-because they will have all merges, expansions, and evaluations already
-performed and the relevant constructs not present in the output; paths to
-dependencies may be wrong; and various sections that do not belong in .gyp
-files such as such as "included_files" and "*_excluded" will be present.
-Output will also be stripped of comments.  This is not intended to be a
-general-purpose gyp pretty-printer; for that, you probably just want to
-run "pprint.pprint(eval(open('source.gyp').read()))", which will still strip
-comments but won't do all of the other things done to this module's output.
-
-The specific formatting of the output generated by this module is subject
-to change.
-"""
-
-
-import gyp.common
-import errno
-import os
-import pprint
-
-
-# These variables should just be spit back out as variable references.
-_generator_identity_variables = [
-  'EXECUTABLE_PREFIX',
-  'EXECUTABLE_SUFFIX',
-  'INTERMEDIATE_DIR',
-  'PRODUCT_DIR',
-  'RULE_INPUT_ROOT',
-  'RULE_INPUT_DIRNAME',
-  'RULE_INPUT_EXT',
-  'RULE_INPUT_NAME',
-  'RULE_INPUT_PATH',
-  'SHARED_INTERMEDIATE_DIR',
-]
-
-# gypd doesn't define a default value for OS like many other generator
-# modules.  Specify "-D OS=whatever" on the command line to provide a value.
-generator_default_variables = {
-}
-
-# gypd supports multiple toolsets
-generator_supports_multiple_toolsets = True
-
-# TODO(mark): This always uses <, which isn't right.  The input module should
-# notify the generator to tell it which phase it is operating in, and this
-# module should use < for the early phase and then switch to > for the late
-# phase.  Bonus points for carrying @ back into the output too.
-for v in _generator_identity_variables:
-  generator_default_variables[v] = '<(%s)' % v
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  output_files = {}
-  for qualified_target in target_list:
-    [input_file, target] = \
-        gyp.common.ParseQualifiedTarget(qualified_target)[0:2]
-
-    if input_file[-4:] != '.gyp':
-      continue
-    input_file_stem = input_file[:-4]
-    output_file = input_file_stem + params['options'].suffix + '.gypd'
-
-    if not output_file in output_files:
-      output_files[output_file] = input_file
-
-  for output_file, input_file in output_files.iteritems():
-    output = open(output_file, 'w')
-    pprint.pprint(data[input_file], output)
-    output.close()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/gypsh.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-# Copyright (c) 2011 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""gypsh output module
-
-gypsh is a GYP shell.  It's not really a generator per se.  All it does is
-fire up an interactive Python session with a few local variables set to the
-variables passed to the generator.  Like gypd, it's intended as a debugging
-aid, to facilitate the exploration of .gyp structures after being processed
-by the input module.
-
-The expected usage is "gyp -f gypsh -D OS=desired_os".
-"""
-
-
-import code
-import sys
-
-
-# All of this stuff about generator variables was lovingly ripped from gypd.py.
-# That module has a much better description of what's going on and why.
-_generator_identity_variables = [
-  'EXECUTABLE_PREFIX',
-  'EXECUTABLE_SUFFIX',
-  'INTERMEDIATE_DIR',
-  'PRODUCT_DIR',
-  'RULE_INPUT_ROOT',
-  'RULE_INPUT_DIRNAME',
-  'RULE_INPUT_EXT',
-  'RULE_INPUT_NAME',
-  'RULE_INPUT_PATH',
-  'SHARED_INTERMEDIATE_DIR',
-]
-
-generator_default_variables = {
-}
-
-for v in _generator_identity_variables:
-  generator_default_variables[v] = '<(%s)' % v
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  locals = {
-        'target_list':  target_list,
-        'target_dicts': target_dicts,
-        'data':         data,
-      }
-
-  # Use a banner that looks like the stock Python one and like what
-  # code.interact uses by default, but tack on something to indicate what
-  # locals are available, and identify gypsh.
-  banner='Python %s on %s\nlocals.keys() = %s\ngypsh' % \
-         (sys.version, sys.platform, repr(sorted(locals.keys())))
-
-  code.interact(banner, local=locals)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/make.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2151 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Notes:
-#
-# This is all roughly based on the Makefile system used by the Linux
-# kernel, but is a non-recursive make -- we put the entire dependency
-# graph in front of make and let it figure it out.
-#
-# The code below generates a separate .mk file for each target, but
-# all are sourced by the top-level Makefile.  This means that all
-# variables in .mk-files clobber one another.  Be careful to use :=
-# where appropriate for immediate evaluation, and similarly to watch
-# that you're not relying on a variable value to last beween different
-# .mk files.
-#
-# TODOs:
-#
-# Global settings and utility functions are currently stuffed in the
-# toplevel Makefile.  It may make sense to generate some .mk files on
-# the side to keep the the files readable.
-
-import os
-import re
-import sys
-import subprocess
-import gyp
-import gyp.common
-import gyp.xcode_emulation
-from gyp.common import GetEnvironFallback
-
-generator_default_variables = {
-  'EXECUTABLE_PREFIX': '',
-  'EXECUTABLE_SUFFIX': '',
-  'STATIC_LIB_PREFIX': 'lib',
-  'SHARED_LIB_PREFIX': 'lib',
-  'STATIC_LIB_SUFFIX': '.a',
-  'INTERMEDIATE_DIR': '$(obj).$(TOOLSET)/$(TARGET)/geni',
-  'SHARED_INTERMEDIATE_DIR': '$(obj)/gen',
-  'PRODUCT_DIR': '$(builddir)',
-  'RULE_INPUT_ROOT': '%(INPUT_ROOT)s',  # This gets expanded by Python.
-  'RULE_INPUT_DIRNAME': '%(INPUT_DIRNAME)s',  # This gets expanded by Python.
-  'RULE_INPUT_PATH': '$(abspath $<)',
-  'RULE_INPUT_EXT': '$(suffix $<)',
-  'RULE_INPUT_NAME': '$(notdir $<)',
-  'CONFIGURATION_NAME': '$(BUILDTYPE)',
-}
-
-# Make supports multiple toolsets
-generator_supports_multiple_toolsets = True
-
-# Request sorted dependencies in the order from dependents to dependencies.
-generator_wants_sorted_dependencies = False
-
-# Placates pylint.
-generator_additional_non_configuration_keys = []
-generator_additional_path_sections = []
-generator_extra_sources_for_rules = []
-
-
-def CalculateVariables(default_variables, params):
-  """Calculate additional variables for use in the build (called by gyp)."""
-  flavor = gyp.common.GetFlavor(params)
-  if flavor == 'mac':
-    default_variables.setdefault('OS', 'mac')
-    default_variables.setdefault('SHARED_LIB_SUFFIX', '.dylib')
-    default_variables.setdefault('SHARED_LIB_DIR',
-                                 generator_default_variables['PRODUCT_DIR'])
-    default_variables.setdefault('LIB_DIR',
-                                 generator_default_variables['PRODUCT_DIR'])
-
-    # Copy additional generator configuration data from Xcode, which is shared
-    # by the Mac Make generator.
-    import gyp.generator.xcode as xcode_generator
-    global generator_additional_non_configuration_keys
-    generator_additional_non_configuration_keys = getattr(xcode_generator,
-        'generator_additional_non_configuration_keys', [])
-    global generator_additional_path_sections
-    generator_additional_path_sections = getattr(xcode_generator,
-        'generator_additional_path_sections', [])
-    global generator_extra_sources_for_rules
-    generator_extra_sources_for_rules = getattr(xcode_generator,
-        'generator_extra_sources_for_rules', [])
-    COMPILABLE_EXTENSIONS.update({'.m': 'objc', '.mm' : 'objcxx'})
-  else:
-    operating_system = flavor
-    if flavor == 'android':
-      operating_system = 'linux'  # Keep this legacy behavior for now.
-    default_variables.setdefault('OS', operating_system)
-    default_variables.setdefault('SHARED_LIB_SUFFIX', '.so')
-    default_variables.setdefault('SHARED_LIB_DIR','$(builddir)/lib.$(TOOLSET)')
-    default_variables.setdefault('LIB_DIR', '$(obj).$(TOOLSET)')
-
-
-def CalculateGeneratorInputInfo(params):
-  """Calculate the generator specific info that gets fed to input (called by
-  gyp)."""
-  generator_flags = params.get('generator_flags', {})
-  android_ndk_version = generator_flags.get('android_ndk_version', None)
-  # Android NDK requires a strict link order.
-  if android_ndk_version:
-    global generator_wants_sorted_dependencies
-    generator_wants_sorted_dependencies = True
-
-
-def ensure_directory_exists(path):
-  dir = os.path.dirname(path)
-  if dir and not os.path.exists(dir):
-    os.makedirs(dir)
-
-
-# The .d checking code below uses these functions:
-# wildcard, sort, foreach, shell, wordlist
-# wildcard can handle spaces, the rest can't.
-# Since I could find no way to make foreach work with spaces in filenames
-# correctly, the .d files have spaces replaced with another character. The .d
-# file for
-#     Chromium\ Framework.framework/foo
-# is for example
-#     out/Release/.deps/out/Release/Chromium?Framework.framework/foo
-# This is the replacement character.
-SPACE_REPLACEMENT = '?'
-
-
-LINK_COMMANDS_LINUX = """\
-quiet_cmd_alink = AR($(TOOLSET)) $@
-cmd_alink = rm -f $@ && $(AR.$(TOOLSET)) crs $@ $(filter %.o,$^)
-
-quiet_cmd_alink_thin = AR($(TOOLSET)) $@
-cmd_alink_thin = rm -f $@ && $(AR.$(TOOLSET)) crsT $@ $(filter %.o,$^)
-
-# Due to circular dependencies between libraries :(, we wrap the
-# special "figure out circular dependencies" flags around the entire
-# input list during linking.
-quiet_cmd_link = LINK($(TOOLSET)) $@
-cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ -Wl,--start-group $(LD_INPUTS) -Wl,--end-group $(LIBS)
-
-# We support two kinds of shared objects (.so):
-# 1) shared_library, which is just bundling together many dependent libraries
-# into a link line.
-# 2) loadable_module, which is generating a module intended for dlopen().
-#
-# They differ only slightly:
-# In the former case, we want to package all dependent code into the .so.
-# In the latter case, we want to package just the API exposed by the
-# outermost module.
-# This means shared_library uses --whole-archive, while loadable_module doesn't.
-# (Note that --whole-archive is incompatible with the --start-group used in
-# normal linking.)
-
-# Other shared-object link notes:
-# - Set SONAME to the library filename so our binaries don't reference
-# the local, absolute paths used on the link command-line.
-quiet_cmd_solink = SOLINK($(TOOLSET)) $@
-cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--whole-archive $(LD_INPUTS) -Wl,--no-whole-archive $(LIBS)
-
-quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@
-cmd_solink_module = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--start-group $(filter-out FORCE_DO_CMD, $^) -Wl,--end-group $(LIBS)
-"""
-
-LINK_COMMANDS_MAC = """\
-quiet_cmd_alink = LIBTOOL-STATIC $@
-cmd_alink = rm -f $@ && ./gyp-mac-tool filter-libtool libtool $(GYP_LIBTOOLFLAGS) -static -o $@ $(filter %.o,$^)
-
-quiet_cmd_link = LINK($(TOOLSET)) $@
-cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o "$@" $(LD_INPUTS) $(LIBS)
-
-# TODO(thakis): Find out and document the difference between shared_library and
-# loadable_module on mac.
-quiet_cmd_solink = SOLINK($(TOOLSET)) $@
-cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o "$@" $(LD_INPUTS) $(LIBS)
-
-# TODO(thakis): The solink_module rule is likely wrong. Xcode seems to pass
-# -bundle -single_module here (for osmesa.so).
-quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@
-cmd_solink_module = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS)
-"""
-
-LINK_COMMANDS_ANDROID = """\
-quiet_cmd_alink = AR($(TOOLSET)) $@
-cmd_alink = rm -f $@ && $(AR.$(TOOLSET)) crs $@ $(filter %.o,$^)
-
-quiet_cmd_alink_thin = AR($(TOOLSET)) $@
-cmd_alink_thin = rm -f $@ && $(AR.$(TOOLSET)) crsT $@ $(filter %.o,$^)
-
-# Due to circular dependencies between libraries :(, we wrap the
-# special "figure out circular dependencies" flags around the entire
-# input list during linking.
-quiet_cmd_link = LINK($(TOOLSET)) $@
-quiet_cmd_link_host = LINK($(TOOLSET)) $@
-cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ -Wl,--start-group $(LD_INPUTS) -Wl,--end-group $(LIBS)
-cmd_link_host = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(LD_INPUTS) $(LIBS)
-
-# Other shared-object link notes:
-# - Set SONAME to the library filename so our binaries don't reference
-# the local, absolute paths used on the link command-line.
-quiet_cmd_solink = SOLINK($(TOOLSET)) $@
-cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--whole-archive $(LD_INPUTS) -Wl,--no-whole-archive $(LIBS)
-
-quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@
-cmd_solink_module = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--start-group $(filter-out FORCE_DO_CMD, $^) -Wl,--end-group $(LIBS)
-quiet_cmd_solink_module_host = SOLINK_MODULE($(TOOLSET)) $@
-cmd_solink_module_host = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS)
-"""
-
-
-# Header of toplevel Makefile.
-# This should go into the build tree, but it's easier to keep it here for now.
-SHARED_HEADER = ("""\
-# We borrow heavily from the kernel build setup, though we are simpler since
-# we don't have Kconfig tweaking settings on us.
-
-# The implicit make rules have it looking for RCS files, among other things.
-# We instead explicitly write all the rules we care about.
-# It's even quicker (saves ~200ms) to pass -r on the command line.
-MAKEFLAGS=-r
-
-# The source directory tree.
-srcdir := %(srcdir)s
-abs_srcdir := $(abspath $(srcdir))
-
-# The name of the builddir.
-builddir_name ?= %(builddir)s
-
-# The V=1 flag on command line makes us verbosely print command lines.
-ifdef V
-  quiet=
-else
-  quiet=quiet_
-endif
-
-# Specify BUILDTYPE=Release on the command line for a release build.
-BUILDTYPE ?= %(default_configuration)s
-
-# Directory all our build output goes into.
-# Note that this must be two directories beneath src/ for unit tests to pass,
-# as they reach into the src/ directory for data with relative paths.
-builddir ?= $(builddir_name)/$(BUILDTYPE)
-abs_builddir := $(abspath $(builddir))
-depsdir := $(builddir)/.deps
-
-# Object output directory.
-obj := $(builddir)/obj
-abs_obj := $(abspath $(obj))
-
-# We build up a list of every single one of the targets so we can slurp in the
-# generated dependency rule Makefiles in one pass.
-all_deps :=
-
-%(make_global_settings)s
-
-# C++ apps need to be linked with g++.
-#
-# Note: flock is used to seralize linking. Linking is a memory-intensive
-# process so running parallel links can often lead to thrashing.  To disable
-# the serialization, override LINK via an envrionment variable as follows:
-#
-#   export LINK=g++
-#
-# This will allow make to invoke N linker processes as specified in -jN.
-LINK ?= %(flock)s $(builddir)/linker.lock $(CXX.target)
-
-CC.target ?= %(CC.target)s
-CFLAGS.target ?= $(CFLAGS)
-CXX.target ?= %(CXX.target)s
-CXXFLAGS.target ?= $(CXXFLAGS)
-LINK.target ?= %(LINK.target)s
-LDFLAGS.target ?= $(LDFLAGS)
-AR.target ?= $(AR)
-
-# TODO(evan): move all cross-compilation logic to gyp-time so we don't need
-# to replicate this environment fallback in make as well.
-CC.host ?= %(CC.host)s
-CFLAGS.host ?=
-CXX.host ?= %(CXX.host)s
-CXXFLAGS.host ?=
-LINK.host ?= %(LINK.host)s
-LDFLAGS.host ?=
-AR.host ?= %(AR.host)s
-
-# Define a dir function that can handle spaces.
-# http://www.gnu.org/software/make/manual/make.html#Syntax-of-Functions
-# "leading spaces cannot appear in the text of the first argument as written.
-# These characters can be put into the argument value by variable substitution."
-empty :=
-space := $(empty) $(empty)
-
-# http://stackoverflow.com/questions/1189781/using-make-dir-or-notdir-on-a-path-with-spaces
-replace_spaces = $(subst $(space),""" + SPACE_REPLACEMENT + """,$1)
-unreplace_spaces = $(subst """ + SPACE_REPLACEMENT + """,$(space),$1)
-dirx = $(call unreplace_spaces,$(dir $(call replace_spaces,$1)))
-
-# Flags to make gcc output dependency info.  Note that you need to be
-# careful here to use the flags that ccache and distcc can understand.
-# We write to a dep file on the side first and then rename at the end
-# so we can't end up with a broken dep file.
-depfile = $(depsdir)/$(call replace_spaces,$@).d
-DEPFLAGS = -MMD -MF $(depfile).raw
-
-# We have to fixup the deps output in a few ways.
-# (1) the file output should mention the proper .o file.
-# ccache or distcc lose the path to the target, so we convert a rule of
-# the form:
-#   foobar.o: DEP1 DEP2
-# into
-#   path/to/foobar.o: DEP1 DEP2
-# (2) we want missing files not to cause us to fail to build.
-# We want to rewrite
-#   foobar.o: DEP1 DEP2 \\
-#               DEP3
-# to
-#   DEP1:
-#   DEP2:
-#   DEP3:
-# so if the files are missing, they're just considered phony rules.
-# We have to do some pretty insane escaping to get those backslashes
-# and dollar signs past make, the shell, and sed at the same time.
-# Doesn't work with spaces, but that's fine: .d files have spaces in
-# their names replaced with other characters."""
-r"""
-define fixup_dep
-# The depfile may not exist if the input file didn't have any #includes.
-touch $(depfile).raw
-# Fixup path as in (1).
-sed -e "s|^$(notdir $@)|$@|" $(depfile).raw >> $(depfile)
-# Add extra rules as in (2).
-# We remove slashes and replace spaces with new lines;
-# remove blank lines;
-# delete the first line and append a colon to the remaining lines.
-sed -e 's|\\||' -e 'y| |\n|' $(depfile).raw |\
-  grep -v '^$$'                             |\
-  sed -e 1d -e 's|$$|:|'                     \
-    >> $(depfile)
-rm $(depfile).raw
-endef
-"""
-"""
-# Command definitions:
-# - cmd_foo is the actual command to run;
-# - quiet_cmd_foo is the brief-output summary of the command.
-
-quiet_cmd_cc = CC($(TOOLSET)) $@
-cmd_cc = $(CC.$(TOOLSET)) $(GYP_CFLAGS) $(DEPFLAGS) $(CFLAGS.$(TOOLSET)) -c -o $@ $<
-
-quiet_cmd_cxx = CXX($(TOOLSET)) $@
-cmd_cxx = $(CXX.$(TOOLSET)) $(GYP_CXXFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $<
-%(extra_commands)s
-quiet_cmd_touch = TOUCH $@
-cmd_touch = touch $@
-
-quiet_cmd_copy = COPY $@
-# send stderr to /dev/null to ignore messages when linking directories.
-cmd_copy = rm -rf "$@" && cp -af "$<" "$@"
-
-%(link_commands)s
-"""
-
-r"""
-# Define an escape_quotes function to escape single quotes.
-# This allows us to handle quotes properly as long as we always use
-# use single quotes and escape_quotes.
-escape_quotes = $(subst ','\'',$(1))
-# This comment is here just to include a ' to unconfuse syntax highlighting.
-# Define an escape_vars function to escape '$' variable syntax.
-# This allows us to read/write command lines with shell variables (e.g.
-# $LD_LIBRARY_PATH), without triggering make substitution.
-escape_vars = $(subst $$,$$$$,$(1))
-# Helper that expands to a shell command to echo a string exactly as it is in
-# make. This uses printf instead of echo because printf's behaviour with respect
-# to escape sequences is more portable than echo's across different shells
-# (e.g., dash, bash).
-exact_echo = printf '%%s\n' '$(call escape_quotes,$(1))'
-"""
-"""
-# Helper to compare the command we're about to run against the command
-# we logged the last time we ran the command.  Produces an empty
-# string (false) when the commands match.
-# Tricky point: Make has no string-equality test function.
-# The kernel uses the following, but it seems like it would have false
-# positives, where one string reordered its arguments.
-#   arg_check = $(strip $(filter-out $(cmd_$(1)), $(cmd_$@)) \\
-#                       $(filter-out $(cmd_$@), $(cmd_$(1))))
-# We instead substitute each for the empty string into the other, and
-# say they're equal if both substitutions produce the empty string.
-# .d files contain """ + SPACE_REPLACEMENT + \
-                   """ instead of spaces, take that into account.
-command_changed = $(or $(subst $(cmd_$(1)),,$(cmd_$(call replace_spaces,$@))),\\
-                       $(subst $(cmd_$(call replace_spaces,$@)),,$(cmd_$(1))))
-
-# Helper that is non-empty when a prerequisite changes.
-# Normally make does this implicitly, but we force rules to always run
-# so we can check their command lines.
-#   $? -- new prerequisites
-#   $| -- order-only dependencies
-prereq_changed = $(filter-out FORCE_DO_CMD,$(filter-out $|,$?))
-
-# Helper that executes all postbuilds until one fails.
-define do_postbuilds
-  @E=0;\\
-  for p in $(POSTBUILDS); do\\
-    eval $$p;\\
-    E=$$?;\\
-    if [ $$E -ne 0 ]; then\\
-      break;\\
-    fi;\\
-  done;\\
-  if [ $$E -ne 0 ]; then\\
-    rm -rf "$@";\\
-    exit $$E;\\
-  fi
-endef
-
-# do_cmd: run a command via the above cmd_foo names, if necessary.
-# Should always run for a given target to handle command-line changes.
-# Second argument, if non-zero, makes it do asm/C/C++ dependency munging.
-# Third argument, if non-zero, makes it do POSTBUILDS processing.
-# Note: We intentionally do NOT call dirx for depfile, since it contains """ + \
-                                                     SPACE_REPLACEMENT + """ for
-# spaces already and dirx strips the """ + SPACE_REPLACEMENT + \
-                                     """ characters.
-define do_cmd
-$(if $(or $(command_changed),$(prereq_changed)),
-  @$(call exact_echo,  $($(quiet)cmd_$(1)))
-  @mkdir -p "$(call dirx,$@)" "$(dir $(depfile))"
-  $(if $(findstring flock,$(word %(flock_index)d,$(cmd_$1))),
-    @$(cmd_$(1))
-    @echo "  $(quiet_cmd_$(1)): Finished",
-    @$(cmd_$(1))
-  )
-  @$(call exact_echo,$(call escape_vars,cmd_$(call replace_spaces,$@) := $(cmd_$(1)))) > $(depfile)
-  @$(if $(2),$(fixup_dep))
-  $(if $(and $(3), $(POSTBUILDS)),
-    $(call do_postbuilds)
-  )
-)
-endef
-
-# Declare the "%(default_target)s" target first so it is the default,
-# even though we don't have the deps yet.
-.PHONY: %(default_target)s
-%(default_target)s:
-
-# make looks for ways to re-generate included makefiles, but in our case, we
-# don't have a direct way. Explicitly telling make that it has nothing to do
-# for them makes it go faster.
-%%.d: ;
-
-# Use FORCE_DO_CMD to force a target to run.  Should be coupled with
-# do_cmd.
-.PHONY: FORCE_DO_CMD
-FORCE_DO_CMD:
-
-""")
-
-SHARED_HEADER_MAC_COMMANDS = """
-quiet_cmd_objc = CXX($(TOOLSET)) $@
-cmd_objc = $(CC.$(TOOLSET)) $(GYP_OBJCFLAGS) $(DEPFLAGS) -c -o $@ $<
-
-quiet_cmd_objcxx = CXX($(TOOLSET)) $@
-cmd_objcxx = $(CXX.$(TOOLSET)) $(GYP_OBJCXXFLAGS) $(DEPFLAGS) -c -o $@ $<
-
-# Commands for precompiled header files.
-quiet_cmd_pch_c = CXX($(TOOLSET)) $@
-cmd_pch_c = $(CC.$(TOOLSET)) $(GYP_PCH_CFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $<
-quiet_cmd_pch_cc = CXX($(TOOLSET)) $@
-cmd_pch_cc = $(CC.$(TOOLSET)) $(GYP_PCH_CXXFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $<
-quiet_cmd_pch_m = CXX($(TOOLSET)) $@
-cmd_pch_m = $(CC.$(TOOLSET)) $(GYP_PCH_OBJCFLAGS) $(DEPFLAGS) -c -o $@ $<
-quiet_cmd_pch_mm = CXX($(TOOLSET)) $@
-cmd_pch_mm = $(CC.$(TOOLSET)) $(GYP_PCH_OBJCXXFLAGS) $(DEPFLAGS) -c -o $@ $<
-
-# gyp-mac-tool is written next to the root Makefile by gyp.
-# Use $(4) for the command, since $(2) and $(3) are used as flag by do_cmd
-# already.
-quiet_cmd_mac_tool = MACTOOL $(4) $<
-cmd_mac_tool = ./gyp-mac-tool $(4) $< "$@"
-
-quiet_cmd_mac_package_framework = PACKAGE FRAMEWORK $@
-cmd_mac_package_framework = ./gyp-mac-tool package-framework "$@" $(4)
-
-quiet_cmd_infoplist = INFOPLIST $@
-cmd_infoplist = $(CC.$(TOOLSET)) -E -P -Wno-trigraphs -x c $(INFOPLIST_DEFINES) "$<" -o "$@"
-"""
-
-SHARED_HEADER_SUN_COMMANDS = """
-# gyp-sun-tool is written next to the root Makefile by gyp.
-# Use $(4) for the command, since $(2) and $(3) are used as flag by do_cmd
-# already.
-quiet_cmd_sun_tool = SUNTOOL $(4) $<
-cmd_sun_tool = ./gyp-sun-tool $(4) $< "$@"
-"""
-
-
-def WriteRootHeaderSuffixRules(writer):
-  extensions = sorted(COMPILABLE_EXTENSIONS.keys(), key=str.lower)
-
-  writer.write('# Suffix rules, putting all outputs into $(obj).\n')
-  for ext in extensions:
-    writer.write('$(obj).$(TOOLSET)/%%.o: $(srcdir)/%%%s FORCE_DO_CMD\n' % ext)
-    writer.write('\t@$(call do_cmd,%s,1)\n' % COMPILABLE_EXTENSIONS[ext])
-
-  writer.write('\n# Try building from generated source, too.\n')
-  for ext in extensions:
-    writer.write(
-        '$(obj).$(TOOLSET)/%%.o: $(obj).$(TOOLSET)/%%%s FORCE_DO_CMD\n' % ext)
-    writer.write('\t@$(call do_cmd,%s,1)\n' % COMPILABLE_EXTENSIONS[ext])
-  writer.write('\n')
-  for ext in extensions:
-    writer.write('$(obj).$(TOOLSET)/%%.o: $(obj)/%%%s FORCE_DO_CMD\n' % ext)
-    writer.write('\t@$(call do_cmd,%s,1)\n' % COMPILABLE_EXTENSIONS[ext])
-  writer.write('\n')
-
-
-SHARED_HEADER_SUFFIX_RULES_COMMENT1 = ("""\
-# Suffix rules, putting all outputs into $(obj).
-""")
-
-
-SHARED_HEADER_SUFFIX_RULES_COMMENT2 = ("""\
-# Try building from generated source, too.
-""")
-
-
-SHARED_FOOTER = """\
-# "all" is a concatenation of the "all" targets from all the included
-# sub-makefiles. This is just here to clarify.
-all:
-
-# Add in dependency-tracking rules.  $(all_deps) is the list of every single
-# target in our tree. Only consider the ones with .d (dependency) info:
-d_files := $(wildcard $(foreach f,$(all_deps),$(depsdir)/$(f).d))
-ifneq ($(d_files),)
-  include $(d_files)
-endif
-"""
-
-header = """\
-# This file is generated by gyp; do not edit.
-
-"""
-
-# Maps every compilable file extension to the do_cmd that compiles it.
-COMPILABLE_EXTENSIONS = {
-  '.c': 'cc',
-  '.cc': 'cxx',
-  '.cpp': 'cxx',
-  '.cxx': 'cxx',
-  '.s': 'cc',
-  '.S': 'cc',
-}
-
-def Compilable(filename):
-  """Return true if the file is compilable (should be in OBJS)."""
-  for res in (filename.endswith(e) for e in COMPILABLE_EXTENSIONS):
-    if res:
-      return True
-  return False
-
-
-def Linkable(filename):
-  """Return true if the file is linkable (should be on the link line)."""
-  return filename.endswith('.o')
-
-
-def Target(filename):
-  """Translate a compilable filename to its .o target."""
-  return os.path.splitext(filename)[0] + '.o'
-
-
-def EscapeShellArgument(s):
-  """Quotes an argument so that it will be interpreted literally by a POSIX
-     shell. Taken from
-     http://stackoverflow.com/questions/35817/whats-the-best-way-to-escape-ossystem-calls-in-python
-     """
-  return "'" + s.replace("'", "'\\''") + "'"
-
-
-def EscapeMakeVariableExpansion(s):
-  """Make has its own variable expansion syntax using $. We must escape it for
-     string to be interpreted literally."""
-  return s.replace('$', '$$')
-
-
-def EscapeCppDefine(s):
-  """Escapes a CPP define so that it will reach the compiler unaltered."""
-  s = EscapeShellArgument(s)
-  s = EscapeMakeVariableExpansion(s)
-  # '#' characters must be escaped even embedded in a string, else Make will
-  # treat it as the start of a comment.
-  return s.replace('#', r'\#')
-
-
-def QuoteIfNecessary(string):
-  """TODO: Should this ideally be replaced with one or more of the above
-     functions?"""
-  if '"' in string:
-    string = '"' + string.replace('"', '\\"') + '"'
-  return string
-
-
-def StringToMakefileVariable(string):
-  """Convert a string to a value that is acceptable as a make variable name."""
-  return re.sub('[^a-zA-Z0-9_]', '_', string)
-
-
-srcdir_prefix = ''
-def Sourceify(path):
-  """Convert a path to its source directory form."""
-  if '$(' in path:
-    return path
-  if os.path.isabs(path):
-    return path
-  return srcdir_prefix + path
-
-
-def QuoteSpaces(s, quote=r'\ '):
-  return s.replace(' ', quote)
-
-
-# Map from qualified target to path to output.
-target_outputs = {}
-# Map from qualified target to any linkable output.  A subset
-# of target_outputs.  E.g. when mybinary depends on liba, we want to
-# include liba in the linker line; when otherbinary depends on
-# mybinary, we just want to build mybinary first.
-target_link_deps = {}
-
-
-class MakefileWriter:
-  """MakefileWriter packages up the writing of one target-specific foobar.mk.
-
-  Its only real entry point is Write(), and is mostly used for namespacing.
-  """
-
-  def __init__(self, generator_flags, flavor):
-    self.generator_flags = generator_flags
-    self.flavor = flavor
-
-    self.suffix_rules_srcdir = {}
-    self.suffix_rules_objdir1 = {}
-    self.suffix_rules_objdir2 = {}
-
-    # Generate suffix rules for all compilable extensions.
-    for ext in COMPILABLE_EXTENSIONS.keys():
-      # Suffix rules for source folder.
-      self.suffix_rules_srcdir.update({ext: ("""\
-$(obj).$(TOOLSET)/$(TARGET)/%%.o: $(srcdir)/%%%s FORCE_DO_CMD
-	@$(call do_cmd,%s,1)
-""" % (ext, COMPILABLE_EXTENSIONS[ext]))})
-
-      # Suffix rules for generated source files.
-      self.suffix_rules_objdir1.update({ext: ("""\
-$(obj).$(TOOLSET)/$(TARGET)/%%.o: $(obj).$(TOOLSET)/%%%s FORCE_DO_CMD
-	@$(call do_cmd,%s,1)
-""" % (ext, COMPILABLE_EXTENSIONS[ext]))})
-      self.suffix_rules_objdir2.update({ext: ("""\
-$(obj).$(TOOLSET)/$(TARGET)/%%.o: $(obj)/%%%s FORCE_DO_CMD
-	@$(call do_cmd,%s,1)
-""" % (ext, COMPILABLE_EXTENSIONS[ext]))})
-
-
-  def Write(self, qualified_target, base_path, output_filename, spec, configs,
-            part_of_all):
-    """The main entry point: writes a .mk file for a single target.
-
-    Arguments:
-      qualified_target: target we're generating
-      base_path: path relative to source root we're building in, used to resolve
-                 target-relative paths
-      output_filename: output .mk file name to write
-      spec, configs: gyp info
-      part_of_all: flag indicating this target is part of 'all'
-    """
-    ensure_directory_exists(output_filename)
-
-    self.fp = open(output_filename, 'w')
-
-    self.fp.write(header)
-
-    self.qualified_target = qualified_target
-    self.path = base_path
-    self.target = spec['target_name']
-    self.type = spec['type']
-    self.toolset = spec['toolset']
-
-    self.is_mac_bundle = gyp.xcode_emulation.IsMacBundle(self.flavor, spec)
-    if self.flavor == 'mac':
-      self.xcode_settings = gyp.xcode_emulation.XcodeSettings(spec)
-    else:
-      self.xcode_settings = None
-
-    deps, link_deps = self.ComputeDeps(spec)
-
-    # Some of the generation below can add extra output, sources, or
-    # link dependencies.  All of the out params of the functions that
-    # follow use names like extra_foo.
-    extra_outputs = []
-    extra_sources = []
-    extra_link_deps = []
-    extra_mac_bundle_resources = []
-    mac_bundle_deps = []
-
-    if self.is_mac_bundle:
-      self.output = self.ComputeMacBundleOutput(spec)
-      self.output_binary = self.ComputeMacBundleBinaryOutput(spec)
-    else:
-      self.output = self.output_binary = self.ComputeOutput(spec)
-
-    self.is_standalone_static_library = bool(
-        spec.get('standalone_static_library', 0))
-    self._INSTALLABLE_TARGETS = ('executable', 'loadable_module',
-                                 'shared_library')
-    if (self.is_standalone_static_library or
-        self.type in self._INSTALLABLE_TARGETS):
-      self.alias = os.path.basename(self.output)
-      install_path = self._InstallableTargetInstallPath()
-    else:
-      self.alias = self.output
-      install_path = self.output
-
-    self.WriteLn("TOOLSET := " + self.toolset)
-    self.WriteLn("TARGET := " + self.target)
-
-    # Actions must come first, since they can generate more OBJs for use below.
-    if 'actions' in spec:
-      self.WriteActions(spec['actions'], extra_sources, extra_outputs,
-                        extra_mac_bundle_resources, part_of_all)
-
-    # Rules must be early like actions.
-    if 'rules' in spec:
-      self.WriteRules(spec['rules'], extra_sources, extra_outputs,
-                      extra_mac_bundle_resources, part_of_all)
-
-    if 'copies' in spec:
-      self.WriteCopies(spec['copies'], extra_outputs, part_of_all)
-
-    # Bundle resources.
-    if self.is_mac_bundle:
-      all_mac_bundle_resources = (
-          spec.get('mac_bundle_resources', []) + extra_mac_bundle_resources)
-      self.WriteMacBundleResources(all_mac_bundle_resources, mac_bundle_deps)
-      self.WriteMacInfoPlist(mac_bundle_deps)
-
-    # Sources.
-    all_sources = spec.get('sources', []) + extra_sources
-    if all_sources:
-      self.WriteSources(
-          configs, deps, all_sources, extra_outputs,
-          extra_link_deps, part_of_all,
-          gyp.xcode_emulation.MacPrefixHeader(
-              self.xcode_settings, lambda p: Sourceify(self.Absolutify(p)),
-              self.Pchify))
-      sources = filter(Compilable, all_sources)
-      if sources:
-        self.WriteLn(SHARED_HEADER_SUFFIX_RULES_COMMENT1)
-        extensions = set([os.path.splitext(s)[1] for s in sources])
-        for ext in extensions:
-          if ext in self.suffix_rules_srcdir:
-            self.WriteLn(self.suffix_rules_srcdir[ext])
-        self.WriteLn(SHARED_HEADER_SUFFIX_RULES_COMMENT2)
-        for ext in extensions:
-          if ext in self.suffix_rules_objdir1:
-            self.WriteLn(self.suffix_rules_objdir1[ext])
-        for ext in extensions:
-          if ext in self.suffix_rules_objdir2:
-            self.WriteLn(self.suffix_rules_objdir2[ext])
-        self.WriteLn('# End of this set of suffix rules')
-
-        # Add dependency from bundle to bundle binary.
-        if self.is_mac_bundle:
-          mac_bundle_deps.append(self.output_binary)
-
-    self.WriteTarget(spec, configs, deps, extra_link_deps + link_deps,
-                     mac_bundle_deps, extra_outputs, part_of_all)
-
-    # Update global list of target outputs, used in dependency tracking.
-    target_outputs[qualified_target] = install_path
-
-    # Update global list of link dependencies.
-    if self.type in ('static_library', 'shared_library'):
-      target_link_deps[qualified_target] = self.output_binary
-
-    # Currently any versions have the same effect, but in future the behavior
-    # could be different.
-    if self.generator_flags.get('android_ndk_version', None):
-      self.WriteAndroidNdkModuleRule(self.target, all_sources, link_deps)
-
-    self.fp.close()
-
-
-  def WriteSubMake(self, output_filename, makefile_path, targets, build_dir):
-    """Write a "sub-project" Makefile.
-
-    This is a small, wrapper Makefile that calls the top-level Makefile to build
-    the targets from a single gyp file (i.e. a sub-project).
-
-    Arguments:
-      output_filename: sub-project Makefile name to write
-      makefile_path: path to the top-level Makefile
-      targets: list of "all" targets for this sub-project
-      build_dir: build output directory, relative to the sub-project
-    """
-    ensure_directory_exists(output_filename)
-    self.fp = open(output_filename, 'w')
-    self.fp.write(header)
-    # For consistency with other builders, put sub-project build output in the
-    # sub-project dir (see test/subdirectory/gyptest-subdir-all.py).
-    self.WriteLn('export builddir_name ?= %s' %
-                 os.path.join(os.path.dirname(output_filename), build_dir))
-    self.WriteLn('.PHONY: all')
-    self.WriteLn('all:')
-    if makefile_path:
-      makefile_path = ' -C ' + makefile_path
-    self.WriteLn('\t$(MAKE)%s %s' % (makefile_path, ' '.join(targets)))
-    self.fp.close()
-
-
-  def WriteActions(self, actions, extra_sources, extra_outputs,
-                   extra_mac_bundle_resources, part_of_all):
-    """Write Makefile code for any 'actions' from the gyp input.
-
-    extra_sources: a list that will be filled in with newly generated source
-                   files, if any
-    extra_outputs: a list that will be filled in with any outputs of these
-                   actions (used to make other pieces dependent on these
-                   actions)
-    part_of_all: flag indicating this target is part of 'all'
-    """
-    env = self.GetSortedXcodeEnv()
-    for action in actions:
-      name = StringToMakefileVariable('%s_%s' % (self.qualified_target,
-                                                 action['action_name']))
-      self.WriteLn('### Rules for action "%s":' % action['action_name'])
-      inputs = action['inputs']
-      outputs = action['outputs']
-
-      # Build up a list of outputs.
-      # Collect the output dirs we'll need.
-      dirs = set()
-      for out in outputs:
-        dir = os.path.split(out)[0]
-        if dir:
-          dirs.add(dir)
-      if int(action.get('process_outputs_as_sources', False)):
-        extra_sources += outputs
-      if int(action.get('process_outputs_as_mac_bundle_resources', False)):
-        extra_mac_bundle_resources += outputs
-
-      # Write the actual command.
-      action_commands = action['action']
-      if self.flavor == 'mac':
-        action_commands = [gyp.xcode_emulation.ExpandEnvVars(command, env)
-                          for command in action_commands]
-      command = gyp.common.EncodePOSIXShellList(action_commands)
-      if 'message' in action:
-        self.WriteLn('quiet_cmd_%s = ACTION %s $@' % (name, action['message']))
-      else:
-        self.WriteLn('quiet_cmd_%s = ACTION %s $@' % (name, name))
-      if len(dirs) > 0:
-        command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command
-
-      cd_action = 'cd %s; ' % Sourceify(self.path or '.')
-
-      # command and cd_action get written to a toplevel variable called
-      # cmd_foo. Toplevel variables can't handle things that change per
-      # makefile like $(TARGET), so hardcode the target.
-      command = command.replace('$(TARGET)', self.target)
-      cd_action = cd_action.replace('$(TARGET)', self.target)
-
-      # Set LD_LIBRARY_PATH in case the action runs an executable from this
-      # build which links to shared libs from this build.
-      # actions run on the host, so they should in theory only use host
-      # libraries, but until everything is made cross-compile safe, also use
-      # target libraries.
-      # TODO(piman): when everything is cross-compile safe, remove lib.target
-      self.WriteLn('cmd_%s = LD_LIBRARY_PATH=$(builddir)/lib.host:'
-                   '$(builddir)/lib.target:$$LD_LIBRARY_PATH; '
-                   'export LD_LIBRARY_PATH; '
-                   '%s%s'
-                   % (name, cd_action, command))
-      self.WriteLn()
-      outputs = map(self.Absolutify, outputs)
-      # The makefile rules are all relative to the top dir, but the gyp actions
-      # are defined relative to their containing dir.  This replaces the obj
-      # variable for the action rule with an absolute version so that the output
-      # goes in the right place.
-      # Only write the 'obj' and 'builddir' rules for the "primary" output (:1);
-      # it's superfluous for the "extra outputs", and this avoids accidentally
-      # writing duplicate dummy rules for those outputs.
-      # Same for environment.
-      self.WriteLn("%s: obj := $(abs_obj)" % QuoteSpaces(outputs[0]))
-      self.WriteLn("%s: builddir := $(abs_builddir)" % QuoteSpaces(outputs[0]))
-      self.WriteSortedXcodeEnv(outputs[0], self.GetSortedXcodeEnv())
-
-      for input in inputs:
-        assert ' ' not in input, (
-            "Spaces in action input filenames not supported (%s)"  % input)
-      for output in outputs:
-        assert ' ' not in output, (
-            "Spaces in action output filenames not supported (%s)"  % output)
-
-      # See the comment in WriteCopies about expanding env vars.
-      outputs = [gyp.xcode_emulation.ExpandEnvVars(o, env) for o in outputs]
-      inputs = [gyp.xcode_emulation.ExpandEnvVars(i, env) for i in inputs]
-
-      self.WriteDoCmd(outputs, map(Sourceify, map(self.Absolutify, inputs)),
-                      part_of_all=part_of_all, command=name)
-
-      # Stuff the outputs in a variable so we can refer to them later.
-      outputs_variable = 'action_%s_outputs' % name
-      self.WriteLn('%s := %s' % (outputs_variable, ' '.join(outputs)))
-      extra_outputs.append('$(%s)' % outputs_variable)
-      self.WriteLn()
-
-    self.WriteLn()
-
-
-  def WriteRules(self, rules, extra_sources, extra_outputs,
-                 extra_mac_bundle_resources, part_of_all):
-    """Write Makefile code for any 'rules' from the gyp input.
-
-    extra_sources: a list that will be filled in with newly generated source
-                   files, if any
-    extra_outputs: a list that will be filled in with any outputs of these
-                   rules (used to make other pieces dependent on these rules)
-    part_of_all: flag indicating this target is part of 'all'
-    """
-    env = self.GetSortedXcodeEnv()
-    for rule in rules:
-      name = StringToMakefileVariable('%s_%s' % (self.qualified_target,
-                                                 rule['rule_name']))
-      count = 0
-      self.WriteLn('### Generated for rule %s:' % name)
-
-      all_outputs = []
-
-      for rule_source in rule.get('rule_sources', []):
-        dirs = set()
-        (rule_source_dirname, rule_source_basename) = os.path.split(rule_source)
-        (rule_source_root, rule_source_ext) = \
-            os.path.splitext(rule_source_basename)
-
-        outputs = [self.ExpandInputRoot(out, rule_source_root,
-                                        rule_source_dirname)
-                   for out in rule['outputs']]
-
-        for out in outputs:
-          dir = os.path.dirname(out)
-          if dir:
-            dirs.add(dir)
-        if int(rule.get('process_outputs_as_sources', False)):
-          extra_sources += outputs
-        if int(rule.get('process_outputs_as_mac_bundle_resources', False)):
-          extra_mac_bundle_resources += outputs
-        inputs = map(Sourceify, map(self.Absolutify, [rule_source] +
-                                    rule.get('inputs', [])))
-        actions = ['$(call do_cmd,%s_%d)' % (name, count)]
-
-        if name == 'resources_grit':
-          # HACK: This is ugly.  Grit intentionally doesn't touch the
-          # timestamp of its output file when the file doesn't change,
-          # which is fine in hash-based dependency systems like scons
-          # and forge, but not kosher in the make world.  After some
-          # discussion, hacking around it here seems like the least
-          # amount of pain.
-          actions += ['@touch --no-create $@']
-
-        # See the comment in WriteCopies about expanding env vars.
-        outputs = [gyp.xcode_emulation.ExpandEnvVars(o, env) for o in outputs]
-        inputs = [gyp.xcode_emulation.ExpandEnvVars(i, env) for i in inputs]
-
-        outputs = map(self.Absolutify, outputs)
-        all_outputs += outputs
-        # Only write the 'obj' and 'builddir' rules for the "primary" output
-        # (:1); it's superfluous for the "extra outputs", and this avoids
-        # accidentally writing duplicate dummy rules for those outputs.
-        self.WriteLn('%s: obj := $(abs_obj)' % outputs[0])
-        self.WriteLn('%s: builddir := $(abs_builddir)' % outputs[0])
-        self.WriteMakeRule(outputs, inputs + ['FORCE_DO_CMD'], actions)
-        for output in outputs:
-          assert ' ' not in output, (
-              "Spaces in rule filenames not yet supported (%s)"  % output)
-        self.WriteLn('all_deps += %s' % ' '.join(outputs))
-
-        action = [self.ExpandInputRoot(ac, rule_source_root,
-                                       rule_source_dirname)
-                  for ac in rule['action']]
-        mkdirs = ''
-        if len(dirs) > 0:
-          mkdirs = 'mkdir -p %s; ' % ' '.join(dirs)
-        cd_action = 'cd %s; ' % Sourceify(self.path or '.')
-
-        # action, cd_action, and mkdirs get written to a toplevel variable
-        # called cmd_foo. Toplevel variables can't handle things that change
-        # per makefile like $(TARGET), so hardcode the target.
-        if self.flavor == 'mac':
-          action = [gyp.xcode_emulation.ExpandEnvVars(command, env)
-                    for command in action]
-        action = gyp.common.EncodePOSIXShellList(action)
-        action = action.replace('$(TARGET)', self.target)
-        cd_action = cd_action.replace('$(TARGET)', self.target)
-        mkdirs = mkdirs.replace('$(TARGET)', self.target)
-
-        # Set LD_LIBRARY_PATH in case the rule runs an executable from this
-        # build which links to shared libs from this build.
-        # rules run on the host, so they should in theory only use host
-        # libraries, but until everything is made cross-compile safe, also use
-        # target libraries.
-        # TODO(piman): when everything is cross-compile safe, remove lib.target
-        self.WriteLn(
-            "cmd_%(name)s_%(count)d = LD_LIBRARY_PATH="
-              "$(builddir)/lib.host:$(builddir)/lib.target:$$LD_LIBRARY_PATH; "
-              "export LD_LIBRARY_PATH; "
-              "%(cd_action)s%(mkdirs)s%(action)s" % {
-          'action': action,
-          'cd_action': cd_action,
-          'count': count,
-          'mkdirs': mkdirs,
-          'name': name,
-        })
-        self.WriteLn(
-            'quiet_cmd_%(name)s_%(count)d = RULE %(name)s_%(count)d $@' % {
-          'count': count,
-          'name': name,
-        })
-        self.WriteLn()
-        count += 1
-
-      outputs_variable = 'rule_%s_outputs' % name
-      self.WriteList(all_outputs, outputs_variable)
-      extra_outputs.append('$(%s)' % outputs_variable)
-
-      self.WriteLn('### Finished generating for rule: %s' % name)
-      self.WriteLn()
-    self.WriteLn('### Finished generating for all rules')
-    self.WriteLn('')
-
-
-  def WriteCopies(self, copies, extra_outputs, part_of_all):
-    """Write Makefile code for any 'copies' from the gyp input.
-
-    extra_outputs: a list that will be filled in with any outputs of this action
-                   (used to make other pieces dependent on this action)
-    part_of_all: flag indicating this target is part of 'all'
-    """
-    self.WriteLn('### Generated for copy rule.')
-
-    variable = StringToMakefileVariable(self.qualified_target + '_copies')
-    outputs = []
-    for copy in copies:
-      for path in copy['files']:
-        # Absolutify() may call normpath, and will strip trailing slashes.
-        path = Sourceify(self.Absolutify(path))
-        filename = os.path.split(path)[1]
-        output = Sourceify(self.Absolutify(os.path.join(copy['destination'],
-                                                        filename)))
-
-        # If the output path has variables in it, which happens in practice for
-        # 'copies', writing the environment as target-local doesn't work,
-        # because the variables are already needed for the target name.
-        # Copying the environment variables into global make variables doesn't
-        # work either, because then the .d files will potentially contain spaces
-        # after variable expansion, and .d file handling cannot handle spaces.
-        # As a workaround, manually expand variables at gyp time. Since 'copies'
-        # can't run scripts, there's no need to write the env then.
-        # WriteDoCmd() will escape spaces for .d files.
-        env = self.GetSortedXcodeEnv()
-        output = gyp.xcode_emulation.ExpandEnvVars(output, env)
-        path = gyp.xcode_emulation.ExpandEnvVars(path, env)
-        self.WriteDoCmd([output], [path], 'copy', part_of_all)
-        outputs.append(output)
-    self.WriteLn('%s = %s' % (variable, ' '.join(map(QuoteSpaces, outputs))))
-    extra_outputs.append('$(%s)' % variable)
-    self.WriteLn()
-
-
-  def WriteMacBundleResources(self, resources, bundle_deps):
-    """Writes Makefile code for 'mac_bundle_resources'."""
-    self.WriteLn('### Generated for mac_bundle_resources')
-
-    for output, res in gyp.xcode_emulation.GetMacBundleResources(
-        generator_default_variables['PRODUCT_DIR'], self.xcode_settings,
-        map(Sourceify, map(self.Absolutify, resources))):
-      self.WriteDoCmd([output], [res], 'mac_tool,,,copy-bundle-resource',
-                      part_of_all=True)
-      bundle_deps.append(output)
-
-
-  def WriteMacInfoPlist(self, bundle_deps):
-    """Write Makefile code for bundle Info.plist files."""
-    info_plist, out, defines, extra_env = gyp.xcode_emulation.GetMacInfoPlist(
-        generator_default_variables['PRODUCT_DIR'], self.xcode_settings,
-        lambda p: Sourceify(self.Absolutify(p)))
-    if not info_plist:
-      return
-    if defines:
-      # Create an intermediate file to store preprocessed results.
-      intermediate_plist = ('$(obj).$(TOOLSET)/$(TARGET)/' +
-          os.path.basename(info_plist))
-      self.WriteList(defines, intermediate_plist + ': INFOPLIST_DEFINES', '-D',
-          quoter=EscapeCppDefine)
-      self.WriteMakeRule([intermediate_plist], [info_plist],
-          ['$(call do_cmd,infoplist)',
-           # "Convert" the plist so that any weird whitespace changes from the
-           # preprocessor do not affect the XML parser in mac_tool.
-           '@plutil -convert xml1 $@ $@'])
-      info_plist = intermediate_plist
-    # plists can contain envvars and substitute them into the file.
-    self.WriteSortedXcodeEnv(
-        out, self.GetSortedXcodeEnv(additional_settings=extra_env))
-    self.WriteDoCmd([out], [info_plist], 'mac_tool,,,copy-info-plist',
-                    part_of_all=True)
-    bundle_deps.append(out)
-
-
-  def WriteSources(self, configs, deps, sources,
-                   extra_outputs, extra_link_deps,
-                   part_of_all, precompiled_header):
-    """Write Makefile code for any 'sources' from the gyp input.
-    These are source files necessary to build the current target.
-
-    configs, deps, sources: input from gyp.
-    extra_outputs: a list of extra outputs this action should be dependent on;
-                   used to serialize action/rules before compilation
-    extra_link_deps: a list that will be filled in with any outputs of
-                     compilation (to be used in link lines)
-    part_of_all: flag indicating this target is part of 'all'
-    """
-
-    # Write configuration-specific variables for CFLAGS, etc.
-    for configname in sorted(configs.keys()):
-      config = configs[configname]
-      self.WriteList(config.get('defines'), 'DEFS_%s' % configname, prefix='-D',
-          quoter=EscapeCppDefine)
-
-      if self.flavor == 'mac':
-        cflags = self.xcode_settings.GetCflags(configname)
-        cflags_c = self.xcode_settings.GetCflagsC(configname)
-        cflags_cc = self.xcode_settings.GetCflagsCC(configname)
-        cflags_objc = self.xcode_settings.GetCflagsObjC(configname)
-        cflags_objcc = self.xcode_settings.GetCflagsObjCC(configname)
-      else:
-        cflags = config.get('cflags')
-        cflags_c = config.get('cflags_c')
-        cflags_cc = config.get('cflags_cc')
-
-      self.WriteLn("# Flags passed to all source files.");
-      self.WriteList(cflags, 'CFLAGS_%s' % configname)
-      self.WriteLn("# Flags passed to only C files.");
-      self.WriteList(cflags_c, 'CFLAGS_C_%s' % configname)
-      self.WriteLn("# Flags passed to only C++ files.");
-      self.WriteList(cflags_cc, 'CFLAGS_CC_%s' % configname)
-      if self.flavor == 'mac':
-        self.WriteLn("# Flags passed to only ObjC files.");
-        self.WriteList(cflags_objc, 'CFLAGS_OBJC_%s' % configname)
-        self.WriteLn("# Flags passed to only ObjC++ files.");
-        self.WriteList(cflags_objcc, 'CFLAGS_OBJCC_%s' % configname)
-      includes = config.get('include_dirs')
-      if includes:
-        includes = map(Sourceify, map(self.Absolutify, includes))
-      self.WriteList(includes, 'INCS_%s' % configname, prefix='-I')
-
-    compilable = filter(Compilable, sources)
-    objs = map(self.Objectify, map(self.Absolutify, map(Target, compilable)))
-    self.WriteList(objs, 'OBJS')
-
-    for obj in objs:
-      assert ' ' not in obj, (
-          "Spaces in object filenames not supported (%s)"  % obj)
-    self.WriteLn('# Add to the list of files we specially track '
-                 'dependencies for.')
-    self.WriteLn('all_deps += $(OBJS)')
-    self.WriteLn()
-
-    # Make sure our dependencies are built first.
-    if deps:
-      self.WriteMakeRule(['$(OBJS)'], deps,
-                         comment = 'Make sure our dependencies are built '
-                                   'before any of us.',
-                         order_only = True)
-
-    # Make sure the actions and rules run first.
-    # If they generate any extra headers etc., the per-.o file dep tracking
-    # will catch the proper rebuilds, so order only is still ok here.
-    if extra_outputs:
-      self.WriteMakeRule(['$(OBJS)'], extra_outputs,
-                         comment = 'Make sure our actions/rules run '
-                                   'before any of us.',
-                         order_only = True)
-
-    pchdeps = precompiled_header.GetObjDependencies(compilable, objs )
-    if pchdeps:
-      self.WriteLn('# Dependencies from obj files to their precompiled headers')
-      for source, obj, gch in pchdeps:
-        self.WriteLn('%s: %s' % (obj, gch))
-      self.WriteLn('# End precompiled header dependencies')
-
-    if objs:
-      extra_link_deps.append('$(OBJS)')
-      self.WriteLn("""\
-# CFLAGS et al overrides must be target-local.
-# See "Target-specific Variable Values" in the GNU Make manual.""")
-      self.WriteLn("$(OBJS): TOOLSET := $(TOOLSET)")
-      self.WriteLn("$(OBJS): GYP_CFLAGS := "
-                   "$(DEFS_$(BUILDTYPE)) "
-                   "$(INCS_$(BUILDTYPE)) "
-                   "%s " % precompiled_header.GetInclude('c') +
-                   "$(CFLAGS_$(BUILDTYPE)) "
-                   "$(CFLAGS_C_$(BUILDTYPE))")
-      self.WriteLn("$(OBJS): GYP_CXXFLAGS := "
-                   "$(DEFS_$(BUILDTYPE)) "
-                   "$(INCS_$(BUILDTYPE)) "
-                   "%s " % precompiled_header.GetInclude('cc') +
-                   "$(CFLAGS_$(BUILDTYPE)) "
-                   "$(CFLAGS_CC_$(BUILDTYPE))")
-      if self.flavor == 'mac':
-        self.WriteLn("$(OBJS): GYP_OBJCFLAGS := "
-                     "$(DEFS_$(BUILDTYPE)) "
-                     "$(INCS_$(BUILDTYPE)) "
-                     "%s " % precompiled_header.GetInclude('m') +
-                     "$(CFLAGS_$(BUILDTYPE)) "
-                     "$(CFLAGS_C_$(BUILDTYPE)) "
-                     "$(CFLAGS_OBJC_$(BUILDTYPE))")
-        self.WriteLn("$(OBJS): GYP_OBJCXXFLAGS := "
-                     "$(DEFS_$(BUILDTYPE)) "
-                     "$(INCS_$(BUILDTYPE)) "
-                     "%s " % precompiled_header.GetInclude('mm') +
-                     "$(CFLAGS_$(BUILDTYPE)) "
-                     "$(CFLAGS_CC_$(BUILDTYPE)) "
-                     "$(CFLAGS_OBJCC_$(BUILDTYPE))")
-
-    self.WritePchTargets(precompiled_header.GetPchBuildCommands())
-
-    # If there are any object files in our input file list, link them into our
-    # output.
-    extra_link_deps += filter(Linkable, sources)
-
-    self.WriteLn()
-
-  def WritePchTargets(self, pch_commands):
-    """Writes make rules to compile prefix headers."""
-    if not pch_commands:
-      return
-
-    for gch, lang_flag, lang, input in pch_commands:
-      extra_flags = {
-        'c': '$(CFLAGS_C_$(BUILDTYPE))',
-        'cc': '$(CFLAGS_CC_$(BUILDTYPE))',
-        'm': '$(CFLAGS_C_$(BUILDTYPE)) $(CFLAGS_OBJC_$(BUILDTYPE))',
-        'mm': '$(CFLAGS_CC_$(BUILDTYPE)) $(CFLAGS_OBJCC_$(BUILDTYPE))',
-      }[lang]
-      var_name = {
-        'c': 'GYP_PCH_CFLAGS',
-        'cc': 'GYP_PCH_CXXFLAGS',
-        'm': 'GYP_PCH_OBJCFLAGS',
-        'mm': 'GYP_PCH_OBJCXXFLAGS',
-      }[lang]
-      self.WriteLn("%s: %s := %s " % (gch, var_name, lang_flag) +
-                   "$(DEFS_$(BUILDTYPE)) "
-                   "$(INCS_$(BUILDTYPE)) "
-                   "$(CFLAGS_$(BUILDTYPE)) " +
-                   extra_flags)
-
-      self.WriteLn('%s: %s FORCE_DO_CMD' % (gch, input))
-      self.WriteLn('\t@$(call do_cmd,pch_%s,1)' % lang)
-      self.WriteLn('')
-      assert ' ' not in gch, (
-          "Spaces in gch filenames not supported (%s)"  % gch)
-      self.WriteLn('all_deps += %s' % gch)
-      self.WriteLn('')
-
-
-  def ComputeOutputBasename(self, spec):
-    """Return the 'output basename' of a gyp spec.
-
-    E.g., the loadable module 'foobar' in directory 'baz' will produce
-      'libfoobar.so'
-    """
-    assert not self.is_mac_bundle
-
-    if self.flavor == 'mac' and self.type in (
-        'static_library', 'executable', 'shared_library', 'loadable_module'):
-      return self.xcode_settings.GetExecutablePath()
-
-    target = spec['target_name']
-    target_prefix = ''
-    target_ext = ''
-    if self.type == 'static_library':
-      if target[:3] == 'lib':
-        target = target[3:]
-      target_prefix = 'lib'
-      target_ext = '.a'
-    elif self.type in ('loadable_module', 'shared_library'):
-      if target[:3] == 'lib':
-        target = target[3:]
-      target_prefix = 'lib'
-      target_ext = '.so'
-    elif self.type == 'none':
-      target = '%s.stamp' % target
-    elif self.type != 'executable':
-      print ("ERROR: What output file should be generated?",
-             "type", self.type, "target", target)
-
-    target_prefix = spec.get('product_prefix', target_prefix)
-    target = spec.get('product_name', target)
-    product_ext = spec.get('product_extension')
-    if product_ext:
-      target_ext = '.' + product_ext
-
-    return target_prefix + target + target_ext
-
-
-  def _InstallImmediately(self):
-    return self.toolset == 'target' and self.flavor == 'mac' and self.type in (
-          'static_library', 'executable', 'shared_library', 'loadable_module')
-
-
-  def ComputeOutput(self, spec):
-    """Return the 'output' (full output path) of a gyp spec.
-
-    E.g., the loadable module 'foobar' in directory 'baz' will produce
-      '$(obj)/baz/libfoobar.so'
-    """
-    assert not self.is_mac_bundle
-
-    path = os.path.join('$(obj).' + self.toolset, self.path)
-    if self.type == 'executable' or self._InstallImmediately():
-      path = '$(builddir)'
-    path = spec.get('product_dir', path)
-    return os.path.join(path, self.ComputeOutputBasename(spec))
-
-
-  def ComputeMacBundleOutput(self, spec):
-    """Return the 'output' (full output path) to a bundle output directory."""
-    assert self.is_mac_bundle
-    path = generator_default_variables['PRODUCT_DIR']
-    return os.path.join(path, self.xcode_settings.GetWrapperName())
-
-
-  def ComputeMacBundleBinaryOutput(self, spec):
-    """Return the 'output' (full output path) to the binary in a bundle."""
-    path = generator_default_variables['PRODUCT_DIR']
-    return os.path.join(path, self.xcode_settings.GetExecutablePath())
-
-
-  def ComputeDeps(self, spec):
-    """Compute the dependencies of a gyp spec.
-
-    Returns a tuple (deps, link_deps), where each is a list of
-    filenames that will need to be put in front of make for either
-    building (deps) or linking (link_deps).
-    """
-    deps = []
-    link_deps = []
-    if 'dependencies' in spec:
-      deps.extend([target_outputs[dep] for dep in spec['dependencies']
-                   if target_outputs[dep]])
-      for dep in spec['dependencies']:
-        if dep in target_link_deps:
-          link_deps.append(target_link_deps[dep])
-      deps.extend(link_deps)
-      # TODO: It seems we need to transitively link in libraries (e.g. -lfoo)?
-      # This hack makes it work:
-      # link_deps.extend(spec.get('libraries', []))
-    return (gyp.common.uniquer(deps), gyp.common.uniquer(link_deps))
-
-
-  def WriteDependencyOnExtraOutputs(self, target, extra_outputs):
-    self.WriteMakeRule([self.output_binary], extra_outputs,
-                       comment = 'Build our special outputs first.',
-                       order_only = True)
-
-
-  def WriteTarget(self, spec, configs, deps, link_deps, bundle_deps,
-                  extra_outputs, part_of_all):
-    """Write Makefile code to produce the final target of the gyp spec.
-
-    spec, configs: input from gyp.
-    deps, link_deps: dependency lists; see ComputeDeps()
-    extra_outputs: any extra outputs that our target should depend on
-    part_of_all: flag indicating this target is part of 'all'
-    """
-
-    self.WriteLn('### Rules for final target.')
-
-    if extra_outputs:
-      self.WriteDependencyOnExtraOutputs(self.output_binary, extra_outputs)
-      self.WriteMakeRule(extra_outputs, deps,
-                         comment=('Preserve order dependency of '
-                                  'special output on deps.'),
-                         order_only = True)
-
-    target_postbuilds = {}
-    if self.type != 'none':
-      for configname in sorted(configs.keys()):
-        config = configs[configname]
-        if self.flavor == 'mac':
-          ldflags = self.xcode_settings.GetLdflags(configname,
-              generator_default_variables['PRODUCT_DIR'],
-              lambda p: Sourceify(self.Absolutify(p)))
-
-          # TARGET_POSTBUILDS_$(BUILDTYPE) is added to postbuilds later on.
-          gyp_to_build = gyp.common.InvertRelativePath(self.path)
-          target_postbuild = self.xcode_settings.GetTargetPostbuilds(
-              configname,
-              QuoteSpaces(os.path.normpath(os.path.join(gyp_to_build,
-                                                        self.output))),
-              QuoteSpaces(os.path.normpath(os.path.join(gyp_to_build,
-                                                        self.output_binary))))
-          if target_postbuild:
-            target_postbuilds[configname] = target_postbuild
-        else:
-          ldflags = config.get('ldflags', [])
-          # Compute an rpath for this output if needed.
-          if any(dep.endswith('.so') for dep in deps):
-            # We want to get the literal string "$ORIGIN" into the link command,
-            # so we need lots of escaping.
-            ldflags.append(r'-Wl,-rpath=\$$ORIGIN/lib.%s/' % self.toolset)
-            ldflags.append(r'-Wl,-rpath-link=\$(builddir)/lib.%s/' %
-                           self.toolset)
-        self.WriteList(ldflags, 'LDFLAGS_%s' % configname)
-        if self.flavor == 'mac':
-          self.WriteList(self.xcode_settings.GetLibtoolflags(configname),
-                         'LIBTOOLFLAGS_%s' % configname)
-      libraries = spec.get('libraries')
-      if libraries:
-        # Remove duplicate entries
-        libraries = gyp.common.uniquer(libraries)
-        if self.flavor == 'mac':
-          libraries = self.xcode_settings.AdjustLibraries(libraries)
-      self.WriteList(libraries, 'LIBS')
-      self.WriteLn('%s: GYP_LDFLAGS := $(LDFLAGS_$(BUILDTYPE))' %
-          QuoteSpaces(self.output_binary))
-      self.WriteLn('%s: LIBS := $(LIBS)' % QuoteSpaces(self.output_binary))
-
-      if self.flavor == 'mac':
-        self.WriteLn('%s: GYP_LIBTOOLFLAGS := $(LIBTOOLFLAGS_$(BUILDTYPE))' %
-            QuoteSpaces(self.output_binary))
-
-    # Postbuild actions. Like actions, but implicitly depend on the target's
-    # output.
-    postbuilds = []
-    if self.flavor == 'mac':
-      if target_postbuilds:
-        postbuilds.append('$(TARGET_POSTBUILDS_$(BUILDTYPE))')
-      postbuilds.extend(
-          gyp.xcode_emulation.GetSpecPostbuildCommands(spec))
-
-    if postbuilds:
-      # Envvars may be referenced by TARGET_POSTBUILDS_$(BUILDTYPE),
-      # so we must output its definition first, since we declare variables
-      # using ":=".
-      self.WriteSortedXcodeEnv(self.output, self.GetSortedXcodePostbuildEnv())
-
-      for configname in target_postbuilds:
-        self.WriteLn('%s: TARGET_POSTBUILDS_%s := %s' %
-            (QuoteSpaces(self.output),
-             configname,
-             gyp.common.EncodePOSIXShellList(target_postbuilds[configname])))
-
-      # Postbuilds expect to be run in the gyp file's directory, so insert an
-      # implicit postbuild to cd to there.
-      postbuilds.insert(0, gyp.common.EncodePOSIXShellList(['cd', self.path]))
-      for i in xrange(len(postbuilds)):
-        if not postbuilds[i].startswith('$'):
-          postbuilds[i] = EscapeShellArgument(postbuilds[i])
-      self.WriteLn('%s: builddir := $(abs_builddir)' % QuoteSpaces(self.output))
-      self.WriteLn('%s: POSTBUILDS := %s' % (
-          QuoteSpaces(self.output), ' '.join(postbuilds)))
-
-    # A bundle directory depends on its dependencies such as bundle resources
-    # and bundle binary. When all dependencies have been built, the bundle
-    # needs to be packaged.
-    if self.is_mac_bundle:
-      # If the framework doesn't contain a binary, then nothing depends
-      # on the actions -- make the framework depend on them directly too.
-      self.WriteDependencyOnExtraOutputs(self.output, extra_outputs)
-
-      # Bundle dependencies. Note that the code below adds actions to this
-      # target, so if you move these two lines, move the lines below as well.
-      self.WriteList(map(QuoteSpaces, bundle_deps), 'BUNDLE_DEPS')
-      self.WriteLn('%s: $(BUNDLE_DEPS)' % QuoteSpaces(self.output))
-
-      # After the framework is built, package it. Needs to happen before
-      # postbuilds, since postbuilds depend on this.
-      if self.type in ('shared_library', 'loadable_module'):
-        self.WriteLn('\t@$(call do_cmd,mac_package_framework,,,%s)' %
-            self.xcode_settings.GetFrameworkVersion())
-
-      # Bundle postbuilds can depend on the whole bundle, so run them after
-      # the bundle is packaged, not already after the bundle binary is done.
-      if postbuilds:
-        self.WriteLn('\t@$(call do_postbuilds)')
-      postbuilds = []  # Don't write postbuilds for target's output.
-
-      # Needed by test/mac/gyptest-rebuild.py.
-      self.WriteLn('\t@true  # No-op, used by tests')
-
-      # Since this target depends on binary and resources which are in
-      # nested subfolders, the framework directory will be older than
-      # its dependencies usually. To prevent this rule from executing
-      # on every build (expensive, especially with postbuilds), expliclity
-      # update the time on the framework directory.
-      self.WriteLn('\t@touch -c %s' % QuoteSpaces(self.output))
-
-    if postbuilds:
-      assert not self.is_mac_bundle, ('Postbuilds for bundles should be done '
-          'on the bundle, not the binary (target \'%s\')' % self.target)
-      assert 'product_dir' not in spec, ('Postbuilds do not work with '
-          'custom product_dir')
-
-    if self.type == 'executable':
-      self.WriteLn('%s: LD_INPUTS := %s' % (
-          QuoteSpaces(self.output_binary),
-          ' '.join(map(QuoteSpaces, link_deps))))
-      if self.toolset == 'host' and self.flavor == 'android':
-        self.WriteDoCmd([self.output_binary], link_deps, 'link_host',
-                        part_of_all, postbuilds=postbuilds)
-      else:
-        self.WriteDoCmd([self.output_binary], link_deps, 'link', part_of_all,
-                        postbuilds=postbuilds)
-
-    elif self.type == 'static_library':
-      for link_dep in link_deps:
-        assert ' ' not in link_dep, (
-            "Spaces in alink input filenames not supported (%s)"  % link_dep)
-      if (self.flavor not in ('mac', 'openbsd', 'win') and not
-          self.is_standalone_static_library):
-        self.WriteDoCmd([self.output_binary], link_deps, 'alink_thin',
-                        part_of_all, postbuilds=postbuilds)
-      else:
-        self.WriteDoCmd([self.output_binary], link_deps, 'alink', part_of_all,
-                        postbuilds=postbuilds)
-    elif self.type == 'shared_library':
-      self.WriteLn('%s: LD_INPUTS := %s' % (
-            QuoteSpaces(self.output_binary),
-            ' '.join(map(QuoteSpaces, link_deps))))
-      self.WriteDoCmd([self.output_binary], link_deps, 'solink', part_of_all,
-                      postbuilds=postbuilds)
-    elif self.type == 'loadable_module':
-      for link_dep in link_deps:
-        assert ' ' not in link_dep, (
-            "Spaces in module input filenames not supported (%s)"  % link_dep)
-      if self.toolset == 'host' and self.flavor == 'android':
-        self.WriteDoCmd([self.output_binary], link_deps, 'solink_module_host',
-                        part_of_all, postbuilds=postbuilds)
-      else:
-        self.WriteDoCmd(
-            [self.output_binary], link_deps, 'solink_module', part_of_all,
-            postbuilds=postbuilds)
-    elif self.type == 'none':
-      # Write a stamp line.
-      self.WriteDoCmd([self.output_binary], deps, 'touch', part_of_all,
-                      postbuilds=postbuilds)
-    else:
-      print "WARNING: no output for", self.type, target
-
-    # Add an alias for each target (if there are any outputs).
-    # Installable target aliases are created below.
-    if ((self.output and self.output != self.target) and
-        (self.type not in self._INSTALLABLE_TARGETS)):
-      self.WriteMakeRule([self.target], [self.output],
-                         comment='Add target alias', phony = True)
-      if part_of_all:
-        self.WriteMakeRule(['all'], [self.target],
-                           comment = 'Add target alias to "all" target.',
-                           phony = True)
-
-    # Add special-case rules for our installable targets.
-    # 1) They need to install to the build dir or "product" dir.
-    # 2) They get shortcuts for building (e.g. "make chrome").
-    # 3) They are part of "make all".
-    if (self.type in self._INSTALLABLE_TARGETS or
-        self.is_standalone_static_library):
-      if self.type == 'shared_library':
-        file_desc = 'shared library'
-      elif self.type == 'static_library':
-        file_desc = 'static library'
-      else:
-        file_desc = 'executable'
-      install_path = self._InstallableTargetInstallPath()
-      installable_deps = [self.output]
-      if (self.flavor == 'mac' and not 'product_dir' in spec and
-          self.toolset == 'target'):
-        # On mac, products are created in install_path immediately.
-        assert install_path == self.output, '%s != %s' % (
-            install_path, self.output)
-
-      # Point the target alias to the final binary output.
-      self.WriteMakeRule([self.target], [install_path],
-                         comment='Add target alias', phony = True)
-      if install_path != self.output:
-        assert not self.is_mac_bundle  # See comment a few lines above.
-        self.WriteDoCmd([install_path], [self.output], 'copy',
-                        comment = 'Copy this to the %s output path.' %
-                        file_desc, part_of_all=part_of_all)
-        installable_deps.append(install_path)
-      if self.output != self.alias and self.alias != self.target:
-        self.WriteMakeRule([self.alias], installable_deps,
-                           comment = 'Short alias for building this %s.' %
-                           file_desc, phony = True)
-      if part_of_all:
-        self.WriteMakeRule(['all'], [install_path],
-                           comment = 'Add %s to "all" target.' % file_desc,
-                           phony = True)
-
-
-  def WriteList(self, value_list, variable=None, prefix='',
-                quoter=QuoteIfNecessary):
-    """Write a variable definition that is a list of values.
-
-    E.g. WriteList(['a','b'], 'foo', prefix='blah') writes out
-         foo = blaha blahb
-    but in a pretty-printed style.
-    """
-    values = ''
-    if value_list:
-      value_list = [quoter(prefix + l) for l in value_list]
-      values = ' \\\n\t' + ' \\\n\t'.join(value_list)
-    self.fp.write('%s :=%s\n\n' % (variable, values))
-
-
-  def WriteDoCmd(self, outputs, inputs, command, part_of_all, comment=None,
-                 postbuilds=False):
-    """Write a Makefile rule that uses do_cmd.
-
-    This makes the outputs dependent on the command line that was run,
-    as well as support the V= make command line flag.
-    """
-    suffix = ''
-    if postbuilds:
-      assert ',' not in command
-      suffix = ',,1'  # Tell do_cmd to honor $POSTBUILDS
-    self.WriteMakeRule(outputs, inputs,
-                       actions = ['$(call do_cmd,%s%s)' % (command, suffix)],
-                       comment = comment,
-                       force = True)
-    # Add our outputs to the list of targets we read depfiles from.
-    # all_deps is only used for deps file reading, and for deps files we replace
-    # spaces with ? because escaping doesn't work with make's $(sort) and
-    # other functions.
-    outputs = [QuoteSpaces(o, SPACE_REPLACEMENT) for o in outputs]
-    self.WriteLn('all_deps += %s' % ' '.join(outputs))
-
-
-  def WriteMakeRule(self, outputs, inputs, actions=None, comment=None,
-                    order_only=False, force=False, phony=False):
-    """Write a Makefile rule, with some extra tricks.
-
-    outputs: a list of outputs for the rule (note: this is not directly
-             supported by make; see comments below)
-    inputs: a list of inputs for the rule
-    actions: a list of shell commands to run for the rule
-    comment: a comment to put in the Makefile above the rule (also useful
-             for making this Python script's code self-documenting)
-    order_only: if true, makes the dependency order-only
-    force: if true, include FORCE_DO_CMD as an order-only dep
-    phony: if true, the rule does not actually generate the named output, the
-           output is just a name to run the rule
-    """
-    outputs = map(QuoteSpaces, outputs)
-    inputs = map(QuoteSpaces, inputs)
-
-    if comment:
-      self.WriteLn('# ' + comment)
-    if phony:
-      self.WriteLn('.PHONY: ' + ' '.join(outputs))
-    # TODO(evanm): just make order_only a list of deps instead of these hacks.
-    if order_only:
-      order_insert = '| '
-      pick_output = ' '.join(outputs)
-    else:
-      order_insert = ''
-      pick_output = outputs[0]
-    if force:
-      force_append = ' FORCE_DO_CMD'
-    else:
-      force_append = ''
-    if actions:
-      self.WriteLn("%s: TOOLSET := $(TOOLSET)" % outputs[0])
-    self.WriteLn('%s: %s%s%s' % (pick_output, order_insert, ' '.join(inputs),
-                                 force_append))
-    if actions:
-      for action in actions:
-        self.WriteLn('\t%s' % action)
-    if not order_only and len(outputs) > 1:
-      # If we have more than one output, a rule like
-      #   foo bar: baz
-      # that for *each* output we must run the action, potentially
-      # in parallel.  That is not what we're trying to write -- what
-      # we want is that we run the action once and it generates all
-      # the files.
-      # http://www.gnu.org/software/hello/manual/automake/Multiple-Outputs.html
-      # discusses this problem and has this solution:
-      # 1) Write the naive rule that would produce parallel runs of
-      # the action.
-      # 2) Make the outputs seralized on each other, so we won't start
-      # a parallel run until the first run finishes, at which point
-      # we'll have generated all the outputs and we're done.
-      self.WriteLn('%s: %s' % (' '.join(outputs[1:]), outputs[0]))
-      # Add a dummy command to the "extra outputs" rule, otherwise make seems to
-      # think these outputs haven't (couldn't have?) changed, and thus doesn't
-      # flag them as changed (i.e. include in '$?') when evaluating dependent
-      # rules, which in turn causes do_cmd() to skip running dependent commands.
-      self.WriteLn('%s: ;' % (' '.join(outputs[1:])))
-    self.WriteLn()
-
-
-  def WriteAndroidNdkModuleRule(self, module_name, all_sources, link_deps):
-    """Write a set of LOCAL_XXX definitions for Android NDK.
-
-    These variable definitions will be used by Android NDK but do nothing for
-    non-Android applications.
-
-    Arguments:
-      module_name: Android NDK module name, which must be unique among all
-          module names.
-      all_sources: A list of source files (will be filtered by Compilable).
-      link_deps: A list of link dependencies, which must be sorted in
-          the order from dependencies to dependents.
-    """
-    if self.type not in ('executable', 'shared_library', 'static_library'):
-      return
-
-    self.WriteLn('# Variable definitions for Android applications')
-    self.WriteLn('include $(CLEAR_VARS)')
-    self.WriteLn('LOCAL_MODULE := ' + module_name)
-    self.WriteLn('LOCAL_CFLAGS := $(CFLAGS_$(BUILDTYPE)) '
-                 '$(DEFS_$(BUILDTYPE)) '
-                 # LOCAL_CFLAGS is applied to both of C and C++.  There is
-                 # no way to specify $(CFLAGS_C_$(BUILDTYPE)) only for C
-                 # sources.
-                 '$(CFLAGS_C_$(BUILDTYPE)) '
-                 # $(INCS_$(BUILDTYPE)) includes the prefix '-I' while
-                 # LOCAL_C_INCLUDES does not expect it.  So put it in
-                 # LOCAL_CFLAGS.
-                 '$(INCS_$(BUILDTYPE))')
-    # LOCAL_CXXFLAGS is obsolete and LOCAL_CPPFLAGS is preferred.
-    self.WriteLn('LOCAL_CPPFLAGS := $(CFLAGS_CC_$(BUILDTYPE))')
-    self.WriteLn('LOCAL_C_INCLUDES :=')
-    self.WriteLn('LOCAL_LDLIBS := $(LDFLAGS_$(BUILDTYPE)) $(LIBS)')
-
-    # Detect the C++ extension.
-    cpp_ext = {'.cc': 0, '.cpp': 0, '.cxx': 0}
-    default_cpp_ext = '.cpp'
-    for filename in all_sources:
-      ext = os.path.splitext(filename)[1]
-      if ext in cpp_ext:
-        cpp_ext[ext] += 1
-        if cpp_ext[ext] > cpp_ext[default_cpp_ext]:
-          default_cpp_ext = ext
-    self.WriteLn('LOCAL_CPP_EXTENSION := ' + default_cpp_ext)
-
-    self.WriteList(map(self.Absolutify, filter(Compilable, all_sources)),
-                   'LOCAL_SRC_FILES')
-
-    # Filter out those which do not match prefix and suffix and produce
-    # the resulting list without prefix and suffix.
-    def DepsToModules(deps, prefix, suffix):
-      modules = []
-      for filepath in deps:
-        filename = os.path.basename(filepath)
-        if filename.startswith(prefix) and filename.endswith(suffix):
-          modules.append(filename[len(prefix):-len(suffix)])
-      return modules
-
-    # Retrieve the default value of 'SHARED_LIB_SUFFIX'
-    params = {'flavor': 'linux'}
-    default_variables = {}
-    CalculateVariables(default_variables, params)
-
-    self.WriteList(
-        DepsToModules(link_deps,
-                      generator_default_variables['SHARED_LIB_PREFIX'],
-                      default_variables['SHARED_LIB_SUFFIX']),
-        'LOCAL_SHARED_LIBRARIES')
-    self.WriteList(
-        DepsToModules(link_deps,
-                      generator_default_variables['STATIC_LIB_PREFIX'],
-                      generator_default_variables['STATIC_LIB_SUFFIX']),
-        'LOCAL_STATIC_LIBRARIES')
-
-    if self.type == 'executable':
-      self.WriteLn('include $(BUILD_EXECUTABLE)')
-    elif self.type == 'shared_library':
-      self.WriteLn('include $(BUILD_SHARED_LIBRARY)')
-    elif self.type == 'static_library':
-      self.WriteLn('include $(BUILD_STATIC_LIBRARY)')
-    self.WriteLn()
-
-
-  def WriteLn(self, text=''):
-    self.fp.write(text + '\n')
-
-
-  def GetSortedXcodeEnv(self, additional_settings=None):
-    return gyp.xcode_emulation.GetSortedXcodeEnv(
-        self.xcode_settings, "$(abs_builddir)",
-        os.path.join("$(abs_srcdir)", self.path), "$(BUILDTYPE)",
-        additional_settings)
-
-
-  def GetSortedXcodePostbuildEnv(self):
-    # CHROMIUM_STRIP_SAVE_FILE is a chromium-specific hack.
-    # TODO(thakis): It would be nice to have some general mechanism instead.
-    strip_save_file = self.xcode_settings.GetPerTargetSetting(
-        'CHROMIUM_STRIP_SAVE_FILE', '')
-    # Even if strip_save_file is empty, explicitly write it. Else a postbuild
-    # might pick up an export from an earlier target.
-    return self.GetSortedXcodeEnv(
-        additional_settings={'CHROMIUM_STRIP_SAVE_FILE': strip_save_file})
-
-
-  def WriteSortedXcodeEnv(self, target, env):
-    for k, v in env:
-      # For
-      #  foo := a\ b
-      # the escaped space does the right thing. For
-      #  export foo := a\ b
-      # it does not -- the backslash is written to the env as literal character.
-      # So don't escape spaces in |env[k]|.
-      self.WriteLn('%s: export %s := %s' % (QuoteSpaces(target), k, v))
-
-
-  def Objectify(self, path):
-    """Convert a path to its output directory form."""
-    if '$(' in path:
-      path = path.replace('$(obj)/', '$(obj).%s/$(TARGET)/' % self.toolset)
-    if not '$(obj)' in path:
-      path = '$(obj).%s/$(TARGET)/%s' % (self.toolset, path)
-    return path
-
-
-  def Pchify(self, path, lang):
-    """Convert a prefix header path to its output directory form."""
-    path = self.Absolutify(path)
-    if '$(' in path:
-      path = path.replace('$(obj)/', '$(obj).%s/$(TARGET)/pch-%s' %
-                          (self.toolset, lang))
-      return path
-    return '$(obj).%s/$(TARGET)/pch-%s/%s' % (self.toolset, lang, path)
-
-
-  def Absolutify(self, path):
-    """Convert a subdirectory-relative path into a base-relative path.
-    Skips over paths that contain variables."""
-    if '$(' in path:
-      # Don't call normpath in this case, as it might collapse the
-      # path too aggressively if it features '..'. However it's still
-      # important to strip trailing slashes.
-      return path.rstrip('/')
-    return os.path.normpath(os.path.join(self.path, path))
-
-
-  def ExpandInputRoot(self, template, expansion, dirname):
-    if '%(INPUT_ROOT)s' not in template and '%(INPUT_DIRNAME)s' not in template:
-      return template
-    path = template % {
-        'INPUT_ROOT': expansion,
-        'INPUT_DIRNAME': dirname,
-        }
-    return path
-
-
-  def _InstallableTargetInstallPath(self):
-    """Returns the location of the final output for an installable target."""
-    # Xcode puts shared_library results into PRODUCT_DIR, and some gyp files
-    # rely on this. Emulate this behavior for mac.
-
-    # XXX(TooTallNate): disabling this code since we don't want this behavior...
-    #if (self.type == 'shared_library' and
-    #    (self.flavor != 'mac' or self.toolset != 'target')):
-    #  # Install all shared libs into a common directory (per toolset) for
-    #  # convenient access with LD_LIBRARY_PATH.
-    #  return '$(builddir)/lib.%s/%s' % (self.toolset, self.alias)
-    return '$(builddir)/' + self.alias
-
-
-def WriteAutoRegenerationRule(params, root_makefile, makefile_name,
-                              build_files):
-  """Write the target to regenerate the Makefile."""
-  options = params['options']
-  build_files_args = [gyp.common.RelativePath(filename, options.toplevel_dir)
-                      for filename in params['build_files_arg']]
-  gyp_binary = gyp.common.FixIfRelativePath(params['gyp_binary'],
-                                            options.toplevel_dir)
-  if not gyp_binary.startswith(os.sep):
-    gyp_binary = os.path.join('.', gyp_binary)
-  root_makefile.write(
-      "quiet_cmd_regen_makefile = ACTION Regenerating $@\n"
-      "cmd_regen_makefile = %(cmd)s\n"
-      "%(makefile_name)s: %(deps)s\n"
-      "\t$(call do_cmd,regen_makefile)\n\n" % {
-          'makefile_name': makefile_name,
-          'deps': ' '.join(map(Sourceify, build_files)),
-          'cmd': gyp.common.EncodePOSIXShellList(
-                     [gyp_binary, '-fmake'] +
-                     gyp.RegenerateFlags(options) +
-                     build_files_args)})
-
-
-def PerformBuild(data, configurations, params):
-  options = params['options']
-  for config in configurations:
-    arguments = ['make']
-    if options.toplevel_dir and options.toplevel_dir != '.':
-      arguments += '-C', options.toplevel_dir
-    arguments.append('BUILDTYPE=' + config)
-    print 'Building [%s]: %s' % (config, arguments)
-    subprocess.check_call(arguments)
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  options = params['options']
-  flavor = gyp.common.GetFlavor(params)
-  generator_flags = params.get('generator_flags', {})
-  builddir_name = generator_flags.get('output_dir', 'out')
-  android_ndk_version = generator_flags.get('android_ndk_version', None)
-  default_target = generator_flags.get('default_target', 'all')
-
-  def CalculateMakefilePath(build_file, base_name):
-    """Determine where to write a Makefile for a given gyp file."""
-    # Paths in gyp files are relative to the .gyp file, but we want
-    # paths relative to the source root for the master makefile.  Grab
-    # the path of the .gyp file as the base to relativize against.
-    # E.g. "foo/bar" when we're constructing targets for "foo/bar/baz.gyp".
-    base_path = gyp.common.RelativePath(os.path.dirname(build_file),
-                                        options.depth)
-    # We write the file in the base_path directory.
-    output_file = os.path.join(options.depth, base_path, base_name)
-    if options.generator_output:
-      output_file = os.path.join(options.generator_output, output_file)
-    base_path = gyp.common.RelativePath(os.path.dirname(build_file),
-                                        options.toplevel_dir)
-    return base_path, output_file
-
-  # TODO:  search for the first non-'Default' target.  This can go
-  # away when we add verification that all targets have the
-  # necessary configurations.
-  default_configuration = None
-  toolsets = set([target_dicts[target]['toolset'] for target in target_list])
-  for target in target_list:
-    spec = target_dicts[target]
-    if spec['default_configuration'] != 'Default':
-      default_configuration = spec['default_configuration']
-      break
-  if not default_configuration:
-    default_configuration = 'Default'
-
-  srcdir = '.'
-  makefile_name = 'Makefile' + options.suffix
-  makefile_path = os.path.join(options.toplevel_dir, makefile_name)
-  if options.generator_output:
-    global srcdir_prefix
-    makefile_path = os.path.join(options.generator_output, makefile_path)
-    srcdir = gyp.common.RelativePath(srcdir, options.generator_output)
-    srcdir_prefix = '$(srcdir)/'
-
-  flock_command= 'flock'
-  header_params = {
-      'default_target': default_target,
-      'builddir': builddir_name,
-      'default_configuration': default_configuration,
-      'flock': flock_command,
-      'flock_index': 1,
-      'link_commands': LINK_COMMANDS_LINUX,
-      'extra_commands': '',
-      'srcdir': srcdir,
-    }
-  if flavor == 'mac':
-    flock_command = './gyp-mac-tool flock'
-    header_params.update({
-        'flock': flock_command,
-        'flock_index': 2,
-        'link_commands': LINK_COMMANDS_MAC,
-        'extra_commands': SHARED_HEADER_MAC_COMMANDS,
-    })
-  elif flavor == 'android':
-    header_params.update({
-        'link_commands': LINK_COMMANDS_ANDROID,
-    })
-  elif flavor == 'solaris':
-    header_params.update({
-        'flock': './gyp-sun-tool flock',
-        'flock_index': 2,
-        'extra_commands': SHARED_HEADER_SUN_COMMANDS,
-    })
-  elif flavor == 'freebsd':
-    # Note: OpenBSD has sysutils/flock. lockf seems to be FreeBSD specific.
-    header_params.update({
-        'flock': 'lockf',
-    })
-
-  header_params.update({
-    'CC.target':   GetEnvironFallback(('CC_target', 'CC'), '$(CC)'),
-    'AR.target':   GetEnvironFallback(('AR_target', 'AR'), '$(AR)'),
-    'CXX.target':  GetEnvironFallback(('CXX_target', 'CXX'), '$(CXX)'),
-    'LINK.target': GetEnvironFallback(('LD_target', 'LD'), '$(LINK)'),
-    'CC.host':     GetEnvironFallback(('CC_host',), 'gcc'),
-    'AR.host':     GetEnvironFallback(('AR_host',), 'ar'),
-    'CXX.host':    GetEnvironFallback(('CXX_host',), 'g++'),
-    'LINK.host':   GetEnvironFallback(('LD_host',), 'g++'),
-  })
-
-  build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0])
-  make_global_settings_array = data[build_file].get('make_global_settings', [])
-  wrappers = {}
-  wrappers['LINK'] = '%s $(builddir)/linker.lock' % flock_command
-  for key, value in make_global_settings_array:
-    if key.endswith('_wrapper'):
-      wrappers[key[:-len('_wrapper')]] = '$(abspath %s)' % value
-  make_global_settings = ''
-  for key, value in make_global_settings_array:
-    if re.match('.*_wrapper', key):
-      continue
-    if value[0] != '$':
-      value = '$(abspath %s)' % value
-    wrapper = wrappers.get(key)
-    if wrapper:
-      value = '%s %s' % (wrapper, value)
-      del wrappers[key]
-    if key in ('CC', 'CC.host', 'CXX', 'CXX.host'):
-      make_global_settings += (
-          'ifneq (,$(filter $(origin %s), undefined default))\n' % key)
-      # Let gyp-time envvars win over global settings.
-      if key in os.environ:
-        value = os.environ[key]
-      make_global_settings += '  %s = %s\n' % (key, value)
-      make_global_settings += 'endif\n'
-    else:
-      make_global_settings += '%s ?= %s\n' % (key, value)
-  # TODO(ukai): define cmd when only wrapper is specified in
-  # make_global_settings.
-
-  header_params['make_global_settings'] = make_global_settings
-
-  ensure_directory_exists(makefile_path)
-  root_makefile = open(makefile_path, 'w')
-  root_makefile.write(SHARED_HEADER % header_params)
-  # Currently any versions have the same effect, but in future the behavior
-  # could be different.
-  if android_ndk_version:
-    root_makefile.write(
-        '# Define LOCAL_PATH for build of Android applications.\n'
-        'LOCAL_PATH := $(call my-dir)\n'
-        '\n')
-  for toolset in toolsets:
-    root_makefile.write('TOOLSET := %s\n' % toolset)
-    WriteRootHeaderSuffixRules(root_makefile)
-
-  # Put build-time support tools next to the root Makefile.
-  dest_path = os.path.dirname(makefile_path)
-  gyp.common.CopyTool(flavor, dest_path)
-
-  # Find the list of targets that derive from the gyp file(s) being built.
-  needed_targets = set()
-  for build_file in params['build_files']:
-    for target in gyp.common.AllTargets(target_list, target_dicts, build_file):
-      needed_targets.add(target)
-
-  build_files = set()
-  include_list = set()
-  for qualified_target in target_list:
-    build_file, target, toolset = gyp.common.ParseQualifiedTarget(
-        qualified_target)
-
-    this_make_global_settings = data[build_file].get('make_global_settings', [])
-    assert make_global_settings_array == this_make_global_settings, (
-        "make_global_settings needs to be the same for all targets.")
-
-    build_files.add(gyp.common.RelativePath(build_file, options.toplevel_dir))
-    included_files = data[build_file]['included_files']
-    for included_file in included_files:
-      # The included_files entries are relative to the dir of the build file
-      # that included them, so we have to undo that and then make them relative
-      # to the root dir.
-      relative_include_file = gyp.common.RelativePath(
-          gyp.common.UnrelativePath(included_file, build_file),
-          options.toplevel_dir)
-      abs_include_file = os.path.abspath(relative_include_file)
-      # If the include file is from the ~/.gyp dir, we should use absolute path
-      # so that relocating the src dir doesn't break the path.
-      if (params['home_dot_gyp'] and
-          abs_include_file.startswith(params['home_dot_gyp'])):
-        build_files.add(abs_include_file)
-      else:
-        build_files.add(relative_include_file)
-
-    base_path, output_file = CalculateMakefilePath(build_file,
-        target + '.' + toolset + options.suffix + '.mk')
-
-    spec = target_dicts[qualified_target]
-    configs = spec['configurations']
-
-    if flavor == 'mac':
-      gyp.xcode_emulation.MergeGlobalXcodeSettingsToSpec(data[build_file], spec)
-
-    writer = MakefileWriter(generator_flags, flavor)
-    writer.Write(qualified_target, base_path, output_file, spec, configs,
-                 part_of_all=qualified_target in needed_targets)
-
-    # Our root_makefile lives at the source root.  Compute the relative path
-    # from there to the output_file for including.
-    mkfile_rel_path = gyp.common.RelativePath(output_file,
-                                              os.path.dirname(makefile_path))
-    include_list.add(mkfile_rel_path)
-
-  # Write out per-gyp (sub-project) Makefiles.
-  depth_rel_path = gyp.common.RelativePath(options.depth, os.getcwd())
-  for build_file in build_files:
-    # The paths in build_files were relativized above, so undo that before
-    # testing against the non-relativized items in target_list and before
-    # calculating the Makefile path.
-    build_file = os.path.join(depth_rel_path, build_file)
-    gyp_targets = [target_dicts[target]['target_name'] for target in target_list
-                   if target.startswith(build_file) and
-                   target in needed_targets]
-    # Only generate Makefiles for gyp files with targets.
-    if not gyp_targets:
-      continue
-    base_path, output_file = CalculateMakefilePath(build_file,
-        os.path.splitext(os.path.basename(build_file))[0] + '.Makefile')
-    makefile_rel_path = gyp.common.RelativePath(os.path.dirname(makefile_path),
-                                                os.path.dirname(output_file))
-    writer.WriteSubMake(output_file, makefile_rel_path, gyp_targets,
-                        builddir_name)
-
-
-  # Write out the sorted list of includes.
-  root_makefile.write('\n')
-  for include_file in sorted(include_list):
-    # We wrap each .mk include in an if statement so users can tell make to
-    # not load a file by setting NO_LOAD.  The below make code says, only
-    # load the .mk file if the .mk filename doesn't start with a token in
-    # NO_LOAD.
-    root_makefile.write(
-        "ifeq ($(strip $(foreach prefix,$(NO_LOAD),\\\n"
-        "    $(findstring $(join ^,$(prefix)),\\\n"
-        "                 $(join ^," + include_file + ")))),)\n")
-    root_makefile.write("  include " + include_file + "\n")
-    root_makefile.write("endif\n")
-  root_makefile.write('\n')
-
-  if (not generator_flags.get('standalone')
-      and generator_flags.get('auto_regeneration', True)):
-    WriteAutoRegenerationRule(params, root_makefile, makefile_name, build_files)
-
-  root_makefile.write(SHARED_FOOTER)
-
-  root_makefile.close()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3123 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import copy
-import ntpath
-import os
-import posixpath
-import re
-import subprocess
-import sys
-
-import gyp.common
-import gyp.easy_xml as easy_xml
-import gyp.MSVSNew as MSVSNew
-import gyp.MSVSProject as MSVSProject
-import gyp.MSVSSettings as MSVSSettings
-import gyp.MSVSToolFile as MSVSToolFile
-import gyp.MSVSUserFile as MSVSUserFile
-import gyp.MSVSUtil as MSVSUtil
-import gyp.MSVSVersion as MSVSVersion
-from gyp.common import GypError
-
-
-# Regular expression for validating Visual Studio GUIDs.  If the GUID
-# contains lowercase hex letters, MSVS will be fine. However,
-# IncrediBuild BuildConsole will parse the solution file, but then
-# silently skip building the target causing hard to track down errors.
-# Note that this only happens with the BuildConsole, and does not occur
-# if IncrediBuild is executed from inside Visual Studio.  This regex
-# validates that the string looks like a GUID with all uppercase hex
-# letters.
-VALID_MSVS_GUID_CHARS = re.compile('^[A-F0-9\-]+$')
-
-
-generator_default_variables = {
-    'EXECUTABLE_PREFIX': '',
-    'EXECUTABLE_SUFFIX': '.exe',
-    'STATIC_LIB_PREFIX': '',
-    'SHARED_LIB_PREFIX': '',
-    'STATIC_LIB_SUFFIX': '.lib',
-    'SHARED_LIB_SUFFIX': '.dll',
-    'INTERMEDIATE_DIR': '$(IntDir)',
-    'SHARED_INTERMEDIATE_DIR': '$(OutDir)obj/global_intermediate',
-    'OS': 'win',
-    'PRODUCT_DIR': '$(OutDir)',
-    'LIB_DIR': '$(OutDir)lib',
-    'RULE_INPUT_ROOT': '$(InputName)',
-    'RULE_INPUT_DIRNAME': '$(InputDir)',
-    'RULE_INPUT_EXT': '$(InputExt)',
-    'RULE_INPUT_NAME': '$(InputFileName)',
-    'RULE_INPUT_PATH': '$(InputPath)',
-    'CONFIGURATION_NAME': '$(ConfigurationName)',
-}
-
-
-# The msvs specific sections that hold paths
-generator_additional_path_sections = [
-    'msvs_cygwin_dirs',
-    'msvs_props',
-]
-
-
-generator_additional_non_configuration_keys = [
-    'msvs_cygwin_dirs',
-    'msvs_cygwin_shell',
-    'msvs_large_pdb',
-    'msvs_shard',
-]
-
-
-# List of precompiled header related keys.
-precomp_keys = [
-    'msvs_precompiled_header',
-    'msvs_precompiled_source',
-]
-
-
-cached_username = None
-
-
-cached_domain = None
-
-
-# TODO(gspencer): Switch the os.environ calls to be
-# win32api.GetDomainName() and win32api.GetUserName() once the
-# python version in depot_tools has been updated to work on Vista
-# 64-bit.
-def _GetDomainAndUserName():
-  if sys.platform not in ('win32', 'cygwin'):
-    return ('DOMAIN', 'USERNAME')
-  global cached_username
-  global cached_domain
-  if not cached_domain or not cached_username:
-    domain = os.environ.get('USERDOMAIN')
-    username = os.environ.get('USERNAME')
-    if not domain or not username:
-      call = subprocess.Popen(['net', 'config', 'Workstation'],
-                              stdout=subprocess.PIPE)
-      config = call.communicate()[0]
-      username_re = re.compile('^User name\s+(\S+)', re.MULTILINE)
-      username_match = username_re.search(config)
-      if username_match:
-        username = username_match.group(1)
-      domain_re = re.compile('^Logon domain\s+(\S+)', re.MULTILINE)
-      domain_match = domain_re.search(config)
-      if domain_match:
-        domain = domain_match.group(1)
-    cached_domain = domain
-    cached_username = username
-  return (cached_domain, cached_username)
-
-fixpath_prefix = None
-
-
-def _NormalizedSource(source):
-  """Normalize the path.
-
-  But not if that gets rid of a variable, as this may expand to something
-  larger than one directory.
-
-  Arguments:
-      source: The path to be normalize.d
-
-  Returns:
-      The normalized path.
-  """
-  normalized = os.path.normpath(source)
-  if source.count('$') == normalized.count('$'):
-    source = normalized
-  return source
-
-
-def _FixPath(path):
-  """Convert paths to a form that will make sense in a vcproj file.
-
-  Arguments:
-    path: The path to convert, may contain / etc.
-  Returns:
-    The path with all slashes made into backslashes.
-  """
-  if fixpath_prefix and path and not os.path.isabs(path) and not path[0] == '$':
-    path = os.path.join(fixpath_prefix, path)
-  path = path.replace('/', '\\')
-  path = _NormalizedSource(path)
-  if path and path[-1] == '\\':
-    path = path[:-1]
-  return path
-
-
-def _FixPaths(paths):
-  """Fix each of the paths of the list."""
-  return [_FixPath(i) for i in paths]
-
-
-def _ConvertSourcesToFilterHierarchy(sources, prefix=None, excluded=None,
-                                     list_excluded=True):
-  """Converts a list split source file paths into a vcproj folder hierarchy.
-
-  Arguments:
-    sources: A list of source file paths split.
-    prefix: A list of source file path layers meant to apply to each of sources.
-    excluded: A set of excluded files.
-
-  Returns:
-    A hierarchy of filenames and MSVSProject.Filter objects that matches the
-    layout of the source tree.
-    For example:
-    _ConvertSourcesToFilterHierarchy([['a', 'bob1.c'], ['b', 'bob2.c']],
-                                     prefix=['joe'])
-    -->
-    [MSVSProject.Filter('a', contents=['joe\\a\\bob1.c']),
-     MSVSProject.Filter('b', contents=['joe\\b\\bob2.c'])]
-  """
-  if not prefix: prefix = []
-  result = []
-  excluded_result = []
-  folders = dict()
-  # Gather files into the final result, excluded, or folders.
-  for s in sources:
-    if len(s) == 1:
-      filename = _NormalizedSource('\\'.join(prefix + s))
-      if filename in excluded:
-        excluded_result.append(filename)
-      else:
-        result.append(filename)
-    else:
-      if not folders.get(s[0]):
-        folders[s[0]] = []
-      folders[s[0]].append(s[1:])
-  # Add a folder for excluded files.
-  if excluded_result and list_excluded:
-    excluded_folder = MSVSProject.Filter('_excluded_files',
-                                         contents=excluded_result)
-    result.append(excluded_folder)
-  # Populate all the folders.
-  for f in folders:
-    contents = _ConvertSourcesToFilterHierarchy(folders[f], prefix=prefix + [f],
-                                                excluded=excluded,
-                                                list_excluded=list_excluded)
-    contents = MSVSProject.Filter(f, contents=contents)
-    result.append(contents)
-
-  return result
-
-
-def _ToolAppend(tools, tool_name, setting, value, only_if_unset=False):
-  if not value: return
-  _ToolSetOrAppend(tools, tool_name, setting, value, only_if_unset)
-
-
-def _ToolSetOrAppend(tools, tool_name, setting, value, only_if_unset=False):
-  # TODO(bradnelson): ugly hack, fix this more generally!!!
-  if 'Directories' in setting or 'Dependencies' in setting:
-    if type(value) == str:
-      value = value.replace('/', '\\')
-    else:
-      value = [i.replace('/', '\\') for i in value]
-  if not tools.get(tool_name):
-    tools[tool_name] = dict()
-  tool = tools[tool_name]
-  if tool.get(setting):
-    if only_if_unset: return
-    if type(tool[setting]) == list:
-      tool[setting] += value
-    else:
-      raise TypeError(
-          'Appending "%s" to a non-list setting "%s" for tool "%s" is '
-          'not allowed, previous value: %s' % (
-              value, setting, tool_name, str(tool[setting])))
-  else:
-    tool[setting] = value
-
-
-def _ConfigPlatform(config_data):
-  return config_data.get('msvs_configuration_platform', 'Win32')
-
-
-def _ConfigBaseName(config_name, platform_name):
-  if config_name.endswith('_' + platform_name):
-    return config_name[0:-len(platform_name) - 1]
-  else:
-    return config_name
-
-
-def _ConfigFullName(config_name, config_data):
-  platform_name = _ConfigPlatform(config_data)
-  return '%s|%s' % (_ConfigBaseName(config_name, platform_name), platform_name)
-
-
-def _BuildCommandLineForRuleRaw(spec, cmd, cygwin_shell, has_input_path,
-                                quote_cmd, do_setup_env):
-
-  if [x for x in cmd if '$(InputDir)' in x]:
-    input_dir_preamble = (
-      'set INPUTDIR=$(InputDir)\n'
-      'set INPUTDIR=%INPUTDIR:$(ProjectDir)=%\n'
-      'set INPUTDIR=%INPUTDIR:~0,-1%\n'
-      )
-  else:
-    input_dir_preamble = ''
-
-  if cygwin_shell:
-    # Find path to cygwin.
-    cygwin_dir = _FixPath(spec.get('msvs_cygwin_dirs', ['.'])[0])
-    # Prepare command.
-    direct_cmd = cmd
-    direct_cmd = [i.replace('$(IntDir)',
-                            '`cygpath -m "${INTDIR}"`') for i in direct_cmd]
-    direct_cmd = [i.replace('$(OutDir)',
-                            '`cygpath -m "${OUTDIR}"`') for i in direct_cmd]
-    direct_cmd = [i.replace('$(InputDir)',
-                            '`cygpath -m "${INPUTDIR}"`') for i in direct_cmd]
-    if has_input_path:
-      direct_cmd = [i.replace('$(InputPath)',
-                              '`cygpath -m "${INPUTPATH}"`')
-                    for i in direct_cmd]
-    direct_cmd = ['\\"%s\\"' % i.replace('"', '\\\\\\"') for i in direct_cmd]
-    # direct_cmd = gyp.common.EncodePOSIXShellList(direct_cmd)
-    direct_cmd = ' '.join(direct_cmd)
-    # TODO(quote):  regularize quoting path names throughout the module
-    cmd = ''
-    if do_setup_env:
-      cmd += 'call "$(ProjectDir)%(cygwin_dir)s\\setup_env.bat" && '
-    cmd += 'set CYGWIN=nontsec&& '
-    if direct_cmd.find('NUMBER_OF_PROCESSORS') >= 0:
-      cmd += 'set /a NUMBER_OF_PROCESSORS_PLUS_1=%%NUMBER_OF_PROCESSORS%%+1&& '
-    if direct_cmd.find('INTDIR') >= 0:
-      cmd += 'set INTDIR=$(IntDir)&& '
-    if direct_cmd.find('OUTDIR') >= 0:
-      cmd += 'set OUTDIR=$(OutDir)&& '
-    if has_input_path and direct_cmd.find('INPUTPATH') >= 0:
-      cmd += 'set INPUTPATH=$(InputPath) && '
-    cmd += 'bash -c "%(cmd)s"'
-    cmd = cmd % {'cygwin_dir': cygwin_dir,
-                 'cmd': direct_cmd}
-    return input_dir_preamble + cmd
-  else:
-    # Convert cat --> type to mimic unix.
-    if cmd[0] == 'cat':
-      command = ['type']
-    else:
-      command = [cmd[0].replace('/', '\\')]
-    # Add call before command to ensure that commands can be tied together one
-    # after the other without aborting in Incredibuild, since IB makes a bat
-    # file out of the raw command string, and some commands (like python) are
-    # actually batch files themselves.
-    command.insert(0, 'call')
-    # Fix the paths
-    # TODO(quote): This is a really ugly heuristic, and will miss path fixing
-    #              for arguments like "--arg=path" or "/opt:path".
-    # If the argument starts with a slash or dash, it's probably a command line
-    # switch
-    arguments = [i if (i[:1] in "/-") else _FixPath(i) for i in cmd[1:]]
-    arguments = [i.replace('$(InputDir)', '%INPUTDIR%') for i in arguments]
-    arguments = [MSVSSettings.FixVCMacroSlashes(i) for i in arguments]
-    if quote_cmd:
-      # Support a mode for using cmd directly.
-      # Convert any paths to native form (first element is used directly).
-      # TODO(quote):  regularize quoting path names throughout the module
-      arguments = ['"%s"' % i for i in arguments]
-    # Collapse into a single command.
-    return input_dir_preamble + ' '.join(command + arguments)
-
-
-def _BuildCommandLineForRule(spec, rule, has_input_path, do_setup_env):
-  # Currently this weird argument munging is used to duplicate the way a
-  # python script would need to be run as part of the chrome tree.
-  # Eventually we should add some sort of rule_default option to set this
-  # per project. For now the behavior chrome needs is the default.
-  mcs = rule.get('msvs_cygwin_shell')
-  if mcs is None:
-    mcs = int(spec.get('msvs_cygwin_shell', 1))
-  elif isinstance(mcs, str):
-    mcs = int(mcs)
-  quote_cmd = int(rule.get('msvs_quote_cmd', 1))
-  return _BuildCommandLineForRuleRaw(spec, rule['action'], mcs, has_input_path,
-                                     quote_cmd, do_setup_env=do_setup_env)
-
-
-def _AddActionStep(actions_dict, inputs, outputs, description, command):
-  """Merge action into an existing list of actions.
-
-  Care must be taken so that actions which have overlapping inputs either don't
-  get assigned to the same input, or get collapsed into one.
-
-  Arguments:
-    actions_dict: dictionary keyed on input name, which maps to a list of
-      dicts describing the actions attached to that input file.
-    inputs: list of inputs
-    outputs: list of outputs
-    description: description of the action
-    command: command line to execute
-  """
-  # Require there to be at least one input (call sites will ensure this).
-  assert inputs
-
-  action = {
-      'inputs': inputs,
-      'outputs': outputs,
-      'description': description,
-      'command': command,
-  }
-
-  # Pick where to stick this action.
-  # While less than optimal in terms of build time, attach them to the first
-  # input for now.
-  chosen_input = inputs[0]
-
-  # Add it there.
-  if chosen_input not in actions_dict:
-    actions_dict[chosen_input] = []
-  actions_dict[chosen_input].append(action)
-
-
-def _AddCustomBuildToolForMSVS(p, spec, primary_input,
-                               inputs, outputs, description, cmd):
-  """Add a custom build tool to execute something.
-
-  Arguments:
-    p: the target project
-    spec: the target project dict
-    primary_input: input file to attach the build tool to
-    inputs: list of inputs
-    outputs: list of outputs
-    description: description of the action
-    cmd: command line to execute
-  """
-  inputs = _FixPaths(inputs)
-  outputs = _FixPaths(outputs)
-  tool = MSVSProject.Tool(
-      'VCCustomBuildTool',
-      {'Description': description,
-       'AdditionalDependencies': ';'.join(inputs),
-       'Outputs': ';'.join(outputs),
-       'CommandLine': cmd,
-      })
-  # Add to the properties of primary input for each config.
-  for config_name, c_data in spec['configurations'].iteritems():
-    p.AddFileConfig(_FixPath(primary_input),
-                    _ConfigFullName(config_name, c_data), tools=[tool])
-
-
-def _AddAccumulatedActionsToMSVS(p, spec, actions_dict):
-  """Add actions accumulated into an actions_dict, merging as needed.
-
-  Arguments:
-    p: the target project
-    spec: the target project dict
-    actions_dict: dictionary keyed on input name, which maps to a list of
-        dicts describing the actions attached to that input file.
-  """
-  for primary_input in actions_dict:
-    inputs = set()
-    outputs = set()
-    descriptions = []
-    commands = []
-    for action in actions_dict[primary_input]:
-      inputs.update(set(action['inputs']))
-      outputs.update(set(action['outputs']))
-      descriptions.append(action['description'])
-      commands.append(action['command'])
-    # Add the custom build step for one input file.
-    description = ', and also '.join(descriptions)
-    command = '\r\n'.join(commands)
-    _AddCustomBuildToolForMSVS(p, spec,
-                               primary_input=primary_input,
-                               inputs=inputs,
-                               outputs=outputs,
-                               description=description,
-                               cmd=command)
-
-
-def _RuleExpandPath(path, input_file):
-  """Given the input file to which a rule applied, string substitute a path.
-
-  Arguments:
-    path: a path to string expand
-    input_file: the file to which the rule applied.
-  Returns:
-    The string substituted path.
-  """
-  path = path.replace('$(InputName)',
-                      os.path.splitext(os.path.split(input_file)[1])[0])
-  path = path.replace('$(InputDir)', os.path.dirname(input_file))
-  path = path.replace('$(InputExt)',
-                      os.path.splitext(os.path.split(input_file)[1])[1])
-  path = path.replace('$(InputFileName)', os.path.split(input_file)[1])
-  path = path.replace('$(InputPath)', input_file)
-  return path
-
-
-def _FindRuleTriggerFiles(rule, sources):
-  """Find the list of files which a particular rule applies to.
-
-  Arguments:
-    rule: the rule in question
-    sources: the set of all known source files for this project
-  Returns:
-    The list of sources that trigger a particular rule.
-  """
-  rule_ext = rule['extension']
-  return [s for s in sources if s.endswith('.' + rule_ext)]
-
-
-def _RuleInputsAndOutputs(rule, trigger_file):
-  """Find the inputs and outputs generated by a rule.
-
-  Arguments:
-    rule: the rule in question.
-    trigger_file: the main trigger for this rule.
-  Returns:
-    The pair of (inputs, outputs) involved in this rule.
-  """
-  raw_inputs = _FixPaths(rule.get('inputs', []))
-  raw_outputs = _FixPaths(rule.get('outputs', []))
-  inputs = set()
-  outputs = set()
-  inputs.add(trigger_file)
-  for i in raw_inputs:
-    inputs.add(_RuleExpandPath(i, trigger_file))
-  for o in raw_outputs:
-    outputs.add(_RuleExpandPath(o, trigger_file))
-  return (inputs, outputs)
-
-
-def _GenerateNativeRulesForMSVS(p, rules, output_dir, spec, options):
-  """Generate a native rules file.
-
-  Arguments:
-    p: the target project
-    rules: the set of rules to include
-    output_dir: the directory in which the project/gyp resides
-    spec: the project dict
-    options: global generator options
-  """
-  rules_filename = '%s%s.rules' % (spec['target_name'],
-                                   options.suffix)
-  rules_file = MSVSToolFile.Writer(os.path.join(output_dir, rules_filename),
-                                   spec['target_name'])
-  # Add each rule.
-  for r in rules:
-    rule_name = r['rule_name']
-    rule_ext = r['extension']
-    inputs = _FixPaths(r.get('inputs', []))
-    outputs = _FixPaths(r.get('outputs', []))
-    # Skip a rule with no action and no inputs.
-    if 'action' not in r and not r.get('rule_sources', []):
-      continue
-    cmd = _BuildCommandLineForRule(spec, r, has_input_path=True,
-                                   do_setup_env=True)
-    rules_file.AddCustomBuildRule(name=rule_name,
-                                  description=r.get('message', rule_name),
-                                  extensions=[rule_ext],
-                                  additional_dependencies=inputs,
-                                  outputs=outputs,
-                                  cmd=cmd)
-  # Write out rules file.
-  rules_file.WriteIfChanged()
-
-  # Add rules file to project.
-  p.AddToolFile(rules_filename)
-
-
-def _Cygwinify(path):
-  path = path.replace('$(OutDir)', '$(OutDirCygwin)')
-  path = path.replace('$(IntDir)', '$(IntDirCygwin)')
-  return path
-
-
-def _GenerateExternalRules(rules, output_dir, spec,
-                           sources, options, actions_to_add):
-  """Generate an external makefile to do a set of rules.
-
-  Arguments:
-    rules: the list of rules to include
-    output_dir: path containing project and gyp files
-    spec: project specification data
-    sources: set of sources known
-    options: global generator options
-    actions_to_add: The list of actions we will add to.
-  """
-  filename = '%s_rules%s.mk' % (spec['target_name'], options.suffix)
-  mk_file = gyp.common.WriteOnDiff(os.path.join(output_dir, filename))
-  # Find cygwin style versions of some paths.
-  mk_file.write('OutDirCygwin:=$(shell cygpath -u "$(OutDir)")\n')
-  mk_file.write('IntDirCygwin:=$(shell cygpath -u "$(IntDir)")\n')
-  # Gather stuff needed to emit all: target.
-  all_inputs = set()
-  all_outputs = set()
-  all_output_dirs = set()
-  first_outputs = []
-  for rule in rules:
-    trigger_files = _FindRuleTriggerFiles(rule, sources)
-    for tf in trigger_files:
-      inputs, outputs = _RuleInputsAndOutputs(rule, tf)
-      all_inputs.update(set(inputs))
-      all_outputs.update(set(outputs))
-      # Only use one target from each rule as the dependency for
-      # 'all' so we don't try to build each rule multiple times.
-      first_outputs.append(list(outputs)[0])
-      # Get the unique output directories for this rule.
-      output_dirs = [os.path.split(i)[0] for i in outputs]
-      for od in output_dirs:
-        all_output_dirs.add(od)
-  first_outputs_cyg = [_Cygwinify(i) for i in first_outputs]
-  # Write out all: target, including mkdir for each output directory.
-  mk_file.write('all: %s\n' % ' '.join(first_outputs_cyg))
-  for od in all_output_dirs:
-    if od:
-      mk_file.write('\tmkdir -p `cygpath -u "%s"`\n' % od)
-  mk_file.write('\n')
-  # Define how each output is generated.
-  for rule in rules:
-    trigger_files = _FindRuleTriggerFiles(rule, sources)
-    for tf in trigger_files:
-      # Get all the inputs and outputs for this rule for this trigger file.
-      inputs, outputs = _RuleInputsAndOutputs(rule, tf)
-      inputs = [_Cygwinify(i) for i in inputs]
-      outputs = [_Cygwinify(i) for i in outputs]
-      # Prepare the command line for this rule.
-      cmd = [_RuleExpandPath(c, tf) for c in rule['action']]
-      cmd = ['"%s"' % i for i in cmd]
-      cmd = ' '.join(cmd)
-      # Add it to the makefile.
-      mk_file.write('%s: %s\n' % (' '.join(outputs), ' '.join(inputs)))
-      mk_file.write('\t%s\n\n' % cmd)
-  # Close up the file.
-  mk_file.close()
-
-  # Add makefile to list of sources.
-  sources.add(filename)
-  # Add a build action to call makefile.
-  cmd = ['make',
-         'OutDir=$(OutDir)',
-         'IntDir=$(IntDir)',
-         '-j', '${NUMBER_OF_PROCESSORS_PLUS_1}',
-         '-f', filename]
-  cmd = _BuildCommandLineForRuleRaw(spec, cmd, True, False, True, True)
-  # Insert makefile as 0'th input, so it gets the action attached there,
-  # as this is easier to understand from in the IDE.
-  all_inputs = list(all_inputs)
-  all_inputs.insert(0, filename)
-  _AddActionStep(actions_to_add,
-                 inputs=_FixPaths(all_inputs),
-                 outputs=_FixPaths(all_outputs),
-                 description='Running external rules for %s' %
-                     spec['target_name'],
-                 command=cmd)
-
-
-def _EscapeEnvironmentVariableExpansion(s):
-  """Escapes % characters.
-
-  Escapes any % characters so that Windows-style environment variable
-  expansions will leave them alone.
-  See http://connect.microsoft.com/VisualStudio/feedback/details/106127/cl-d-name-text-containing-percentage-characters-doesnt-compile
-  to understand why we have to do this.
-
-  Args:
-      s: The string to be escaped.
-
-  Returns:
-      The escaped string.
-  """
-  s = s.replace('%', '%%')
-  return s
-
-
-quote_replacer_regex = re.compile(r'(\\*)"')
-
-
-def _EscapeCommandLineArgumentForMSVS(s):
-  """Escapes a Windows command-line argument.
-
-  So that the Win32 CommandLineToArgv function will turn the escaped result back
-  into the original string.
-  See http://msdn.microsoft.com/en-us/library/17w5ykft.aspx
-  ("Parsing C++ Command-Line Arguments") to understand why we have to do
-  this.
-
-  Args:
-      s: the string to be escaped.
-  Returns:
-      the escaped string.
-  """
-
-  def _Replace(match):
-    # For a literal quote, CommandLineToArgv requires an odd number of
-    # backslashes preceding it, and it produces half as many literal backslashes
-    # (rounded down). So we need to produce 2n+1 backslashes.
-    return 2 * match.group(1) + '\\"'
-
-  # Escape all quotes so that they are interpreted literally.
-  s = quote_replacer_regex.sub(_Replace, s)
-  # Now add unescaped quotes so that any whitespace is interpreted literally.
-  s = '"' + s + '"'
-  return s
-
-
-delimiters_replacer_regex = re.compile(r'(\\*)([,;]+)')
-
-
-def _EscapeVCProjCommandLineArgListItem(s):
-  """Escapes command line arguments for MSVS.
-
-  The VCProj format stores string lists in a single string using commas and
-  semi-colons as separators, which must be quoted if they are to be
-  interpreted literally. However, command-line arguments may already have
-  quotes, and the VCProj parser is ignorant of the backslash escaping
-  convention used by CommandLineToArgv, so the command-line quotes and the
-  VCProj quotes may not be the same quotes. So to store a general
-  command-line argument in a VCProj list, we need to parse the existing
-  quoting according to VCProj's convention and quote any delimiters that are
-  not already quoted by that convention. The quotes that we add will also be
-  seen by CommandLineToArgv, so if backslashes precede them then we also have
-  to escape those backslashes according to the CommandLineToArgv
-  convention.
-
-  Args:
-      s: the string to be escaped.
-  Returns:
-      the escaped string.
-  """
-
-  def _Replace(match):
-    # For a non-literal quote, CommandLineToArgv requires an even number of
-    # backslashes preceding it, and it produces half as many literal
-    # backslashes. So we need to produce 2n backslashes.
-    return 2 * match.group(1) + '"' + match.group(2) + '"'
-
-  segments = s.split('"')
-  # The unquoted segments are at the even-numbered indices.
-  for i in range(0, len(segments), 2):
-    segments[i] = delimiters_replacer_regex.sub(_Replace, segments[i])
-  # Concatenate back into a single string
-  s = '"'.join(segments)
-  if len(segments) % 2 == 0:
-    # String ends while still quoted according to VCProj's convention. This
-    # means the delimiter and the next list item that follow this one in the
-    # .vcproj file will be misinterpreted as part of this item. There is nothing
-    # we can do about this. Adding an extra quote would correct the problem in
-    # the VCProj but cause the same problem on the final command-line. Moving
-    # the item to the end of the list does works, but that's only possible if
-    # there's only one such item. Let's just warn the user.
-    print >> sys.stderr, ('Warning: MSVS may misinterpret the odd number of ' +
-                          'quotes in ' + s)
-  return s
-
-
-def _EscapeCppDefineForMSVS(s):
-  """Escapes a CPP define so that it will reach the compiler unaltered."""
-  s = _EscapeEnvironmentVariableExpansion(s)
-  s = _EscapeCommandLineArgumentForMSVS(s)
-  s = _EscapeVCProjCommandLineArgListItem(s)
-  # cl.exe replaces literal # characters with = in preprocesor definitions for
-  # some reason. Octal-encode to work around that.
-  s = s.replace('#', '\\%03o' % ord('#'))
-  return s
-
-
-quote_replacer_regex2 = re.compile(r'(\\+)"')
-
-
-def _EscapeCommandLineArgumentForMSBuild(s):
-  """Escapes a Windows command-line argument for use by MSBuild."""
-
-  def _Replace(match):
-    return (len(match.group(1)) / 2 * 4) * '\\' + '\\"'
-
-  # Escape all quotes so that they are interpreted literally.
-  s = quote_replacer_regex2.sub(_Replace, s)
-  return s
-
-
-def _EscapeMSBuildSpecialCharacters(s):
-  escape_dictionary = {
-      '%': '%25',
-      '$': '%24',
-      '@': '%40',
-      "'": '%27',
-      ';': '%3B',
-      '?': '%3F',
-      '*': '%2A'
-      }
-  result = ''.join([escape_dictionary.get(c, c) for c in s])
-  return result
-
-
-def _EscapeCppDefineForMSBuild(s):
-  """Escapes a CPP define so that it will reach the compiler unaltered."""
-  s = _EscapeEnvironmentVariableExpansion(s)
-  s = _EscapeCommandLineArgumentForMSBuild(s)
-  s = _EscapeMSBuildSpecialCharacters(s)
-  # cl.exe replaces literal # characters with = in preprocesor definitions for
-  # some reason. Octal-encode to work around that.
-  s = s.replace('#', '\\%03o' % ord('#'))
-  return s
-
-
-def _GenerateRulesForMSVS(p, output_dir, options, spec,
-                          sources, excluded_sources,
-                          actions_to_add):
-  """Generate all the rules for a particular project.
-
-  Arguments:
-    p: the project
-    output_dir: directory to emit rules to
-    options: global options passed to the generator
-    spec: the specification for this project
-    sources: the set of all known source files in this project
-    excluded_sources: the set of sources excluded from normal processing
-    actions_to_add: deferred list of actions to add in
-  """
-  rules = spec.get('rules', [])
-  rules_native = [r for r in rules if not int(r.get('msvs_external_rule', 0))]
-  rules_external = [r for r in rules if int(r.get('msvs_external_rule', 0))]
-
-  # Handle rules that use a native rules file.
-  if rules_native:
-    _GenerateNativeRulesForMSVS(p, rules_native, output_dir, spec, options)
-
-  # Handle external rules (non-native rules).
-  if rules_external:
-    _GenerateExternalRules(rules_external, output_dir, spec,
-                           sources, options, actions_to_add)
-  _AdjustSourcesForRules(rules, sources, excluded_sources)
-
-
-def _AdjustSourcesForRules(rules, sources, excluded_sources):
-  # Add outputs generated by each rule (if applicable).
-  for rule in rules:
-    # Done if not processing outputs as sources.
-    if int(rule.get('process_outputs_as_sources', False)):
-      # Add in the outputs from this rule.
-      trigger_files = _FindRuleTriggerFiles(rule, sources)
-      for trigger_file in trigger_files:
-        inputs, outputs = _RuleInputsAndOutputs(rule, trigger_file)
-        inputs = set(_FixPaths(inputs))
-        outputs = set(_FixPaths(outputs))
-        inputs.remove(_FixPath(trigger_file))
-        sources.update(inputs)
-        excluded_sources.update(inputs)
-        sources.update(outputs)
-
-
-def _FilterActionsFromExcluded(excluded_sources, actions_to_add):
-  """Take inputs with actions attached out of the list of exclusions.
-
-  Arguments:
-    excluded_sources: list of source files not to be built.
-    actions_to_add: dict of actions keyed on source file they're attached to.
-  Returns:
-    excluded_sources with files that have actions attached removed.
-  """
-  must_keep = set(_FixPaths(actions_to_add.keys()))
-  return [s for s in excluded_sources if s not in must_keep]
-
-
-def _GetDefaultConfiguration(spec):
-  return spec['configurations'][spec['default_configuration']]
-
-
-def _GetGuidOfProject(proj_path, spec):
-  """Get the guid for the project.
-
-  Arguments:
-    proj_path: Path of the vcproj or vcxproj file to generate.
-    spec: The target dictionary containing the properties of the target.
-  Returns:
-    the guid.
-  Raises:
-    ValueError: if the specified GUID is invalid.
-  """
-  # Pluck out the default configuration.
-  default_config = _GetDefaultConfiguration(spec)
-  # Decide the guid of the project.
-  guid = default_config.get('msvs_guid')
-  if guid:
-    if VALID_MSVS_GUID_CHARS.match(guid) is None:
-      raise ValueError('Invalid MSVS guid: "%s".  Must match regex: "%s".' %
-                       (guid, VALID_MSVS_GUID_CHARS.pattern))
-    guid = '{%s}' % guid
-  guid = guid or MSVSNew.MakeGuid(proj_path)
-  return guid
-
-
-def _GetMsbuildToolsetOfProject(proj_path, spec, version):
-  """Get the platform toolset for the project.
-
-  Arguments:
-    proj_path: Path of the vcproj or vcxproj file to generate.
-    spec: The target dictionary containing the properties of the target.
-    version: The MSVSVersion object.
-  Returns:
-    the platform toolset string or None.
-  """
-  # Pluck out the default configuration.
-  default_config = _GetDefaultConfiguration(spec)
-  toolset = default_config.get('msbuild_toolset')
-  if not toolset and version.DefaultToolset():
-    toolset = version.DefaultToolset()
-  return toolset
-
-
-def _GenerateProject(project, options, version, generator_flags):
-  """Generates a vcproj file.
-
-  Arguments:
-    project: the MSVSProject object.
-    options: global generator options.
-    version: the MSVSVersion object.
-    generator_flags: dict of generator-specific flags.
-  Returns:
-    A list of source files that cannot be found on disk.
-  """
-  default_config = _GetDefaultConfiguration(project.spec)
-
-  # Skip emitting anything if told to with msvs_existing_vcproj option.
-  if default_config.get('msvs_existing_vcproj'):
-    return []
-
-  if version.UsesVcxproj():
-    return _GenerateMSBuildProject(project, options, version, generator_flags)
-  else:
-    return _GenerateMSVSProject(project, options, version, generator_flags)
-
-
-def _GenerateMSVSProject(project, options, version, generator_flags):
-  """Generates a .vcproj file.  It may create .rules and .user files too.
-
-  Arguments:
-    project: The project object we will generate the file for.
-    options: Global options passed to the generator.
-    version: The VisualStudioVersion object.
-    generator_flags: dict of generator-specific flags.
-  """
-  spec = project.spec
-  vcproj_dir = os.path.dirname(project.path)
-  if vcproj_dir and not os.path.exists(vcproj_dir):
-    os.makedirs(vcproj_dir)
-
-  platforms = _GetUniquePlatforms(spec)
-  p = MSVSProject.Writer(project.path, version, spec['target_name'],
-                         project.guid, platforms)
-
-  # Get directory project file is in.
-  project_dir = os.path.split(project.path)[0]
-  gyp_path = _NormalizedSource(project.build_file)
-  relative_path_of_gyp_file = gyp.common.RelativePath(gyp_path, project_dir)
-
-  config_type = _GetMSVSConfigurationType(spec, project.build_file)
-  for config_name, config in spec['configurations'].iteritems():
-    _AddConfigurationToMSVSProject(p, spec, config_type, config_name, config)
-
-  # Prepare list of sources and excluded sources.
-  gyp_file = os.path.split(project.build_file)[1]
-  sources, excluded_sources = _PrepareListOfSources(spec, generator_flags,
-                                                    gyp_file)
-
-  # Add rules.
-  actions_to_add = {}
-  _GenerateRulesForMSVS(p, project_dir, options, spec,
-                        sources, excluded_sources,
-                        actions_to_add)
-  list_excluded = generator_flags.get('msvs_list_excluded_files', True)
-  sources, excluded_sources, excluded_idl = (
-      _AdjustSourcesAndConvertToFilterHierarchy(
-          spec, options, project_dir, sources, excluded_sources, list_excluded))
-
-  # Add in files.
-  missing_sources = _VerifySourcesExist(sources, project_dir)
-  p.AddFiles(sources)
-
-  _AddToolFilesToMSVS(p, spec)
-  _HandlePreCompiledHeaders(p, sources, spec)
-  _AddActions(actions_to_add, spec, relative_path_of_gyp_file)
-  _AddCopies(actions_to_add, spec)
-  _WriteMSVSUserFile(project.path, version, spec)
-
-  # NOTE: this stanza must appear after all actions have been decided.
-  # Don't excluded sources with actions attached, or they won't run.
-  excluded_sources = _FilterActionsFromExcluded(
-      excluded_sources, actions_to_add)
-  _ExcludeFilesFromBeingBuilt(p, spec, excluded_sources, excluded_idl,
-                              list_excluded)
-  _AddAccumulatedActionsToMSVS(p, spec, actions_to_add)
-
-  # Write it out.
-  p.WriteIfChanged()
-
-  return missing_sources
-
-
-def _GetUniquePlatforms(spec):
-  """Returns the list of unique platforms for this spec, e.g ['win32', ...].
-
-  Arguments:
-    spec: The target dictionary containing the properties of the target.
-  Returns:
-    The MSVSUserFile object created.
-  """
-  # Gather list of unique platforms.
-  platforms = set()
-  for configuration in spec['configurations']:
-    platforms.add(_ConfigPlatform(spec['configurations'][configuration]))
-  platforms = list(platforms)
-  return platforms
-
-
-def _CreateMSVSUserFile(proj_path, version, spec):
-  """Generates a .user file for the user running this Gyp program.
-
-  Arguments:
-    proj_path: The path of the project file being created.  The .user file
-               shares the same path (with an appropriate suffix).
-    version: The VisualStudioVersion object.
-    spec: The target dictionary containing the properties of the target.
-  Returns:
-    The MSVSUserFile object created.
-  """
-  (domain, username) = _GetDomainAndUserName()
-  vcuser_filename = '.'.join([proj_path, domain, username, 'user'])
-  user_file = MSVSUserFile.Writer(vcuser_filename, version,
-                                  spec['target_name'])
-  return user_file
-
-
-def _GetMSVSConfigurationType(spec, build_file):
-  """Returns the configuration type for this project.
-
-  It's a number defined by Microsoft.  May raise an exception.
-
-  Args:
-      spec: The target dictionary containing the properties of the target.
-      build_file: The path of the gyp file.
-  Returns:
-      An integer, the configuration type.
-  """
-  try:
-    config_type = {
-        'executable': '1',  # .exe
-        'shared_library': '2',  # .dll
-        'loadable_module': '2',  # .dll
-        'static_library': '4',  # .lib
-        'none': '10',  # Utility type
-        }[spec['type']]
-  except KeyError:
-    if spec.get('type'):
-      raise GypError('Target type %s is not a valid target type for '
-                     'target %s in %s.' %
-                     (spec['type'], spec['target_name'], build_file))
-    else:
-      raise GypError('Missing type field for target %s in %s.' %
-                     (spec['target_name'], build_file))
-  return config_type
-
-
-def _AddConfigurationToMSVSProject(p, spec, config_type, config_name, config):
-  """Adds a configuration to the MSVS project.
-
-  Many settings in a vcproj file are specific to a configuration.  This
-  function the main part of the vcproj file that's configuration specific.
-
-  Arguments:
-    p: The target project being generated.
-    spec: The target dictionary containing the properties of the target.
-    config_type: The configuration type, a number as defined by Microsoft.
-    config_name: The name of the configuration.
-    config: The dictionnary that defines the special processing to be done
-            for this configuration.
-  """
-  # Get the information for this configuration
-  include_dirs, resource_include_dirs = _GetIncludeDirs(config)
-  libraries = _GetLibraries(spec)
-  out_file, vc_tool, _ = _GetOutputFilePathAndTool(spec, msbuild=False)
-  defines = _GetDefines(config)
-  defines = [_EscapeCppDefineForMSVS(d) for d in defines]
-  disabled_warnings = _GetDisabledWarnings(config)
-  prebuild = config.get('msvs_prebuild')
-  postbuild = config.get('msvs_postbuild')
-  def_file = _GetModuleDefinition(spec)
-  precompiled_header = config.get('msvs_precompiled_header')
-
-  # Prepare the list of tools as a dictionary.
-  tools = dict()
-  # Add in user specified msvs_settings.
-  msvs_settings = config.get('msvs_settings', {})
-  MSVSSettings.ValidateMSVSSettings(msvs_settings)
-
-  # Prevent default library inheritance from the environment.
-  _ToolAppend(tools, 'VCLinkerTool', 'AdditionalDependencies', ['$(NOINHERIT)'])
-
-  for tool in msvs_settings:
-    settings = config['msvs_settings'][tool]
-    for setting in settings:
-      _ToolAppend(tools, tool, setting, settings[setting])
-  # Add the information to the appropriate tool
-  _ToolAppend(tools, 'VCCLCompilerTool',
-              'AdditionalIncludeDirectories', include_dirs)
-  _ToolAppend(tools, 'VCResourceCompilerTool',
-              'AdditionalIncludeDirectories', resource_include_dirs)
-  # Add in libraries.
-  _ToolAppend(tools, 'VCLinkerTool', 'AdditionalDependencies', libraries)
-  if out_file:
-    _ToolAppend(tools, vc_tool, 'OutputFile', out_file, only_if_unset=True)
-  # Add defines.
-  _ToolAppend(tools, 'VCCLCompilerTool', 'PreprocessorDefinitions', defines)
-  _ToolAppend(tools, 'VCResourceCompilerTool', 'PreprocessorDefinitions',
-              defines)
-  # Change program database directory to prevent collisions.
-  _ToolAppend(tools, 'VCCLCompilerTool', 'ProgramDataBaseFileName',
-              '$(IntDir)$(ProjectName)\\vc80.pdb', only_if_unset=True)
-  # Add disabled warnings.
-  _ToolAppend(tools, 'VCCLCompilerTool',
-              'DisableSpecificWarnings', disabled_warnings)
-  # Add Pre-build.
-  _ToolAppend(tools, 'VCPreBuildEventTool', 'CommandLine', prebuild)
-  # Add Post-build.
-  _ToolAppend(tools, 'VCPostBuildEventTool', 'CommandLine', postbuild)
-  # Turn on precompiled headers if appropriate.
-  if precompiled_header:
-    precompiled_header = os.path.split(precompiled_header)[1]
-    _ToolAppend(tools, 'VCCLCompilerTool', 'UsePrecompiledHeader', '2')
-    _ToolAppend(tools, 'VCCLCompilerTool',
-                'PrecompiledHeaderThrough', precompiled_header)
-    _ToolAppend(tools, 'VCCLCompilerTool',
-                'ForcedIncludeFiles', precompiled_header)
-  # Loadable modules don't generate import libraries;
-  # tell dependent projects to not expect one.
-  if spec['type'] == 'loadable_module':
-    _ToolAppend(tools, 'VCLinkerTool', 'IgnoreImportLibrary', 'true')
-  # Set the module definition file if any.
-  if def_file:
-    _ToolAppend(tools, 'VCLinkerTool', 'ModuleDefinitionFile', def_file)
-
-  _AddConfigurationToMSVS(p, spec, tools, config, config_type, config_name)
-
-
-def _GetIncludeDirs(config):
-  """Returns the list of directories to be used for #include directives.
-
-  Arguments:
-    config: The dictionnary that defines the special processing to be done
-            for this configuration.
-  Returns:
-    The list of directory paths.
-  """
-  # TODO(bradnelson): include_dirs should really be flexible enough not to
-  #                   require this sort of thing.
-  include_dirs = (
-      config.get('include_dirs', []) +
-      config.get('msvs_system_include_dirs', []))
-  resource_include_dirs = config.get('resource_include_dirs', include_dirs)
-  include_dirs = _FixPaths(include_dirs)
-  resource_include_dirs = _FixPaths(resource_include_dirs)
-  return include_dirs, resource_include_dirs
-
-
-def _GetLibraries(spec):
-  """Returns the list of libraries for this configuration.
-
-  Arguments:
-    spec: The target dictionary containing the properties of the target.
-  Returns:
-    The list of directory paths.
-  """
-  libraries = spec.get('libraries', [])
-  # Strip out -l, as it is not used on windows (but is needed so we can pass
-  # in libraries that are assumed to be in the default library path).
-  # Also remove duplicate entries, leaving only the last duplicate, while
-  # preserving order.
-  found = set()
-  unique_libraries_list = []
-  for entry in reversed(libraries):
-    library = re.sub('^\-l', '', entry)
-    if not os.path.splitext(library)[1]:
-      library += '.lib'
-    if library not in found:
-      found.add(library)
-      unique_libraries_list.append(library)
-  unique_libraries_list.reverse()
-  return unique_libraries_list
-
-
-def _GetOutputFilePathAndTool(spec, msbuild):
-  """Returns the path and tool to use for this target.
-
-  Figures out the path of the file this spec will create and the name of
-  the VC tool that will create it.
-
-  Arguments:
-    spec: The target dictionary containing the properties of the target.
-  Returns:
-    A triple of (file path, name of the vc tool, name of the msbuild tool)
-  """
-  # Select a name for the output file.
-  out_file = ''
-  vc_tool = ''
-  msbuild_tool = ''
-  output_file_map = {
-      'executable': ('VCLinkerTool', 'Link', '$(OutDir)', '.exe'),
-      'shared_library': ('VCLinkerTool', 'Link', '$(OutDir)', '.dll'),
-      'loadable_module': ('VCLinkerTool', 'Link', '$(OutDir)', '.dll'),
-      'static_library': ('VCLibrarianTool', 'Lib', '$(OutDir)lib\\', '.lib'),
-  }
-  output_file_props = output_file_map.get(spec['type'])
-  if output_file_props and int(spec.get('msvs_auto_output_file', 1)):
-    vc_tool, msbuild_tool, out_dir, suffix = output_file_props
-    if spec.get('standalone_static_library', 0):
-      out_dir = '$(OutDir)'
-    out_dir = spec.get('product_dir', out_dir)
-    product_extension = spec.get('product_extension')
-    if product_extension:
-      suffix = '.' + product_extension
-    elif msbuild:
-      suffix = '$(TargetExt)'
-    prefix = spec.get('product_prefix', '')
-    product_name = spec.get('product_name', '$(ProjectName)')
-    out_file = ntpath.join(out_dir, prefix + product_name + suffix)
-  return out_file, vc_tool, msbuild_tool
-
-
-def _GetDefines(config):
-  """Returns the list of preprocessor definitions for this configuation.
-
-  Arguments:
-    config: The dictionnary that defines the special processing to be done
-            for this configuration.
-  Returns:
-    The list of preprocessor definitions.
-  """
-  defines = []
-  for d in config.get('defines', []):
-    if type(d) == list:
-      fd = '='.join([str(dpart) for dpart in d])
-    else:
-      fd = str(d)
-    defines.append(fd)
-  return defines
-
-
-def _GetDisabledWarnings(config):
-  return [str(i) for i in config.get('msvs_disabled_warnings', [])]
-
-
-def _GetModuleDefinition(spec):
-  def_file = ''
-  if spec['type'] in ['shared_library', 'loadable_module', 'executable']:
-    def_files = [s for s in spec.get('sources', []) if s.endswith('.def')]
-    if len(def_files) == 1:
-      def_file = _FixPath(def_files[0])
-    elif def_files:
-      raise ValueError(
-          'Multiple module definition files in one target, target %s lists '
-          'multiple .def files: %s' % (
-              spec['target_name'], ' '.join(def_files)))
-  return def_file
-
-
-def _ConvertToolsToExpectedForm(tools):
-  """Convert tools to a form expected by Visual Studio.
-
-  Arguments:
-    tools: A dictionnary of settings; the tool name is the key.
-  Returns:
-    A list of Tool objects.
-  """
-  tool_list = []
-  for tool, settings in tools.iteritems():
-    # Collapse settings with lists.
-    settings_fixed = {}
-    for setting, value in settings.iteritems():
-      if type(value) == list:
-        if ((tool == 'VCLinkerTool' and
-             setting == 'AdditionalDependencies') or
-            setting == 'AdditionalOptions'):
-          settings_fixed[setting] = ' '.join(value)
-        else:
-          settings_fixed[setting] = ';'.join(value)
-      else:
-        settings_fixed[setting] = value
-    # Add in this tool.
-    tool_list.append(MSVSProject.Tool(tool, settings_fixed))
-  return tool_list
-
-
-def _AddConfigurationToMSVS(p, spec, tools, config, config_type, config_name):
-  """Add to the project file the configuration specified by config.
-
-  Arguments:
-    p: The target project being generated.
-    spec: the target project dict.
-    tools: A dictionnary of settings; the tool name is the key.
-    config: The dictionnary that defines the special processing to be done
-            for this configuration.
-    config_type: The configuration type, a number as defined by Microsoft.
-    config_name: The name of the configuration.
-  """
-  attributes = _GetMSVSAttributes(spec, config, config_type)
-  # Add in this configuration.
-  tool_list = _ConvertToolsToExpectedForm(tools)
-  p.AddConfig(_ConfigFullName(config_name, config),
-              attrs=attributes, tools=tool_list)
-
-
-def _GetMSVSAttributes(spec, config, config_type):
-  # Prepare configuration attributes.
-  prepared_attrs = {}
-  source_attrs = config.get('msvs_configuration_attributes', {})
-  for a in source_attrs:
-    prepared_attrs[a] = source_attrs[a]
-  # Add props files.
-  vsprops_dirs = config.get('msvs_props', [])
-  vsprops_dirs = _FixPaths(vsprops_dirs)
-  if vsprops_dirs:
-    prepared_attrs['InheritedPropertySheets'] = ';'.join(vsprops_dirs)
-  # Set configuration type.
-  prepared_attrs['ConfigurationType'] = config_type
-  output_dir = prepared_attrs.get('OutputDirectory',
-                                  '$(SolutionDir)$(ConfigurationName)')
-  prepared_attrs['OutputDirectory'] = _FixPath(output_dir) + '\\'
-  if 'IntermediateDirectory' not in prepared_attrs:
-    intermediate = '$(ConfigurationName)\\obj\\$(ProjectName)'
-    prepared_attrs['IntermediateDirectory'] = _FixPath(intermediate) + '\\'
-  else:
-    intermediate = _FixPath(prepared_attrs['IntermediateDirectory']) + '\\'
-    intermediate = MSVSSettings.FixVCMacroSlashes(intermediate)
-    prepared_attrs['IntermediateDirectory'] = intermediate
-  return prepared_attrs
-
-
-def _AddNormalizedSources(sources_set, sources_array):
-  sources = [_NormalizedSource(s) for s in sources_array]
-  sources_set.update(set(sources))
-
-
-def _PrepareListOfSources(spec, generator_flags, gyp_file):
-  """Prepare list of sources and excluded sources.
-
-  Besides the sources specified directly in the spec, adds the gyp file so
-  that a change to it will cause a re-compile. Also adds appropriate sources
-  for actions and copies. Assumes later stage will un-exclude files which
-  have custom build steps attached.
-
-  Arguments:
-    spec: The target dictionary containing the properties of the target.
-    gyp_file: The name of the gyp file.
-  Returns:
-    A pair of (list of sources, list of excluded sources).
-    The sources will be relative to the gyp file.
-  """
-  sources = set()
-  _AddNormalizedSources(sources, spec.get('sources', []))
-  excluded_sources = set()
-  # Add in the gyp file.
-  if not generator_flags.get('standalone'):
-    sources.add(gyp_file)
-
-  # Add in 'action' inputs and outputs.
-  for a in spec.get('actions', []):
-    inputs = a['inputs']
-    inputs = [_NormalizedSource(i) for i in inputs]
-    # Add all inputs to sources and excluded sources.
-    inputs = set(inputs)
-    sources.update(inputs)
-    excluded_sources.update(inputs)
-    if int(a.get('process_outputs_as_sources', False)):
-      _AddNormalizedSources(sources, a.get('outputs', []))
-  # Add in 'copies' inputs and outputs.
-  for cpy in spec.get('copies', []):
-    _AddNormalizedSources(sources, cpy.get('files', []))
-  return (sources, excluded_sources)
-
-
-def _AdjustSourcesAndConvertToFilterHierarchy(
-    spec, options, gyp_dir, sources, excluded_sources, list_excluded):
-  """Adjusts the list of sources and excluded sources.
-
-  Also converts the sets to lists.
-
-  Arguments:
-    spec: The target dictionary containing the properties of the target.
-    options: Global generator options.
-    gyp_dir: The path to the gyp file being processed.
-    sources: A set of sources to be included for this project.
-    excluded_sources: A set of sources to be excluded for this project.
-  Returns:
-    A trio of (list of sources, list of excluded sources,
-               path of excluded IDL file)
-  """
-  # Exclude excluded sources coming into the generator.
-  excluded_sources.update(set(spec.get('sources_excluded', [])))
-  # Add excluded sources into sources for good measure.
-  sources.update(excluded_sources)
-  # Convert to proper windows form.
-  # NOTE: sources goes from being a set to a list here.
-  # NOTE: excluded_sources goes from being a set to a list here.
-  sources = _FixPaths(sources)
-  # Convert to proper windows form.
-  excluded_sources = _FixPaths(excluded_sources)
-
-  excluded_idl = _IdlFilesHandledNonNatively(spec, sources)
-
-  precompiled_related = _GetPrecompileRelatedFiles(spec)
-  # Find the excluded ones, minus the precompiled header related ones.
-  fully_excluded = [i for i in excluded_sources if i not in precompiled_related]
-
-  # Convert to folders and the right slashes.
-  sources = [i.split('\\') for i in sources]
-  sources = _ConvertSourcesToFilterHierarchy(sources, excluded=fully_excluded,
-                                             list_excluded=list_excluded)
-
-  return sources, excluded_sources, excluded_idl
-
-
-def _IdlFilesHandledNonNatively(spec, sources):
-  # If any non-native rules use 'idl' as an extension exclude idl files.
-  # Gather a list here to use later.
-  using_idl = False
-  for rule in spec.get('rules', []):
-    if rule['extension'] == 'idl' and int(rule.get('msvs_external_rule', 0)):
-      using_idl = True
-      break
-  if using_idl:
-    excluded_idl = [i for i in sources if i.endswith('.idl')]
-  else:
-    excluded_idl = []
-  return excluded_idl
-
-
-def _GetPrecompileRelatedFiles(spec):
-  # Gather a list of precompiled header related sources.
-  precompiled_related = []
-  for _, config in spec['configurations'].iteritems():
-    for k in precomp_keys:
-      f = config.get(k)
-      if f:
-        precompiled_related.append(_FixPath(f))
-  return precompiled_related
-
-
-def _ExcludeFilesFromBeingBuilt(p, spec, excluded_sources, excluded_idl,
-                                list_excluded):
-  exclusions = _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl)
-  for file_name, excluded_configs in exclusions.iteritems():
-    if (not list_excluded and
-            len(excluded_configs) == len(spec['configurations'])):
-      # If we're not listing excluded files, then they won't appear in the
-      # project, so don't try to configure them to be excluded.
-      pass
-    else:
-      for config_name, config in excluded_configs:
-        p.AddFileConfig(file_name, _ConfigFullName(config_name, config),
-                        {'ExcludedFromBuild': 'true'})
-
-
-def _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl):
-  exclusions = {}
-  # Exclude excluded sources from being built.
-  for f in excluded_sources:
-    excluded_configs = []
-    for config_name, config in spec['configurations'].iteritems():
-      precomped = [_FixPath(config.get(i, '')) for i in precomp_keys]
-      # Don't do this for ones that are precompiled header related.
-      if f not in precomped:
-        excluded_configs.append((config_name, config))
-    exclusions[f] = excluded_configs
-  # If any non-native rules use 'idl' as an extension exclude idl files.
-  # Exclude them now.
-  for f in excluded_idl:
-    excluded_configs = []
-    for config_name, config in spec['configurations'].iteritems():
-      excluded_configs.append((config_name, config))
-    exclusions[f] = excluded_configs
-  return exclusions
-
-
-def _AddToolFilesToMSVS(p, spec):
-  # Add in tool files (rules).
-  tool_files = set()
-  for _, config in spec['configurations'].iteritems():
-    for f in config.get('msvs_tool_files', []):
-      tool_files.add(f)
-  for f in tool_files:
-    p.AddToolFile(f)
-
-
-def _HandlePreCompiledHeaders(p, sources, spec):
-  # Pre-compiled header source stubs need a different compiler flag
-  # (generate precompiled header) and any source file not of the same
-  # kind (i.e. C vs. C++) as the precompiled header source stub needs
-  # to have use of precompiled headers disabled.
-  extensions_excluded_from_precompile = []
-  for config_name, config in spec['configurations'].iteritems():
-    source = config.get('msvs_precompiled_source')
-    if source:
-      source = _FixPath(source)
-      # UsePrecompiledHeader=1 for if using precompiled headers.
-      tool = MSVSProject.Tool('VCCLCompilerTool',
-                              {'UsePrecompiledHeader': '1'})
-      p.AddFileConfig(source, _ConfigFullName(config_name, config),
-                      {}, tools=[tool])
-      basename, extension = os.path.splitext(source)
-      if extension == '.c':
-        extensions_excluded_from_precompile = ['.cc', '.cpp', '.cxx']
-      else:
-        extensions_excluded_from_precompile = ['.c']
-  def DisableForSourceTree(source_tree):
-    for source in source_tree:
-      if isinstance(source, MSVSProject.Filter):
-        DisableForSourceTree(source.contents)
-      else:
-        basename, extension = os.path.splitext(source)
-        if extension in extensions_excluded_from_precompile:
-          for config_name, config in spec['configurations'].iteritems():
-            tool = MSVSProject.Tool('VCCLCompilerTool',
-                                    {'UsePrecompiledHeader': '0',
-                                     'ForcedIncludeFiles': '$(NOINHERIT)'})
-            p.AddFileConfig(_FixPath(source),
-                            _ConfigFullName(config_name, config),
-                            {}, tools=[tool])
-  # Do nothing if there was no precompiled source.
-  if extensions_excluded_from_precompile:
-    DisableForSourceTree(sources)
-
-
-def _AddActions(actions_to_add, spec, relative_path_of_gyp_file):
-  # Add actions.
-  actions = spec.get('actions', [])
-  # Don't setup_env every time. When all the actions are run together in one
-  # batch file in VS, the PATH will grow too long.
-  # Membership in this set means that the cygwin environment has been set up,
-  # and does not need to be set up again.
-  have_setup_env = set()
-  for a in actions:
-    # Attach actions to the gyp file if nothing else is there.
-    inputs = a.get('inputs') or [relative_path_of_gyp_file]
-    attached_to = inputs[0]
-    need_setup_env = attached_to not in have_setup_env
-    cmd = _BuildCommandLineForRule(spec, a, has_input_path=False,
-                                   do_setup_env=need_setup_env)
-    have_setup_env.add(attached_to)
-    # Add the action.
-    _AddActionStep(actions_to_add,
-                   inputs=inputs,
-                   outputs=a.get('outputs', []),
-                   description=a.get('message', a['action_name']),
-                   command=cmd)
-
-
-def _WriteMSVSUserFile(project_path, version, spec):
-  # Add run_as and test targets.
-  if 'run_as' in spec:
-    run_as = spec['run_as']
-    action = run_as.get('action', [])
-    environment = run_as.get('environment', [])
-    working_directory = run_as.get('working_directory', '.')
-  elif int(spec.get('test', 0)):
-    action = ['$(TargetPath)', '--gtest_print_time']
-    environment = []
-    working_directory = '.'
-  else:
-    return  # Nothing to add
-  # Write out the user file.
-  user_file = _CreateMSVSUserFile(project_path, version, spec)
-  for config_name, c_data in spec['configurations'].iteritems():
-    user_file.AddDebugSettings(_ConfigFullName(config_name, c_data),
-                               action, environment, working_directory)
-  user_file.WriteIfChanged()
-
-
-def _AddCopies(actions_to_add, spec):
-  copies = _GetCopies(spec)
-  for inputs, outputs, cmd, description in copies:
-    _AddActionStep(actions_to_add, inputs=inputs, outputs=outputs,
-                   description=description, command=cmd)
-
-
-def _GetCopies(spec):
-  copies = []
-  # Add copies.
-  for cpy in spec.get('copies', []):
-    for src in cpy.get('files', []):
-      dst = os.path.join(cpy['destination'], os.path.basename(src))
-      # _AddCustomBuildToolForMSVS() will call _FixPath() on the inputs and
-      # outputs, so do the same for our generated command line.
-      if src.endswith('/'):
-        src_bare = src[:-1]
-        base_dir = posixpath.split(src_bare)[0]
-        outer_dir = posixpath.split(src_bare)[1]
-        cmd = 'cd "%s" && xcopy /e /f /y "%s" "%s\\%s\\"' % (
-            _FixPath(base_dir), outer_dir, _FixPath(dst), outer_dir)
-        copies.append(([src], ['dummy_copies', dst], cmd,
-                       'Copying %s to %s' % (src, dst)))
-      else:
-        cmd = 'mkdir "%s" 2>nul & set ERRORLEVEL=0 & copy /Y "%s" "%s"' % (
-            _FixPath(cpy['destination']), _FixPath(src), _FixPath(dst))
-        copies.append(([src], [dst], cmd, 'Copying %s to %s' % (src, dst)))
-  return copies
-
-
-def _GetPathDict(root, path):
-  # |path| will eventually be empty (in the recursive calls) if it was initially
-  # relative; otherwise it will eventually end up as '\', 'D:\', etc.
-  if not path or path.endswith(os.sep):
-    return root
-  parent, folder = os.path.split(path)
-  parent_dict = _GetPathDict(root, parent)
-  if folder not in parent_dict:
-    parent_dict[folder] = dict()
-  return parent_dict[folder]
-
-
-def _DictsToFolders(base_path, bucket, flat):
-  # Convert to folders recursively.
-  children = []
-  for folder, contents in bucket.iteritems():
-    if type(contents) == dict:
-      folder_children = _DictsToFolders(os.path.join(base_path, folder),
-                                        contents, flat)
-      if flat:
-        children += folder_children
-      else:
-        folder_children = MSVSNew.MSVSFolder(os.path.join(base_path, folder),
-                                             name='(' + folder + ')',
-                                             entries=folder_children)
-        children.append(folder_children)
-    else:
-      children.append(contents)
-  return children
-
-
-def _CollapseSingles(parent, node):
-  # Recursively explorer the tree of dicts looking for projects which are
-  # the sole item in a folder which has the same name as the project. Bring
-  # such projects up one level.
-  if (type(node) == dict and
-      len(node) == 1 and
-      node.keys()[0] == parent + '.vcproj'):
-    return node[node.keys()[0]]
-  if type(node) != dict:
-    return node
-  for child in node:
-    node[child] = _CollapseSingles(child, node[child])
-  return node
-
-
-def _GatherSolutionFolders(sln_projects, project_objects, flat):
-  root = {}
-  # Convert into a tree of dicts on path.
-  for p in sln_projects:
-    gyp_file, target = gyp.common.ParseQualifiedTarget(p)[0:2]
-    gyp_dir = os.path.dirname(gyp_file)
-    path_dict = _GetPathDict(root, gyp_dir)
-    path_dict[target + '.vcproj'] = project_objects[p]
-  # Walk down from the top until we hit a folder that has more than one entry.
-  # In practice, this strips the top-level "src/" dir from the hierarchy in
-  # the solution.
-  while len(root) == 1 and type(root[root.keys()[0]]) == dict:
-    root = root[root.keys()[0]]
-  # Collapse singles.
-  root = _CollapseSingles('', root)
-  # Merge buckets until everything is a root entry.
-  return _DictsToFolders('', root, flat)
-
-
-def _GetPathOfProject(qualified_target, spec, options, msvs_version):
-  default_config = _GetDefaultConfiguration(spec)
-  proj_filename = default_config.get('msvs_existing_vcproj')
-  if not proj_filename:
-    proj_filename = (spec['target_name'] + options.suffix +
-                     msvs_version.ProjectExtension())
-
-  build_file = gyp.common.BuildFile(qualified_target)
-  proj_path = os.path.join(os.path.dirname(build_file), proj_filename)
-  fix_prefix = None
-  if options.generator_output:
-    project_dir_path = os.path.dirname(os.path.abspath(proj_path))
-    proj_path = os.path.join(options.generator_output, proj_path)
-    fix_prefix = gyp.common.RelativePath(project_dir_path,
-                                         os.path.dirname(proj_path))
-  return proj_path, fix_prefix
-
-
-def _GetPlatformOverridesOfProject(spec):
-  # Prepare a dict indicating which project configurations are used for which
-  # solution configurations for this target.
-  config_platform_overrides = {}
-  for config_name, c in spec['configurations'].iteritems():
-    config_fullname = _ConfigFullName(config_name, c)
-    platform = c.get('msvs_target_platform', _ConfigPlatform(c))
-    fixed_config_fullname = '%s|%s' % (
-        _ConfigBaseName(config_name, _ConfigPlatform(c)), platform)
-    config_platform_overrides[config_fullname] = fixed_config_fullname
-  return config_platform_overrides
-
-
-def _CreateProjectObjects(target_list, target_dicts, options, msvs_version):
-  """Create a MSVSProject object for the targets found in target list.
-
-  Arguments:
-    target_list: the list of targets to generate project objects for.
-    target_dicts: the dictionary of specifications.
-    options: global generator options.
-    msvs_version: the MSVSVersion object.
-  Returns:
-    A set of created projects, keyed by target.
-  """
-  global fixpath_prefix
-  # Generate each project.
-  projects = {}
-  for qualified_target in target_list:
-    spec = target_dicts[qualified_target]
-    if spec['toolset'] != 'target':
-      raise GypError(
-          'Multiple toolsets not supported in msvs build (target %s)' %
-          qualified_target)
-    proj_path, fixpath_prefix = _GetPathOfProject(qualified_target, spec,
-                                                  options, msvs_version)
-    guid = _GetGuidOfProject(proj_path, spec)
-    overrides = _GetPlatformOverridesOfProject(spec)
-    build_file = gyp.common.BuildFile(qualified_target)
-    # Create object for this project.
-    obj = MSVSNew.MSVSProject(
-        proj_path,
-        name=spec['target_name'],
-        guid=guid,
-        spec=spec,
-        build_file=build_file,
-        config_platform_overrides=overrides,
-        fixpath_prefix=fixpath_prefix)
-    # Set project toolset if any (MS build only)
-    if msvs_version.UsesVcxproj():
-      obj.set_msbuild_toolset(
-          _GetMsbuildToolsetOfProject(proj_path, spec, msvs_version))
-    projects[qualified_target] = obj
-  # Set all the dependencies
-  for project in projects.values():
-    deps = project.spec.get('dependencies', [])
-    deps = [projects[d] for d in deps]
-    project.set_dependencies(deps)
-  return projects
-
-
-def CalculateVariables(default_variables, params):
-  """Generated variables that require params to be known."""
-
-  generator_flags = params.get('generator_flags', {})
-
-  # Select project file format version (if unset, default to auto detecting).
-  msvs_version = MSVSVersion.SelectVisualStudioVersion(
-      generator_flags.get('msvs_version', 'auto'))
-  # Stash msvs_version for later (so we don't have to probe the system twice).
-  params['msvs_version'] = msvs_version
-
-  # Set a variable so conditions can be based on msvs_version.
-  default_variables['MSVS_VERSION'] = msvs_version.ShortName()
-
-  # To determine processor word size on Windows, in addition to checking
-  # PROCESSOR_ARCHITECTURE (which reflects the word size of the current
-  # process), it is also necessary to check PROCESSOR_ARCITEW6432 (which
-  # contains the actual word size of the system when running thru WOW64).
-  if (os.environ.get('PROCESSOR_ARCHITECTURE', '').find('64') >= 0 or
-      os.environ.get('PROCESSOR_ARCHITEW6432', '').find('64') >= 0):
-    default_variables['MSVS_OS_BITS'] = 64
-  else:
-    default_variables['MSVS_OS_BITS'] = 32
-
-
-def PerformBuild(data, configurations, params):
-  options = params['options']
-  msvs_version = params['msvs_version']
-  devenv = os.path.join(msvs_version.path, 'Common7', 'IDE', 'devenv.com')
-
-  for build_file, build_file_dict in data.iteritems():
-    (build_file_root, build_file_ext) = os.path.splitext(build_file)
-    if build_file_ext != '.gyp':
-      continue
-    sln_path = build_file_root + options.suffix + '.sln'
-    if options.generator_output:
-      sln_path = os.path.join(options.generator_output, sln_path)
-
-  for config in configurations:
-    arguments = [devenv, sln_path, '/Build', config]
-    print 'Building [%s]: %s' % (config, arguments)
-    rtn = subprocess.check_call(arguments)
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  """Generate .sln and .vcproj files.
-
-  This is the entry point for this generator.
-  Arguments:
-    target_list: List of target pairs: 'base/base.gyp:base'.
-    target_dicts: Dict of target properties keyed on target pair.
-    data: Dictionary containing per .gyp data.
-  """
-  global fixpath_prefix
-
-  options = params['options']
-
-  # Get the project file format version back out of where we stashed it in
-  # GeneratorCalculatedVariables.
-  msvs_version = params['msvs_version']
-
-  generator_flags = params.get('generator_flags', {})
-
-  # Optionally shard targets marked with 'msvs_shard': SHARD_COUNT.
-  (target_list, target_dicts) = MSVSUtil.ShardTargets(target_list, target_dicts)
-
-  # Optionally use the large PDB workaround for targets marked with
-  # 'msvs_large_pdb': 1.
-  (target_list, target_dicts) = MSVSUtil.InsertLargePdbShims(
-        target_list, target_dicts, generator_default_variables)
-
-  # Prepare the set of configurations.
-  configs = set()
-  for qualified_target in target_list:
-    spec = target_dicts[qualified_target]
-    for config_name, config in spec['configurations'].iteritems():
-      configs.add(_ConfigFullName(config_name, config))
-  configs = list(configs)
-
-  # Figure out all the projects that will be generated and their guids
-  project_objects = _CreateProjectObjects(target_list, target_dicts, options,
-                                          msvs_version)
-
-  # Generate each project.
-  missing_sources = []
-  for project in project_objects.values():
-    fixpath_prefix = project.fixpath_prefix
-    missing_sources.extend(_GenerateProject(project, options, msvs_version,
-                                            generator_flags))
-  fixpath_prefix = None
-
-  for build_file in data:
-    # Validate build_file extension
-    if not build_file.endswith('.gyp'):
-      continue
-    sln_path = os.path.splitext(build_file)[0] + options.suffix + '.sln'
-    if options.generator_output:
-      sln_path = os.path.join(options.generator_output, sln_path)
-    # Get projects in the solution, and their dependents.
-    sln_projects = gyp.common.BuildFileTargets(target_list, build_file)
-    sln_projects += gyp.common.DeepDependencyTargets(target_dicts, sln_projects)
-    # Create folder hierarchy.
-    root_entries = _GatherSolutionFolders(
-        sln_projects, project_objects, flat=msvs_version.FlatSolution())
-    # Create solution.
-    sln = MSVSNew.MSVSSolution(sln_path,
-                               entries=root_entries,
-                               variants=configs,
-                               websiteProperties=False,
-                               version=msvs_version)
-    sln.Write()
-
-  if missing_sources:
-    error_message = "Missing input files:\n" + \
-                    '\n'.join(set(missing_sources))
-    if generator_flags.get('msvs_error_on_missing_sources', False):
-      raise GypError(error_message)
-    else:
-      print >> sys.stdout, "Warning: " + error_message
-
-
-def _GenerateMSBuildFiltersFile(filters_path, source_files,
-                                extension_to_rule_name):
-  """Generate the filters file.
-
-  This file is used by Visual Studio to organize the presentation of source
-  files into folders.
-
-  Arguments:
-      filters_path: The path of the file to be created.
-      source_files: The hierarchical structure of all the sources.
-      extension_to_rule_name: A dictionary mapping file extensions to rules.
-  """
-  filter_group = []
-  source_group = []
-  _AppendFiltersForMSBuild('', source_files, extension_to_rule_name,
-                           filter_group, source_group)
-  if filter_group:
-    content = ['Project',
-               {'ToolsVersion': '4.0',
-                'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003'
-               },
-               ['ItemGroup'] + filter_group,
-               ['ItemGroup'] + source_group
-              ]
-    easy_xml.WriteXmlIfChanged(content, filters_path, pretty=True, win32=True)
-  elif os.path.exists(filters_path):
-    # We don't need this filter anymore.  Delete the old filter file.
-    os.unlink(filters_path)
-
-
-def _AppendFiltersForMSBuild(parent_filter_name, sources,
-                             extension_to_rule_name,
-                             filter_group, source_group):
-  """Creates the list of filters and sources to be added in the filter file.
-
-  Args:
-      parent_filter_name: The name of the filter under which the sources are
-          found.
-      sources: The hierarchy of filters and sources to process.
-      extension_to_rule_name: A dictionary mapping file extensions to rules.
-      filter_group: The list to which filter entries will be appended.
-      source_group: The list to which source entries will be appeneded.
-  """
-  for source in sources:
-    if isinstance(source, MSVSProject.Filter):
-      # We have a sub-filter.  Create the name of that sub-filter.
-      if not parent_filter_name:
-        filter_name = source.name
-      else:
-        filter_name = '%s\\%s' % (parent_filter_name, source.name)
-      # Add the filter to the group.
-      filter_group.append(
-          ['Filter', {'Include': filter_name},
-           ['UniqueIdentifier', MSVSNew.MakeGuid(source.name)]])
-      # Recurse and add its dependents.
-      _AppendFiltersForMSBuild(filter_name, source.contents,
-                               extension_to_rule_name,
-                               filter_group, source_group)
-    else:
-      # It's a source.  Create a source entry.
-      _, element = _MapFileToMsBuildSourceType(source, extension_to_rule_name)
-      source_entry = [element, {'Include': source}]
-      # Specify the filter it is part of, if any.
-      if parent_filter_name:
-        source_entry.append(['Filter', parent_filter_name])
-      source_group.append(source_entry)
-
-
-def _MapFileToMsBuildSourceType(source, extension_to_rule_name):
-  """Returns the group and element type of the source file.
-
-  Arguments:
-      source: The source file name.
-      extension_to_rule_name: A dictionary mapping file extensions to rules.
-
-  Returns:
-      A pair of (group this file should be part of, the label of element)
-  """
-  _, ext = os.path.splitext(source)
-  if ext in extension_to_rule_name:
-    group = 'rule'
-    element = extension_to_rule_name[ext]
-  elif ext in ['.cc', '.cpp', '.c', '.cxx']:
-    group = 'compile'
-    element = 'ClCompile'
-  elif ext in ['.h', '.hxx']:
-    group = 'include'
-    element = 'ClInclude'
-  elif ext == '.rc':
-    group = 'resource'
-    element = 'ResourceCompile'
-  elif ext == '.idl':
-    group = 'midl'
-    element = 'Midl'
-  else:
-    group = 'none'
-    element = 'None'
-  return (group, element)
-
-
-def _GenerateRulesForMSBuild(output_dir, options, spec,
-                             sources, excluded_sources,
-                             props_files_of_rules, targets_files_of_rules,
-                             actions_to_add, extension_to_rule_name):
-  # MSBuild rules are implemented using three files: an XML file, a .targets
-  # file and a .props file.
-  # See http://blogs.msdn.com/b/vcblog/archive/2010/04/21/quick-help-on-vs2010-custom-build-rule.aspx
-  # for more details.
-  rules = spec.get('rules', [])
-  rules_native = [r for r in rules if not int(r.get('msvs_external_rule', 0))]
-  rules_external = [r for r in rules if int(r.get('msvs_external_rule', 0))]
-
-  msbuild_rules = []
-  for rule in rules_native:
-    # Skip a rule with no action and no inputs.
-    if 'action' not in rule and not rule.get('rule_sources', []):
-      continue
-    msbuild_rule = MSBuildRule(rule, spec)
-    msbuild_rules.append(msbuild_rule)
-    extension_to_rule_name[msbuild_rule.extension] = msbuild_rule.rule_name
-  if msbuild_rules:
-    base = spec['target_name'] + options.suffix
-    props_name = base + '.props'
-    targets_name = base + '.targets'
-    xml_name = base + '.xml'
-
-    props_files_of_rules.add(props_name)
-    targets_files_of_rules.add(targets_name)
-
-    props_path = os.path.join(output_dir, props_name)
-    targets_path = os.path.join(output_dir, targets_name)
-    xml_path = os.path.join(output_dir, xml_name)
-
-    _GenerateMSBuildRulePropsFile(props_path, msbuild_rules)
-    _GenerateMSBuildRuleTargetsFile(targets_path, msbuild_rules)
-    _GenerateMSBuildRuleXmlFile(xml_path, msbuild_rules)
-
-  if rules_external:
-    _GenerateExternalRules(rules_external, output_dir, spec,
-                           sources, options, actions_to_add)
-  _AdjustSourcesForRules(rules, sources, excluded_sources)
-
-
-class MSBuildRule(object):
-  """Used to store information used to generate an MSBuild rule.
-
-  Attributes:
-    rule_name: The rule name, sanitized to use in XML.
-    target_name: The name of the target.
-    after_targets: The name of the AfterTargets element.
-    before_targets: The name of the BeforeTargets element.
-    depends_on: The name of the DependsOn element.
-    compute_output: The name of the ComputeOutput element.
-    dirs_to_make: The name of the DirsToMake element.
-    inputs: The name of the _inputs element.
-    tlog: The name of the _tlog element.
-    extension: The extension this rule applies to.
-    description: The message displayed when this rule is invoked.
-    additional_dependencies: A string listing additional dependencies.
-    outputs: The outputs of this rule.
-    command: The command used to run the rule.
-  """
-
-  def __init__(self, rule, spec):
-    self.display_name = rule['rule_name']
-    # Assure that the rule name is only characters and numbers
-    self.rule_name = re.sub(r'\W', '_', self.display_name)
-    # Create the various element names, following the example set by the
-    # Visual Studio 2008 to 2010 conversion.  I don't know if VS2010
-    # is sensitive to the exact names.
-    self.target_name = '_' + self.rule_name
-    self.after_targets = self.rule_name + 'AfterTargets'
-    self.before_targets = self.rule_name + 'BeforeTargets'
-    self.depends_on = self.rule_name + 'DependsOn'
-    self.compute_output = 'Compute%sOutput' % self.rule_name
-    self.dirs_to_make = self.rule_name + 'DirsToMake'
-    self.inputs = self.rule_name + '_inputs'
-    self.tlog = self.rule_name + '_tlog'
-    self.extension = rule['extension']
-    if not self.extension.startswith('.'):
-      self.extension = '.' + self.extension
-
-    self.description = MSVSSettings.ConvertVCMacrosToMSBuild(
-        rule.get('message', self.rule_name))
-    old_additional_dependencies = _FixPaths(rule.get('inputs', []))
-    self.additional_dependencies = (
-        ';'.join([MSVSSettings.ConvertVCMacrosToMSBuild(i)
-                  for i in old_additional_dependencies]))
-    old_outputs = _FixPaths(rule.get('outputs', []))
-    self.outputs = ';'.join([MSVSSettings.ConvertVCMacrosToMSBuild(i)
-                             for i in old_outputs])
-    old_command = _BuildCommandLineForRule(spec, rule, has_input_path=True,
-                                           do_setup_env=True)
-    self.command = MSVSSettings.ConvertVCMacrosToMSBuild(old_command)
-
-
-def _GenerateMSBuildRulePropsFile(props_path, msbuild_rules):
-  """Generate the .props file."""
-  content = ['Project',
-             {'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003'}]
-  for rule in msbuild_rules:
-    content.extend([
-        ['PropertyGroup',
-         {'Condition': "'$(%s)' == '' and '$(%s)' == '' and "
-          "'$(ConfigurationType)' != 'Makefile'" % (rule.before_targets,
-                                                    rule.after_targets)
-         },
-         [rule.before_targets, 'Midl'],
-         [rule.after_targets, 'CustomBuild'],
-        ],
-        ['PropertyGroup',
-         [rule.depends_on,
-          {'Condition': "'$(ConfigurationType)' != 'Makefile'"},
-          '_SelectedFiles;$(%s)' % rule.depends_on
-         ],
-        ],
-        ['ItemDefinitionGroup',
-         [rule.rule_name,
-          ['CommandLineTemplate', rule.command],
-          ['Outputs', rule.outputs],
-          ['ExecutionDescription', rule.description],
-          ['AdditionalDependencies', rule.additional_dependencies],
-         ],
-        ]
-    ])
-  easy_xml.WriteXmlIfChanged(content, props_path, pretty=True, win32=True)
-
-
-def _GenerateMSBuildRuleTargetsFile(targets_path, msbuild_rules):
-  """Generate the .targets file."""
-  content = ['Project',
-             {'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003'
-             }
-            ]
-  item_group = [
-      'ItemGroup',
-      ['PropertyPageSchema',
-       {'Include': '$(MSBuildThisFileDirectory)$(MSBuildThisFileName).xml'}
-      ]
-    ]
-  for rule in msbuild_rules:
-    item_group.append(
-        ['AvailableItemName',
-         {'Include': rule.rule_name},
-         ['Targets', rule.target_name],
-        ])
-  content.append(item_group)
-
-  for rule in msbuild_rules:
-    content.append(
-        ['UsingTask',
-         {'TaskName': rule.rule_name,
-          'TaskFactory': 'XamlTaskFactory',
-          'AssemblyName': 'Microsoft.Build.Tasks.v4.0'
-         },
-         ['Task', '$(MSBuildThisFileDirectory)$(MSBuildThisFileName).xml'],
-        ])
-  for rule in msbuild_rules:
-    rule_name = rule.rule_name
-    target_outputs = '%%(%s.Outputs)' % rule_name
-    target_inputs = ('%%(%s.Identity);%%(%s.AdditionalDependencies);'
-                     '$(MSBuildProjectFile)') % (rule_name, rule_name)
-    rule_inputs = '%%(%s.Identity)' % rule_name
-    extension_condition = ("'%(Extension)'=='.obj' or "
-                           "'%(Extension)'=='.res' or "
-                           "'%(Extension)'=='.rsc' or "
-                           "'%(Extension)'=='.lib'")
-    remove_section = [
-        'ItemGroup',
-        {'Condition': "'@(SelectedFiles)' != ''"},
-        [rule_name,
-         {'Remove': '@(%s)' % rule_name,
-          'Condition': "'%(Identity)' != '@(SelectedFiles)'"
-         }
-        ]
-    ]
-    inputs_section = [
-        'ItemGroup',
-        [rule.inputs, {'Include': '%%(%s.AdditionalDependencies)' % rule_name}]
-    ]
-    logging_section = [
-        'ItemGroup',
-        [rule.tlog,
-         {'Include': '%%(%s.Outputs)' % rule_name,
-          'Condition': ("'%%(%s.Outputs)' != '' and "
-                        "'%%(%s.ExcludedFromBuild)' != 'true'" %
-                        (rule_name, rule_name))
-         },
-         ['Source', "@(%s, '|')" % rule_name],
-         ['Inputs', "@(%s -> '%%(Fullpath)', ';')" % rule.inputs],
-        ],
-    ]
-    message_section = [
-        'Message',
-        {'Importance': 'High',
-         'Text': '%%(%s.ExecutionDescription)' % rule_name
-        }
-    ]
-    write_tlog_section = [
-        'WriteLinesToFile',
-        {'Condition': "'@(%s)' != '' and '%%(%s.ExcludedFromBuild)' != "
-         "'true'" % (rule.tlog, rule.tlog),
-         'File': '$(IntDir)$(ProjectName).write.1.tlog',
-         'Lines': "^%%(%s.Source);@(%s->'%%(Fullpath)')" % (rule.tlog,
-                                                            rule.tlog)
-        }
-    ]
-    read_tlog_section = [
-        'WriteLinesToFile',
-        {'Condition': "'@(%s)' != '' and '%%(%s.ExcludedFromBuild)' != "
-         "'true'" % (rule.tlog, rule.tlog),
-         'File': '$(IntDir)$(ProjectName).read.1.tlog',
-         'Lines': "^%%(%s.Source);%%(%s.Inputs)" % (rule.tlog, rule.tlog)
-        }
-    ]
-    command_and_input_section = [
-        rule_name,
-        {'Condition': "'@(%s)' != '' and '%%(%s.ExcludedFromBuild)' != "
-         "'true'" % (rule_name, rule_name),
-         'CommandLineTemplate': '%%(%s.CommandLineTemplate)' % rule_name,
-         'AdditionalOptions': '%%(%s.AdditionalOptions)' % rule_name,
-         'Inputs': rule_inputs
-        }
-    ]
-    content.extend([
-        ['Target',
-         {'Name': rule.target_name,
-          'BeforeTargets': '$(%s)' % rule.before_targets,
-          'AfterTargets': '$(%s)' % rule.after_targets,
-          'Condition': "'@(%s)' != ''" % rule_name,
-          'DependsOnTargets': '$(%s);%s' % (rule.depends_on,
-                                            rule.compute_output),
-          'Outputs': target_outputs,
-          'Inputs': target_inputs
-         },
-         remove_section,
-         inputs_section,
-         logging_section,
-         message_section,
-         write_tlog_section,
-         read_tlog_section,
-         command_and_input_section,
-        ],
-        ['PropertyGroup',
-         ['ComputeLinkInputsTargets',
-          '$(ComputeLinkInputsTargets);',
-          '%s;' % rule.compute_output
-         ],
-         ['ComputeLibInputsTargets',
-          '$(ComputeLibInputsTargets);',
-          '%s;' % rule.compute_output
-         ],
-        ],
-        ['Target',
-         {'Name': rule.compute_output,
-          'Condition': "'@(%s)' != ''" % rule_name
-         },
-         ['ItemGroup',
-          [rule.dirs_to_make,
-           {'Condition': "'@(%s)' != '' and "
-            "'%%(%s.ExcludedFromBuild)' != 'true'" % (rule_name, rule_name),
-            'Include': '%%(%s.Outputs)' % rule_name
-           }
-          ],
-          ['Link',
-           {'Include': '%%(%s.Identity)' % rule.dirs_to_make,
-            'Condition': extension_condition
-           }
-          ],
-          ['Lib',
-           {'Include': '%%(%s.Identity)' % rule.dirs_to_make,
-            'Condition': extension_condition
-           }
-          ],
-          ['ImpLib',
-           {'Include': '%%(%s.Identity)' % rule.dirs_to_make,
-            'Condition': extension_condition
-           }
-          ],
-         ],
-         ['MakeDir',
-          {'Directories': ("@(%s->'%%(RootDir)%%(Directory)')" %
-                           rule.dirs_to_make)
-          }
-         ]
-        ],
-    ])
-  easy_xml.WriteXmlIfChanged(content, targets_path, pretty=True, win32=True)
-
-
-def _GenerateMSBuildRuleXmlFile(xml_path, msbuild_rules):
-  # Generate the .xml file
-  content = [
-      'ProjectSchemaDefinitions',
-      {'xmlns': ('clr-namespace:Microsoft.Build.Framework.XamlTypes;'
-                 'assembly=Microsoft.Build.Framework'),
-       'xmlns:x': 'http://schemas.microsoft.com/winfx/2006/xaml',
-       'xmlns:sys': 'clr-namespace:System;assembly=mscorlib',
-       'xmlns:transformCallback':
-       'Microsoft.Cpp.Dev10.ConvertPropertyCallback'
-      }
-  ]
-  for rule in msbuild_rules:
-    content.extend([
-        ['Rule',
-         {'Name': rule.rule_name,
-          'PageTemplate': 'tool',
-          'DisplayName': rule.display_name,
-          'Order': '200'
-         },
-         ['Rule.DataSource',
-          ['DataSource',
-           {'Persistence': 'ProjectFile',
-            'ItemType': rule.rule_name
-           }
-          ]
-         ],
-         ['Rule.Categories',
-          ['Category',
-           {'Name': 'General'},
-           ['Category.DisplayName',
-            ['sys:String', 'General'],
-           ],
-          ],
-          ['Category',
-           {'Name': 'Command Line',
-            'Subtype': 'CommandLine'
-           },
-           ['Category.DisplayName',
-            ['sys:String', 'Command Line'],
-           ],
-          ],
-         ],
-         ['StringListProperty',
-          {'Name': 'Inputs',
-           'Category': 'Command Line',
-           'IsRequired': 'true',
-           'Switch': ' '
-          },
-          ['StringListProperty.DataSource',
-           ['DataSource',
-            {'Persistence': 'ProjectFile',
-             'ItemType': rule.rule_name,
-             'SourceType': 'Item'
-            }
-           ]
-          ],
-         ],
-         ['StringProperty',
-          {'Name': 'CommandLineTemplate',
-           'DisplayName': 'Command Line',
-           'Visible': 'False',
-           'IncludeInCommandLine': 'False'
-          }
-         ],
-         ['DynamicEnumProperty',
-          {'Name': rule.before_targets,
-           'Category': 'General',
-           'EnumProvider': 'Targets',
-           'IncludeInCommandLine': 'False'
-          },
-          ['DynamicEnumProperty.DisplayName',
-           ['sys:String', 'Execute Before'],
-          ],
-          ['DynamicEnumProperty.Description',
-           ['sys:String', 'Specifies the targets for the build customization'
-            ' to run before.'
-           ],
-          ],
-          ['DynamicEnumProperty.ProviderSettings',
-           ['NameValuePair',
-            {'Name': 'Exclude',
-             'Value': '^%s|^Compute' % rule.before_targets
-            }
-           ]
-          ],
-          ['DynamicEnumProperty.DataSource',
-           ['DataSource',
-            {'Persistence': 'ProjectFile',
-             'HasConfigurationCondition': 'true'
-            }
-           ]
-          ],
-         ],
-         ['DynamicEnumProperty',
-          {'Name': rule.after_targets,
-           'Category': 'General',
-           'EnumProvider': 'Targets',
-           'IncludeInCommandLine': 'False'
-          },
-          ['DynamicEnumProperty.DisplayName',
-           ['sys:String', 'Execute After'],
-          ],
-          ['DynamicEnumProperty.Description',
-           ['sys:String', ('Specifies the targets for the build customization'
-                           ' to run after.')
-           ],
-          ],
-          ['DynamicEnumProperty.ProviderSettings',
-           ['NameValuePair',
-            {'Name': 'Exclude',
-             'Value': '^%s|^Compute' % rule.after_targets
-            }
-           ]
-          ],
-          ['DynamicEnumProperty.DataSource',
-           ['DataSource',
-            {'Persistence': 'ProjectFile',
-             'ItemType': '',
-             'HasConfigurationCondition': 'true'
-            }
-           ]
-          ],
-         ],
-         ['StringListProperty',
-          {'Name': 'Outputs',
-           'DisplayName': 'Outputs',
-           'Visible': 'False',
-           'IncludeInCommandLine': 'False'
-          }
-         ],
-         ['StringProperty',
-          {'Name': 'ExecutionDescription',
-           'DisplayName': 'Execution Description',
-           'Visible': 'False',
-           'IncludeInCommandLine': 'False'
-          }
-         ],
-         ['StringListProperty',
-          {'Name': 'AdditionalDependencies',
-           'DisplayName': 'Additional Dependencies',
-           'IncludeInCommandLine': 'False',
-           'Visible': 'false'
-          }
-         ],
-         ['StringProperty',
-          {'Subtype': 'AdditionalOptions',
-           'Name': 'AdditionalOptions',
-           'Category': 'Command Line'
-          },
-          ['StringProperty.DisplayName',
-           ['sys:String', 'Additional Options'],
-          ],
-          ['StringProperty.Description',
-           ['sys:String', 'Additional Options'],
-          ],
-         ],
-        ],
-        ['ItemType',
-         {'Name': rule.rule_name,
-          'DisplayName': rule.display_name
-         }
-        ],
-        ['FileExtension',
-         {'Name': '*' + rule.extension,
-          'ContentType': rule.rule_name
-         }
-        ],
-        ['ContentType',
-         {'Name': rule.rule_name,
-          'DisplayName': '',
-          'ItemType': rule.rule_name
-         }
-        ]
-    ])
-  easy_xml.WriteXmlIfChanged(content, xml_path, pretty=True, win32=True)
-
-
-def _GetConfigurationAndPlatform(name, settings):
-  configuration = name.rsplit('_', 1)[0]
-  platform = settings.get('msvs_configuration_platform', 'Win32')
-  return (configuration, platform)
-
-
-def _GetConfigurationCondition(name, settings):
-  return (r"'$(Configuration)|$(Platform)'=='%s|%s'" %
-          _GetConfigurationAndPlatform(name, settings))
-
-
-def _GetMSBuildProjectConfigurations(configurations):
-  group = ['ItemGroup', {'Label': 'ProjectConfigurations'}]
-  for (name, settings) in sorted(configurations.iteritems()):
-    configuration, platform = _GetConfigurationAndPlatform(name, settings)
-    designation = '%s|%s' % (configuration, platform)
-    group.append(
-        ['ProjectConfiguration', {'Include': designation},
-         ['Configuration', configuration],
-         ['Platform', platform]])
-  return [group]
-
-
-def _GetMSBuildGlobalProperties(spec, guid, gyp_file_name):
-  namespace = os.path.splitext(gyp_file_name)[0]
-  return [
-      ['PropertyGroup', {'Label': 'Globals'},
-       ['ProjectGuid', guid],
-       ['Keyword', 'Win32Proj'],
-       ['RootNamespace', namespace],
-      ]
-  ]
-
-
-def _GetMSBuildConfigurationDetails(spec, build_file):
-  properties = {}
-  for name, settings in spec['configurations'].iteritems():
-    msbuild_attributes = _GetMSBuildAttributes(spec, settings, build_file)
-    condition = _GetConfigurationCondition(name, settings)
-    character_set = msbuild_attributes.get('CharacterSet')
-    _AddConditionalProperty(properties, condition, 'ConfigurationType',
-                            msbuild_attributes['ConfigurationType'])
-    if character_set:
-      _AddConditionalProperty(properties, condition, 'CharacterSet',
-                              character_set)
-  return _GetMSBuildPropertyGroup(spec, 'Configuration', properties)
-
-
-def _GetMSBuildLocalProperties(msbuild_toolset):
-  # Currently the only local property we support is PlatformToolset
-  properties = {}
-  if msbuild_toolset:
-    properties = [
-        ['PropertyGroup', {'Label': 'Locals'},
-          ['PlatformToolset', msbuild_toolset],
-        ]
-      ]
-  return properties
-
-
-def _GetMSBuildPropertySheets(configurations):
-  user_props = r'$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props'
-  additional_props = {}
-  props_specified = False
-  for name, settings in sorted(configurations.iteritems()):
-    configuration = _GetConfigurationCondition(name, settings)
-    if settings.has_key('msbuild_props'):
-      additional_props[configuration] = _FixPaths(settings['msbuild_props'])
-      props_specified = True
-    else:
-     additional_props[configuration] = ''
-
-  if not props_specified:
-    return [
-        ['ImportGroup',
-         {'Label': 'PropertySheets'},
-         ['Import',
-          {'Project': user_props,
-           'Condition': "exists('%s')" % user_props,
-           'Label': 'LocalAppDataPlatform'
-          }
-         ]
-        ]
-    ]
-  else:
-    sheets = []
-    for condition, props in additional_props.iteritems():
-      import_group = [
-        'ImportGroup',
-        {'Label': 'PropertySheets',
-         'Condition': condition
-        },
-        ['Import',
-         {'Project': user_props,
-          'Condition': "exists('%s')" % user_props,
-          'Label': 'LocalAppDataPlatform'
-         }
-        ]
-      ]
-      for props_file in props:
-        import_group.append(['Import', {'Project':props_file}])
-      sheets.append(import_group)
-    return sheets
-
-def _ConvertMSVSBuildAttributes(spec, config, build_file):
-  config_type = _GetMSVSConfigurationType(spec, build_file)
-  msvs_attributes = _GetMSVSAttributes(spec, config, config_type)
-  msbuild_attributes = {}
-  for a in msvs_attributes:
-    if a in ['IntermediateDirectory', 'OutputDirectory']:
-      directory = MSVSSettings.ConvertVCMacrosToMSBuild(msvs_attributes[a])
-      if not directory.endswith('\\'):
-        directory += '\\'
-      msbuild_attributes[a] = directory
-    elif a == 'CharacterSet':
-      msbuild_attributes[a] = _ConvertMSVSCharacterSet(msvs_attributes[a])
-    elif a == 'ConfigurationType':
-      msbuild_attributes[a] = _ConvertMSVSConfigurationType(msvs_attributes[a])
-    else:
-      print 'Warning: Do not know how to convert MSVS attribute ' + a
-  return msbuild_attributes
-
-
-def _ConvertMSVSCharacterSet(char_set):
-  if char_set.isdigit():
-    char_set = {
-        '0': 'MultiByte',
-        '1': 'Unicode',
-        '2': 'MultiByte',
-    }[char_set]
-  return char_set
-
-
-def _ConvertMSVSConfigurationType(config_type):
-  if config_type.isdigit():
-    config_type = {
-        '1': 'Application',
-        '2': 'DynamicLibrary',
-        '4': 'StaticLibrary',
-        '10': 'Utility'
-    }[config_type]
-  return config_type
-
-
-def _GetMSBuildAttributes(spec, config, build_file):
-  if 'msbuild_configuration_attributes' not in config:
-    msbuild_attributes = _ConvertMSVSBuildAttributes(spec, config, build_file)
-
-  else:
-    config_type = _GetMSVSConfigurationType(spec, build_file)
-    config_type = _ConvertMSVSConfigurationType(config_type)
-    msbuild_attributes = config.get('msbuild_configuration_attributes', {})
-    msbuild_attributes.setdefault('ConfigurationType', config_type)
-    output_dir = msbuild_attributes.get('OutputDirectory',
-                                      '$(SolutionDir)$(Configuration)')
-    msbuild_attributes['OutputDirectory'] = _FixPath(output_dir) + '\\'
-    if 'IntermediateDirectory' not in msbuild_attributes:
-      intermediate = _FixPath('$(Configuration)') + '\\'
-      msbuild_attributes['IntermediateDirectory'] = intermediate
-    if 'CharacterSet' in msbuild_attributes:
-      msbuild_attributes['CharacterSet'] = _ConvertMSVSCharacterSet(
-          msbuild_attributes['CharacterSet'])
-  if 'TargetName' not in msbuild_attributes:
-    prefix = spec.get('product_prefix', '')
-    product_name = spec.get('product_name', '$(ProjectName)')
-    target_name = prefix + product_name
-    msbuild_attributes['TargetName'] = target_name
-  if 'TargetExt' not in msbuild_attributes and 'product_extension' in spec:
-    ext = spec.get('product_extension')
-    msbuild_attributes['TargetExt'] = '.' + ext
-
-  # Make sure that 'TargetPath' matches 'Lib.OutputFile' or 'Link.OutputFile'
-  # (depending on the tool used) to avoid MSB8012 warning.
-  msbuild_tool_map = {
-      'executable': 'Link',
-      'shared_library': 'Link',
-      'loadable_module': 'Link',
-      'static_library': 'Lib',
-  }
-  msbuild_tool = msbuild_tool_map.get(spec['type'])
-  if msbuild_tool:
-    msbuild_settings = config['finalized_msbuild_settings']
-    out_file = msbuild_settings[msbuild_tool].get('OutputFile')
-    if out_file:
-      msbuild_attributes['TargetPath'] = _FixPath(out_file)
-
-  return msbuild_attributes
-
-
-def _GetMSBuildConfigurationGlobalProperties(spec, configurations, build_file):
-  # TODO(jeanluc) We could optimize out the following and do it only if
-  # there are actions.
-  # TODO(jeanluc) Handle the equivalent of setting 'CYGWIN=nontsec'.
-  new_paths = []
-  cygwin_dirs = spec.get('msvs_cygwin_dirs', ['.'])[0]
-  if cygwin_dirs:
-    cyg_path = '$(MSBuildProjectDirectory)\\%s\\bin\\' % _FixPath(cygwin_dirs)
-    new_paths.append(cyg_path)
-    # TODO(jeanluc) Change the convention to have both a cygwin_dir and a
-    # python_dir.
-    python_path = cyg_path.replace('cygwin\\bin', 'python_26')
-    new_paths.append(python_path)
-    if new_paths:
-      new_paths = '$(ExecutablePath);' + ';'.join(new_paths)
-
-  properties = {}
-  for (name, configuration) in sorted(configurations.iteritems()):
-    condition = _GetConfigurationCondition(name, configuration)
-    attributes = _GetMSBuildAttributes(spec, configuration, build_file)
-    msbuild_settings = configuration['finalized_msbuild_settings']
-    _AddConditionalProperty(properties, condition, 'IntDir',
-                            attributes['IntermediateDirectory'])
-    _AddConditionalProperty(properties, condition, 'OutDir',
-                            attributes['OutputDirectory'])
-    _AddConditionalProperty(properties, condition, 'TargetName',
-                            attributes['TargetName'])
-    if 'TargetExt' in attributes:
-      _AddConditionalProperty(properties, condition, 'TargetExt',
-                              attributes['TargetExt'])
-
-    if attributes.get('TargetPath'):
-      _AddConditionalProperty(properties, condition, 'TargetPath',
-                              attributes['TargetPath'])
-
-    if new_paths:
-      _AddConditionalProperty(properties, condition, 'ExecutablePath',
-                              new_paths)
-    tool_settings = msbuild_settings.get('', {})
-    for name, value in sorted(tool_settings.iteritems()):
-      formatted_value = _GetValueFormattedForMSBuild('', name, value)
-      _AddConditionalProperty(properties, condition, name, formatted_value)
-  return _GetMSBuildPropertyGroup(spec, None, properties)
-
-
-def _AddConditionalProperty(properties, condition, name, value):
-  """Adds a property / conditional value pair to a dictionary.
-
-  Arguments:
-    properties: The dictionary to be modified.  The key is the name of the
-        property.  The value is itself a dictionary; its key is the value and
-        the value a list of condition for which this value is true.
-    condition: The condition under which the named property has the value.
-    name: The name of the property.
-    value: The value of the property.
-  """
-  if name not in properties:
-    properties[name] = {}
-  values = properties[name]
-  if value not in values:
-    values[value] = []
-  conditions = values[value]
-  conditions.append(condition)
-
-
-# Regex for msvs variable references ( i.e. $(FOO) ).
-MSVS_VARIABLE_REFERENCE = re.compile('\$\(([a-zA-Z_][a-zA-Z0-9_]*)\)')
-
-
-def _GetMSBuildPropertyGroup(spec, label, properties):
-  """Returns a PropertyGroup definition for the specified properties.
-
-  Arguments:
-    spec: The target project dict.
-    label: An optional label for the PropertyGroup.
-    properties: The dictionary to be converted.  The key is the name of the
-        property.  The value is itself a dictionary; its key is the value and
-        the value a list of condition for which this value is true.
-  """
-  group = ['PropertyGroup']
-  if label:
-    group.append({'Label': label})
-  num_configurations = len(spec['configurations'])
-  def GetEdges(node):
-    # Use a definition of edges such that user_of_variable -> used_varible.
-    # This happens to be easier in this case, since a variable's
-    # definition contains all variables it references in a single string.
-    edges = set()
-    for value in sorted(properties[node].keys()):
-      # Add to edges all $(...) references to variables.
-      #
-      # Variable references that refer to names not in properties are excluded
-      # These can exist for instance to refer built in definitions like
-      # $(SolutionDir).
-      #
-      # Self references are ignored. Self reference is used in a few places to
-      # append to the default value. I.e. PATH=$(PATH);other_path
-      edges.update(set([v for v in MSVS_VARIABLE_REFERENCE.findall(value)
-                        if v in properties and v != node]))
-    return edges
-  properties_ordered = gyp.common.TopologicallySorted(
-      properties.keys(), GetEdges)
-  # Walk properties in the reverse of a topological sort on
-  # user_of_variable -> used_variable as this ensures variables are
-  # defined before they are used.
-  # NOTE: reverse(topsort(DAG)) = topsort(reverse_edges(DAG))
-  for name in reversed(properties_ordered):
-    values = properties[name]
-    for value, conditions in sorted(values.iteritems()):
-      if len(conditions) == num_configurations:
-        # If the value is the same all configurations,
-        # just add one unconditional entry.
-        group.append([name, value])
-      else:
-        for condition in conditions:
-          group.append([name, {'Condition': condition}, value])
-  return [group]
-
-
-def _GetMSBuildToolSettingsSections(spec, configurations):
-  groups = []
-  for (name, configuration) in sorted(configurations.iteritems()):
-    msbuild_settings = configuration['finalized_msbuild_settings']
-    group = ['ItemDefinitionGroup',
-             {'Condition': _GetConfigurationCondition(name, configuration)}
-            ]
-    for tool_name, tool_settings in sorted(msbuild_settings.iteritems()):
-      # Skip the tool named '' which is a holder of global settings handled
-      # by _GetMSBuildConfigurationGlobalProperties.
-      if tool_name:
-        if tool_settings:
-          tool = [tool_name]
-          for name, value in sorted(tool_settings.iteritems()):
-            formatted_value = _GetValueFormattedForMSBuild(tool_name, name,
-                                                           value)
-            tool.append([name, formatted_value])
-          group.append(tool)
-    groups.append(group)
-  return groups
-
-
-def _FinalizeMSBuildSettings(spec, configuration):
-  if 'msbuild_settings' in configuration:
-    converted = False
-    msbuild_settings = configuration['msbuild_settings']
-    MSVSSettings.ValidateMSBuildSettings(msbuild_settings)
-  else:
-    converted = True
-    msvs_settings = configuration.get('msvs_settings', {})
-    msbuild_settings = MSVSSettings.ConvertToMSBuildSettings(msvs_settings)
-  include_dirs, resource_include_dirs = _GetIncludeDirs(configuration)
-  libraries = _GetLibraries(spec)
-  out_file, _, msbuild_tool = _GetOutputFilePathAndTool(spec, msbuild=True)
-  defines = _GetDefines(configuration)
-  if converted:
-    # Visual Studio 2010 has TR1
-    defines = [d for d in defines if d != '_HAS_TR1=0']
-    # Warn of ignored settings
-    ignored_settings = ['msvs_prebuild', 'msvs_postbuild', 'msvs_tool_files']
-    for ignored_setting in ignored_settings:
-      value = configuration.get(ignored_setting)
-      if value:
-        print ('Warning: The automatic conversion to MSBuild does not handle '
-               '%s.  Ignoring setting of %s' % (ignored_setting, str(value)))
-
-  defines = [_EscapeCppDefineForMSBuild(d) for d in defines]
-  disabled_warnings = _GetDisabledWarnings(configuration)
-  # TODO(jeanluc) Validate & warn that we don't translate
-  # prebuild = configuration.get('msvs_prebuild')
-  # postbuild = configuration.get('msvs_postbuild')
-  def_file = _GetModuleDefinition(spec)
-  precompiled_header = configuration.get('msvs_precompiled_header')
-
-  # Add the information to the appropriate tool
-  # TODO(jeanluc) We could optimize and generate these settings only if
-  # the corresponding files are found, e.g. don't generate ResourceCompile
-  # if you don't have any resources.
-  _ToolAppend(msbuild_settings, 'ClCompile',
-              'AdditionalIncludeDirectories', include_dirs)
-  _ToolAppend(msbuild_settings, 'ResourceCompile',
-              'AdditionalIncludeDirectories', resource_include_dirs)
-  # Add in libraries, note that even for empty libraries, we want this
-  # set, to prevent inheriting default libraries from the enviroment.
-  _ToolSetOrAppend(msbuild_settings, 'Link', 'AdditionalDependencies',
-                  libraries)
-  if out_file:
-    _ToolAppend(msbuild_settings, msbuild_tool, 'OutputFile', out_file,
-                only_if_unset=True)
-  # Add defines.
-  _ToolAppend(msbuild_settings, 'ClCompile',
-              'PreprocessorDefinitions', defines)
-  _ToolAppend(msbuild_settings, 'ResourceCompile',
-              'PreprocessorDefinitions', defines)
-  # Add disabled warnings.
-  _ToolAppend(msbuild_settings, 'ClCompile',
-              'DisableSpecificWarnings', disabled_warnings)
-  # Turn on precompiled headers if appropriate.
-  if precompiled_header:
-    precompiled_header = os.path.split(precompiled_header)[1]
-    _ToolAppend(msbuild_settings, 'ClCompile', 'PrecompiledHeader', 'Use')
-    _ToolAppend(msbuild_settings, 'ClCompile',
-                'PrecompiledHeaderFile', precompiled_header)
-    _ToolAppend(msbuild_settings, 'ClCompile',
-                'ForcedIncludeFiles', precompiled_header)
-  # Loadable modules don't generate import libraries;
-  # tell dependent projects to not expect one.
-  if spec['type'] == 'loadable_module':
-    _ToolAppend(msbuild_settings, '', 'IgnoreImportLibrary', 'true')
-  # Set the module definition file if any.
-  if def_file:
-    _ToolAppend(msbuild_settings, 'Link', 'ModuleDefinitionFile', def_file)
-  configuration['finalized_msbuild_settings'] = msbuild_settings
-
-
-def _GetValueFormattedForMSBuild(tool_name, name, value):
-  if type(value) == list:
-    # For some settings, VS2010 does not automatically extends the settings
-    # TODO(jeanluc) Is this what we want?
-    if name in ['AdditionalIncludeDirectories',
-                'AdditionalLibraryDirectories',
-                'AdditionalOptions',
-                'DelayLoadDLLs',
-                'DisableSpecificWarnings',
-                'PreprocessorDefinitions']:
-      value.append('%%(%s)' % name)
-    # For most tools, entries in a list should be separated with ';' but some
-    # settings use a space.  Check for those first.
-    exceptions = {
-        'ClCompile': ['AdditionalOptions'],
-        'Link': ['AdditionalOptions'],
-        'Lib': ['AdditionalOptions']}
-    if tool_name in exceptions and name in exceptions[tool_name]:
-      char = ' '
-    else:
-      char = ';'
-    formatted_value = char.join(
-        [MSVSSettings.ConvertVCMacrosToMSBuild(i) for i in value])
-  else:
-    formatted_value = MSVSSettings.ConvertVCMacrosToMSBuild(value)
-  return formatted_value
-
-
-def _VerifySourcesExist(sources, root_dir):
-  """Verifies that all source files exist on disk.
-
-  Checks that all regular source files, i.e. not created at run time,
-  exist on disk.  Missing files cause needless recompilation but no otherwise
-  visible errors.
-
-  Arguments:
-    sources: A recursive list of Filter/file names.
-    root_dir: The root directory for the relative path names.
-  Returns:
-    A list of source files that cannot be found on disk.
-  """
-  missing_sources = []
-  for source in sources:
-    if isinstance(source, MSVSProject.Filter):
-      missing_sources.extend(_VerifySourcesExist(source.contents, root_dir))
-    else:
-      if '$' not in source:
-        full_path = os.path.join(root_dir, source)
-        if not os.path.exists(full_path):
-          missing_sources.append(full_path)
-  return missing_sources
-
-
-def _GetMSBuildSources(spec, sources, exclusions, extension_to_rule_name,
-                       actions_spec, sources_handled_by_action, list_excluded):
-  groups = ['none', 'midl', 'include', 'compile', 'resource', 'rule']
-  grouped_sources = {}
-  for g in groups:
-    grouped_sources[g] = []
-
-  _AddSources2(spec, sources, exclusions, grouped_sources,
-               extension_to_rule_name, sources_handled_by_action, list_excluded)
-  sources = []
-  for g in groups:
-    if grouped_sources[g]:
-      sources.append(['ItemGroup'] + grouped_sources[g])
-  if actions_spec:
-    sources.append(['ItemGroup'] + actions_spec)
-  return sources
-
-
-def _AddSources2(spec, sources, exclusions, grouped_sources,
-                 extension_to_rule_name, sources_handled_by_action,
-                 list_excluded):
-  extensions_excluded_from_precompile = []
-  for source in sources:
-    if isinstance(source, MSVSProject.Filter):
-      _AddSources2(spec, source.contents, exclusions, grouped_sources,
-                   extension_to_rule_name, sources_handled_by_action,
-                   list_excluded)
-    else:
-      if not source in sources_handled_by_action:
-        detail = []
-        excluded_configurations = exclusions.get(source, [])
-        if len(excluded_configurations) == len(spec['configurations']):
-          detail.append(['ExcludedFromBuild', 'true'])
-        else:
-          for config_name, configuration in sorted(excluded_configurations):
-            condition = _GetConfigurationCondition(config_name, configuration)
-            detail.append(['ExcludedFromBuild',
-                           {'Condition': condition},
-                           'true'])
-        # Add precompile if needed
-        for config_name, configuration in spec['configurations'].iteritems():
-          precompiled_source = configuration.get('msvs_precompiled_source', '')
-          if precompiled_source != '':
-            precompiled_source = _FixPath(precompiled_source)
-            if not extensions_excluded_from_precompile:
-              # If the precompiled header is generated by a C source, we must
-              # not try to use it for C++ sources, and vice versa.
-              basename, extension = os.path.splitext(precompiled_source)
-              if extension == '.c':
-                extensions_excluded_from_precompile = ['.cc', '.cpp', '.cxx']
-              else:
-                extensions_excluded_from_precompile = ['.c']
-
-          if precompiled_source == source:
-            condition = _GetConfigurationCondition(config_name, configuration)
-            detail.append(['PrecompiledHeader',
-                           {'Condition': condition},
-                           'Create'
-                          ])
-          else:
-            # Turn off precompiled header usage for source files of a
-            # different type than the file that generated the
-            # precompiled header.
-            for extension in extensions_excluded_from_precompile:
-              if source.endswith(extension):
-                detail.append(['PrecompiledHeader', ''])
-                detail.append(['ForcedIncludeFiles', ''])
-
-        group, element = _MapFileToMsBuildSourceType(source,
-                                                     extension_to_rule_name)
-        grouped_sources[group].append([element, {'Include': source}] + detail)
-
-
-def _GetMSBuildProjectReferences(project):
-  references = []
-  if project.dependencies:
-    group = ['ItemGroup']
-    for dependency in project.dependencies:
-      guid = dependency.guid
-      project_dir = os.path.split(project.path)[0]
-      relative_path = gyp.common.RelativePath(dependency.path, project_dir)
-      project_ref = ['ProjectReference',
-          {'Include': relative_path},
-          ['Project', guid],
-          ['ReferenceOutputAssembly', 'false']
-          ]
-      for config in dependency.spec.get('configurations', {}).itervalues():
-        # If it's disabled in any config, turn it off in the reference.
-        if config.get('msvs_2010_disable_uldi_when_referenced', 0):
-          project_ref.append(['UseLibraryDependencyInputs', 'false'])
-          break
-      group.append(project_ref)
-    references.append(group)
-  return references
-
-
-def _GenerateMSBuildProject(project, options, version, generator_flags):
-  spec = project.spec
-  configurations = spec['configurations']
-  project_dir, project_file_name = os.path.split(project.path)
-  msbuildproj_dir = os.path.dirname(project.path)
-  if msbuildproj_dir and not os.path.exists(msbuildproj_dir):
-    os.makedirs(msbuildproj_dir)
-  # Prepare list of sources and excluded sources.
-  gyp_path = _NormalizedSource(project.build_file)
-  relative_path_of_gyp_file = gyp.common.RelativePath(gyp_path, project_dir)
-
-  gyp_file = os.path.split(project.build_file)[1]
-  sources, excluded_sources = _PrepareListOfSources(spec, generator_flags,
-                                                    gyp_file)
-  # Add rules.
-  actions_to_add = {}
-  props_files_of_rules = set()
-  targets_files_of_rules = set()
-  extension_to_rule_name = {}
-  list_excluded = generator_flags.get('msvs_list_excluded_files', True)
-  _GenerateRulesForMSBuild(project_dir, options, spec,
-                           sources, excluded_sources,
-                           props_files_of_rules, targets_files_of_rules,
-                           actions_to_add, extension_to_rule_name)
-  sources, excluded_sources, excluded_idl = (
-      _AdjustSourcesAndConvertToFilterHierarchy(spec, options,
-                                                project_dir, sources,
-                                                excluded_sources,
-                                                list_excluded))
-  _AddActions(actions_to_add, spec, project.build_file)
-  _AddCopies(actions_to_add, spec)
-
-  # NOTE: this stanza must appear after all actions have been decided.
-  # Don't excluded sources with actions attached, or they won't run.
-  excluded_sources = _FilterActionsFromExcluded(
-      excluded_sources, actions_to_add)
-
-  exclusions = _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl)
-  actions_spec, sources_handled_by_action = _GenerateActionsForMSBuild(
-      spec, actions_to_add)
-
-  _GenerateMSBuildFiltersFile(project.path + '.filters', sources,
-                              extension_to_rule_name)
-  missing_sources = _VerifySourcesExist(sources, project_dir)
-
-  for configuration in configurations.itervalues():
-    _FinalizeMSBuildSettings(spec, configuration)
-
-  # Add attributes to root element
-
-  import_default_section = [
-      ['Import', {'Project': r'$(VCTargetsPath)\Microsoft.Cpp.Default.props'}]]
-  import_cpp_props_section = [
-      ['Import', {'Project': r'$(VCTargetsPath)\Microsoft.Cpp.props'}]]
-  import_cpp_targets_section = [
-      ['Import', {'Project': r'$(VCTargetsPath)\Microsoft.Cpp.targets'}]]
-  macro_section = [['PropertyGroup', {'Label': 'UserMacros'}]]
-
-  content = [
-      'Project',
-      {'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003',
-       'ToolsVersion': version.ProjectVersion(),
-       'DefaultTargets': 'Build'
-      }]
-
-  content += _GetMSBuildProjectConfigurations(configurations)
-  content += _GetMSBuildGlobalProperties(spec, project.guid, project_file_name)
-  content += import_default_section
-  content += _GetMSBuildConfigurationDetails(spec, project.build_file)
-  content += _GetMSBuildLocalProperties(project.msbuild_toolset)
-  content += import_cpp_props_section
-  content += _GetMSBuildExtensions(props_files_of_rules)
-  content += _GetMSBuildPropertySheets(configurations)
-  content += macro_section
-  content += _GetMSBuildConfigurationGlobalProperties(spec, configurations,
-                                                      project.build_file)
-  content += _GetMSBuildToolSettingsSections(spec, configurations)
-  content += _GetMSBuildSources(
-      spec, sources, exclusions, extension_to_rule_name, actions_spec,
-      sources_handled_by_action, list_excluded)
-  content += _GetMSBuildProjectReferences(project)
-  content += import_cpp_targets_section
-  content += _GetMSBuildExtensionTargets(targets_files_of_rules)
-
-  # TODO(jeanluc) File a bug to get rid of runas.  We had in MSVS:
-  # has_run_as = _WriteMSVSUserFile(project.path, version, spec)
-
-  easy_xml.WriteXmlIfChanged(content, project.path, pretty=True, win32=True)
-
-  return missing_sources
-
-
-def _GetMSBuildExtensions(props_files_of_rules):
-  extensions = ['ImportGroup', {'Label': 'ExtensionSettings'}]
-  for props_file in props_files_of_rules:
-    extensions.append(['Import', {'Project': props_file}])
-  return [extensions]
-
-
-def _GetMSBuildExtensionTargets(targets_files_of_rules):
-  targets_node = ['ImportGroup', {'Label': 'ExtensionTargets'}]
-  for targets_file in sorted(targets_files_of_rules):
-    targets_node.append(['Import', {'Project': targets_file}])
-  return [targets_node]
-
-
-def _GenerateActionsForMSBuild(spec, actions_to_add):
-  """Add actions accumulated into an actions_to_add, merging as needed.
-
-  Arguments:
-    spec: the target project dict
-    actions_to_add: dictionary keyed on input name, which maps to a list of
-        dicts describing the actions attached to that input file.
-
-  Returns:
-    A pair of (action specification, the sources handled by this action).
-  """
-  sources_handled_by_action = set()
-  actions_spec = []
-  for primary_input, actions in actions_to_add.iteritems():
-    inputs = set()
-    outputs = set()
-    descriptions = []
-    commands = []
-    for action in actions:
-      inputs.update(set(action['inputs']))
-      outputs.update(set(action['outputs']))
-      descriptions.append(action['description'])
-      cmd = action['command']
-      # For most actions, add 'call' so that actions that invoke batch files
-      # return and continue executing.  msbuild_use_call provides a way to
-      # disable this but I have not seen any adverse effect from doing that
-      # for everything.
-      if action.get('msbuild_use_call', True):
-        cmd = 'call ' + cmd
-      commands.append(cmd)
-    # Add the custom build action for one input file.
-    description = ', and also '.join(descriptions)
-
-    # We can't join the commands simply with && because the command line will
-    # get too long. See also _AddActions: cygwin's setup_env mustn't be called
-    # for every invocation or the command that sets the PATH will grow too
-    # long.
-    command = (
-        '\r\nif %errorlevel% neq 0 exit /b %errorlevel%\r\n'.join(commands))
-    _AddMSBuildAction(spec,
-                      primary_input,
-                      inputs,
-                      outputs,
-                      command,
-                      description,
-                      sources_handled_by_action,
-                      actions_spec)
-  return actions_spec, sources_handled_by_action
-
-
-def _AddMSBuildAction(spec, primary_input, inputs, outputs, cmd, description,
-                      sources_handled_by_action, actions_spec):
-  command = MSVSSettings.ConvertVCMacrosToMSBuild(cmd)
-  primary_input = _FixPath(primary_input)
-  inputs_array = _FixPaths(inputs)
-  outputs_array = _FixPaths(outputs)
-  additional_inputs = ';'.join([i for i in inputs_array
-                                if i != primary_input])
-  outputs = ';'.join(outputs_array)
-  sources_handled_by_action.add(primary_input)
-  action_spec = ['CustomBuild', {'Include': primary_input}]
-  action_spec.extend(
-      # TODO(jeanluc) 'Document' for all or just if as_sources?
-      [['FileType', 'Document'],
-       ['Command', command],
-       ['Message', description],
-       ['Outputs', outputs]
-      ])
-  if additional_inputs:
-    action_spec.append(['AdditionalInputs', additional_inputs])
-  actions_spec.append(action_spec)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs_test.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-#!/usr/bin/env python
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-""" Unit tests for the msvs.py file. """
-
-import gyp.generator.msvs as msvs
-import unittest
-import StringIO
-
-
-class TestSequenceFunctions(unittest.TestCase):
-
-  def setUp(self):
-    self.stderr = StringIO.StringIO()
-
-  def test_GetLibraries(self):
-    self.assertEqual(
-      msvs._GetLibraries({}),
-      [])
-    self.assertEqual(
-      msvs._GetLibraries({'libraries': []}),
-      [])
-    self.assertEqual(
-      msvs._GetLibraries({'other':'foo', 'libraries': ['a.lib']}),
-      ['a.lib'])
-    self.assertEqual(
-      msvs._GetLibraries({'libraries': ['-la']}),
-      ['a.lib'])
-    self.assertEqual(
-      msvs._GetLibraries({'libraries': ['a.lib', 'b.lib', 'c.lib', '-lb.lib',
-                                   '-lb.lib', 'd.lib', 'a.lib']}),
-      ['c.lib', 'b.lib', 'd.lib', 'a.lib'])
-
-if __name__ == '__main__':
-  unittest.main()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1809 +0,0 @@
-# Copyright (c) 2013 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import copy
-import hashlib
-import multiprocessing
-import os.path
-import re
-import signal
-import subprocess
-import sys
-import gyp
-import gyp.common
-import gyp.msvs_emulation
-import gyp.MSVSUtil as MSVSUtil
-import gyp.xcode_emulation
-
-from gyp.common import GetEnvironFallback
-import gyp.ninja_syntax as ninja_syntax
-
-generator_default_variables = {
-  'EXECUTABLE_PREFIX': '',
-  'EXECUTABLE_SUFFIX': '',
-  'STATIC_LIB_PREFIX': 'lib',
-  'STATIC_LIB_SUFFIX': '.a',
-  'SHARED_LIB_PREFIX': 'lib',
-
-  # Gyp expects the following variables to be expandable by the build
-  # system to the appropriate locations.  Ninja prefers paths to be
-  # known at gyp time.  To resolve this, introduce special
-  # variables starting with $! and $| (which begin with a $ so gyp knows it
-  # should be treated specially, but is otherwise an invalid
-  # ninja/shell variable) that are passed to gyp here but expanded
-  # before writing out into the target .ninja files; see
-  # ExpandSpecial.
-  # $! is used for variables that represent a path and that can only appear at
-  # the start of a string, while $| is used for variables that can appear
-  # anywhere in a string.
-  'INTERMEDIATE_DIR': '$!INTERMEDIATE_DIR',
-  'SHARED_INTERMEDIATE_DIR': '$!PRODUCT_DIR/gen',
-  'PRODUCT_DIR': '$!PRODUCT_DIR',
-  'CONFIGURATION_NAME': '$|CONFIGURATION_NAME',
-
-  # Special variables that may be used by gyp 'rule' targets.
-  # We generate definitions for these variables on the fly when processing a
-  # rule.
-  'RULE_INPUT_ROOT': '${root}',
-  'RULE_INPUT_DIRNAME': '${dirname}',
-  'RULE_INPUT_PATH': '${source}',
-  'RULE_INPUT_EXT': '${ext}',
-  'RULE_INPUT_NAME': '${name}',
-}
-
-# Placates pylint.
-generator_additional_non_configuration_keys = []
-generator_additional_path_sections = []
-generator_extra_sources_for_rules = []
-
-# TODO: figure out how to not build extra host objects in the non-cross-compile
-# case when this is enabled, and enable unconditionally.
-generator_supports_multiple_toolsets = (
-  os.environ.get('GYP_CROSSCOMPILE') or
-  os.environ.get('AR_host') or
-  os.environ.get('CC_host') or
-  os.environ.get('CXX_host') or
-  os.environ.get('AR_target') or
-  os.environ.get('CC_target') or
-  os.environ.get('CXX_target'))
-
-
-def StripPrefix(arg, prefix):
-  if arg.startswith(prefix):
-    return arg[len(prefix):]
-  return arg
-
-
-def QuoteShellArgument(arg, flavor):
-  """Quote a string such that it will be interpreted as a single argument
-  by the shell."""
-  # Rather than attempting to enumerate the bad shell characters, just
-  # whitelist common OK ones and quote anything else.
-  if re.match(r'^[a-zA-Z0-9_=.\\/-]+$', arg):
-    return arg  # No quoting necessary.
-  if flavor == 'win':
-    return gyp.msvs_emulation.QuoteForRspFile(arg)
-  return "'" + arg.replace("'", "'" + '"\'"' + "'")  + "'"
-
-
-def Define(d, flavor):
-  """Takes a preprocessor define and returns a -D parameter that's ninja- and
-  shell-escaped."""
-  if flavor == 'win':
-    # cl.exe replaces literal # characters with = in preprocesor definitions for
-    # some reason. Octal-encode to work around that.
-    d = d.replace('#', '\\%03o' % ord('#'))
-  return QuoteShellArgument(ninja_syntax.escape('-D' + d), flavor)
-
-
-class Target:
-  """Target represents the paths used within a single gyp target.
-
-  Conceptually, building a single target A is a series of steps:
-
-  1) actions/rules/copies  generates source/resources/etc.
-  2) compiles              generates .o files
-  3) link                  generates a binary (library/executable)
-  4) bundle                merges the above in a mac bundle
-
-  (Any of these steps can be optional.)
-
-  From a build ordering perspective, a dependent target B could just
-  depend on the last output of this series of steps.
-
-  But some dependent commands sometimes need to reach inside the box.
-  For example, when linking B it needs to get the path to the static
-  library generated by A.
-
-  This object stores those paths.  To keep things simple, member
-  variables only store concrete paths to single files, while methods
-  compute derived values like "the last output of the target".
-  """
-  def __init__(self, type):
-    # Gyp type ("static_library", etc.) of this target.
-    self.type = type
-    # File representing whether any input dependencies necessary for
-    # dependent actions have completed.
-    self.preaction_stamp = None
-    # File representing whether any input dependencies necessary for
-    # dependent compiles have completed.
-    self.precompile_stamp = None
-    # File representing the completion of actions/rules/copies, if any.
-    self.actions_stamp = None
-    # Path to the output of the link step, if any.
-    self.binary = None
-    # Path to the file representing the completion of building the bundle,
-    # if any.
-    self.bundle = None
-    # On Windows, incremental linking requires linking against all the .objs
-    # that compose a .lib (rather than the .lib itself). That list is stored
-    # here.
-    self.component_objs = None
-    # Windows only. The import .lib is the output of a build step, but
-    # because dependents only link against the lib (not both the lib and the
-    # dll) we keep track of the import library here.
-    self.import_lib = None
-
-  def Linkable(self):
-    """Return true if this is a target that can be linked against."""
-    return self.type in ('static_library', 'shared_library')
-
-  def UsesToc(self, flavor):
-    """Return true if the target should produce a restat rule based on a TOC
-    file."""
-    # For bundles, the .TOC should be produced for the binary, not for
-    # FinalOutput(). But the naive approach would put the TOC file into the
-    # bundle, so don't do this for bundles for now.
-    if flavor == 'win' or self.bundle:
-      return False
-    return self.type in ('shared_library', 'loadable_module')
-
-  def PreActionInput(self, flavor):
-    """Return the path, if any, that should be used as a dependency of
-    any dependent action step."""
-    if self.UsesToc(flavor):
-      return self.FinalOutput() + '.TOC'
-    return self.FinalOutput() or self.preaction_stamp
-
-  def PreCompileInput(self):
-    """Return the path, if any, that should be used as a dependency of
-    any dependent compile step."""
-    return self.actions_stamp or self.precompile_stamp
-
-  def FinalOutput(self):
-    """Return the last output of the target, which depends on all prior
-    steps."""
-    return self.bundle or self.binary or self.actions_stamp
-
-
-# A small discourse on paths as used within the Ninja build:
-# All files we produce (both at gyp and at build time) appear in the
-# build directory (e.g. out/Debug).
-#
-# Paths within a given .gyp file are always relative to the directory
-# containing the .gyp file.  Call these "gyp paths".  This includes
-# sources as well as the starting directory a given gyp rule/action
-# expects to be run from.  We call the path from the source root to
-# the gyp file the "base directory" within the per-.gyp-file
-# NinjaWriter code.
-#
-# All paths as written into the .ninja files are relative to the build
-# directory.  Call these paths "ninja paths".
-#
-# We translate between these two notions of paths with two helper
-# functions:
-#
-# - GypPathToNinja translates a gyp path (i.e. relative to the .gyp file)
-#   into the equivalent ninja path.
-#
-# - GypPathToUniqueOutput translates a gyp path into a ninja path to write
-#   an output file; the result can be namespaced such that it is unique
-#   to the input file name as well as the output target name.
-
-class NinjaWriter:
-  def __init__(self, qualified_target, target_outputs, base_dir, build_dir,
-               output_file, flavor, toplevel_dir=None):
-    """
-    base_dir: path from source root to directory containing this gyp file,
-              by gyp semantics, all input paths are relative to this
-    build_dir: path from source root to build output
-    toplevel_dir: path to the toplevel directory
-    """
-
-    self.qualified_target = qualified_target
-    self.target_outputs = target_outputs
-    self.base_dir = base_dir
-    self.build_dir = build_dir
-    self.ninja = ninja_syntax.Writer(output_file)
-    self.flavor = flavor
-    self.abs_build_dir = None
-    if toplevel_dir is not None:
-      self.abs_build_dir = os.path.abspath(os.path.join(toplevel_dir,
-                                                        build_dir))
-    self.obj_ext = '.obj' if flavor == 'win' else '.o'
-    if flavor == 'win':
-      # See docstring of msvs_emulation.GenerateEnvironmentFiles().
-      self.win_env = {}
-      for arch in ('x86', 'x64'):
-        self.win_env[arch] = 'environment.' + arch
-
-    # Relative path from build output dir to base dir.
-    build_to_top = gyp.common.InvertRelativePath(build_dir, toplevel_dir)
-    self.build_to_base = os.path.join(build_to_top, base_dir)
-    # Relative path from base dir to build dir.
-    base_to_top = gyp.common.InvertRelativePath(base_dir, toplevel_dir)
-    self.base_to_build = os.path.join(base_to_top, build_dir)
-
-  def ExpandSpecial(self, path, product_dir=None):
-    """Expand specials like $!PRODUCT_DIR in |path|.
-
-    If |product_dir| is None, assumes the cwd is already the product
-    dir.  Otherwise, |product_dir| is the relative path to the product
-    dir.
-    """
-
-    PRODUCT_DIR = '$!PRODUCT_DIR'
-    if PRODUCT_DIR in path:
-      if product_dir:
-        path = path.replace(PRODUCT_DIR, product_dir)
-      else:
-        path = path.replace(PRODUCT_DIR + '/', '')
-        path = path.replace(PRODUCT_DIR + '\\', '')
-        path = path.replace(PRODUCT_DIR, '.')
-
-    INTERMEDIATE_DIR = '$!INTERMEDIATE_DIR'
-    if INTERMEDIATE_DIR in path:
-      int_dir = self.GypPathToUniqueOutput('gen')
-      # GypPathToUniqueOutput generates a path relative to the product dir,
-      # so insert product_dir in front if it is provided.
-      path = path.replace(INTERMEDIATE_DIR,
-                          os.path.join(product_dir or '', int_dir))
-
-    CONFIGURATION_NAME = '$|CONFIGURATION_NAME'
-    path = path.replace(CONFIGURATION_NAME, self.config_name)
-
-    return path
-
-  def ExpandRuleVariables(self, path, root, dirname, source, ext, name):
-    if self.flavor == 'win':
-      path = self.msvs_settings.ConvertVSMacros(
-          path, config=self.config_name)
-    path = path.replace(generator_default_variables['RULE_INPUT_ROOT'], root)
-    path = path.replace(generator_default_variables['RULE_INPUT_DIRNAME'],
-                        dirname)
-    path = path.replace(generator_default_variables['RULE_INPUT_PATH'], source)
-    path = path.replace(generator_default_variables['RULE_INPUT_EXT'], ext)
-    path = path.replace(generator_default_variables['RULE_INPUT_NAME'], name)
-    return path
-
-  def GypPathToNinja(self, path, env=None):
-    """Translate a gyp path to a ninja path, optionally expanding environment
-    variable references in |path| with |env|.
-
-    See the above discourse on path conversions."""
-    if env:
-      if self.flavor == 'mac':
-        path = gyp.xcode_emulation.ExpandEnvVars(path, env)
-      elif self.flavor == 'win':
-        path = gyp.msvs_emulation.ExpandMacros(path, env)
-    if path.startswith('$!'):
-      expanded = self.ExpandSpecial(path)
-      if self.flavor == 'win':
-        expanded = os.path.normpath(expanded)
-      return expanded
-    if '$|' in path:
-      path =  self.ExpandSpecial(path)
-    assert '$' not in path, path
-    return os.path.normpath(os.path.join(self.build_to_base, path))
-
-  def GypPathToUniqueOutput(self, path, qualified=True):
-    """Translate a gyp path to a ninja path for writing output.
-
-    If qualified is True, qualify the resulting filename with the name
-    of the target.  This is necessary when e.g. compiling the same
-    path twice for two separate output targets.
-
-    See the above discourse on path conversions."""
-
-    path = self.ExpandSpecial(path)
-    assert not path.startswith('$'), path
-
-    # Translate the path following this scheme:
-    #   Input: foo/bar.gyp, target targ, references baz/out.o
-    #   Output: obj/foo/baz/targ.out.o (if qualified)
-    #           obj/foo/baz/out.o (otherwise)
-    #     (and obj.host instead of obj for cross-compiles)
-    #
-    # Why this scheme and not some other one?
-    # 1) for a given input, you can compute all derived outputs by matching
-    #    its path, even if the input is brought via a gyp file with '..'.
-    # 2) simple files like libraries and stamps have a simple filename.
-
-    obj = 'obj'
-    if self.toolset != 'target':
-      obj += '.' + self.toolset
-
-    path_dir, path_basename = os.path.split(path)
-    if qualified:
-      path_basename = self.name + '.' + path_basename
-    return os.path.normpath(os.path.join(obj, self.base_dir, path_dir,
-                                         path_basename))
-
-  def WriteCollapsedDependencies(self, name, targets):
-    """Given a list of targets, return a path for a single file
-    representing the result of building all the targets or None.
-
-    Uses a stamp file if necessary."""
-
-    assert targets == filter(None, targets), targets
-    if len(targets) == 0:
-      return None
-    if len(targets) > 1:
-      stamp = self.GypPathToUniqueOutput(name + '.stamp')
-      targets = self.ninja.build(stamp, 'stamp', targets)
-      self.ninja.newline()
-    return targets[0]
-
-  def WriteSpec(self, spec, config_name, generator_flags,
-      case_sensitive_filesystem):
-    """The main entry point for NinjaWriter: write the build rules for a spec.
-
-    Returns a Target object, which represents the output paths for this spec.
-    Returns None if there are no outputs (e.g. a settings-only 'none' type
-    target)."""
-
-    self.config_name = config_name
-    self.name = spec['target_name']
-    self.toolset = spec['toolset']
-    config = spec['configurations'][config_name]
-    self.target = Target(spec['type'])
-    self.is_standalone_static_library = bool(
-        spec.get('standalone_static_library', 0))
-
-    self.is_mac_bundle = gyp.xcode_emulation.IsMacBundle(self.flavor, spec)
-    self.xcode_settings = self.msvs_settings = None
-    if self.flavor == 'mac':
-      self.xcode_settings = gyp.xcode_emulation.XcodeSettings(spec)
-    if self.flavor == 'win':
-      self.msvs_settings = gyp.msvs_emulation.MsvsSettings(spec,
-                                                           generator_flags)
-      arch = self.msvs_settings.GetArch(config_name)
-      self.ninja.variable('arch', self.win_env[arch])
-
-    # Compute predepends for all rules.
-    # actions_depends is the dependencies this target depends on before running
-    # any of its action/rule/copy steps.
-    # compile_depends is the dependencies this target depends on before running
-    # any of its compile steps.
-    actions_depends = []
-    compile_depends = []
-    # TODO(evan): it is rather confusing which things are lists and which
-    # are strings.  Fix these.
-    if 'dependencies' in spec:
-      for dep in spec['dependencies']:
-        if dep in self.target_outputs:
-          target = self.target_outputs[dep]
-          actions_depends.append(target.PreActionInput(self.flavor))
-          compile_depends.append(target.PreCompileInput())
-      actions_depends = filter(None, actions_depends)
-      compile_depends = filter(None, compile_depends)
-      actions_depends = self.WriteCollapsedDependencies('actions_depends',
-                                                        actions_depends)
-      compile_depends = self.WriteCollapsedDependencies('compile_depends',
-                                                        compile_depends)
-      self.target.preaction_stamp = actions_depends
-      self.target.precompile_stamp = compile_depends
-
-    # Write out actions, rules, and copies.  These must happen before we
-    # compile any sources, so compute a list of predependencies for sources
-    # while we do it.
-    extra_sources = []
-    mac_bundle_depends = []
-    self.target.actions_stamp = self.WriteActionsRulesCopies(
-        spec, extra_sources, actions_depends, mac_bundle_depends)
-
-    # If we have actions/rules/copies, we depend directly on those, but
-    # otherwise we depend on dependent target's actions/rules/copies etc.
-    # We never need to explicitly depend on previous target's link steps,
-    # because no compile ever depends on them.
-    compile_depends_stamp = (self.target.actions_stamp or compile_depends)
-
-    # Write out the compilation steps, if any.
-    link_deps = []
-    sources = spec.get('sources', []) + extra_sources
-    if sources:
-      pch = None
-      if self.flavor == 'win':
-        gyp.msvs_emulation.VerifyMissingSources(
-            sources, self.abs_build_dir, generator_flags, self.GypPathToNinja)
-        pch = gyp.msvs_emulation.PrecompiledHeader(
-            self.msvs_settings, config_name, self.GypPathToNinja,
-            self.GypPathToUniqueOutput, self.obj_ext)
-      else:
-        pch = gyp.xcode_emulation.MacPrefixHeader(
-            self.xcode_settings, self.GypPathToNinja,
-            lambda path, lang: self.GypPathToUniqueOutput(path + '-' + lang))
-      link_deps = self.WriteSources(
-          config_name, config, sources, compile_depends_stamp, pch,
-          case_sensitive_filesystem, spec)
-      # Some actions/rules output 'sources' that are already object files.
-      link_deps += [self.GypPathToNinja(f)
-          for f in sources if f.endswith(self.obj_ext)]
-
-    if self.flavor == 'win' and self.target.type == 'static_library':
-      self.target.component_objs = link_deps
-
-    # Write out a link step, if needed.
-    output = None
-    if link_deps or self.target.actions_stamp or actions_depends:
-      output = self.WriteTarget(spec, config_name, config, link_deps,
-                                self.target.actions_stamp or actions_depends)
-      if self.is_mac_bundle:
-        mac_bundle_depends.append(output)
-
-    # Bundle all of the above together, if needed.
-    if self.is_mac_bundle:
-      output = self.WriteMacBundle(spec, mac_bundle_depends)
-
-    if not output:
-      return None
-
-    assert self.target.FinalOutput(), output
-    return self.target
-
-  def _WinIdlRule(self, source, prebuild, outputs):
-    """Handle the implicit VS .idl rule for one source file. Fills |outputs|
-    with files that are generated."""
-    outdir, output, vars, flags = self.msvs_settings.GetIdlBuildData(
-        source, self.config_name)
-    outdir = self.GypPathToNinja(outdir)
-    def fix_path(path, rel=None):
-      path = os.path.join(outdir, path)
-      dirname, basename = os.path.split(source)
-      root, ext = os.path.splitext(basename)
-      path = self.ExpandRuleVariables(
-          path, root, dirname, source, ext, basename)
-      if rel:
-        path = os.path.relpath(path, rel)
-      return path
-    vars = [(name, fix_path(value, outdir)) for name, value in vars]
-    output = [fix_path(p) for p in output]
-    vars.append(('outdir', outdir))
-    vars.append(('idlflags', flags))
-    input = self.GypPathToNinja(source)
-    self.ninja.build(output, 'idl', input,
-        variables=vars, order_only=prebuild)
-    outputs.extend(output)
-
-  def WriteWinIdlFiles(self, spec, prebuild):
-    """Writes rules to match MSVS's implicit idl handling."""
-    assert self.flavor == 'win'
-    if self.msvs_settings.HasExplicitIdlRules(spec):
-      return []
-    outputs = []
-    for source in filter(lambda x: x.endswith('.idl'), spec['sources']):
-      self._WinIdlRule(source, prebuild, outputs)
-    return outputs
-
-  def WriteActionsRulesCopies(self, spec, extra_sources, prebuild,
-                              mac_bundle_depends):
-    """Write out the Actions, Rules, and Copies steps.  Return a path
-    representing the outputs of these steps."""
-    outputs = []
-    extra_mac_bundle_resources = []
-
-    if 'actions' in spec:
-      outputs += self.WriteActions(spec['actions'], extra_sources, prebuild,
-                                   extra_mac_bundle_resources)
-    if 'rules' in spec:
-      outputs += self.WriteRules(spec['rules'], extra_sources, prebuild,
-                                 extra_mac_bundle_resources)
-    if 'copies' in spec:
-      outputs += self.WriteCopies(spec['copies'], prebuild, mac_bundle_depends)
-
-    if 'sources' in spec and self.flavor == 'win':
-      outputs += self.WriteWinIdlFiles(spec, prebuild)
-
-    stamp = self.WriteCollapsedDependencies('actions_rules_copies', outputs)
-
-    if self.is_mac_bundle:
-      mac_bundle_resources = spec.get('mac_bundle_resources', []) + \
-                             extra_mac_bundle_resources
-      self.WriteMacBundleResources(mac_bundle_resources, mac_bundle_depends)
-      self.WriteMacInfoPlist(mac_bundle_depends)
-
-    return stamp
-
-  def GenerateDescription(self, verb, message, fallback):
-    """Generate and return a description of a build step.
-
-    |verb| is the short summary, e.g. ACTION or RULE.
-    |message| is a hand-written description, or None if not available.
-    |fallback| is the gyp-level name of the step, usable as a fallback.
-    """
-    if self.toolset != 'target':
-      verb += '(%s)' % self.toolset
-    if message:
-      return '%s %s' % (verb, self.ExpandSpecial(message))
-    else:
-      return '%s %s: %s' % (verb, self.name, fallback)
-
-  def WriteActions(self, actions, extra_sources, prebuild,
-                   extra_mac_bundle_resources):
-    # Actions cd into the base directory.
-    env = self.GetSortedXcodeEnv()
-    if self.flavor == 'win':
-      env = self.msvs_settings.GetVSMacroEnv(
-          '$!PRODUCT_DIR', config=self.config_name)
-    all_outputs = []
-    for action in actions:
-      # First write out a rule for the action.
-      name = '%s_%s' % (action['action_name'],
-                        hashlib.md5(self.qualified_target).hexdigest())
-      description = self.GenerateDescription('ACTION',
-                                             action.get('message', None),
-                                             name)
-      is_cygwin = (self.msvs_settings.IsRuleRunUnderCygwin(action)
-                   if self.flavor == 'win' else False)
-      args = action['action']
-      rule_name, _ = self.WriteNewNinjaRule(name, args, description,
-                                            is_cygwin, env=env)
-
-      inputs = [self.GypPathToNinja(i, env) for i in action['inputs']]
-      if int(action.get('process_outputs_as_sources', False)):
-        extra_sources += action['outputs']
-      if int(action.get('process_outputs_as_mac_bundle_resources', False)):
-        extra_mac_bundle_resources += action['outputs']
-      outputs = [self.GypPathToNinja(o, env) for o in action['outputs']]
-
-      # Then write out an edge using the rule.
-      self.ninja.build(outputs, rule_name, inputs,
-                       order_only=prebuild)
-      all_outputs += outputs
-
-      self.ninja.newline()
-
-    return all_outputs
-
-  def WriteRules(self, rules, extra_sources, prebuild,
-                 extra_mac_bundle_resources):
-    env = self.GetSortedXcodeEnv()
-    all_outputs = []
-    for rule in rules:
-      # First write out a rule for the rule action.
-      name = '%s_%s' % (rule['rule_name'],
-                        hashlib.md5(self.qualified_target).hexdigest())
-      # Skip a rule with no action and no inputs.
-      if 'action' not in rule and not rule.get('rule_sources', []):
-        continue
-      args = rule['action']
-      description = self.GenerateDescription(
-          'RULE',
-          rule.get('message', None),
-          ('%s ' + generator_default_variables['RULE_INPUT_PATH']) % name)
-      is_cygwin = (self.msvs_settings.IsRuleRunUnderCygwin(rule)
-                   if self.flavor == 'win' else False)
-      rule_name, args = self.WriteNewNinjaRule(
-          name, args, description, is_cygwin, env=env)
-
-      # TODO: if the command references the outputs directly, we should
-      # simplify it to just use $out.
-
-      # Rules can potentially make use of some special variables which
-      # must vary per source file.
-      # Compute the list of variables we'll need to provide.
-      special_locals = ('source', 'root', 'dirname', 'ext', 'name')
-      needed_variables = set(['source'])
-      for argument in args:
-        for var in special_locals:
-          if ('${%s}' % var) in argument:
-            needed_variables.add(var)
-
-      def cygwin_munge(path):
-        if is_cygwin:
-          return path.replace('\\', '/')
-        return path
-
-      # For each source file, write an edge that generates all the outputs.
-      for source in rule.get('rule_sources', []):
-        dirname, basename = os.path.split(source)
-        root, ext = os.path.splitext(basename)
-
-        # Gather the list of inputs and outputs, expanding $vars if possible.
-        outputs = [self.ExpandRuleVariables(o, root, dirname,
-                                            source, ext, basename)
-                   for o in rule['outputs']]
-        inputs = [self.ExpandRuleVariables(i, root, dirname,
-                                           source, ext, basename)
-                  for i in rule.get('inputs', [])]
-
-        if int(rule.get('process_outputs_as_sources', False)):
-          extra_sources += outputs
-        if int(rule.get('process_outputs_as_mac_bundle_resources', False)):
-          extra_mac_bundle_resources += outputs
-
-        extra_bindings = []
-        for var in needed_variables:
-          if var == 'root':
-            extra_bindings.append(('root', cygwin_munge(root)))
-          elif var == 'dirname':
-            extra_bindings.append(('dirname', cygwin_munge(dirname)))
-          elif var == 'source':
-            # '$source' is a parameter to the rule action, which means
-            # it shouldn't be converted to a Ninja path.  But we don't
-            # want $!PRODUCT_DIR in there either.
-            source_expanded = self.ExpandSpecial(source, self.base_to_build)
-            extra_bindings.append(('source', cygwin_munge(source_expanded)))
-          elif var == 'ext':
-            extra_bindings.append(('ext', ext))
-          elif var == 'name':
-            extra_bindings.append(('name', cygwin_munge(basename)))
-          else:
-            assert var == None, repr(var)
-
-        inputs = [self.GypPathToNinja(i, env) for i in inputs]
-        outputs = [self.GypPathToNinja(o, env) for o in outputs]
-        extra_bindings.append(('unique_name',
-            hashlib.md5(outputs[0]).hexdigest()))
-        self.ninja.build(outputs, rule_name, self.GypPathToNinja(source),
-                         implicit=inputs,
-                         order_only=prebuild,
-                         variables=extra_bindings)
-
-        all_outputs.extend(outputs)
-
-    return all_outputs
-
-  def WriteCopies(self, copies, prebuild, mac_bundle_depends):
-    outputs = []
-    env = self.GetSortedXcodeEnv()
-    for copy in copies:
-      for path in copy['files']:
-        # Normalize the path so trailing slashes don't confuse us.
-        path = os.path.normpath(path)
-        basename = os.path.split(path)[1]
-        src = self.GypPathToNinja(path, env)
-        dst = self.GypPathToNinja(os.path.join(copy['destination'], basename),
-                                  env)
-        outputs += self.ninja.build(dst, 'copy', src, order_only=prebuild)
-        if self.is_mac_bundle:
-          # gyp has mac_bundle_resources to copy things into a bundle's
-          # Resources folder, but there's no built-in way to copy files to other
-          # places in the bundle. Hence, some targets use copies for this. Check
-          # if this file is copied into the current bundle, and if so add it to
-          # the bundle depends so that dependent targets get rebuilt if the copy
-          # input changes.
-          if dst.startswith(self.xcode_settings.GetBundleContentsFolderPath()):
-            mac_bundle_depends.append(dst)
-
-    return outputs
-
-  def WriteMacBundleResources(self, resources, bundle_depends):
-    """Writes ninja edges for 'mac_bundle_resources'."""
-    for output, res in gyp.xcode_emulation.GetMacBundleResources(
-        self.ExpandSpecial(generator_default_variables['PRODUCT_DIR']),
-        self.xcode_settings, map(self.GypPathToNinja, resources)):
-      self.ninja.build(output, 'mac_tool', res,
-                       variables=[('mactool_cmd', 'copy-bundle-resource')])
-      bundle_depends.append(output)
-
-  def WriteMacInfoPlist(self, bundle_depends):
-    """Write build rules for bundle Info.plist files."""
-    info_plist, out, defines, extra_env = gyp.xcode_emulation.GetMacInfoPlist(
-        self.ExpandSpecial(generator_default_variables['PRODUCT_DIR']),
-        self.xcode_settings, self.GypPathToNinja)
-    if not info_plist:
-      return
-    if defines:
-      # Create an intermediate file to store preprocessed results.
-      intermediate_plist = self.GypPathToUniqueOutput(
-          os.path.basename(info_plist))
-      defines = ' '.join([Define(d, self.flavor) for d in defines])
-      info_plist = self.ninja.build(intermediate_plist, 'infoplist', info_plist,
-                                    variables=[('defines',defines)])
-
-    env = self.GetSortedXcodeEnv(additional_settings=extra_env)
-    env = self.ComputeExportEnvString(env)
-
-    self.ninja.build(out, 'mac_tool', info_plist,
-                     variables=[('mactool_cmd', 'copy-info-plist'),
-                                ('env', env)])
-    bundle_depends.append(out)
-
-  def WriteSources(self, config_name, config, sources, predepends,
-                   precompiled_header, case_sensitive_filesystem, spec):
-    """Write build rules to compile all of |sources|."""
-    if self.toolset == 'host':
-      self.ninja.variable('ar', '$ar_host')
-      self.ninja.variable('cc', '$cc_host')
-      self.ninja.variable('cxx', '$cxx_host')
-      self.ninja.variable('ld', '$ld_host')
-
-    extra_defines = []
-    if self.flavor == 'mac':
-      cflags = self.xcode_settings.GetCflags(config_name)
-      cflags_c = self.xcode_settings.GetCflagsC(config_name)
-      cflags_cc = self.xcode_settings.GetCflagsCC(config_name)
-      cflags_objc = ['$cflags_c'] + \
-                    self.xcode_settings.GetCflagsObjC(config_name)
-      cflags_objcc = ['$cflags_cc'] + \
-                     self.xcode_settings.GetCflagsObjCC(config_name)
-    elif self.flavor == 'win':
-      cflags = self.msvs_settings.GetCflags(config_name)
-      cflags_c = self.msvs_settings.GetCflagsC(config_name)
-      cflags_cc = self.msvs_settings.GetCflagsCC(config_name)
-      extra_defines = self.msvs_settings.GetComputedDefines(config_name)
-      pdbpath = self.msvs_settings.GetCompilerPdbName(
-          config_name, self.ExpandSpecial)
-      if not pdbpath:
-        obj = 'obj'
-        if self.toolset != 'target':
-          obj += '.' + self.toolset
-        pdbpath = os.path.normpath(os.path.join(obj, self.base_dir,
-                                                self.name + '.pdb'))
-      self.WriteVariableList('pdbname', [pdbpath])
-      self.WriteVariableList('pchprefix', [self.name])
-    else:
-      cflags = config.get('cflags', [])
-      cflags_c = config.get('cflags_c', [])
-      cflags_cc = config.get('cflags_cc', [])
-
-    defines = config.get('defines', []) + extra_defines
-    self.WriteVariableList('defines', [Define(d, self.flavor) for d in defines])
-    if self.flavor == 'win':
-      self.WriteVariableList('rcflags',
-          [QuoteShellArgument(self.ExpandSpecial(f), self.flavor)
-           for f in self.msvs_settings.GetRcflags(config_name,
-                                                  self.GypPathToNinja)])
-
-    include_dirs = config.get('include_dirs', [])
-    if self.flavor == 'win':
-      include_dirs = self.msvs_settings.AdjustIncludeDirs(include_dirs,
-                                                          config_name)
-    self.WriteVariableList('includes',
-        [QuoteShellArgument('-I' + self.GypPathToNinja(i), self.flavor)
-         for i in include_dirs])
-
-    pch_commands = precompiled_header.GetPchBuildCommands()
-    if self.flavor == 'mac':
-      self.WriteVariableList('cflags_pch_c',
-                             [precompiled_header.GetInclude('c')])
-      self.WriteVariableList('cflags_pch_cc',
-                             [precompiled_header.GetInclude('cc')])
-      self.WriteVariableList('cflags_pch_objc',
-                             [precompiled_header.GetInclude('m')])
-      self.WriteVariableList('cflags_pch_objcc',
-                             [precompiled_header.GetInclude('mm')])
-
-    self.WriteVariableList('cflags', map(self.ExpandSpecial, cflags))
-    self.WriteVariableList('cflags_c', map(self.ExpandSpecial, cflags_c))
-    self.WriteVariableList('cflags_cc', map(self.ExpandSpecial, cflags_cc))
-    if self.flavor == 'mac':
-      self.WriteVariableList('cflags_objc', map(self.ExpandSpecial,
-                                                cflags_objc))
-      self.WriteVariableList('cflags_objcc', map(self.ExpandSpecial,
-                                                 cflags_objcc))
-    self.ninja.newline()
-    outputs = []
-    for source in sources:
-      filename, ext = os.path.splitext(source)
-      ext = ext[1:]
-      obj_ext = self.obj_ext
-      if ext in ('cc', 'cpp', 'cxx'):
-        command = 'cxx'
-      elif ext == 'c' or (ext == 'S' and self.flavor != 'win'):
-        command = 'cc'
-      elif ext == 's' and self.flavor != 'win':  # Doesn't generate .o.d files.
-        command = 'cc_s'
-      elif (self.flavor == 'win' and ext == 'asm' and
-            self.msvs_settings.GetArch(config_name) == 'x86' and
-            not self.msvs_settings.HasExplicitAsmRules(spec)):
-        # Asm files only get auto assembled for x86 (not x64).
-        command = 'asm'
-        # Add the _asm suffix as msvs is capable of handling .cc and
-        # .asm files of the same name without collision.
-        obj_ext = '_asm.obj'
-      elif self.flavor == 'mac' and ext == 'm':
-        command = 'objc'
-      elif self.flavor == 'mac' and ext == 'mm':
-        command = 'objcxx'
-      elif self.flavor == 'win' and ext == 'rc':
-        command = 'rc'
-        obj_ext = '.res'
-      else:
-        # Ignore unhandled extensions.
-        continue
-      input = self.GypPathToNinja(source)
-      output = self.GypPathToUniqueOutput(filename + obj_ext)
-      # Ninja's depfile handling gets confused when the case of a filename
-      # changes on a case-insensitive file system. To work around that, always
-      # convert .o filenames to lowercase on such file systems. See
-      # https://github.com/martine/ninja/issues/402 for details.
-      if not case_sensitive_filesystem:
-        output = output.lower()
-      implicit = precompiled_header.GetObjDependencies([input], [output])
-      variables = []
-      if self.flavor == 'win':
-        variables, output, implicit = precompiled_header.GetFlagsModifications(
-            input, output, implicit, command, cflags_c, cflags_cc,
-            self.ExpandSpecial)
-      self.ninja.build(output, command, input,
-                       implicit=[gch for _, _, gch in implicit],
-                       order_only=predepends, variables=variables)
-      outputs.append(output)
-
-    self.WritePchTargets(pch_commands)
-
-    self.ninja.newline()
-    return outputs
-
-  def WritePchTargets(self, pch_commands):
-    """Writes ninja rules to compile prefix headers."""
-    if not pch_commands:
-      return
-
-    for gch, lang_flag, lang, input in pch_commands:
-      var_name = {
-        'c': 'cflags_pch_c',
-        'cc': 'cflags_pch_cc',
-        'm': 'cflags_pch_objc',
-        'mm': 'cflags_pch_objcc',
-      }[lang]
-
-      map = { 'c': 'cc', 'cc': 'cxx', 'm': 'objc', 'mm': 'objcxx', }
-      cmd = map.get(lang)
-      self.ninja.build(gch, cmd, input, variables=[(var_name, lang_flag)])
-
-  def WriteLink(self, spec, config_name, config, link_deps):
-    """Write out a link step. Fills out target.binary. """
-
-    command = {
-      'executable':      'link',
-      'loadable_module': 'solink_module',
-      'shared_library':  'solink',
-    }[spec['type']]
-
-    implicit_deps = set()
-    solibs = set()
-
-    if 'dependencies' in spec:
-      # Two kinds of dependencies:
-      # - Linkable dependencies (like a .a or a .so): add them to the link line.
-      # - Non-linkable dependencies (like a rule that generates a file
-      #   and writes a stamp file): add them to implicit_deps
-      extra_link_deps = set()
-      for dep in spec['dependencies']:
-        target = self.target_outputs.get(dep)
-        if not target:
-          continue
-        linkable = target.Linkable()
-        if linkable:
-          if (self.flavor == 'win' and
-              target.component_objs and
-              self.msvs_settings.IsUseLibraryDependencyInputs(config_name)):
-            extra_link_deps |= set(target.component_objs)
-          elif self.flavor == 'win' and target.import_lib:
-            extra_link_deps.add(target.import_lib)
-          elif target.UsesToc(self.flavor):
-            solibs.add(target.binary)
-            implicit_deps.add(target.binary + '.TOC')
-          else:
-            extra_link_deps.add(target.binary)
-
-        final_output = target.FinalOutput()
-        if not linkable or final_output != target.binary:
-          implicit_deps.add(final_output)
-
-      link_deps.extend(list(extra_link_deps))
-
-    extra_bindings = []
-    if self.is_mac_bundle:
-      output = self.ComputeMacBundleBinaryOutput()
-    else:
-      output = self.ComputeOutput(spec)
-      extra_bindings.append(('postbuilds',
-                             self.GetPostbuildCommand(spec, output, output)))
-
-    is_executable = spec['type'] == 'executable'
-    if self.flavor == 'mac':
-      ldflags = self.xcode_settings.GetLdflags(config_name,
-          self.ExpandSpecial(generator_default_variables['PRODUCT_DIR']),
-          self.GypPathToNinja)
-    elif self.flavor == 'win':
-      manifest_name = self.GypPathToUniqueOutput(
-          self.ComputeOutputFileName(spec))
-      ldflags, manifest_files = self.msvs_settings.GetLdflags(config_name,
-          self.GypPathToNinja, self.ExpandSpecial, manifest_name, is_executable)
-      self.WriteVariableList('manifests', manifest_files)
-    else:
-      ldflags = config.get('ldflags', [])
-      if is_executable and len(solibs):
-        ldflags.append('-Wl,-rpath=\$$ORIGIN/lib/')
-        ldflags.append('-Wl,-rpath-link=lib/')
-    self.WriteVariableList('ldflags',
-                           gyp.common.uniquer(map(self.ExpandSpecial,
-                                                  ldflags)))
-
-    libraries = gyp.common.uniquer(map(self.ExpandSpecial,
-                                       spec.get('libraries', [])))
-    if self.flavor == 'mac':
-      libraries = self.xcode_settings.AdjustLibraries(libraries)
-    elif self.flavor == 'win':
-      libraries = self.msvs_settings.AdjustLibraries(libraries)
-    self.WriteVariableList('libs', libraries)
-
-    self.target.binary = output
-
-    if command in ('solink', 'solink_module'):
-      extra_bindings.append(('soname', os.path.split(output)[1]))
-      extra_bindings.append(('lib',
-                            gyp.common.EncodePOSIXShellArgument(output)))
-      if self.flavor == 'win':
-        extra_bindings.append(('dll', output))
-        if '/NOENTRY' not in ldflags:
-          self.target.import_lib = output + '.lib'
-          extra_bindings.append(('implibflag',
-                                 '/IMPLIB:%s' % self.target.import_lib))
-          output = [output, self.target.import_lib]
-      else:
-        output = [output, output + '.TOC']
-
-    if len(solibs):
-      extra_bindings.append(('solibs', gyp.common.EncodePOSIXShellList(solibs)))
-
-    self.ninja.build(output, command, link_deps,
-                     implicit=list(implicit_deps),
-                     variables=extra_bindings)
-
-  def WriteTarget(self, spec, config_name, config, link_deps, compile_deps):
-    if spec['type'] == 'none':
-      # TODO(evan): don't call this function for 'none' target types, as
-      # it doesn't do anything, and we fake out a 'binary' with a stamp file.
-      self.target.binary = compile_deps
-    elif spec['type'] == 'static_library':
-      self.target.binary = self.ComputeOutput(spec)
-      variables = []
-      postbuild = self.GetPostbuildCommand(
-          spec, self.target.binary, self.target.binary)
-      if postbuild:
-        variables.append(('postbuilds', postbuild))
-      if self.xcode_settings:
-        variables.append(('libtool_flags',
-                          self.xcode_settings.GetLibtoolflags(config_name)))
-      if (self.flavor not in ('mac', 'win') and not
-          self.is_standalone_static_library):
-        self.ninja.build(self.target.binary, 'alink_thin', link_deps,
-                         order_only=compile_deps, variables=variables)
-      else:
-        if self.msvs_settings:
-          libflags = self.msvs_settings.GetLibFlags(config_name,
-                                                    self.GypPathToNinja)
-          variables.append(('libflags', libflags))
-        self.ninja.build(self.target.binary, 'alink', link_deps,
-                         order_only=compile_deps, variables=variables)
-    else:
-      self.WriteLink(spec, config_name, config, link_deps)
-    return self.target.binary
-
-  def WriteMacBundle(self, spec, mac_bundle_depends):
-    assert self.is_mac_bundle
-    package_framework = spec['type'] in ('shared_library', 'loadable_module')
-    output = self.ComputeMacBundleOutput()
-    postbuild = self.GetPostbuildCommand(spec, output, self.target.binary,
-                                         is_command_start=not package_framework)
-    variables = []
-    if postbuild:
-      variables.append(('postbuilds', postbuild))
-    if package_framework:
-      variables.append(('version', self.xcode_settings.GetFrameworkVersion()))
-      self.ninja.build(output, 'package_framework', mac_bundle_depends,
-                       variables=variables)
-    else:
-      self.ninja.build(output, 'stamp', mac_bundle_depends,
-                       variables=variables)
-    self.target.bundle = output
-    return output
-
-  def GetSortedXcodeEnv(self, additional_settings=None):
-    """Returns the variables Xcode would set for build steps."""
-    assert self.abs_build_dir
-    abs_build_dir = self.abs_build_dir
-    return gyp.xcode_emulation.GetSortedXcodeEnv(
-        self.xcode_settings, abs_build_dir,
-        os.path.join(abs_build_dir, self.build_to_base), self.config_name,
-        additional_settings)
-
-  def GetSortedXcodePostbuildEnv(self):
-    """Returns the variables Xcode would set for postbuild steps."""
-    postbuild_settings = {}
-    # CHROMIUM_STRIP_SAVE_FILE is a chromium-specific hack.
-    # TODO(thakis): It would be nice to have some general mechanism instead.
-    strip_save_file = self.xcode_settings.GetPerTargetSetting(
-        'CHROMIUM_STRIP_SAVE_FILE')
-    if strip_save_file:
-      postbuild_settings['CHROMIUM_STRIP_SAVE_FILE'] = strip_save_file
-    return self.GetSortedXcodeEnv(additional_settings=postbuild_settings)
-
-  def GetPostbuildCommand(self, spec, output, output_binary,
-                          is_command_start=False):
-    """Returns a shell command that runs all the postbuilds, and removes
-    |output| if any of them fails. If |is_command_start| is False, then the
-    returned string will start with ' && '."""
-    if not self.xcode_settings or spec['type'] == 'none' or not output:
-      return ''
-    output = QuoteShellArgument(output, self.flavor)
-    target_postbuilds = self.xcode_settings.GetTargetPostbuilds(
-        self.config_name,
-        os.path.normpath(os.path.join(self.base_to_build, output)),
-        QuoteShellArgument(
-            os.path.normpath(os.path.join(self.base_to_build, output_binary)),
-            self.flavor),
-        quiet=True)
-    postbuilds = gyp.xcode_emulation.GetSpecPostbuildCommands(spec, quiet=True)
-    postbuilds = target_postbuilds + postbuilds
-    if not postbuilds:
-      return ''
-    # Postbuilds expect to be run in the gyp file's directory, so insert an
-    # implicit postbuild to cd to there.
-    postbuilds.insert(0, gyp.common.EncodePOSIXShellList(
-        ['cd', self.build_to_base]))
-    env = self.ComputeExportEnvString(self.GetSortedXcodePostbuildEnv())
-    # G will be non-null if any postbuild fails. Run all postbuilds in a
-    # subshell.
-    commands = env + ' (' + \
-        ' && '.join([ninja_syntax.escape(command) for command in postbuilds])
-    command_string = (commands + '); G=$$?; '
-                      # Remove the final output if any postbuild failed.
-                      '((exit $$G) || rm -rf %s) ' % output + '&& exit $$G)')
-    if is_command_start:
-      return '(' + command_string + ' && '
-    else:
-      return '$ && (' + command_string
-
-  def ComputeExportEnvString(self, env):
-    """Given an environment, returns a string looking like
-        'export FOO=foo; export BAR="${FOO} bar;'
-    that exports |env| to the shell."""
-    export_str = []
-    for k, v in env:
-      export_str.append('export %s=%s;' %
-          (k, ninja_syntax.escape(gyp.common.EncodePOSIXShellArgument(v))))
-    return ' '.join(export_str)
-
-  def ComputeMacBundleOutput(self):
-    """Return the 'output' (full output path) to a bundle output directory."""
-    assert self.is_mac_bundle
-    path = self.ExpandSpecial(generator_default_variables['PRODUCT_DIR'])
-    return os.path.join(path, self.xcode_settings.GetWrapperName())
-
-  def ComputeMacBundleBinaryOutput(self):
-    """Return the 'output' (full output path) to the binary in a bundle."""
-    assert self.is_mac_bundle
-    path = self.ExpandSpecial(generator_default_variables['PRODUCT_DIR'])
-    return os.path.join(path, self.xcode_settings.GetExecutablePath())
-
-  def ComputeOutputFileName(self, spec, type=None):
-    """Compute the filename of the final output for the current target."""
-    if not type:
-      type = spec['type']
-
-    default_variables = copy.copy(generator_default_variables)
-    CalculateVariables(default_variables, {'flavor': self.flavor})
-
-    # Compute filename prefix: the product prefix, or a default for
-    # the product type.
-    DEFAULT_PREFIX = {
-      'loadable_module': default_variables['SHARED_LIB_PREFIX'],
-      'shared_library': default_variables['SHARED_LIB_PREFIX'],
-      'static_library': default_variables['STATIC_LIB_PREFIX'],
-      'executable': default_variables['EXECUTABLE_PREFIX'],
-      }
-    prefix = spec.get('product_prefix', DEFAULT_PREFIX.get(type, ''))
-
-    # Compute filename extension: the product extension, or a default
-    # for the product type.
-    DEFAULT_EXTENSION = {
-        'loadable_module': default_variables['SHARED_LIB_SUFFIX'],
-        'shared_library': default_variables['SHARED_LIB_SUFFIX'],
-        'static_library': default_variables['STATIC_LIB_SUFFIX'],
-        'executable': default_variables['EXECUTABLE_SUFFIX'],
-      }
-    extension = spec.get('product_extension')
-    if extension:
-      extension = '.' + extension
-    else:
-      extension = DEFAULT_EXTENSION.get(type, '')
-
-    if 'product_name' in spec:
-      # If we were given an explicit name, use that.
-      target = spec['product_name']
-    else:
-      # Otherwise, derive a name from the target name.
-      target = spec['target_name']
-      if prefix == 'lib':
-        # Snip out an extra 'lib' from libs if appropriate.
-        target = StripPrefix(target, 'lib')
-
-    if type in ('static_library', 'loadable_module', 'shared_library',
-                        'executable'):
-      return '%s%s%s' % (prefix, target, extension)
-    elif type == 'none':
-      return '%s.stamp' % target
-    else:
-      raise Exception('Unhandled output type %s' % type)
-
-  def ComputeOutput(self, spec, type=None):
-    """Compute the path for the final output of the spec."""
-    assert not self.is_mac_bundle or type
-
-    if not type:
-      type = spec['type']
-
-    if self.flavor == 'win':
-      override = self.msvs_settings.GetOutputName(self.config_name,
-                                                  self.ExpandSpecial)
-      if override:
-        return override
-
-    if self.flavor == 'mac' and type in (
-        'static_library', 'executable', 'shared_library', 'loadable_module'):
-      filename = self.xcode_settings.GetExecutablePath()
-    else:
-      filename = self.ComputeOutputFileName(spec, type)
-
-    if 'product_dir' in spec:
-      path = os.path.join(spec['product_dir'], filename)
-      return self.ExpandSpecial(path)
-
-    # Some products go into the output root, libraries go into shared library
-    # dir, and everything else goes into the normal place.
-    type_in_output_root = ['executable', 'loadable_module']
-    if self.flavor == 'mac' and self.toolset == 'target':
-      type_in_output_root += ['shared_library', 'static_library']
-    elif self.flavor == 'win' and self.toolset == 'target':
-      type_in_output_root += ['shared_library']
-
-    if type in type_in_output_root or self.is_standalone_static_library:
-      return filename
-    elif type == 'shared_library':
-      libdir = 'lib'
-      if self.toolset != 'target':
-        libdir = os.path.join('lib', '%s' % self.toolset)
-      return os.path.join(libdir, filename)
-    else:
-      return self.GypPathToUniqueOutput(filename, qualified=False)
-
-  def WriteVariableList(self, var, values):
-    assert not isinstance(values, str)
-    if values is None:
-      values = []
-    self.ninja.variable(var, ' '.join(values))
-
-  def WriteNewNinjaRule(self, name, args, description, is_cygwin, env):
-    """Write out a new ninja "rule" statement for a given command.
-
-    Returns the name of the new rule, and a copy of |args| with variables
-    expanded."""
-
-    if self.flavor == 'win':
-      args = [self.msvs_settings.ConvertVSMacros(
-                  arg, self.base_to_build, config=self.config_name)
-              for arg in args]
-      description = self.msvs_settings.ConvertVSMacros(
-          description, config=self.config_name)
-    elif self.flavor == 'mac':
-      # |env| is an empty list on non-mac.
-      args = [gyp.xcode_emulation.ExpandEnvVars(arg, env) for arg in args]
-      description = gyp.xcode_emulation.ExpandEnvVars(description, env)
-
-    # TODO: we shouldn't need to qualify names; we do it because
-    # currently the ninja rule namespace is global, but it really
-    # should be scoped to the subninja.
-    rule_name = self.name
-    if self.toolset == 'target':
-      rule_name += '.' + self.toolset
-    rule_name += '.' + name
-    rule_name = re.sub('[^a-zA-Z0-9_]', '_', rule_name)
-
-    # Remove variable references, but not if they refer to the magic rule
-    # variables.  This is not quite right, as it also protects these for
-    # actions, not just for rules where they are valid. Good enough.
-    protect = [ '${root}', '${dirname}', '${source}', '${ext}', '${name}' ]
-    protect = '(?!' + '|'.join(map(re.escape, protect)) + ')'
-    description = re.sub(protect + r'\$', '_', description)
-
-    # gyp dictates that commands are run from the base directory.
-    # cd into the directory before running, and adjust paths in
-    # the arguments to point to the proper locations.
-    rspfile = None
-    rspfile_content = None
-    args = [self.ExpandSpecial(arg, self.base_to_build) for arg in args]
-    if self.flavor == 'win':
-      rspfile = rule_name + '.$unique_name.rsp'
-      # The cygwin case handles this inside the bash sub-shell.
-      run_in = '' if is_cygwin else ' ' + self.build_to_base
-      if is_cygwin:
-        rspfile_content = self.msvs_settings.BuildCygwinBashCommandLine(
-            args, self.build_to_base)
-      else:
-        rspfile_content = gyp.msvs_emulation.EncodeRspFileList(args)
-      command = ('%s gyp-win-tool action-wrapper $arch ' % sys.executable +
-                 rspfile + run_in)
-    else:
-      env = self.ComputeExportEnvString(env)
-      command = gyp.common.EncodePOSIXShellList(args)
-      command = 'cd %s; ' % self.build_to_base + env + command
-
-    # GYP rules/actions express being no-ops by not touching their outputs.
-    # Avoid executing downstream dependencies in this case by specifying
-    # restat=1 to ninja.
-    self.ninja.rule(rule_name, command, description, restat=True,
-                    rspfile=rspfile, rspfile_content=rspfile_content)
-    self.ninja.newline()
-
-    return rule_name, args
-
-
-def CalculateVariables(default_variables, params):
-  """Calculate additional variables for use in the build (called by gyp)."""
-  global generator_additional_non_configuration_keys
-  global generator_additional_path_sections
-  flavor = gyp.common.GetFlavor(params)
-  if flavor == 'mac':
-    default_variables.setdefault('OS', 'mac')
-    default_variables.setdefault('SHARED_LIB_SUFFIX', '.dylib')
-    default_variables.setdefault('SHARED_LIB_DIR',
-                                 generator_default_variables['PRODUCT_DIR'])
-    default_variables.setdefault('LIB_DIR',
-                                 generator_default_variables['PRODUCT_DIR'])
-
-    # Copy additional generator configuration data from Xcode, which is shared
-    # by the Mac Ninja generator.
-    import gyp.generator.xcode as xcode_generator
-    generator_additional_non_configuration_keys = getattr(xcode_generator,
-        'generator_additional_non_configuration_keys', [])
-    generator_additional_path_sections = getattr(xcode_generator,
-        'generator_additional_path_sections', [])
-    global generator_extra_sources_for_rules
-    generator_extra_sources_for_rules = getattr(xcode_generator,
-        'generator_extra_sources_for_rules', [])
-  elif flavor == 'win':
-    default_variables.setdefault('OS', 'win')
-    default_variables['EXECUTABLE_SUFFIX'] = '.exe'
-    default_variables['STATIC_LIB_PREFIX'] = ''
-    default_variables['STATIC_LIB_SUFFIX'] = '.lib'
-    default_variables['SHARED_LIB_PREFIX'] = ''
-    default_variables['SHARED_LIB_SUFFIX'] = '.dll'
-    generator_flags = params.get('generator_flags', {})
-
-    # Copy additional generator configuration data from VS, which is shared
-    # by the Windows Ninja generator.
-    import gyp.generator.msvs as msvs_generator
-    generator_additional_non_configuration_keys = getattr(msvs_generator,
-        'generator_additional_non_configuration_keys', [])
-    generator_additional_path_sections = getattr(msvs_generator,
-        'generator_additional_path_sections', [])
-
-    # Set a variable so conditions can be based on msvs_version.
-    msvs_version = gyp.msvs_emulation.GetVSVersion(generator_flags)
-    default_variables['MSVS_VERSION'] = msvs_version.ShortName()
-
-    # To determine processor word size on Windows, in addition to checking
-    # PROCESSOR_ARCHITECTURE (which reflects the word size of the current
-    # process), it is also necessary to check PROCESSOR_ARCHITEW6432 (which
-    # contains the actual word size of the system when running thru WOW64).
-    if ('64' in os.environ.get('PROCESSOR_ARCHITECTURE', '') or
-        '64' in os.environ.get('PROCESSOR_ARCHITEW6432', '')):
-      default_variables['MSVS_OS_BITS'] = 64
-    else:
-      default_variables['MSVS_OS_BITS'] = 32
-  else:
-    operating_system = flavor
-    if flavor == 'android':
-      operating_system = 'linux'  # Keep this legacy behavior for now.
-    default_variables.setdefault('OS', operating_system)
-    default_variables.setdefault('SHARED_LIB_SUFFIX', '.so')
-    default_variables.setdefault('SHARED_LIB_DIR',
-                                 os.path.join('$!PRODUCT_DIR', 'lib'))
-    default_variables.setdefault('LIB_DIR',
-                                 os.path.join('$!PRODUCT_DIR', 'obj'))
-
-
-def OpenOutput(path, mode='w'):
-  """Open |path| for writing, creating directories if necessary."""
-  try:
-    os.makedirs(os.path.dirname(path))
-  except OSError:
-    pass
-  return open(path, mode)
-
-
-def CommandWithWrapper(cmd, wrappers, prog):
-  wrapper = wrappers.get(cmd, '')
-  if wrapper:
-    return wrapper + ' ' + prog
-  return prog
-
-
-def GenerateOutputForConfig(target_list, target_dicts, data, params,
-                            config_name):
-  options = params['options']
-  flavor = gyp.common.GetFlavor(params)
-  generator_flags = params.get('generator_flags', {})
-
-  # generator_dir: relative path from pwd to where make puts build files.
-  # Makes migrating from make to ninja easier, ninja doesn't put anything here.
-  generator_dir = os.path.relpath(params['options'].generator_output or '.')
-
-  # output_dir: relative path from generator_dir to the build directory.
-  output_dir = generator_flags.get('output_dir', 'out')
-
-  # build_dir: relative path from source root to our output files.
-  # e.g. "out/Debug"
-  build_dir = os.path.normpath(os.path.join(generator_dir,
-                                            output_dir,
-                                            config_name))
-
-  toplevel_build = os.path.join(options.toplevel_dir, build_dir)
-
-  master_ninja = ninja_syntax.Writer(
-      OpenOutput(os.path.join(toplevel_build, 'build.ninja')),
-      width=120)
-  case_sensitive_filesystem = not os.path.exists(
-      os.path.join(toplevel_build, 'BUILD.NINJA'))
-
-  # Put build-time support tools in out/{config_name}.
-  gyp.common.CopyTool(flavor, toplevel_build)
-
-  # Grab make settings for CC/CXX.
-  # The rules are
-  # - The priority from low to high is gcc/g++, the 'make_global_settings' in
-  #   gyp, the environment variable.
-  # - If there is no 'make_global_settings' for CC.host/CXX.host or
-  #   'CC_host'/'CXX_host' enviroment variable, cc_host/cxx_host should be set
-  #   to cc/cxx.
-  if flavor == 'win':
-    cc = 'cl.exe'
-    cxx = 'cl.exe'
-    ld = 'link.exe'
-    gyp.msvs_emulation.GenerateEnvironmentFiles(
-        toplevel_build, generator_flags, OpenOutput)
-    ld_host = '$ld'
-  else:
-    cc = 'gcc'
-    cxx = 'g++'
-    ld = '$cxx'
-    ld_host = '$cxx_host'
-
-  cc_host = None
-  cxx_host = None
-  cc_host_global_setting = None
-  cxx_host_global_setting = None
-
-  build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0])
-  make_global_settings = data[build_file].get('make_global_settings', [])
-  build_to_root = gyp.common.InvertRelativePath(build_dir,
-                                                options.toplevel_dir)
-  flock = 'flock'
-  if flavor == 'mac':
-    flock = './gyp-mac-tool flock'
-  wrappers = {}
-  if flavor != 'win':
-    wrappers['LINK'] = flock + ' linker.lock'
-  for key, value in make_global_settings:
-    if key == 'CC':
-      cc = os.path.join(build_to_root, value)
-    if key == 'CXX':
-      cxx = os.path.join(build_to_root, value)
-    if key == 'LD':
-      ld = os.path.join(build_to_root, value)
-    if key == 'CC.host':
-      cc_host = os.path.join(build_to_root, value)
-      cc_host_global_setting = value
-    if key == 'CXX.host':
-      cxx_host = os.path.join(build_to_root, value)
-      cxx_host_global_setting = value
-    if key == 'LD.host':
-      ld_host = os.path.join(build_to_root, value)
-    if key.endswith('_wrapper'):
-      wrappers[key[:-len('_wrapper')]] = os.path.join(build_to_root, value)
-
-  cc = GetEnvironFallback(['CC_target', 'CC'], cc)
-  master_ninja.variable('cc', CommandWithWrapper('CC', wrappers, cc))
-  cxx = GetEnvironFallback(['CXX_target', 'CXX'], cxx)
-  master_ninja.variable('cxx', CommandWithWrapper('CXX', wrappers, cxx))
-  ld = GetEnvironFallback(['LD_target', 'LD'], ld)
-
-  if not cc_host:
-    cc_host = cc
-  if not cxx_host:
-    cxx_host = cxx
-
-  if flavor == 'win':
-    master_ninja.variable('ld', ld)
-    master_ninja.variable('idl', 'midl.exe')
-    master_ninja.variable('ar', 'lib.exe')
-    master_ninja.variable('rc', 'rc.exe')
-    master_ninja.variable('asm', 'ml.exe')
-    master_ninja.variable('mt', 'mt.exe')
-    master_ninja.variable('use_dep_database', '1')
-  else:
-    master_ninja.variable('ld', CommandWithWrapper('LINK', wrappers, ld))
-    master_ninja.variable('ar', GetEnvironFallback(['AR_target', 'AR'], 'ar'))
-
-  master_ninja.variable('ar_host', GetEnvironFallback(['AR_host'], 'ar'))
-  cc_host = GetEnvironFallback(['CC_host'], cc_host)
-  cxx_host = GetEnvironFallback(['CXX_host'], cxx_host)
-  ld_host = GetEnvironFallback(['LD_host'], ld_host)
-
-  # The environment variable could be used in 'make_global_settings', like
-  # ['CC.host', '$(CC)'] or ['CXX.host', '$(CXX)'], transform them here.
-  if '$(CC)' in cc_host and cc_host_global_setting:
-    cc_host = cc_host_global_setting.replace('$(CC)', cc)
-  if '$(CXX)' in cxx_host and cxx_host_global_setting:
-    cxx_host = cxx_host_global_setting.replace('$(CXX)', cxx)
-  master_ninja.variable('cc_host',
-                        CommandWithWrapper('CC.host', wrappers, cc_host))
-  master_ninja.variable('cxx_host',
-                        CommandWithWrapper('CXX.host', wrappers, cxx_host))
-  if flavor == 'win':
-    master_ninja.variable('ld_host', ld_host)
-  else:
-    master_ninja.variable('ld_host', CommandWithWrapper(
-        'LINK', wrappers, ld_host))
-
-  master_ninja.newline()
-
-  if flavor != 'win':
-    master_ninja.rule(
-      'cc',
-      description='CC $out',
-      command=('$cc -MMD -MF $out.d $defines $includes $cflags $cflags_c '
-              '$cflags_pch_c -c $in -o $out'),
-      depfile='$out.d')
-    master_ninja.rule(
-      'cc_s',
-      description='CC $out',
-      command=('$cc $defines $includes $cflags $cflags_c '
-              '$cflags_pch_c -c $in -o $out'))
-    master_ninja.rule(
-      'cxx',
-      description='CXX $out',
-      command=('$cxx -MMD -MF $out.d $defines $includes $cflags $cflags_cc '
-              '$cflags_pch_cc -c $in -o $out'),
-      depfile='$out.d')
-  else:
-    cc_command = ('ninja -t msvc -o $out -e $arch '
-                  '-- '
-                  '$cc /nologo /showIncludes /FC '
-                  '@$out.rsp /c $in /Fo$out /Fd$pdbname ')
-    cxx_command = ('ninja -t msvc -o $out -e $arch '
-                   '-- '
-                   '$cxx /nologo /showIncludes /FC '
-                   '@$out.rsp /c $in /Fo$out /Fd$pdbname ')
-    master_ninja.rule(
-      'cc',
-      description='CC $out',
-      command=cc_command,
-      depfile='$out.d',
-      rspfile='$out.rsp',
-      rspfile_content='$defines $includes $cflags $cflags_c')
-    master_ninja.rule(
-      'cxx',
-      description='CXX $out',
-      command=cxx_command,
-      depfile='$out.d',
-      rspfile='$out.rsp',
-      rspfile_content='$defines $includes $cflags $cflags_cc')
-    master_ninja.rule(
-      'idl',
-      description='IDL $in',
-      command=('%s gyp-win-tool midl-wrapper $arch $outdir '
-               '$tlb $h $dlldata $iid $proxy $in '
-               '$idlflags' % sys.executable))
-    master_ninja.rule(
-      'rc',
-      description='RC $in',
-      # Note: $in must be last otherwise rc.exe complains.
-      command=('%s gyp-win-tool rc-wrapper '
-               '$arch $rc $defines $includes $rcflags /fo$out $in' %
-               sys.executable))
-    master_ninja.rule(
-      'asm',
-      description='ASM $in',
-      command=('%s gyp-win-tool asm-wrapper '
-               '$arch $asm $defines $includes /c /Fo $out $in' %
-               sys.executable))
-
-  if flavor != 'mac' and flavor != 'win':
-    master_ninja.rule(
-      'alink',
-      description='AR $out',
-      command='rm -f $out && $ar rcs $out $in')
-    master_ninja.rule(
-      'alink_thin',
-      description='AR $out',
-      command='rm -f $out && $ar rcsT $out $in')
-
-    # This allows targets that only need to depend on $lib's API to declare an
-    # order-only dependency on $lib.TOC and avoid relinking such downstream
-    # dependencies when $lib changes only in non-public ways.
-    # The resulting string leaves an uninterpolated %{suffix} which
-    # is used in the final substitution below.
-    mtime_preserving_solink_base = (
-        'if [ ! -e $lib -o ! -e ${lib}.TOC ]; then '
-        '%(solink)s && %(extract_toc)s > ${lib}.TOC; else '
-        '%(solink)s && %(extract_toc)s > ${lib}.tmp && '
-        'if ! cmp -s ${lib}.tmp ${lib}.TOC; then mv ${lib}.tmp ${lib}.TOC ; '
-        'fi; fi'
-        % { 'solink':
-              '$ld -shared $ldflags -o $lib -Wl,-soname=$soname %(suffix)s',
-            'extract_toc':
-              ('{ readelf -d ${lib} | grep SONAME ; '
-               'nm -gD -f p ${lib} | cut -f1-2 -d\' \'; }')})
-
-    master_ninja.rule(
-      'solink',
-      description='SOLINK $lib',
-      restat=True,
-      command=(mtime_preserving_solink_base % {
-          'suffix': '-Wl,--whole-archive $in $solibs -Wl,--no-whole-archive '
-          '$libs'}))
-    master_ninja.rule(
-      'solink_module',
-      description='SOLINK(module) $lib',
-      restat=True,
-      command=(mtime_preserving_solink_base % {
-          'suffix': '-Wl,--start-group $in $solibs -Wl,--end-group $libs'}))
-    master_ninja.rule(
-      'link',
-      description='LINK $out',
-      command=('$ld $ldflags -o $out '
-               '-Wl,--start-group $in $solibs -Wl,--end-group $libs'))
-  elif flavor == 'win':
-    master_ninja.rule(
-        'alink',
-        description='LIB $out',
-        command=('%s gyp-win-tool link-wrapper $arch '
-                 '$ar /nologo /ignore:4221 /OUT:$out @$out.rsp' %
-                 sys.executable),
-        rspfile='$out.rsp',
-        rspfile_content='$in_newline $libflags')
-    dlldesc = 'LINK(DLL) $dll'
-    dllcmd = ('%s gyp-win-tool link-wrapper $arch '
-              '$ld /nologo $implibflag /DLL /OUT:$dll '
-              '/PDB:$dll.pdb @$dll.rsp' % sys.executable)
-    dllcmd += (' && %s gyp-win-tool manifest-wrapper $arch '
-               'cmd /c if exist $dll.manifest del $dll.manifest' %
-               sys.executable)
-    dllcmd += (' && %s gyp-win-tool manifest-wrapper $arch '
-               '$mt -nologo -manifest $manifests -out:$dll.manifest' %
-               sys.executable)
-    master_ninja.rule('solink', description=dlldesc, command=dllcmd,
-                      rspfile='$dll.rsp',
-                      rspfile_content='$libs $in_newline $ldflags',
-                      restat=True)
-    master_ninja.rule('solink_module', description=dlldesc, command=dllcmd,
-                      rspfile='$dll.rsp',
-                      rspfile_content='$libs $in_newline $ldflags',
-                      restat=True)
-    # Note that ldflags goes at the end so that it has the option of
-    # overriding default settings earlier in the command line.
-    master_ninja.rule(
-        'link',
-        description='LINK $out',
-        command=('%s gyp-win-tool link-wrapper $arch '
-                 '$ld /nologo /OUT:$out /PDB:$out.pdb @$out.rsp && '
-                 '%s gyp-win-tool manifest-wrapper $arch '
-                 'cmd /c if exist $out.manifest del $out.manifest && '
-                 '%s gyp-win-tool manifest-wrapper $arch '
-                 '$mt -nologo -manifest $manifests -out:$out.manifest' %
-                 (sys.executable, sys.executable, sys.executable)),
-        rspfile='$out.rsp',
-        rspfile_content='$in_newline $libs $ldflags')
-  else:
-    master_ninja.rule(
-      'objc',
-      description='OBJC $out',
-      command=('$cc -MMD -MF $out.d $defines $includes $cflags $cflags_objc '
-               '$cflags_pch_objc -c $in -o $out'),
-      depfile='$out.d')
-    master_ninja.rule(
-      'objcxx',
-      description='OBJCXX $out',
-      command=('$cxx -MMD -MF $out.d $defines $includes $cflags $cflags_objcc '
-               '$cflags_pch_objcc -c $in -o $out'),
-      depfile='$out.d')
-    master_ninja.rule(
-      'alink',
-      description='LIBTOOL-STATIC $out, POSTBUILDS',
-      command='rm -f $out && '
-              './gyp-mac-tool filter-libtool libtool $libtool_flags '
-              '-static -o $out $in'
-              '$postbuilds')
-
-    # Record the public interface of $lib in $lib.TOC. See the corresponding
-    # comment in the posix section above for details.
-    mtime_preserving_solink_base = (
-        'if [ ! -e $lib -o ! -e ${lib}.TOC ] || '
-             # Always force dependent targets to relink if this library
-             # reexports something. Handling this correctly would require
-             # recursive TOC dumping but this is rare in practice, so punt.
-             'otool -l $lib | grep -q LC_REEXPORT_DYLIB ; then '
-          '%(solink)s && %(extract_toc)s > ${lib}.TOC; '
-        'else '
-          '%(solink)s && %(extract_toc)s > ${lib}.tmp && '
-          'if ! cmp -s ${lib}.tmp ${lib}.TOC; then '
-            'mv ${lib}.tmp ${lib}.TOC ; '
-          'fi; '
-        'fi'
-        % { 'solink': '$ld -shared $ldflags -o $lib %(suffix)s',
-            'extract_toc':
-              '{ otool -l $lib | grep LC_ID_DYLIB -A 5; '
-              'nm -gP $lib | cut -f1-2 -d\' \' | grep -v U$$; true; }'})
-
-    # TODO(thakis): The solink_module rule is likely wrong. Xcode seems to pass
-    # -bundle -single_module here (for osmesa.so).
-    master_ninja.rule(
-      'solink',
-      description='SOLINK $lib, POSTBUILDS',
-      restat=True,
-      command=(mtime_preserving_solink_base % {
-          'suffix': '$in $solibs $libs$postbuilds'}))
-    master_ninja.rule(
-      'solink_module',
-      description='SOLINK(module) $lib, POSTBUILDS',
-      restat=True,
-      command=(mtime_preserving_solink_base % {
-          'suffix': '$in $solibs $libs$postbuilds'}))
-
-    master_ninja.rule(
-      'link',
-      description='LINK $out, POSTBUILDS',
-      command=('$ld $ldflags -o $out '
-               '$in $solibs $libs$postbuilds'))
-    master_ninja.rule(
-      'infoplist',
-      description='INFOPLIST $out',
-      command=('$cc -E -P -Wno-trigraphs -x c $defines $in -o $out && '
-               'plutil -convert xml1 $out $out'))
-    master_ninja.rule(
-      'mac_tool',
-      description='MACTOOL $mactool_cmd $in',
-      command='$env ./gyp-mac-tool $mactool_cmd $in $out')
-    master_ninja.rule(
-      'package_framework',
-      description='PACKAGE FRAMEWORK $out, POSTBUILDS',
-      command='./gyp-mac-tool package-framework $out $version$postbuilds '
-              '&& touch $out')
-  if flavor == 'win':
-    master_ninja.rule(
-      'stamp',
-      description='STAMP $out',
-      command='%s gyp-win-tool stamp $out' % sys.executable)
-    master_ninja.rule(
-      'copy',
-      description='COPY $in $out',
-      command='%s gyp-win-tool recursive-mirror $in $out' % sys.executable)
-  else:
-    master_ninja.rule(
-      'stamp',
-      description='STAMP $out',
-      command='${postbuilds}touch $out')
-    master_ninja.rule(
-      'copy',
-      description='COPY $in $out',
-      command='rm -rf $out && cp -af $in $out')
-  master_ninja.newline()
-
-  all_targets = set()
-  for build_file in params['build_files']:
-    for target in gyp.common.AllTargets(target_list,
-                                        target_dicts,
-                                        os.path.normpath(build_file)):
-      all_targets.add(target)
-  all_outputs = set()
-
-  # target_outputs is a map from qualified target name to a Target object.
-  target_outputs = {}
-  # target_short_names is a map from target short name to a list of Target
-  # objects.
-  target_short_names = {}
-  for qualified_target in target_list:
-    # qualified_target is like: third_party/icu/icu.gyp:icui18n#target
-    build_file, name, toolset = \
-        gyp.common.ParseQualifiedTarget(qualified_target)
-
-    this_make_global_settings = data[build_file].get('make_global_settings', [])
-    assert make_global_settings == this_make_global_settings, (
-        "make_global_settings needs to be the same for all targets.")
-
-    spec = target_dicts[qualified_target]
-    if flavor == 'mac':
-      gyp.xcode_emulation.MergeGlobalXcodeSettingsToSpec(data[build_file], spec)
-
-    build_file = gyp.common.RelativePath(build_file, options.toplevel_dir)
-
-    base_path = os.path.dirname(build_file)
-    obj = 'obj'
-    if toolset != 'target':
-      obj += '.' + toolset
-    output_file = os.path.join(obj, base_path, name + '.ninja')
-
-    abs_build_dir = os.path.abspath(toplevel_build)
-    writer = NinjaWriter(qualified_target, target_outputs, base_path, build_dir,
-                         OpenOutput(os.path.join(toplevel_build, output_file)),
-                         flavor, toplevel_dir=options.toplevel_dir)
-    master_ninja.subninja(output_file)
-
-    target = writer.WriteSpec(
-        spec, config_name, generator_flags, case_sensitive_filesystem)
-    if target:
-      if name != target.FinalOutput() and spec['toolset'] == 'target':
-        target_short_names.setdefault(name, []).append(target)
-      target_outputs[qualified_target] = target
-      if qualified_target in all_targets:
-        all_outputs.add(target.FinalOutput())
-
-  if target_short_names:
-    # Write a short name to build this target.  This benefits both the
-    # "build chrome" case as well as the gyp tests, which expect to be
-    # able to run actions and build libraries by their short name.
-    master_ninja.newline()
-    master_ninja.comment('Short names for targets.')
-    for short_name in target_short_names:
-      master_ninja.build(short_name, 'phony', [x.FinalOutput() for x in
-                                               target_short_names[short_name]])
-
-  if all_outputs:
-    master_ninja.newline()
-    master_ninja.build('all', 'phony', list(all_outputs))
-    master_ninja.default(generator_flags.get('default_target', 'all'))
-
-
-def PerformBuild(data, configurations, params):
-  options = params['options']
-  for config in configurations:
-    builddir = os.path.join(options.toplevel_dir, 'out', config)
-    arguments = ['ninja', '-C', builddir]
-    print 'Building [%s]: %s' % (config, arguments)
-    subprocess.check_call(arguments)
-
-
-def CallGenerateOutputForConfig(arglist):
-  # Ignore the interrupt signal so that the parent process catches it and
-  # kills all multiprocessing children.
-  signal.signal(signal.SIGINT, signal.SIG_IGN)
-
-  (target_list, target_dicts, data, params, config_name) = arglist
-  GenerateOutputForConfig(target_list, target_dicts, data, params, config_name)
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  user_config = params.get('generator_flags', {}).get('config', None)
-  if gyp.common.GetFlavor(params) == 'win':
-    target_list, target_dicts = MSVSUtil.ShardTargets(target_list, target_dicts)
-    target_list, target_dicts = MSVSUtil.InsertLargePdbShims(
-        target_list, target_dicts, generator_default_variables)
-
-  if user_config:
-    GenerateOutputForConfig(target_list, target_dicts, data, params,
-                            user_config)
-  else:
-    config_names = target_dicts[target_list[0]]['configurations'].keys()
-    if params['parallel']:
-      try:
-        pool = multiprocessing.Pool(len(config_names))
-        arglists = []
-        for config_name in config_names:
-          arglists.append(
-              (target_list, target_dicts, data, params, config_name))
-          pool.map(CallGenerateOutputForConfig, arglists)
-      except KeyboardInterrupt, e:
-        pool.terminate()
-        raise e
-    else:
-      for config_name in config_names:
-        GenerateOutputForConfig(target_list, target_dicts, data, params,
-                                config_name)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja_test.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,44 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-""" Unit tests for the ninja.py file. """
-
-import gyp.generator.ninja as ninja
-import unittest
-import StringIO
-import sys
-import TestCommon
-
-
-class TestPrefixesAndSuffixes(unittest.TestCase):
-  if sys.platform in ('win32', 'cygwin'):
-    def test_BinaryNamesWindows(self):
-      writer = ninja.NinjaWriter('foo', 'wee', '.', '.', 'ninja.build', 'win')
-      spec = { 'target_name': 'wee' }
-      self.assertTrue(writer.ComputeOutputFileName(spec, 'executable').
-          endswith('.exe'))
-      self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library').
-          endswith('.dll'))
-      self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library').
-          endswith('.lib'))
-
-  if sys.platform == 'linux2':
-    def test_BinaryNamesLinux(self):
-      writer = ninja.NinjaWriter('foo', 'wee', '.', '.', 'ninja.build', 'linux')
-      spec = { 'target_name': 'wee' }
-      self.assertTrue('.' not in writer.ComputeOutputFileName(spec,
-                                                              'executable'))
-      self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library').
-          startswith('lib'))
-      self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library').
-          startswith('lib'))
-      self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library').
-          endswith('.so'))
-      self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library').
-          endswith('.a'))
-
-if __name__ == '__main__':
-  unittest.main()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/scons.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1072 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import gyp
-import gyp.common
-import gyp.SCons as SCons
-import os.path
-import pprint
-import re
-import subprocess
-
-
-# TODO:  remove when we delete the last WriteList() call in this module
-WriteList = SCons.WriteList
-
-
-generator_default_variables = {
-    'EXECUTABLE_PREFIX': '',
-    'EXECUTABLE_SUFFIX': '',
-    'STATIC_LIB_PREFIX': '${LIBPREFIX}',
-    'SHARED_LIB_PREFIX': '${SHLIBPREFIX}',
-    'STATIC_LIB_SUFFIX': '${LIBSUFFIX}',
-    'SHARED_LIB_SUFFIX': '${SHLIBSUFFIX}',
-    'INTERMEDIATE_DIR': '${INTERMEDIATE_DIR}',
-    'SHARED_INTERMEDIATE_DIR': '${SHARED_INTERMEDIATE_DIR}',
-    'OS': 'linux',
-    'PRODUCT_DIR': '$TOP_BUILDDIR',
-    'SHARED_LIB_DIR': '$LIB_DIR',
-    'LIB_DIR': '$LIB_DIR',
-    'RULE_INPUT_ROOT': '${SOURCE.filebase}',
-    'RULE_INPUT_DIRNAME': '${SOURCE.dir}',
-    'RULE_INPUT_EXT': '${SOURCE.suffix}',
-    'RULE_INPUT_NAME': '${SOURCE.file}',
-    'RULE_INPUT_PATH': '${SOURCE.abspath}',
-    'CONFIGURATION_NAME': '${CONFIG_NAME}',
-}
-
-# Tell GYP how to process the input for us.
-generator_handles_variants = True
-generator_wants_absolute_build_file_paths = True
-
-
-def FixPath(path, prefix):
-  if not os.path.isabs(path) and not path[0] == '$':
-    path = prefix + path
-  return path
-
-
-header = """\
-# This file is generated; do not edit.
-"""
-
-
-_alias_template = """
-if GetOption('verbose'):
-  _action = Action([%(action)s])
-else:
-  _action = Action([%(action)s], %(message)s)
-_outputs = env.Alias(
-  ['_%(target_name)s_action'],
-  %(inputs)s,
-  _action
-)
-env.AlwaysBuild(_outputs)
-"""
-
-_run_as_template = """
-if GetOption('verbose'):
-  _action = Action([%(action)s])
-else:
-  _action = Action([%(action)s], %(message)s)
-"""
-
-_run_as_template_suffix = """
-_run_as_target = env.Alias('run_%(target_name)s', target_files, _action)
-env.Requires(_run_as_target, [
-    Alias('%(target_name)s'),
-])
-env.AlwaysBuild(_run_as_target)
-"""
-
-_command_template = """
-if GetOption('verbose'):
-  _action = Action([%(action)s])
-else:
-  _action = Action([%(action)s], %(message)s)
-_outputs = env.Command(
-  %(outputs)s,
-  %(inputs)s,
-  _action
-)
-"""
-
-# This is copied from the default SCons action, updated to handle symlinks.
-_copy_action_template = """
-import shutil
-import SCons.Action
-
-def _copy_files_or_dirs_or_symlinks(dest, src):
-  SCons.Node.FS.invalidate_node_memos(dest)
-  if SCons.Util.is_List(src) and os.path.isdir(dest):
-    for file in src:
-      shutil.copy2(file, dest)
-    return 0
-  elif os.path.islink(src):
-    linkto = os.readlink(src)
-    os.symlink(linkto, dest)
-    return 0
-  elif os.path.isfile(src):
-    return shutil.copy2(src, dest)
-  else:
-    return shutil.copytree(src, dest, 1)
-
-def _copy_files_or_dirs_or_symlinks_str(dest, src):
-  return 'Copying %s to %s ...' % (src, dest)
-
-GYPCopy = SCons.Action.ActionFactory(_copy_files_or_dirs_or_symlinks,
-                                     _copy_files_or_dirs_or_symlinks_str,
-                                     convert=str)
-"""
-
-_rule_template = """
-%(name)s_additional_inputs = %(inputs)s
-%(name)s_outputs = %(outputs)s
-def %(name)s_emitter(target, source, env):
-  return (%(name)s_outputs, source + %(name)s_additional_inputs)
-if GetOption('verbose'):
-  %(name)s_action = Action([%(action)s])
-else:
-  %(name)s_action = Action([%(action)s], %(message)s)
-env['BUILDERS']['%(name)s'] = Builder(action=%(name)s_action,
-                                      emitter=%(name)s_emitter)
-
-_outputs = []
-_processed_input_files = []
-for infile in input_files:
-  if (type(infile) == type('')
-      and not os.path.isabs(infile)
-      and not infile[0] == '$'):
-    infile = %(src_dir)r + infile
-  if str(infile).endswith('.%(extension)s'):
-    _generated = env.%(name)s(infile)
-    env.Precious(_generated)
-    _outputs.append(_generated)
-    %(process_outputs_as_sources_line)s
-  else:
-    _processed_input_files.append(infile)
-prerequisites.extend(_outputs)
-input_files = _processed_input_files
-"""
-
-_spawn_hack = """
-import re
-import SCons.Platform.posix
-needs_shell = re.compile('["\\'><!^&]')
-def gyp_spawn(sh, escape, cmd, args, env):
-  def strip_scons_quotes(arg):
-    if arg[0] == '"' and arg[-1] == '"':
-      return arg[1:-1]
-    return arg
-  stripped_args = [strip_scons_quotes(a) for a in args]
-  if needs_shell.search(' '.join(stripped_args)):
-    return SCons.Platform.posix.exec_spawnvpe([sh, '-c', ' '.join(args)], env)
-  else:
-    return SCons.Platform.posix.exec_spawnvpe(stripped_args, env)
-"""
-
-
-def EscapeShellArgument(s):
-  """Quotes an argument so that it will be interpreted literally by a POSIX
-     shell. Taken from
-     http://stackoverflow.com/questions/35817/whats-the-best-way-to-escape-ossystem-calls-in-python
-     """
-  return "'" + s.replace("'", "'\\''") + "'"
-
-
-def InvertNaiveSConsQuoting(s):
-  """SCons tries to "help" with quoting by naively putting double-quotes around
-     command-line arguments containing space or tab, which is broken for all
-     but trivial cases, so we undo it. (See quote_spaces() in Subst.py)"""
-  if ' ' in s or '\t' in s:
-    # Then SCons will put double-quotes around this, so add our own quotes
-    # to close its quotes at the beginning and end.
-    s = '"' + s + '"'
-  return s
-
-
-def EscapeSConsVariableExpansion(s):
-  """SCons has its own variable expansion syntax using $. We must escape it for
-    strings to be interpreted literally. For some reason this requires four
-    dollar signs, not two, even without the shell involved."""
-  return s.replace('$', '$$$$')
-
-
-def EscapeCppDefine(s):
-  """Escapes a CPP define so that it will reach the compiler unaltered."""
-  s = EscapeShellArgument(s)
-  s = InvertNaiveSConsQuoting(s)
-  s = EscapeSConsVariableExpansion(s)
-  return s
-
-
-def GenerateConfig(fp, config, indent='', src_dir=''):
-  """
-  Generates SCons dictionary items for a gyp configuration.
-
-  This provides the main translation between the (lower-case) gyp settings
-  keywords and the (upper-case) SCons construction variables.
-  """
-  var_mapping = {
-      'ASFLAGS' : 'asflags',
-      'CCFLAGS' : 'cflags',
-      'CFLAGS' : 'cflags_c',
-      'CXXFLAGS' : 'cflags_cc',
-      'CPPDEFINES' : 'defines',
-      'CPPPATH' : 'include_dirs',
-      # Add the ldflags value to $LINKFLAGS, but not $SHLINKFLAGS.
-      # SCons defines $SHLINKFLAGS to incorporate $LINKFLAGS, so
-      # listing both here would case 'ldflags' to get appended to
-      # both, and then have it show up twice on the command line.
-      'LINKFLAGS' : 'ldflags',
-  }
-  postamble='\n%s],\n' % indent
-  for scons_var in sorted(var_mapping.keys()):
-      gyp_var = var_mapping[scons_var]
-      value = config.get(gyp_var)
-      if value:
-        if gyp_var in ('defines',):
-          value = [EscapeCppDefine(v) for v in value]
-        if gyp_var in ('include_dirs',):
-          if src_dir and not src_dir.endswith('/'):
-            src_dir += '/'
-          result = []
-          for v in value:
-            v = FixPath(v, src_dir)
-            # Force SCons to evaluate the CPPPATH directories at
-            # SConscript-read time, so delayed evaluation of $SRC_DIR
-            # doesn't point it to the --generator-output= directory.
-            result.append('env.Dir(%r)' % v)
-          value = result
-        else:
-          value = map(repr, value)
-        WriteList(fp,
-                  value,
-                  prefix=indent,
-                  preamble='%s%s = [\n    ' % (indent, scons_var),
-                  postamble=postamble)
-
-
-def GenerateSConscript(output_filename, spec, build_file, build_file_data):
-  """
-  Generates a SConscript file for a specific target.
-
-  This generates a SConscript file suitable for building any or all of
-  the target's configurations.
-
-  A SConscript file may be called multiple times to generate targets for
-  multiple configurations.  Consequently, it needs to be ready to build
-  the target for any requested configuration, and therefore contains
-  information about the settings for all configurations (generated into
-  the SConscript file at gyp configuration time) as well as logic for
-  selecting (at SCons build time) the specific configuration being built.
-
-  The general outline of a generated SConscript file is:
-
-    --  Header
-
-    --  Import 'env'.  This contains a $CONFIG_NAME construction
-        variable that specifies what configuration to build
-        (e.g. Debug, Release).
-
-    --  Configurations.  This is a dictionary with settings for
-        the different configurations (Debug, Release) under which this
-        target can be built.  The values in the dictionary are themselves
-        dictionaries specifying what construction variables should added
-        to the local copy of the imported construction environment
-        (Append), should be removed (FilterOut), and should outright
-        replace the imported values (Replace).
-
-    --  Clone the imported construction environment and update
-        with the proper configuration settings.
-
-    --  Initialize the lists of the targets' input files and prerequisites.
-
-    --  Target-specific actions and rules.  These come after the
-        input file and prerequisite initializations because the
-        outputs of the actions and rules may affect the input file
-        list (process_outputs_as_sources) and get added to the list of
-        prerequisites (so that they're guaranteed to be executed before
-        building the target).
-
-    --  Call the Builder for the target itself.
-
-    --  Arrange for any copies to be made into installation directories.
-
-    --  Set up the {name} Alias (phony Node) for the target as the
-        primary handle for building all of the target's pieces.
-
-    --  Use env.Require() to make sure the prerequisites (explicitly
-        specified, but also including the actions and rules) are built
-        before the target itself.
-
-    --  Return the {name} Alias to the calling SConstruct file
-        so it can be added to the list of default targets.
-  """
-  scons_target = SCons.Target(spec)
-
-  gyp_dir = os.path.dirname(output_filename)
-  if not gyp_dir:
-      gyp_dir = '.'
-  gyp_dir = os.path.abspath(gyp_dir)
-
-  output_dir = os.path.dirname(output_filename)
-  src_dir = build_file_data['_DEPTH']
-  src_dir_rel = gyp.common.RelativePath(src_dir, output_dir)
-  subdir = gyp.common.RelativePath(os.path.dirname(build_file), src_dir)
-  src_subdir = '$SRC_DIR/' + subdir
-  src_subdir_ = src_subdir + '/'
-
-  component_name = os.path.splitext(os.path.basename(build_file))[0]
-  target_name = spec['target_name']
-
-  if not os.path.exists(gyp_dir):
-    os.makedirs(gyp_dir)
-  fp = open(output_filename, 'w')
-  fp.write(header)
-
-  fp.write('\nimport os\n')
-  fp.write('\nImport("env")\n')
-
-  #
-  fp.write('\n')
-  fp.write('env = env.Clone(COMPONENT_NAME=%s,\n' % repr(component_name))
-  fp.write('                TARGET_NAME=%s)\n' % repr(target_name))
-
-  #
-  for config in spec['configurations'].itervalues():
-    if config.get('scons_line_length'):
-      fp.write(_spawn_hack)
-      break
-
-  #
-  indent = ' ' * 12
-  fp.write('\n')
-  fp.write('configurations = {\n')
-  for config_name, config in spec['configurations'].iteritems():
-    fp.write('    \'%s\' : {\n' % config_name)
-
-    fp.write('        \'Append\' : dict(\n')
-    GenerateConfig(fp, config, indent, src_subdir)
-    libraries = spec.get('libraries')
-    if libraries:
-      WriteList(fp,
-                map(repr, libraries),
-                prefix=indent,
-                preamble='%sLIBS = [\n    ' % indent,
-                postamble='\n%s],\n' % indent)
-    fp.write('        ),\n')
-
-    fp.write('        \'FilterOut\' : dict(\n' )
-    for key, var in config.get('scons_remove', {}).iteritems():
-      fp.write('             %s = %s,\n' % (key, repr(var)))
-    fp.write('        ),\n')
-
-    fp.write('        \'Replace\' : dict(\n' )
-    scons_settings = config.get('scons_variable_settings', {})
-    for key in sorted(scons_settings.keys()):
-      val = pprint.pformat(scons_settings[key])
-      fp.write('             %s = %s,\n' % (key, val))
-    if 'c++' in spec.get('link_languages', []):
-      fp.write('             %s = %s,\n' % ('LINK', repr('$CXX')))
-    if config.get('scons_line_length'):
-      fp.write('             SPAWN = gyp_spawn,\n')
-    fp.write('        ),\n')
-
-    fp.write('        \'ImportExternal\' : [\n' )
-    for var in config.get('scons_import_variables', []):
-      fp.write('             %s,\n' % repr(var))
-    fp.write('        ],\n')
-
-    fp.write('        \'PropagateExternal\' : [\n' )
-    for var in config.get('scons_propagate_variables', []):
-      fp.write('             %s,\n' % repr(var))
-    fp.write('        ],\n')
-
-    fp.write('    },\n')
-  fp.write('}\n')
-
-  fp.write('\n'
-           'config = configurations[env[\'CONFIG_NAME\']]\n'
-           'env.Append(**config[\'Append\'])\n'
-           'env.FilterOut(**config[\'FilterOut\'])\n'
-           'env.Replace(**config[\'Replace\'])\n')
-
-  fp.write('\n'
-           '# Scons forces -fPIC for SHCCFLAGS on some platforms.\n'
-           '# Disable that so we can control it from cflags in gyp.\n'
-           '# Note that Scons itself is inconsistent with its -fPIC\n'
-           '# setting. SHCCFLAGS forces -fPIC, and SHCFLAGS does not.\n'
-           '# This will make SHCCFLAGS consistent with SHCFLAGS.\n'
-           'env[\'SHCCFLAGS\'] = [\'$CCFLAGS\']\n')
-
-  fp.write('\n'
-           'for _var in config[\'ImportExternal\']:\n'
-           '  if _var in ARGUMENTS:\n'
-           '    env[_var] = ARGUMENTS[_var]\n'
-           '  elif _var in os.environ:\n'
-           '    env[_var] = os.environ[_var]\n'
-           'for _var in config[\'PropagateExternal\']:\n'
-           '  if _var in ARGUMENTS:\n'
-           '    env[_var] = ARGUMENTS[_var]\n'
-           '  elif _var in os.environ:\n'
-           '    env[\'ENV\'][_var] = os.environ[_var]\n')
-
-  fp.write('\n'
-           "env['ENV']['LD_LIBRARY_PATH'] = env.subst('$LIB_DIR')\n")
-
-  #
-  #fp.write("\nif env.has_key('CPPPATH'):\n")
-  #fp.write("  env['CPPPATH'] = map(env.Dir, env['CPPPATH'])\n")
-
-  variants = spec.get('variants', {})
-  for setting in sorted(variants.keys()):
-    if_fmt = 'if ARGUMENTS.get(%s) not in (None, \'0\'):\n'
-    fp.write('\n')
-    fp.write(if_fmt % repr(setting.upper()))
-    fp.write('  env.AppendUnique(\n')
-    GenerateConfig(fp, variants[setting], indent, src_subdir)
-    fp.write('  )\n')
-
-  #
-  scons_target.write_input_files(fp)
-
-  fp.write('\n')
-  fp.write('target_files = []\n')
-  prerequisites = spec.get('scons_prerequisites', [])
-  fp.write('prerequisites = %s\n' % pprint.pformat(prerequisites))
-
-  actions = spec.get('actions', [])
-  for action in actions:
-    a = ['cd', src_subdir, '&&'] + action['action']
-    message = action.get('message')
-    if message:
-      message = repr(message)
-    inputs = [FixPath(f, src_subdir_) for f in action.get('inputs', [])]
-    outputs = [FixPath(f, src_subdir_) for f in action.get('outputs', [])]
-    if outputs:
-      template = _command_template
-    else:
-      template = _alias_template
-    fp.write(template % {
-                 'inputs' : pprint.pformat(inputs),
-                 'outputs' : pprint.pformat(outputs),
-                 'action' : pprint.pformat(a),
-                 'message' : message,
-                 'target_name': target_name,
-             })
-    if int(action.get('process_outputs_as_sources', 0)):
-      fp.write('input_files.extend(_outputs)\n')
-    fp.write('prerequisites.extend(_outputs)\n')
-    fp.write('target_files.extend(_outputs)\n')
-
-  rules = spec.get('rules', [])
-  for rule in rules:
-    name = re.sub('[^a-zA-Z0-9_]', '_', rule['rule_name'])
-    message = rule.get('message')
-    if message:
-        message = repr(message)
-    if int(rule.get('process_outputs_as_sources', 0)):
-      poas_line = '_processed_input_files.extend(_generated)'
-    else:
-      poas_line = '_processed_input_files.append(infile)'
-    inputs = [FixPath(f, src_subdir_) for f in rule.get('inputs', [])]
-    outputs = [FixPath(f, src_subdir_) for f in rule.get('outputs', [])]
-    # Skip a rule with no action and no inputs.
-    if 'action' not in rule and not rule.get('rule_sources', []):
-      continue
-    a = ['cd', src_subdir, '&&'] + rule['action']
-    fp.write(_rule_template % {
-                 'inputs' : pprint.pformat(inputs),
-                 'outputs' : pprint.pformat(outputs),
-                 'action' : pprint.pformat(a),
-                 'extension' : rule['extension'],
-                 'name' : name,
-                 'message' : message,
-                 'process_outputs_as_sources_line' : poas_line,
-                 'src_dir' : src_subdir_,
-             })
-
-  scons_target.write_target(fp, src_subdir)
-
-  copies = spec.get('copies', [])
-  if copies:
-    fp.write(_copy_action_template)
-  for copy in copies:
-    destdir = None
-    files = None
-    try:
-      destdir = copy['destination']
-    except KeyError, e:
-      gyp.common.ExceptionAppend(
-        e,
-        "Required 'destination' key missing for 'copies' in %s." % build_file)
-      raise
-    try:
-      files = copy['files']
-    except KeyError, e:
-      gyp.common.ExceptionAppend(
-        e, "Required 'files' key missing for 'copies' in %s." % build_file)
-      raise
-    if not files:
-      # TODO:  should probably add a (suppressible) warning;
-      # a null file list may be unintentional.
-      continue
-    if not destdir:
-      raise Exception(
-        "Required 'destination' key is empty for 'copies' in %s." % build_file)
-
-    fmt = ('\n'
-           '_outputs = env.Command(%s,\n'
-           '    %s,\n'
-           '    GYPCopy(\'$TARGET\', \'$SOURCE\'))\n')
-    for f in copy['files']:
-      # Remove trailing separators so basename() acts like Unix basename and
-      # always returns the last element, whether a file or dir. Without this,
-      # only the contents, not the directory itself, are copied (and nothing
-      # might be copied if dest already exists, since scons thinks nothing needs
-      # to be done).
-      dest = os.path.join(destdir, os.path.basename(f.rstrip(os.sep)))
-      f = FixPath(f, src_subdir_)
-      dest = FixPath(dest, src_subdir_)
-      fp.write(fmt % (repr(dest), repr(f)))
-      fp.write('target_files.extend(_outputs)\n')
-
-  run_as = spec.get('run_as')
-  if run_as:
-    action = run_as.get('action', [])
-    working_directory = run_as.get('working_directory')
-    if not working_directory:
-      working_directory = gyp_dir
-    else:
-      if not os.path.isabs(working_directory):
-        working_directory = os.path.normpath(os.path.join(gyp_dir,
-                                                          working_directory))
-    if run_as.get('environment'):
-      for (key, val) in run_as.get('environment').iteritems():
-        action = ['%s="%s"' % (key, val)] + action
-    action = ['cd', '"%s"' % working_directory, '&&'] + action
-    fp.write(_run_as_template % {
-      'action' : pprint.pformat(action),
-      'message' : run_as.get('message', ''),
-    })
-
-  fmt = "\ngyp_target = env.Alias('%s', target_files)\n"
-  fp.write(fmt % target_name)
-
-  dependencies = spec.get('scons_dependencies', [])
-  if dependencies:
-    WriteList(fp, dependencies, preamble='dependencies = [\n    ',
-                                postamble='\n]\n')
-    fp.write('env.Requires(target_files, dependencies)\n')
-    fp.write('env.Requires(gyp_target, dependencies)\n')
-    fp.write('for prerequisite in prerequisites:\n')
-    fp.write('  env.Requires(prerequisite, dependencies)\n')
-  fp.write('env.Requires(gyp_target, prerequisites)\n')
-
-  if run_as:
-    fp.write(_run_as_template_suffix % {
-      'target_name': target_name,
-    })
-
-  fp.write('Return("gyp_target")\n')
-
-  fp.close()
-
-
-#############################################################################
-# TEMPLATE BEGIN
-
-_wrapper_template = """\
-
-__doc__ = '''
-Wrapper configuration for building this entire "solution,"
-including all the specific targets in various *.scons files.
-'''
-
-import os
-import sys
-
-import SCons.Environment
-import SCons.Util
-
-def GetProcessorCount():
-  '''
-  Detects the number of CPUs on the system. Adapted form:
-  http://codeliberates.blogspot.com/2008/05/detecting-cpuscores-in-python.html
-  '''
-  # Linux, Unix and Mac OS X:
-  if hasattr(os, 'sysconf'):
-    if os.sysconf_names.has_key('SC_NPROCESSORS_ONLN'):
-      # Linux and Unix or Mac OS X with python >= 2.5:
-      return os.sysconf('SC_NPROCESSORS_ONLN')
-    else:  # Mac OS X with Python < 2.5:
-      return int(os.popen2("sysctl -n hw.ncpu")[1].read())
-  # Windows:
-  if os.environ.has_key('NUMBER_OF_PROCESSORS'):
-    return max(int(os.environ.get('NUMBER_OF_PROCESSORS', '1')), 1)
-  return 1  # Default
-
-# Support PROGRESS= to show progress in different ways.
-p = ARGUMENTS.get('PROGRESS')
-if p == 'spinner':
-  Progress(['/\\r', '|\\r', '\\\\\\r', '-\\r'],
-           interval=5,
-           file=open('/dev/tty', 'w'))
-elif p == 'name':
-  Progress('$TARGET\\r', overwrite=True, file=open('/dev/tty', 'w'))
-
-# Set the default -j value based on the number of processors.
-SetOption('num_jobs', GetProcessorCount() + 1)
-
-# Have SCons use its cached dependency information.
-SetOption('implicit_cache', 1)
-
-# Only re-calculate MD5 checksums if a timestamp has changed.
-Decider('MD5-timestamp')
-
-# Since we set the -j value by default, suppress SCons warnings about being
-# unable to support parallel build on versions of Python with no threading.
-default_warnings = ['no-no-parallel-support']
-SetOption('warn', default_warnings + GetOption('warn'))
-
-AddOption('--mode', nargs=1, dest='conf_list', default=[],
-          action='append', help='Configuration to build.')
-
-AddOption('--verbose', dest='verbose', default=False,
-          action='store_true', help='Verbose command-line output.')
-
-
-#
-sconscript_file_map = %(sconscript_files)s
-
-class LoadTarget:
-  '''
-  Class for deciding if a given target sconscript is to be included
-  based on a list of included target names, optionally prefixed with '-'
-  to exclude a target name.
-  '''
-  def __init__(self, load):
-    '''
-    Initialize a class with a list of names for possible loading.
-
-    Arguments:
-      load:  list of elements in the LOAD= specification
-    '''
-    self.included = set([c for c in load if not c.startswith('-')])
-    self.excluded = set([c[1:] for c in load if c.startswith('-')])
-
-    if not self.included:
-      self.included = set(['all'])
-
-  def __call__(self, target):
-    '''
-    Returns True if the specified target's sconscript file should be
-    loaded, based on the initialized included and excluded lists.
-    '''
-    return (target in self.included or
-            ('all' in self.included and not target in self.excluded))
-
-if 'LOAD' in ARGUMENTS:
-  load = ARGUMENTS['LOAD'].split(',')
-else:
-  load = []
-load_target = LoadTarget(load)
-
-sconscript_files = []
-for target, sconscript in sconscript_file_map.iteritems():
-  if load_target(target):
-    sconscript_files.append(sconscript)
-
-
-target_alias_list= []
-
-conf_list = GetOption('conf_list')
-if conf_list:
-    # In case the same --mode= value was specified multiple times.
-    conf_list = list(set(conf_list))
-else:
-    conf_list = [%(default_configuration)r]
-
-sconsbuild_dir = Dir(%(sconsbuild_dir)s)
-
-
-def FilterOut(self, **kw):
-  kw = SCons.Environment.copy_non_reserved_keywords(kw)
-  for key, val in kw.items():
-    envval = self.get(key, None)
-    if envval is None:
-      # No existing variable in the environment, so nothing to delete.
-      continue
-
-    for vremove in val:
-      # Use while not if, so we can handle duplicates.
-      while vremove in envval:
-        envval.remove(vremove)
-
-    self[key] = envval
-
-    # TODO(sgk): SCons.Environment.Append() has much more logic to deal
-    # with various types of values.  We should handle all those cases in here
-    # too.  (If variable is a dict, etc.)
-
-
-non_compilable_suffixes = {
-    'LINUX' : set([
-        '.bdic',
-        '.css',
-        '.dat',
-        '.fragment',
-        '.gperf',
-        '.h',
-        '.hh',
-        '.hpp',
-        '.html',
-        '.hxx',
-        '.idl',
-        '.in',
-        '.in0',
-        '.in1',
-        '.js',
-        '.mk',
-        '.rc',
-        '.sigs',
-        '',
-    ]),
-    'WINDOWS' : set([
-        '.h',
-        '.hh',
-        '.hpp',
-        '.dat',
-        '.idl',
-        '.in',
-        '.in0',
-        '.in1',
-    ]),
-}
-
-def compilable(env, file):
-  base, ext = os.path.splitext(str(file))
-  if ext in non_compilable_suffixes[env['TARGET_PLATFORM']]:
-    return False
-  return True
-
-def compilable_files(env, sources):
-  return [x for x in sources if compilable(env, x)]
-
-def GypProgram(env, target, source, *args, **kw):
-  source = compilable_files(env, source)
-  result = env.Program(target, source, *args, **kw)
-  if env.get('INCREMENTAL'):
-    env.Precious(result)
-  return result
-
-def GypTestProgram(env, target, source, *args, **kw):
-  source = compilable_files(env, source)
-  result = env.Program(target, source, *args, **kw)
-  if env.get('INCREMENTAL'):
-    env.Precious(*result)
-  return result
-
-def GypLibrary(env, target, source, *args, **kw):
-  source = compilable_files(env, source)
-  result = env.Library(target, source, *args, **kw)
-  return result
-
-def GypLoadableModule(env, target, source, *args, **kw):
-  source = compilable_files(env, source)
-  result = env.LoadableModule(target, source, *args, **kw)
-  return result
-
-def GypStaticLibrary(env, target, source, *args, **kw):
-  source = compilable_files(env, source)
-  result = env.StaticLibrary(target, source, *args, **kw)
-  return result
-
-def GypSharedLibrary(env, target, source, *args, **kw):
-  source = compilable_files(env, source)
-  result = env.SharedLibrary(target, source, *args, **kw)
-  if env.get('INCREMENTAL'):
-    env.Precious(result)
-  return result
-
-def add_gyp_methods(env):
-  env.AddMethod(GypProgram)
-  env.AddMethod(GypTestProgram)
-  env.AddMethod(GypLibrary)
-  env.AddMethod(GypLoadableModule)
-  env.AddMethod(GypStaticLibrary)
-  env.AddMethod(GypSharedLibrary)
-
-  env.AddMethod(FilterOut)
-
-  env.AddMethod(compilable)
-
-
-base_env = Environment(
-    tools = %(scons_tools)s,
-    INTERMEDIATE_DIR='$OBJ_DIR/${COMPONENT_NAME}/_${TARGET_NAME}_intermediate',
-    LIB_DIR='$TOP_BUILDDIR/lib',
-    OBJ_DIR='$TOP_BUILDDIR/obj',
-    SCONSBUILD_DIR=sconsbuild_dir.abspath,
-    SHARED_INTERMEDIATE_DIR='$OBJ_DIR/_global_intermediate',
-    SRC_DIR=Dir(%(src_dir)r),
-    TARGET_PLATFORM='LINUX',
-    TOP_BUILDDIR='$SCONSBUILD_DIR/$CONFIG_NAME',
-    LIBPATH=['$LIB_DIR'],
-)
-
-if not GetOption('verbose'):
-  base_env.SetDefault(
-      ARCOMSTR='Creating library $TARGET',
-      ASCOMSTR='Assembling $TARGET',
-      CCCOMSTR='Compiling $TARGET',
-      CONCATSOURCECOMSTR='ConcatSource $TARGET',
-      CXXCOMSTR='Compiling $TARGET',
-      LDMODULECOMSTR='Building loadable module $TARGET',
-      LINKCOMSTR='Linking $TARGET',
-      MANIFESTCOMSTR='Updating manifest for $TARGET',
-      MIDLCOMSTR='Compiling IDL $TARGET',
-      PCHCOMSTR='Precompiling $TARGET',
-      RANLIBCOMSTR='Indexing $TARGET',
-      RCCOMSTR='Compiling resource $TARGET',
-      SHCCCOMSTR='Compiling $TARGET',
-      SHCXXCOMSTR='Compiling $TARGET',
-      SHLINKCOMSTR='Linking $TARGET',
-      SHMANIFESTCOMSTR='Updating manifest for $TARGET',
-  )
-
-add_gyp_methods(base_env)
-
-for conf in conf_list:
-  env = base_env.Clone(CONFIG_NAME=conf)
-  SConsignFile(env.File('$TOP_BUILDDIR/.sconsign').abspath)
-  for sconscript in sconscript_files:
-    target_alias = env.SConscript(sconscript, exports=['env'])
-    if target_alias:
-      target_alias_list.extend(target_alias)
-
-Default(Alias('all', target_alias_list))
-
-help_fmt = '''
-Usage: hammer [SCONS_OPTIONS] [VARIABLES] [TARGET] ...
-
-Local command-line build options:
-  --mode=CONFIG             Configuration to build:
-                              --mode=Debug [default]
-                              --mode=Release
-  --verbose                 Print actual executed command lines.
-
-Supported command-line build variables:
-  LOAD=[module,...]         Comma-separated list of components to load in the
-                              dependency graph ('-' prefix excludes)
-  PROGRESS=type             Display a progress indicator:
-                              name:  print each evaluated target name
-                              spinner:  print a spinner every 5 targets
-
-The following TARGET names can also be used as LOAD= module names:
-
-%%s
-'''
-
-if GetOption('help'):
-  def columnar_text(items, width=78, indent=2, sep=2):
-    result = []
-    colwidth = max(map(len, items)) + sep
-    cols = (width - indent) / colwidth
-    if cols < 1:
-      cols = 1
-    rows = (len(items) + cols - 1) / cols
-    indent = '%%*s' %% (indent, '')
-    sep = indent
-    for row in xrange(0, rows):
-      result.append(sep)
-      for i in xrange(row, len(items), rows):
-        result.append('%%-*s' %% (colwidth, items[i]))
-      sep = '\\n' + indent
-    result.append('\\n')
-    return ''.join(result)
-
-  load_list = set(sconscript_file_map.keys())
-  target_aliases = set(map(str, target_alias_list))
-
-  common = load_list and target_aliases
-  load_only = load_list - common
-  target_only = target_aliases - common
-  help_text = [help_fmt %% columnar_text(sorted(list(common)))]
-  if target_only:
-    fmt = "The following are additional TARGET names:\\n\\n%%s\\n"
-    help_text.append(fmt %% columnar_text(sorted(list(target_only))))
-  if load_only:
-    fmt = "The following are additional LOAD= module names:\\n\\n%%s\\n"
-    help_text.append(fmt %% columnar_text(sorted(list(load_only))))
-  Help(''.join(help_text))
-"""
-
-# TEMPLATE END
-#############################################################################
-
-
-def GenerateSConscriptWrapper(build_file, build_file_data, name,
-                              output_filename, sconscript_files,
-                              default_configuration):
-  """
-  Generates the "wrapper" SConscript file (analogous to the Visual Studio
-  solution) that calls all the individual target SConscript files.
-  """
-  output_dir = os.path.dirname(output_filename)
-  src_dir = build_file_data['_DEPTH']
-  src_dir_rel = gyp.common.RelativePath(src_dir, output_dir)
-  if not src_dir_rel:
-    src_dir_rel = '.'
-  scons_settings = build_file_data.get('scons_settings', {})
-  sconsbuild_dir = scons_settings.get('sconsbuild_dir', '#')
-  scons_tools = scons_settings.get('tools', ['default'])
-
-  sconscript_file_lines = ['dict(']
-  for target in sorted(sconscript_files.keys()):
-    sconscript = sconscript_files[target]
-    sconscript_file_lines.append('    %s = %r,' % (target, sconscript))
-  sconscript_file_lines.append(')')
-
-  fp = open(output_filename, 'w')
-  fp.write(header)
-  fp.write(_wrapper_template % {
-               'default_configuration' : default_configuration,
-               'name' : name,
-               'scons_tools' : repr(scons_tools),
-               'sconsbuild_dir' : repr(sconsbuild_dir),
-               'sconscript_files' : '\n'.join(sconscript_file_lines),
-               'src_dir' : src_dir_rel,
-           })
-  fp.close()
-
-  # Generate the SConstruct file that invokes the wrapper SConscript.
-  dir, fname = os.path.split(output_filename)
-  SConstruct = os.path.join(dir, 'SConstruct')
-  fp = open(SConstruct, 'w')
-  fp.write(header)
-  fp.write('SConscript(%s)\n' % repr(fname))
-  fp.close()
-
-
-def TargetFilename(target, build_file=None, output_suffix=''):
-  """Returns the .scons file name for the specified target.
-  """
-  if build_file is None:
-    build_file, target = gyp.common.ParseQualifiedTarget(target)[:2]
-  output_file = os.path.join(os.path.dirname(build_file),
-                             target + output_suffix + '.scons')
-  return output_file
-
-
-def PerformBuild(data, configurations, params):
-  options = params['options']
-
-  # Due to the way we test gyp on the chromium typbots
-  # we need to look for 'scons.py' as well as the more common 'scons'
-  # TODO(sbc): update the trybots to have a more normal install
-  # of scons.
-  scons = 'scons'
-  paths = os.environ['PATH'].split(os.pathsep)
-  for scons_name in ['scons', 'scons.py']:
-    for path in paths:
-      test_scons = os.path.join(path, scons_name)
-      print 'looking for: %s' % test_scons
-      if os.path.exists(test_scons):
-        print "found scons: %s" % scons
-        scons = test_scons
-        break
-
-  for config in configurations:
-    arguments = [scons, '-C', options.toplevel_dir, '--mode=%s' % config]
-    print "Building [%s]: %s" % (config, arguments)
-    subprocess.check_call(arguments)
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  """
-  Generates all the output files for the specified targets.
-  """
-  options = params['options']
-
-  if options.generator_output:
-    def output_path(filename):
-      return filename.replace(params['cwd'], options.generator_output)
-  else:
-    def output_path(filename):
-      return filename
-
-  default_configuration = None
-
-  for qualified_target in target_list:
-    spec = target_dicts[qualified_target]
-    if spec['toolset'] != 'target':
-      raise Exception(
-          'Multiple toolsets not supported in scons build (target %s)' %
-          qualified_target)
-    scons_target = SCons.Target(spec)
-    if scons_target.is_ignored:
-      continue
-
-    # TODO:  assumes the default_configuration of the first target
-    # non-Default target is the correct default for all targets.
-    # Need a better model for handle variation between targets.
-    if (not default_configuration and
-        spec['default_configuration'] != 'Default'):
-      default_configuration = spec['default_configuration']
-
-    build_file, target = gyp.common.ParseQualifiedTarget(qualified_target)[:2]
-    output_file = TargetFilename(target, build_file, options.suffix)
-    if options.generator_output:
-      output_file = output_path(output_file)
-
-    if not spec.has_key('libraries'):
-      spec['libraries'] = []
-
-    # Add dependent static library targets to the 'libraries' value.
-    deps = spec.get('dependencies', [])
-    spec['scons_dependencies'] = []
-    for d in deps:
-      td = target_dicts[d]
-      target_name = td['target_name']
-      spec['scons_dependencies'].append("Alias('%s')" % target_name)
-      if td['type'] in ('static_library', 'shared_library'):
-        libname = td.get('product_name', target_name)
-        spec['libraries'].append('lib' + libname)
-      if td['type'] == 'loadable_module':
-        prereqs = spec.get('scons_prerequisites', [])
-        # TODO:  parameterize with <(SHARED_LIBRARY_*) variables?
-        td_target = SCons.Target(td)
-        td_target.target_prefix = '${SHLIBPREFIX}'
-        td_target.target_suffix = '${SHLIBSUFFIX}'
-
-    GenerateSConscript(output_file, spec, build_file, data[build_file])
-
-  if not default_configuration:
-    default_configuration = 'Default'
-
-  for build_file in sorted(data.keys()):
-    path, ext = os.path.splitext(build_file)
-    if ext != '.gyp':
-      continue
-    output_dir, basename = os.path.split(path)
-    output_filename  = path + '_main' + options.suffix + '.scons'
-
-    all_targets = gyp.common.AllTargets(target_list, target_dicts, build_file)
-    sconscript_files = {}
-    for t in all_targets:
-      scons_target = SCons.Target(target_dicts[t])
-      if scons_target.is_ignored:
-        continue
-      bf, target = gyp.common.ParseQualifiedTarget(t)[:2]
-      target_filename = TargetFilename(target, bf, options.suffix)
-      tpath = gyp.common.RelativePath(target_filename, output_dir)
-      sconscript_files[target] = tpath
-
-    output_filename = output_path(output_filename)
-    if sconscript_files:
-      GenerateSConscriptWrapper(build_file, data[build_file], basename,
-                                output_filename, sconscript_files,
-                                default_configuration)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1239 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import filecmp
-import gyp.common
-import gyp.xcodeproj_file
-import errno
-import os
-import sys
-import posixpath
-import re
-import shutil
-import subprocess
-import tempfile
-
-
-# Project files generated by this module will use _intermediate_var as a
-# custom Xcode setting whose value is a DerivedSources-like directory that's
-# project-specific and configuration-specific.  The normal choice,
-# DERIVED_FILE_DIR, is target-specific, which is thought to be too restrictive
-# as it is likely that multiple targets within a single project file will want
-# to access the same set of generated files.  The other option,
-# PROJECT_DERIVED_FILE_DIR, is unsuitable because while it is project-specific,
-# it is not configuration-specific.  INTERMEDIATE_DIR is defined as
-# $(PROJECT_DERIVED_FILE_DIR)/$(CONFIGURATION).
-_intermediate_var = 'INTERMEDIATE_DIR'
-
-# SHARED_INTERMEDIATE_DIR is the same, except that it is shared among all
-# targets that share the same BUILT_PRODUCTS_DIR.
-_shared_intermediate_var = 'SHARED_INTERMEDIATE_DIR'
-
-_library_search_paths_var = 'LIBRARY_SEARCH_PATHS'
-
-generator_default_variables = {
-  'EXECUTABLE_PREFIX': '',
-  'EXECUTABLE_SUFFIX': '',
-  'STATIC_LIB_PREFIX': 'lib',
-  'SHARED_LIB_PREFIX': 'lib',
-  'STATIC_LIB_SUFFIX': '.a',
-  'SHARED_LIB_SUFFIX': '.dylib',
-  # INTERMEDIATE_DIR is a place for targets to build up intermediate products.
-  # It is specific to each build environment.  It is only guaranteed to exist
-  # and be constant within the context of a project, corresponding to a single
-  # input file.  Some build environments may allow their intermediate directory
-  # to be shared on a wider scale, but this is not guaranteed.
-  'INTERMEDIATE_DIR': '$(%s)' % _intermediate_var,
-  'OS': 'mac',
-  'PRODUCT_DIR': '$(BUILT_PRODUCTS_DIR)',
-  'LIB_DIR': '$(BUILT_PRODUCTS_DIR)',
-  'RULE_INPUT_ROOT': '$(INPUT_FILE_BASE)',
-  'RULE_INPUT_EXT': '$(INPUT_FILE_SUFFIX)',
-  'RULE_INPUT_NAME': '$(INPUT_FILE_NAME)',
-  'RULE_INPUT_PATH': '$(INPUT_FILE_PATH)',
-  'RULE_INPUT_DIRNAME': '$(INPUT_FILE_DIRNAME)',
-  'SHARED_INTERMEDIATE_DIR': '$(%s)' % _shared_intermediate_var,
-  'CONFIGURATION_NAME': '$(CONFIGURATION)',
-}
-
-# The Xcode-specific sections that hold paths.
-generator_additional_path_sections = [
-  'mac_bundle_resources',
-  'mac_framework_headers',
-  'mac_framework_private_headers',
-  # 'mac_framework_dirs', input already handles _dirs endings.
-]
-
-# The Xcode-specific keys that exist on targets and aren't moved down to
-# configurations.
-generator_additional_non_configuration_keys = [
-  'mac_bundle',
-  'mac_bundle_resources',
-  'mac_framework_headers',
-  'mac_framework_private_headers',
-  'xcode_create_dependents_test_runner',
-]
-
-# We want to let any rules apply to files that are resources also.
-generator_extra_sources_for_rules = [
-  'mac_bundle_resources',
-  'mac_framework_headers',
-  'mac_framework_private_headers',
-]
-
-# Xcode's standard set of library directories, which don't need to be duplicated
-# in LIBRARY_SEARCH_PATHS. This list is not exhaustive, but that's okay.
-xcode_standard_library_dirs = frozenset([
-  '$(SDKROOT)/usr/lib',
-  '$(SDKROOT)/usr/local/lib',
-])
-
-def CreateXCConfigurationList(configuration_names):
-  xccl = gyp.xcodeproj_file.XCConfigurationList({'buildConfigurations': []})
-  if len(configuration_names) == 0:
-    configuration_names = ['Default']
-  for configuration_name in configuration_names:
-    xcbc = gyp.xcodeproj_file.XCBuildConfiguration({
-        'name': configuration_name})
-    xccl.AppendProperty('buildConfigurations', xcbc)
-  xccl.SetProperty('defaultConfigurationName', configuration_names[0])
-  return xccl
-
-
-class XcodeProject(object):
-  def __init__(self, gyp_path, path, build_file_dict):
-    self.gyp_path = gyp_path
-    self.path = path
-    self.project = gyp.xcodeproj_file.PBXProject(path=path)
-    projectDirPath = gyp.common.RelativePath(
-                         os.path.dirname(os.path.abspath(self.gyp_path)),
-                         os.path.dirname(path) or '.')
-    self.project.SetProperty('projectDirPath', projectDirPath)
-    self.project_file = \
-        gyp.xcodeproj_file.XCProjectFile({'rootObject': self.project})
-    self.build_file_dict = build_file_dict
-
-    # TODO(mark): add destructor that cleans up self.path if created_dir is
-    # True and things didn't complete successfully.  Or do something even
-    # better with "try"?
-    self.created_dir = False
-    try:
-      os.makedirs(self.path)
-      self.created_dir = True
-    except OSError, e:
-      if e.errno != errno.EEXIST:
-        raise
-
-  def Finalize1(self, xcode_targets, serialize_all_tests):
-    # Collect a list of all of the build configuration names used by the
-    # various targets in the file.  It is very heavily advised to keep each
-    # target in an entire project (even across multiple project files) using
-    # the same set of configuration names.
-    configurations = []
-    for xct in self.project.GetProperty('targets'):
-      xccl = xct.GetProperty('buildConfigurationList')
-      xcbcs = xccl.GetProperty('buildConfigurations')
-      for xcbc in xcbcs:
-        name = xcbc.GetProperty('name')
-        if name not in configurations:
-          configurations.append(name)
-
-    # Replace the XCConfigurationList attached to the PBXProject object with
-    # a new one specifying all of the configuration names used by the various
-    # targets.
-    try:
-      xccl = CreateXCConfigurationList(configurations)
-      self.project.SetProperty('buildConfigurationList', xccl)
-    except:
-      sys.stderr.write("Problem with gyp file %s\n" % self.gyp_path)
-      raise
-
-    # The need for this setting is explained above where _intermediate_var is
-    # defined.  The comments below about wanting to avoid project-wide build
-    # settings apply here too, but this needs to be set on a project-wide basis
-    # so that files relative to the _intermediate_var setting can be displayed
-    # properly in the Xcode UI.
-    #
-    # Note that for configuration-relative files such as anything relative to
-    # _intermediate_var, for the purposes of UI tree view display, Xcode will
-    # only resolve the configuration name once, when the project file is
-    # opened.  If the active build configuration is changed, the project file
-    # must be closed and reopened if it is desired for the tree view to update.
-    # This is filed as Apple radar 6588391.
-    xccl.SetBuildSetting(_intermediate_var,
-                         '$(PROJECT_DERIVED_FILE_DIR)/$(CONFIGURATION)')
-    xccl.SetBuildSetting(_shared_intermediate_var,
-                         '$(SYMROOT)/DerivedSources/$(CONFIGURATION)')
-
-    # Set user-specified project-wide build settings and config files.  This
-    # is intended to be used very sparingly.  Really, almost everything should
-    # go into target-specific build settings sections.  The project-wide
-    # settings are only intended to be used in cases where Xcode attempts to
-    # resolve variable references in a project context as opposed to a target
-    # context, such as when resolving sourceTree references while building up
-    # the tree tree view for UI display.
-    # Any values set globally are applied to all configurations, then any
-    # per-configuration values are applied.
-    for xck, xcv in self.build_file_dict.get('xcode_settings', {}).iteritems():
-      xccl.SetBuildSetting(xck, xcv)
-    if 'xcode_config_file' in self.build_file_dict:
-      config_ref = self.project.AddOrGetFileInRootGroup(
-          self.build_file_dict['xcode_config_file'])
-      xccl.SetBaseConfiguration(config_ref)
-    build_file_configurations = self.build_file_dict.get('configurations', {})
-    if build_file_configurations:
-      for config_name in configurations:
-        build_file_configuration_named = \
-            build_file_configurations.get(config_name, {})
-        if build_file_configuration_named:
-          xcc = xccl.ConfigurationNamed(config_name)
-          for xck, xcv in build_file_configuration_named.get('xcode_settings',
-                                                             {}).iteritems():
-            xcc.SetBuildSetting(xck, xcv)
-          if 'xcode_config_file' in build_file_configuration_named:
-            config_ref = self.project.AddOrGetFileInRootGroup(
-                build_file_configurations[config_name]['xcode_config_file'])
-            xcc.SetBaseConfiguration(config_ref)
-
-    # Sort the targets based on how they appeared in the input.
-    # TODO(mark): Like a lot of other things here, this assumes internal
-    # knowledge of PBXProject - in this case, of its "targets" property.
-
-    # ordinary_targets are ordinary targets that are already in the project
-    # file. run_test_targets are the targets that run unittests and should be
-    # used for the Run All Tests target.  support_targets are the action/rule
-    # targets used by GYP file targets, just kept for the assert check.
-    ordinary_targets = []
-    run_test_targets = []
-    support_targets = []
-
-    # targets is full list of targets in the project.
-    targets = []
-
-    # does the it define it's own "all"?
-    has_custom_all = False
-
-    # targets_for_all is the list of ordinary_targets that should be listed
-    # in this project's "All" target.  It includes each non_runtest_target
-    # that does not have suppress_wildcard set.
-    targets_for_all = []
-
-    for target in self.build_file_dict['targets']:
-      target_name = target['target_name']
-      toolset = target['toolset']
-      qualified_target = gyp.common.QualifiedTarget(self.gyp_path, target_name,
-                                                    toolset)
-      xcode_target = xcode_targets[qualified_target]
-      # Make sure that the target being added to the sorted list is already in
-      # the unsorted list.
-      assert xcode_target in self.project._properties['targets']
-      targets.append(xcode_target)
-      ordinary_targets.append(xcode_target)
-      if xcode_target.support_target:
-        support_targets.append(xcode_target.support_target)
-        targets.append(xcode_target.support_target)
-
-      if not int(target.get('suppress_wildcard', False)):
-        targets_for_all.append(xcode_target)
-
-      if target_name.lower() == 'all':
-        has_custom_all = True;
-
-      # If this target has a 'run_as' attribute, add its target to the
-      # targets, and add it to the test targets.
-      if target.get('run_as'):
-        # Make a target to run something.  It should have one
-        # dependency, the parent xcode target.
-        xccl = CreateXCConfigurationList(configurations)
-        run_target = gyp.xcodeproj_file.PBXAggregateTarget({
-              'name':                   'Run ' + target_name,
-              'productName':            xcode_target.GetProperty('productName'),
-              'buildConfigurationList': xccl,
-            },
-            parent=self.project)
-        run_target.AddDependency(xcode_target)
-
-        command = target['run_as']
-        script = ''
-        if command.get('working_directory'):
-          script = script + 'cd "%s"\n' % \
-                   gyp.xcodeproj_file.ConvertVariablesToShellSyntax(
-                       command.get('working_directory'))
-
-        if command.get('environment'):
-          script = script + "\n".join(
-            ['export %s="%s"' %
-             (key, gyp.xcodeproj_file.ConvertVariablesToShellSyntax(val))
-             for (key, val) in command.get('environment').iteritems()]) + "\n"
-
-        # Some test end up using sockets, files on disk, etc. and can get
-        # confused if more then one test runs at a time.  The generator
-        # flag 'xcode_serialize_all_test_runs' controls the forcing of all
-        # tests serially.  It defaults to True.  To get serial runs this
-        # little bit of python does the same as the linux flock utility to
-        # make sure only one runs at a time.
-        command_prefix = ''
-        if serialize_all_tests:
-          command_prefix = \
-"""python -c "import fcntl, subprocess, sys
-file = open('$TMPDIR/GYP_serialize_test_runs', 'a')
-fcntl.flock(file.fileno(), fcntl.LOCK_EX)
-sys.exit(subprocess.call(sys.argv[1:]))" """
-
-        # If we were unable to exec for some reason, we want to exit
-        # with an error, and fixup variable references to be shell
-        # syntax instead of xcode syntax.
-        script = script + 'exec ' + command_prefix + '%s\nexit 1\n' % \
-                 gyp.xcodeproj_file.ConvertVariablesToShellSyntax(
-                     gyp.common.EncodePOSIXShellList(command.get('action')))
-
-        ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({
-              'shellScript':      script,
-              'showEnvVarsInLog': 0,
-            })
-        run_target.AppendProperty('buildPhases', ssbp)
-
-        # Add the run target to the project file.
-        targets.append(run_target)
-        run_test_targets.append(run_target)
-        xcode_target.test_runner = run_target
-
-
-    # Make sure that the list of targets being replaced is the same length as
-    # the one replacing it, but allow for the added test runner targets.
-    assert len(self.project._properties['targets']) == \
-      len(ordinary_targets) + len(support_targets)
-
-    self.project._properties['targets'] = targets
-
-    # Get rid of unnecessary levels of depth in groups like the Source group.
-    self.project.RootGroupsTakeOverOnlyChildren(True)
-
-    # Sort the groups nicely.  Do this after sorting the targets, because the
-    # Products group is sorted based on the order of the targets.
-    self.project.SortGroups()
-
-    # Create an "All" target if there's more than one target in this project
-    # file and the project didn't define its own "All" target.  Put a generated
-    # "All" target first so that people opening up the project for the first
-    # time will build everything by default.
-    if len(targets_for_all) > 1 and not has_custom_all:
-      xccl = CreateXCConfigurationList(configurations)
-      all_target = gyp.xcodeproj_file.PBXAggregateTarget(
-          {
-            'buildConfigurationList': xccl,
-            'name':                   'All',
-          },
-          parent=self.project)
-
-      for target in targets_for_all:
-        all_target.AddDependency(target)
-
-      # TODO(mark): This is evil because it relies on internal knowledge of
-      # PBXProject._properties.  It's important to get the "All" target first,
-      # though.
-      self.project._properties['targets'].insert(0, all_target)
-
-    # The same, but for run_test_targets.
-    if len(run_test_targets) > 1:
-      xccl = CreateXCConfigurationList(configurations)
-      run_all_tests_target = gyp.xcodeproj_file.PBXAggregateTarget(
-          {
-            'buildConfigurationList': xccl,
-            'name':                   'Run All Tests',
-          },
-          parent=self.project)
-      for run_test_target in run_test_targets:
-        run_all_tests_target.AddDependency(run_test_target)
-
-      # Insert after the "All" target, which must exist if there is more than
-      # one run_test_target.
-      self.project._properties['targets'].insert(1, run_all_tests_target)
-
-  def Finalize2(self, xcode_targets, xcode_target_to_target_dict):
-    # Finalize2 needs to happen in a separate step because the process of
-    # updating references to other projects depends on the ordering of targets
-    # within remote project files.  Finalize1 is responsible for sorting duty,
-    # and once all project files are sorted, Finalize2 can come in and update
-    # these references.
-
-    # To support making a "test runner" target that will run all the tests
-    # that are direct dependents of any given target, we look for
-    # xcode_create_dependents_test_runner being set on an Aggregate target,
-    # and generate a second target that will run the tests runners found under
-    # the marked target.
-    for bf_tgt in self.build_file_dict['targets']:
-      if int(bf_tgt.get('xcode_create_dependents_test_runner', 0)):
-        tgt_name = bf_tgt['target_name']
-        toolset = bf_tgt['toolset']
-        qualified_target = gyp.common.QualifiedTarget(self.gyp_path,
-                                                      tgt_name, toolset)
-        xcode_target = xcode_targets[qualified_target]
-        if isinstance(xcode_target, gyp.xcodeproj_file.PBXAggregateTarget):
-          # Collect all the run test targets.
-          all_run_tests = []
-          pbxtds = xcode_target.GetProperty('dependencies')
-          for pbxtd in pbxtds:
-            pbxcip = pbxtd.GetProperty('targetProxy')
-            dependency_xct = pbxcip.GetProperty('remoteGlobalIDString')
-            if hasattr(dependency_xct, 'test_runner'):
-              all_run_tests.append(dependency_xct.test_runner)
-
-          # Directly depend on all the runners as they depend on the target
-          # that builds them.
-          if len(all_run_tests) > 0:
-            run_all_target = gyp.xcodeproj_file.PBXAggregateTarget({
-                  'name':        'Run %s Tests' % tgt_name,
-                  'productName': tgt_name,
-                },
-                parent=self.project)
-            for run_test_target in all_run_tests:
-              run_all_target.AddDependency(run_test_target)
-
-            # Insert the test runner after the related target.
-            idx = self.project._properties['targets'].index(xcode_target)
-            self.project._properties['targets'].insert(idx + 1, run_all_target)
-
-    # Update all references to other projects, to make sure that the lists of
-    # remote products are complete.  Otherwise, Xcode will fill them in when
-    # it opens the project file, which will result in unnecessary diffs.
-    # TODO(mark): This is evil because it relies on internal knowledge of
-    # PBXProject._other_pbxprojects.
-    for other_pbxproject in self.project._other_pbxprojects.keys():
-      self.project.AddOrGetProjectReference(other_pbxproject)
-
-    self.project.SortRemoteProductReferences()
-
-    # Give everything an ID.
-    self.project_file.ComputeIDs()
-
-    # Make sure that no two objects in the project file have the same ID.  If
-    # multiple objects wind up with the same ID, upon loading the file, Xcode
-    # will only recognize one object (the last one in the file?) and the
-    # results are unpredictable.
-    self.project_file.EnsureNoIDCollisions()
-
-  def Write(self):
-    # Write the project file to a temporary location first.  Xcode watches for
-    # changes to the project file and presents a UI sheet offering to reload
-    # the project when it does change.  However, in some cases, especially when
-    # multiple projects are open or when Xcode is busy, things don't work so
-    # seamlessly.  Sometimes, Xcode is able to detect that a project file has
-    # changed but can't unload it because something else is referencing it.
-    # To mitigate this problem, and to avoid even having Xcode present the UI
-    # sheet when an open project is rewritten for inconsequential changes, the
-    # project file is written to a temporary file in the xcodeproj directory
-    # first.  The new temporary file is then compared to the existing project
-    # file, if any.  If they differ, the new file replaces the old; otherwise,
-    # the new project file is simply deleted.  Xcode properly detects a file
-    # being renamed over an open project file as a change and so it remains
-    # able to present the "project file changed" sheet under this system.
-    # Writing to a temporary file first also avoids the possible problem of
-    # Xcode rereading an incomplete project file.
-    (output_fd, new_pbxproj_path) = \
-        tempfile.mkstemp(suffix='.tmp', prefix='project.pbxproj.gyp.',
-                         dir=self.path)
-
-    try:
-      output_file = os.fdopen(output_fd, 'wb')
-
-      self.project_file.Print(output_file)
-      output_file.close()
-
-      pbxproj_path = os.path.join(self.path, 'project.pbxproj')
-
-      same = False
-      try:
-        same = filecmp.cmp(pbxproj_path, new_pbxproj_path, False)
-      except OSError, e:
-        if e.errno != errno.ENOENT:
-          raise
-
-      if same:
-        # The new file is identical to the old one, just get rid of the new
-        # one.
-        os.unlink(new_pbxproj_path)
-      else:
-        # The new file is different from the old one, or there is no old one.
-        # Rename the new file to the permanent name.
-        #
-        # tempfile.mkstemp uses an overly restrictive mode, resulting in a
-        # file that can only be read by the owner, regardless of the umask.
-        # There's no reason to not respect the umask here, which means that
-        # an extra hoop is required to fetch it and reset the new file's mode.
-        #
-        # No way to get the umask without setting a new one?  Set a safe one
-        # and then set it back to the old value.
-        umask = os.umask(077)
-        os.umask(umask)
-
-        os.chmod(new_pbxproj_path, 0666 & ~umask)
-        os.rename(new_pbxproj_path, pbxproj_path)
-
-    except Exception:
-      # Don't leave turds behind.  In fact, if this code was responsible for
-      # creating the xcodeproj directory, get rid of that too.
-      os.unlink(new_pbxproj_path)
-      if self.created_dir:
-        shutil.rmtree(self.path, True)
-      raise
-
-
-cached_xcode_version = None
-def InstalledXcodeVersion():
-  """Fetches the installed version of Xcode, returns empty string if it is
-  unable to figure it out."""
-
-  global cached_xcode_version
-  if not cached_xcode_version is None:
-    return cached_xcode_version
-
-  # Default to an empty string
-  cached_xcode_version = ''
-
-  # Collect the xcodebuild's version information.
-  try:
-    import subprocess
-    cmd = ['/usr/bin/xcodebuild', '-version']
-    proc = subprocess.Popen(cmd, stdout=subprocess.PIPE)
-    xcodebuild_version_info = proc.communicate()[0]
-    # Any error, return empty string
-    if proc.returncode:
-      xcodebuild_version_info = ''
-  except OSError:
-    # We failed to launch the tool
-    xcodebuild_version_info = ''
-
-  # Pull out the Xcode version itself.
-  match_line = re.search('^Xcode (.*)$', xcodebuild_version_info, re.MULTILINE)
-  if match_line:
-    cached_xcode_version = match_line.group(1)
-  # Done!
-  return cached_xcode_version
-
-
-def AddSourceToTarget(source, type, pbxp, xct):
-  # TODO(mark): Perhaps source_extensions and library_extensions can be made a
-  # little bit fancier.
-  source_extensions = ['c', 'cc', 'cpp', 'cxx', 'm', 'mm', 's']
-
-  # .o is conceptually more of a "source" than a "library," but Xcode thinks
-  # of "sources" as things to compile and "libraries" (or "frameworks") as
-  # things to link with. Adding an object file to an Xcode target's frameworks
-  # phase works properly.
-  library_extensions = ['a', 'dylib', 'framework', 'o']
-
-  basename = posixpath.basename(source)
-  (root, ext) = posixpath.splitext(basename)
-  if ext:
-    ext = ext[1:].lower()
-
-  if ext in source_extensions and type != 'none':
-    xct.SourcesPhase().AddFile(source)
-  elif ext in library_extensions and type != 'none':
-    xct.FrameworksPhase().AddFile(source)
-  else:
-    # Files that aren't added to a sources or frameworks build phase can still
-    # go into the project file, just not as part of a build phase.
-    pbxp.AddOrGetFileInRootGroup(source)
-
-
-def AddResourceToTarget(resource, pbxp, xct):
-  # TODO(mark): Combine with AddSourceToTarget above?  Or just inline this call
-  # where it's used.
-  xct.ResourcesPhase().AddFile(resource)
-
-
-def AddHeaderToTarget(header, pbxp, xct, is_public):
-  # TODO(mark): Combine with AddSourceToTarget above?  Or just inline this call
-  # where it's used.
-  settings = '{ATTRIBUTES = (%s, ); }' % ('Private', 'Public')[is_public]
-  xct.HeadersPhase().AddFile(header, settings)
-
-
-_xcode_variable_re = re.compile('(\$\((.*?)\))')
-def ExpandXcodeVariables(string, expansions):
-  """Expands Xcode-style $(VARIABLES) in string per the expansions dict.
-
-  In some rare cases, it is appropriate to expand Xcode variables when a
-  project file is generated.  For any substring $(VAR) in string, if VAR is a
-  key in the expansions dict, $(VAR) will be replaced with expansions[VAR].
-  Any $(VAR) substring in string for which VAR is not a key in the expansions
-  dict will remain in the returned string.
-  """
-
-  matches = _xcode_variable_re.findall(string)
-  if matches == None:
-    return string
-
-  matches.reverse()
-  for match in matches:
-    (to_replace, variable) = match
-    if not variable in expansions:
-      continue
-
-    replacement = expansions[variable]
-    string = re.sub(re.escape(to_replace), replacement, string)
-
-  return string
-
-
-def EscapeXCodeArgument(s):
-  """We must escape the arguments that we give to XCode so that it knows not to
-     split on spaces and to respect backslash and quote literals."""
-  s = s.replace('\\', '\\\\')
-  s = s.replace('"', '\\"')
-  return '"' + s + '"'
-
-
-
-def PerformBuild(data, configurations, params):
-  options = params['options']
-
-  for build_file, build_file_dict in data.iteritems():
-    (build_file_root, build_file_ext) = os.path.splitext(build_file)
-    if build_file_ext != '.gyp':
-      continue
-    xcodeproj_path = build_file_root + options.suffix + '.xcodeproj'
-    if options.generator_output:
-      xcodeproj_path = os.path.join(options.generator_output, xcodeproj_path)
-
-  for config in configurations:
-    arguments = ['xcodebuild', '-project', xcodeproj_path]
-    arguments += ['-configuration', config]
-    print "Building [%s]: %s" % (config, arguments)
-    subprocess.check_call(arguments)
-
-
-def GenerateOutput(target_list, target_dicts, data, params):
-  options = params['options']
-  generator_flags = params.get('generator_flags', {})
-  parallel_builds = generator_flags.get('xcode_parallel_builds', True)
-  serialize_all_tests = \
-      generator_flags.get('xcode_serialize_all_test_runs', True)
-  project_version = generator_flags.get('xcode_project_version', None)
-  skip_excluded_files = \
-      not generator_flags.get('xcode_list_excluded_files', True)
-  xcode_projects = {}
-  for build_file, build_file_dict in data.iteritems():
-    (build_file_root, build_file_ext) = os.path.splitext(build_file)
-    if build_file_ext != '.gyp':
-      continue
-    xcodeproj_path = build_file_root + options.suffix + '.xcodeproj'
-    if options.generator_output:
-      xcodeproj_path = os.path.join(options.generator_output, xcodeproj_path)
-    xcp = XcodeProject(build_file, xcodeproj_path, build_file_dict)
-    xcode_projects[build_file] = xcp
-    pbxp = xcp.project
-
-    if parallel_builds:
-      pbxp.SetProperty('attributes',
-                       {'BuildIndependentTargetsInParallel': 'YES'})
-    if project_version:
-      xcp.project_file.SetXcodeVersion(project_version)
-
-    # Add gyp/gypi files to project
-    if not generator_flags.get('standalone'):
-      main_group = pbxp.GetProperty('mainGroup')
-      build_group = gyp.xcodeproj_file.PBXGroup({'name': 'Build'})
-      main_group.AppendChild(build_group)
-      for included_file in build_file_dict['included_files']:
-        build_group.AddOrGetFileByPath(included_file, False)
-
-  xcode_targets = {}
-  xcode_target_to_target_dict = {}
-  for qualified_target in target_list:
-    [build_file, target_name, toolset] = \
-        gyp.common.ParseQualifiedTarget(qualified_target)
-
-    spec = target_dicts[qualified_target]
-    if spec['toolset'] != 'target':
-      raise Exception(
-          'Multiple toolsets not supported in xcode build (target %s)' %
-          qualified_target)
-    configuration_names = [spec['default_configuration']]
-    for configuration_name in sorted(spec['configurations'].keys()):
-      if configuration_name not in configuration_names:
-        configuration_names.append(configuration_name)
-    xcp = xcode_projects[build_file]
-    pbxp = xcp.project
-
-    # Set up the configurations for the target according to the list of names
-    # supplied.
-    xccl = CreateXCConfigurationList(configuration_names)
-
-    # Create an XCTarget subclass object for the target. The type with
-    # "+bundle" appended will be used if the target has "mac_bundle" set.
-    # loadable_modules not in a mac_bundle are mapped to
-    # com.googlecode.gyp.xcode.bundle, a pseudo-type that xcode.py interprets
-    # to create a single-file mh_bundle.
-    _types = {
-      'executable':             'com.apple.product-type.tool',
-      'loadable_module':        'com.googlecode.gyp.xcode.bundle',
-      'shared_library':         'com.apple.product-type.library.dynamic',
-      'static_library':         'com.apple.product-type.library.static',
-      'executable+bundle':      'com.apple.product-type.application',
-      'loadable_module+bundle': 'com.apple.product-type.bundle',
-      'shared_library+bundle':  'com.apple.product-type.framework',
-    }
-
-    target_properties = {
-      'buildConfigurationList': xccl,
-      'name':                   target_name,
-    }
-
-    type = spec['type']
-    is_bundle = int(spec.get('mac_bundle', 0))
-    if type != 'none':
-      type_bundle_key = type
-      if is_bundle:
-        type_bundle_key += '+bundle'
-      xctarget_type = gyp.xcodeproj_file.PBXNativeTarget
-      try:
-        target_properties['productType'] = _types[type_bundle_key]
-      except KeyError, e:
-        gyp.common.ExceptionAppend(e, "-- unknown product type while "
-                                   "writing target %s" % target_name)
-        raise
-    else:
-      xctarget_type = gyp.xcodeproj_file.PBXAggregateTarget
-      assert not is_bundle, (
-          'mac_bundle targets cannot have type none (target "%s")' %
-          target_name)
-
-    target_product_name = spec.get('product_name')
-    if target_product_name is not None:
-      target_properties['productName'] = target_product_name
-
-    xct = xctarget_type(target_properties, parent=pbxp,
-                        force_outdir=spec.get('product_dir'),
-                        force_prefix=spec.get('product_prefix'),
-                        force_extension=spec.get('product_extension'))
-    pbxp.AppendProperty('targets', xct)
-    xcode_targets[qualified_target] = xct
-    xcode_target_to_target_dict[xct] = spec
-
-    spec_actions = spec.get('actions', [])
-    spec_rules = spec.get('rules', [])
-
-    # Xcode has some "issues" with checking dependencies for the "Compile
-    # sources" step with any source files/headers generated by actions/rules.
-    # To work around this, if a target is building anything directly (not
-    # type "none"), then a second target is used to run the GYP actions/rules
-    # and is made a dependency of this target.  This way the work is done
-    # before the dependency checks for what should be recompiled.
-    support_xct = None
-    if type != 'none' and (spec_actions or spec_rules):
-      support_xccl = CreateXCConfigurationList(configuration_names);
-      support_target_properties = {
-        'buildConfigurationList': support_xccl,
-        'name':                   target_name + ' Support',
-      }
-      if target_product_name:
-        support_target_properties['productName'] = \
-            target_product_name + ' Support'
-      support_xct = \
-          gyp.xcodeproj_file.PBXAggregateTarget(support_target_properties,
-                                                parent=pbxp)
-      pbxp.AppendProperty('targets', support_xct)
-      xct.AddDependency(support_xct)
-    # Hang the support target off the main target so it can be tested/found
-    # by the generator during Finalize.
-    xct.support_target = support_xct
-
-    prebuild_index = 0
-
-    # Add custom shell script phases for "actions" sections.
-    for action in spec_actions:
-      # There's no need to write anything into the script to ensure that the
-      # output directories already exist, because Xcode will look at the
-      # declared outputs and automatically ensure that they exist for us.
-
-      # Do we have a message to print when this action runs?
-      message = action.get('message')
-      if message:
-        message = 'echo note: ' + gyp.common.EncodePOSIXShellArgument(message)
-      else:
-        message = ''
-
-      # Turn the list into a string that can be passed to a shell.
-      action_string = gyp.common.EncodePOSIXShellList(action['action'])
-
-      # Convert Xcode-type variable references to sh-compatible environment
-      # variable references.
-      message_sh = gyp.xcodeproj_file.ConvertVariablesToShellSyntax(message)
-      action_string_sh = gyp.xcodeproj_file.ConvertVariablesToShellSyntax(
-        action_string)
-
-      script = ''
-      # Include the optional message
-      if message_sh:
-        script += message_sh + '\n'
-      # Be sure the script runs in exec, and that if exec fails, the script
-      # exits signalling an error.
-      script += 'exec ' + action_string_sh + '\nexit 1\n'
-      ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({
-            'inputPaths': action['inputs'],
-            'name': 'Action "' + action['action_name'] + '"',
-            'outputPaths': action['outputs'],
-            'shellScript': script,
-            'showEnvVarsInLog': 0,
-          })
-
-      if support_xct:
-        support_xct.AppendProperty('buildPhases', ssbp)
-      else:
-        # TODO(mark): this assumes too much knowledge of the internals of
-        # xcodeproj_file; some of these smarts should move into xcodeproj_file
-        # itself.
-        xct._properties['buildPhases'].insert(prebuild_index, ssbp)
-        prebuild_index = prebuild_index + 1
-
-      # TODO(mark): Should verify that at most one of these is specified.
-      if int(action.get('process_outputs_as_sources', False)):
-        for output in action['outputs']:
-          AddSourceToTarget(output, type, pbxp, xct)
-
-      if int(action.get('process_outputs_as_mac_bundle_resources', False)):
-        for output in action['outputs']:
-          AddResourceToTarget(output, pbxp, xct)
-
-    # tgt_mac_bundle_resources holds the list of bundle resources so
-    # the rule processing can check against it.
-    if is_bundle:
-      tgt_mac_bundle_resources = spec.get('mac_bundle_resources', [])
-    else:
-      tgt_mac_bundle_resources = []
-
-    # Add custom shell script phases driving "make" for "rules" sections.
-    #
-    # Xcode's built-in rule support is almost powerful enough to use directly,
-    # but there are a few significant deficiencies that render them unusable.
-    # There are workarounds for some of its inadequacies, but in aggregate,
-    # the workarounds added complexity to the generator, and some workarounds
-    # actually require input files to be crafted more carefully than I'd like.
-    # Consequently, until Xcode rules are made more capable, "rules" input
-    # sections will be handled in Xcode output by shell script build phases
-    # performed prior to the compilation phase.
-    #
-    # The following problems with Xcode rules were found.  The numbers are
-    # Apple radar IDs.  I hope that these shortcomings are addressed, I really
-    # liked having the rules handled directly in Xcode during the period that
-    # I was prototyping this.
-    #
-    # 6588600 Xcode compiles custom script rule outputs too soon, compilation
-    #         fails.  This occurs when rule outputs from distinct inputs are
-    #         interdependent.  The only workaround is to put rules and their
-    #         inputs in a separate target from the one that compiles the rule
-    #         outputs.  This requires input file cooperation and it means that
-    #         process_outputs_as_sources is unusable.
-    # 6584932 Need to declare that custom rule outputs should be excluded from
-    #         compilation.  A possible workaround is to lie to Xcode about a
-    #         rule's output, giving it a dummy file it doesn't know how to
-    #         compile.  The rule action script would need to touch the dummy.
-    # 6584839 I need a way to declare additional inputs to a custom rule.
-    #         A possible workaround is a shell script phase prior to
-    #         compilation that touches a rule's primary input files if any
-    #         would-be additional inputs are newer than the output.  Modifying
-    #         the source tree - even just modification times - feels dirty.
-    # 6564240 Xcode "custom script" build rules always dump all environment
-    #         variables.  This is a low-prioroty problem and is not a
-    #         show-stopper.
-    rules_by_ext = {}
-    for rule in spec_rules:
-      rules_by_ext[rule['extension']] = rule
-
-      # First, some definitions:
-      #
-      # A "rule source" is a file that was listed in a target's "sources"
-      # list and will have a rule applied to it on the basis of matching the
-      # rule's "extensions" attribute.  Rule sources are direct inputs to
-      # rules.
-      #
-      # Rule definitions may specify additional inputs in their "inputs"
-      # attribute.  These additional inputs are used for dependency tracking
-      # purposes.
-      #
-      # A "concrete output" is a rule output with input-dependent variables
-      # resolved.  For example, given a rule with:
-      #   'extension': 'ext', 'outputs': ['$(INPUT_FILE_BASE).cc'],
-      # if the target's "sources" list contained "one.ext" and "two.ext",
-      # the "concrete output" for rule input "two.ext" would be "two.cc".  If
-      # a rule specifies multiple outputs, each input file that the rule is
-      # applied to will have the same number of concrete outputs.
-      #
-      # If any concrete outputs are outdated or missing relative to their
-      # corresponding rule_source or to any specified additional input, the
-      # rule action must be performed to generate the concrete outputs.
-
-      # concrete_outputs_by_rule_source will have an item at the same index
-      # as the rule['rule_sources'] that it corresponds to.  Each item is a
-      # list of all of the concrete outputs for the rule_source.
-      concrete_outputs_by_rule_source = []
-
-      # concrete_outputs_all is a flat list of all concrete outputs that this
-      # rule is able to produce, given the known set of input files
-      # (rule_sources) that apply to it.
-      concrete_outputs_all = []
-
-      # messages & actions are keyed by the same indices as rule['rule_sources']
-      # and concrete_outputs_by_rule_source.  They contain the message and
-      # action to perform after resolving input-dependent variables.  The
-      # message is optional, in which case None is stored for each rule source.
-      messages = []
-      actions = []
-
-      for rule_source in rule.get('rule_sources', []):
-        rule_source_dirname, rule_source_basename = \
-            posixpath.split(rule_source)
-        (rule_source_root, rule_source_ext) = \
-            posixpath.splitext(rule_source_basename)
-
-        # These are the same variable names that Xcode uses for its own native
-        # rule support.  Because Xcode's rule engine is not being used, they
-        # need to be expanded as they are written to the makefile.
-        rule_input_dict = {
-          'INPUT_FILE_BASE':   rule_source_root,
-          'INPUT_FILE_SUFFIX': rule_source_ext,
-          'INPUT_FILE_NAME':   rule_source_basename,
-          'INPUT_FILE_PATH':   rule_source,
-          'INPUT_FILE_DIRNAME': rule_source_dirname,
-        }
-
-        concrete_outputs_for_this_rule_source = []
-        for output in rule.get('outputs', []):
-          # Fortunately, Xcode and make both use $(VAR) format for their
-          # variables, so the expansion is the only transformation necessary.
-          # Any remaning $(VAR)-type variables in the string can be given
-          # directly to make, which will pick up the correct settings from
-          # what Xcode puts into the environment.
-          concrete_output = ExpandXcodeVariables(output, rule_input_dict)
-          concrete_outputs_for_this_rule_source.append(concrete_output)
-
-          # Add all concrete outputs to the project.
-          pbxp.AddOrGetFileInRootGroup(concrete_output)
-
-        concrete_outputs_by_rule_source.append( \
-            concrete_outputs_for_this_rule_source)
-        concrete_outputs_all.extend(concrete_outputs_for_this_rule_source)
-
-        # TODO(mark): Should verify that at most one of these is specified.
-        if int(rule.get('process_outputs_as_sources', False)):
-          for output in concrete_outputs_for_this_rule_source:
-            AddSourceToTarget(output, type, pbxp, xct)
-
-        # If the file came from the mac_bundle_resources list or if the rule
-        # is marked to process outputs as bundle resource, do so.
-        was_mac_bundle_resource = rule_source in tgt_mac_bundle_resources
-        if was_mac_bundle_resource or \
-            int(rule.get('process_outputs_as_mac_bundle_resources', False)):
-          for output in concrete_outputs_for_this_rule_source:
-            AddResourceToTarget(output, pbxp, xct)
-
-        # Do we have a message to print when this rule runs?
-        message = rule.get('message')
-        if message:
-          message = gyp.common.EncodePOSIXShellArgument(message)
-          message = ExpandXcodeVariables(message, rule_input_dict)
-        messages.append(message)
-
-        # Turn the list into a string that can be passed to a shell.
-        action_string = gyp.common.EncodePOSIXShellList(rule['action'])
-
-        action = ExpandXcodeVariables(action_string, rule_input_dict)
-        actions.append(action)
-
-      if len(concrete_outputs_all) > 0:
-        # TODO(mark): There's a possibilty for collision here.  Consider
-        # target "t" rule "A_r" and target "t_A" rule "r".
-        makefile_name = '%s.make' % re.sub(
-            '[^a-zA-Z0-9_]', '_' , '%s_%s' % (target_name, rule['rule_name']))
-        makefile_path = os.path.join(xcode_projects[build_file].path,
-                                     makefile_name)
-        # TODO(mark): try/close?  Write to a temporary file and swap it only
-        # if it's got changes?
-        makefile = open(makefile_path, 'wb')
-
-        # make will build the first target in the makefile by default.  By
-        # convention, it's called "all".  List all (or at least one)
-        # concrete output for each rule source as a prerequisite of the "all"
-        # target.
-        makefile.write('all: \\\n')
-        for concrete_output_index in \
-            xrange(0, len(concrete_outputs_by_rule_source)):
-          # Only list the first (index [0]) concrete output of each input
-          # in the "all" target.  Otherwise, a parallel make (-j > 1) would
-          # attempt to process each input multiple times simultaneously.
-          # Otherwise, "all" could just contain the entire list of
-          # concrete_outputs_all.
-          concrete_output = \
-              concrete_outputs_by_rule_source[concrete_output_index][0]
-          if concrete_output_index == len(concrete_outputs_by_rule_source) - 1:
-            eol = ''
-          else:
-            eol = ' \\'
-          makefile.write('    %s%s\n' % (concrete_output, eol))
-
-        for (rule_source, concrete_outputs, message, action) in \
-            zip(rule['rule_sources'], concrete_outputs_by_rule_source,
-                messages, actions):
-          makefile.write('\n')
-
-          # Add a rule that declares it can build each concrete output of a
-          # rule source.  Collect the names of the directories that are
-          # required.
-          concrete_output_dirs = []
-          for concrete_output_index in xrange(0, len(concrete_outputs)):
-            concrete_output = concrete_outputs[concrete_output_index]
-            if concrete_output_index == 0:
-              bol = ''
-            else:
-              bol = '    '
-            makefile.write('%s%s \\\n' % (bol, concrete_output))
-
-            concrete_output_dir = posixpath.dirname(concrete_output)
-            if (concrete_output_dir and
-                concrete_output_dir not in concrete_output_dirs):
-              concrete_output_dirs.append(concrete_output_dir)
-
-          makefile.write('    : \\\n')
-
-          # The prerequisites for this rule are the rule source itself and
-          # the set of additional rule inputs, if any.
-          prerequisites = [rule_source]
-          prerequisites.extend(rule.get('inputs', []))
-          for prerequisite_index in xrange(0, len(prerequisites)):
-            prerequisite = prerequisites[prerequisite_index]
-            if prerequisite_index == len(prerequisites) - 1:
-              eol = ''
-            else:
-              eol = ' \\'
-            makefile.write('    %s%s\n' % (prerequisite, eol))
-
-          # Make sure that output directories exist before executing the rule
-          # action.
-          if len(concrete_output_dirs) > 0:
-            makefile.write('\t@mkdir -p "%s"\n' %
-                           '" "'.join(concrete_output_dirs))
-
-          # The rule message and action have already had the necessary variable
-          # substitutions performed.
-          if message:
-            # Mark it with note: so Xcode picks it up in build output.
-            makefile.write('\t@echo note: %s\n' % message)
-          makefile.write('\t%s\n' % action)
-
-        makefile.close()
-
-        # It might be nice to ensure that needed output directories exist
-        # here rather than in each target in the Makefile, but that wouldn't
-        # work if there ever was a concrete output that had an input-dependent
-        # variable anywhere other than in the leaf position.
-
-        # Don't declare any inputPaths or outputPaths.  If they're present,
-        # Xcode will provide a slight optimization by only running the script
-        # phase if any output is missing or outdated relative to any input.
-        # Unfortunately, it will also assume that all outputs are touched by
-        # the script, and if the outputs serve as files in a compilation
-        # phase, they will be unconditionally rebuilt.  Since make might not
-        # rebuild everything that could be declared here as an output, this
-        # extra compilation activity is unnecessary.  With inputPaths and
-        # outputPaths not supplied, make will always be called, but it knows
-        # enough to not do anything when everything is up-to-date.
-
-        # To help speed things up, pass -j COUNT to make so it does some work
-        # in parallel.  Don't use ncpus because Xcode will build ncpus targets
-        # in parallel and if each target happens to have a rules step, there
-        # would be ncpus^2 things going.  With a machine that has 2 quad-core
-        # Xeons, a build can quickly run out of processes based on
-        # scheduling/other tasks, and randomly failing builds are no good.
-        script = \
-"""JOB_COUNT="$(/usr/sbin/sysctl -n hw.ncpu)"
-if [ "${JOB_COUNT}" -gt 4 ]; then
-  JOB_COUNT=4
-fi
-exec "${DEVELOPER_BIN_DIR}/make" -f "${PROJECT_FILE_PATH}/%s" -j "${JOB_COUNT}"
-exit 1
-""" % makefile_name
-        ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({
-              'name': 'Rule "' + rule['rule_name'] + '"',
-              'shellScript': script,
-              'showEnvVarsInLog': 0,
-            })
-
-        if support_xct:
-          support_xct.AppendProperty('buildPhases', ssbp)
-        else:
-          # TODO(mark): this assumes too much knowledge of the internals of
-          # xcodeproj_file; some of these smarts should move into xcodeproj_file
-          # itself.
-          xct._properties['buildPhases'].insert(prebuild_index, ssbp)
-          prebuild_index = prebuild_index + 1
-
-      # Extra rule inputs also go into the project file.  Concrete outputs were
-      # already added when they were computed.
-      groups = ['inputs', 'inputs_excluded']
-      if skip_excluded_files:
-        groups = [x for x in groups if not x.endswith('_excluded')]
-      for group in groups:
-        for item in rule.get(group, []):
-          pbxp.AddOrGetFileInRootGroup(item)
-
-    # Add "sources".
-    for source in spec.get('sources', []):
-      (source_root, source_extension) = posixpath.splitext(source)
-      if source_extension[1:] not in rules_by_ext:
-        # AddSourceToTarget will add the file to a root group if it's not
-        # already there.
-        AddSourceToTarget(source, type, pbxp, xct)
-      else:
-        pbxp.AddOrGetFileInRootGroup(source)
-
-    # Add "mac_bundle_resources" and "mac_framework_private_headers" if
-    # it's a bundle of any type.
-    if is_bundle:
-      for resource in tgt_mac_bundle_resources:
-        (resource_root, resource_extension) = posixpath.splitext(resource)
-        if resource_extension[1:] not in rules_by_ext:
-          AddResourceToTarget(resource, pbxp, xct)
-        else:
-          pbxp.AddOrGetFileInRootGroup(resource)
-
-      for header in spec.get('mac_framework_private_headers', []):
-        AddHeaderToTarget(header, pbxp, xct, False)
-
-    # Add "mac_framework_headers". These can be valid for both frameworks
-    # and static libraries.
-    if is_bundle or type == 'static_library':
-      for header in spec.get('mac_framework_headers', []):
-        AddHeaderToTarget(header, pbxp, xct, True)
-
-    # Add "copies".
-    pbxcp_dict = {}
-    for copy_group in spec.get('copies', []):
-      dest = copy_group['destination']
-      if dest[0] not in ('/', '$'):
-        # Relative paths are relative to $(SRCROOT).
-        dest = '$(SRCROOT)/' + dest
-
-      # Coalesce multiple "copies" sections in the same target with the same
-      # "destination" property into the same PBXCopyFilesBuildPhase, otherwise
-      # they'll wind up with ID collisions.
-      pbxcp = pbxcp_dict.get(dest, None)
-      if pbxcp is None:
-        pbxcp = gyp.xcodeproj_file.PBXCopyFilesBuildPhase({
-              'name': 'Copy to ' + copy_group['destination']
-            },
-            parent=xct)
-        pbxcp.SetDestination(dest)
-
-        # TODO(mark): The usual comment about this knowing too much about
-        # gyp.xcodeproj_file internals applies.
-        xct._properties['buildPhases'].insert(prebuild_index, pbxcp)
-
-        pbxcp_dict[dest] = pbxcp
-
-      for file in copy_group['files']:
-        pbxcp.AddFile(file)
-
-    # Excluded files can also go into the project file.
-    if not skip_excluded_files:
-      for key in ['sources', 'mac_bundle_resources', 'mac_framework_headers',
-                  'mac_framework_private_headers']:
-        excluded_key = key + '_excluded'
-        for item in spec.get(excluded_key, []):
-          pbxp.AddOrGetFileInRootGroup(item)
-
-    # So can "inputs" and "outputs" sections of "actions" groups.
-    groups = ['inputs', 'inputs_excluded', 'outputs', 'outputs_excluded']
-    if skip_excluded_files:
-      groups = [x for x in groups if not x.endswith('_excluded')]
-    for action in spec.get('actions', []):
-      for group in groups:
-        for item in action.get(group, []):
-          # Exclude anything in BUILT_PRODUCTS_DIR.  They're products, not
-          # sources.
-          if not item.startswith('$(BUILT_PRODUCTS_DIR)/'):
-            pbxp.AddOrGetFileInRootGroup(item)
-
-    for postbuild in spec.get('postbuilds', []):
-      action_string_sh = gyp.common.EncodePOSIXShellList(postbuild['action'])
-      script = 'exec ' + action_string_sh + '\nexit 1\n'
-
-      # Make the postbuild step depend on the output of ld or ar from this
-      # target. Apparently putting the script step after the link step isn't
-      # sufficient to ensure proper ordering in all cases. With an input
-      # declared but no outputs, the script step should run every time, as
-      # desired.
-      ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({
-            'inputPaths': ['$(BUILT_PRODUCTS_DIR)/$(EXECUTABLE_PATH)'],
-            'name': 'Postbuild "' + postbuild['postbuild_name'] + '"',
-            'shellScript': script,
-            'showEnvVarsInLog': 0,
-          })
-      xct.AppendProperty('buildPhases', ssbp)
-
-    # Add dependencies before libraries, because adding a dependency may imply
-    # adding a library.  It's preferable to keep dependencies listed first
-    # during a link phase so that they can override symbols that would
-    # otherwise be provided by libraries, which will usually include system
-    # libraries.  On some systems, ld is finicky and even requires the
-    # libraries to be ordered in such a way that unresolved symbols in
-    # earlier-listed libraries may only be resolved by later-listed libraries.
-    # The Mac linker doesn't work that way, but other platforms do, and so
-    # their linker invocations need to be constructed in this way.  There's
-    # no compelling reason for Xcode's linker invocations to differ.
-
-    if 'dependencies' in spec:
-      for dependency in spec['dependencies']:
-        xct.AddDependency(xcode_targets[dependency])
-        # The support project also gets the dependencies (in case they are
-        # needed for the actions/rules to work).
-        if support_xct:
-          support_xct.AddDependency(xcode_targets[dependency])
-
-    if 'libraries' in spec:
-      for library in spec['libraries']:
-        xct.FrameworksPhase().AddFile(library)
-        # Add the library's directory to LIBRARY_SEARCH_PATHS if necessary.
-        # I wish Xcode handled this automatically.
-        library_dir = posixpath.dirname(library)
-        if library_dir not in xcode_standard_library_dirs and (
-            not xct.HasBuildSetting(_library_search_paths_var) or
-            library_dir not in xct.GetBuildSetting(_library_search_paths_var)):
-          xct.AppendBuildSetting(_library_search_paths_var, library_dir)
-
-    for configuration_name in configuration_names:
-      configuration = spec['configurations'][configuration_name]
-      xcbc = xct.ConfigurationNamed(configuration_name)
-      for include_dir in configuration.get('mac_framework_dirs', []):
-        xcbc.AppendBuildSetting('FRAMEWORK_SEARCH_PATHS', include_dir)
-      for include_dir in configuration.get('include_dirs', []):
-        xcbc.AppendBuildSetting('HEADER_SEARCH_PATHS', include_dir)
-      if 'defines' in configuration:
-        for define in configuration['defines']:
-          set_define = EscapeXCodeArgument(define)
-          xcbc.AppendBuildSetting('GCC_PREPROCESSOR_DEFINITIONS', set_define)
-      if 'xcode_settings' in configuration:
-        for xck, xcv in configuration['xcode_settings'].iteritems():
-          xcbc.SetBuildSetting(xck, xcv)
-      if 'xcode_config_file' in configuration:
-        config_ref = pbxp.AddOrGetFileInRootGroup(
-            configuration['xcode_config_file'])
-        xcbc.SetBaseConfiguration(config_ref)
-
-  build_files = []
-  for build_file, build_file_dict in data.iteritems():
-    if build_file.endswith('.gyp'):
-      build_files.append(build_file)
-
-  for build_file in build_files:
-    xcode_projects[build_file].Finalize1(xcode_targets, serialize_all_tests)
-
-  for build_file in build_files:
-    xcode_projects[build_file].Finalize2(xcode_targets,
-                                         xcode_target_to_target_dict)
-
-  for build_file in build_files:
-    xcode_projects[build_file].Write()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/input.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2679 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-from compiler.ast import Const
-from compiler.ast import Dict
-from compiler.ast import Discard
-from compiler.ast import List
-from compiler.ast import Module
-from compiler.ast import Node
-from compiler.ast import Stmt
-import compiler
-import copy
-import gyp.common
-import multiprocessing
-import optparse
-import os.path
-import re
-import shlex
-import signal
-import subprocess
-import sys
-import threading
-import time
-from gyp.common import GypError
-
-
-# A list of types that are treated as linkable.
-linkable_types = ['executable', 'shared_library', 'loadable_module']
-
-# A list of sections that contain links to other targets.
-dependency_sections = ['dependencies', 'export_dependent_settings']
-
-# base_path_sections is a list of sections defined by GYP that contain
-# pathnames.  The generators can provide more keys, the two lists are merged
-# into path_sections, but you should call IsPathSection instead of using either
-# list directly.
-base_path_sections = [
-  'destination',
-  'files',
-  'include_dirs',
-  'inputs',
-  'libraries',
-  'outputs',
-  'sources',
-]
-path_sections = []
-
-is_path_section_charset = set('=+?!')
-is_path_section_match_re = re.compile('_(dir|file|path)s?$')
-
-def IsPathSection(section):
-  # If section ends in one of these characters, it's applied to a section
-  # without the trailing characters.  '/' is notably absent from this list,
-  # because there's no way for a regular expression to be treated as a path.
-  while section[-1:] in is_path_section_charset:
-    section = section[:-1]
-  return section in path_sections or is_path_section_match_re.search(section)
-
-# base_non_configuraiton_keys is a list of key names that belong in the target
-# itself and should not be propagated into its configurations.  It is merged
-# with a list that can come from the generator to
-# create non_configuration_keys.
-base_non_configuration_keys = [
-  # Sections that must exist inside targets and not configurations.
-  'actions',
-  'configurations',
-  'copies',
-  'default_configuration',
-  'dependencies',
-  'dependencies_original',
-  'link_languages',
-  'libraries',
-  'postbuilds',
-  'product_dir',
-  'product_extension',
-  'product_name',
-  'product_prefix',
-  'rules',
-  'run_as',
-  'sources',
-  'standalone_static_library',
-  'suppress_wildcard',
-  'target_name',
-  'toolset',
-  'toolsets',
-  'type',
-  'variants',
-
-  # Sections that can be found inside targets or configurations, but that
-  # should not be propagated from targets into their configurations.
-  'variables',
-]
-non_configuration_keys = []
-
-# Keys that do not belong inside a configuration dictionary.
-invalid_configuration_keys = [
-  'actions',
-  'all_dependent_settings',
-  'configurations',
-  'dependencies',
-  'direct_dependent_settings',
-  'libraries',
-  'link_settings',
-  'sources',
-  'standalone_static_library',
-  'target_name',
-  'type',
-]
-
-# Controls how the generator want the build file paths.
-absolute_build_file_paths = False
-
-# Controls whether or not the generator supports multiple toolsets.
-multiple_toolsets = False
-
-
-def GetIncludedBuildFiles(build_file_path, aux_data, included=None):
-  """Return a list of all build files included into build_file_path.
-
-  The returned list will contain build_file_path as well as all other files
-  that it included, either directly or indirectly.  Note that the list may
-  contain files that were included into a conditional section that evaluated
-  to false and was not merged into build_file_path's dict.
-
-  aux_data is a dict containing a key for each build file or included build
-  file.  Those keys provide access to dicts whose "included" keys contain
-  lists of all other files included by the build file.
-
-  included should be left at its default None value by external callers.  It
-  is used for recursion.
-
-  The returned list will not contain any duplicate entries.  Each build file
-  in the list will be relative to the current directory.
-  """
-
-  if included == None:
-    included = []
-
-  if build_file_path in included:
-    return included
-
-  included.append(build_file_path)
-
-  for included_build_file in aux_data[build_file_path].get('included', []):
-    GetIncludedBuildFiles(included_build_file, aux_data, included)
-
-  return included
-
-
-def CheckedEval(file_contents):
-  """Return the eval of a gyp file.
-
-  The gyp file is restricted to dictionaries and lists only, and
-  repeated keys are not allowed.
-
-  Note that this is slower than eval() is.
-  """
-
-  ast = compiler.parse(file_contents)
-  assert isinstance(ast, Module)
-  c1 = ast.getChildren()
-  assert c1[0] is None
-  assert isinstance(c1[1], Stmt)
-  c2 = c1[1].getChildren()
-  assert isinstance(c2[0], Discard)
-  c3 = c2[0].getChildren()
-  assert len(c3) == 1
-  return CheckNode(c3[0], [])
-
-
-def CheckNode(node, keypath):
-  if isinstance(node, Dict):
-    c = node.getChildren()
-    dict = {}
-    for n in range(0, len(c), 2):
-      assert isinstance(c[n], Const)
-      key = c[n].getChildren()[0]
-      if key in dict:
-        raise GypError("Key '" + key + "' repeated at level " +
-              repr(len(keypath) + 1) + " with key path '" +
-              '.'.join(keypath) + "'")
-      kp = list(keypath)  # Make a copy of the list for descending this node.
-      kp.append(key)
-      dict[key] = CheckNode(c[n + 1], kp)
-    return dict
-  elif isinstance(node, List):
-    c = node.getChildren()
-    children = []
-    for index, child in enumerate(c):
-      kp = list(keypath)  # Copy list.
-      kp.append(repr(index))
-      children.append(CheckNode(child, kp))
-    return children
-  elif isinstance(node, Const):
-    return node.getChildren()[0]
-  else:
-    raise TypeError, "Unknown AST node at key path '" + '.'.join(keypath) + \
-         "': " + repr(node)
-
-
-def LoadOneBuildFile(build_file_path, data, aux_data, variables, includes,
-                     is_target, check):
-  if build_file_path in data:
-    return data[build_file_path]
-
-  if os.path.exists(build_file_path):
-    build_file_contents = open(build_file_path).read()
-  else:
-    raise GypError("%s not found (cwd: %s)" % (build_file_path, os.getcwd()))
-
-  build_file_data = None
-  try:
-    if check:
-      build_file_data = CheckedEval(build_file_contents)
-    else:
-      build_file_data = eval(build_file_contents, {'__builtins__': None},
-                             None)
-  except SyntaxError, e:
-    e.filename = build_file_path
-    raise
-  except Exception, e:
-    gyp.common.ExceptionAppend(e, 'while reading ' + build_file_path)
-    raise
-
-  data[build_file_path] = build_file_data
-  aux_data[build_file_path] = {}
-
-  # Scan for includes and merge them in.
-  try:
-    if is_target:
-      LoadBuildFileIncludesIntoDict(build_file_data, build_file_path, data,
-                                    aux_data, variables, includes, check)
-    else:
-      LoadBuildFileIncludesIntoDict(build_file_data, build_file_path, data,
-                                    aux_data, variables, None, check)
-  except Exception, e:
-    gyp.common.ExceptionAppend(e,
-                               'while reading includes of ' + build_file_path)
-    raise
-
-  return build_file_data
-
-
-def LoadBuildFileIncludesIntoDict(subdict, subdict_path, data, aux_data,
-                                  variables, includes, check):
-  includes_list = []
-  if includes != None:
-    includes_list.extend(includes)
-  if 'includes' in subdict:
-    for include in subdict['includes']:
-      # "include" is specified relative to subdict_path, so compute the real
-      # path to include by appending the provided "include" to the directory
-      # in which subdict_path resides.
-      relative_include = \
-          os.path.normpath(os.path.join(os.path.dirname(subdict_path), include))
-      includes_list.append(relative_include)
-    # Unhook the includes list, it's no longer needed.
-    del subdict['includes']
-
-  # Merge in the included files.
-  for include in includes_list:
-    if not 'included' in aux_data[subdict_path]:
-      aux_data[subdict_path]['included'] = []
-    aux_data[subdict_path]['included'].append(include)
-
-    gyp.DebugOutput(gyp.DEBUG_INCLUDES, "Loading Included File: '%s'", include)
-
-    MergeDicts(subdict,
-               LoadOneBuildFile(include, data, aux_data, variables, None,
-                                False, check),
-               subdict_path, include)
-
-  # Recurse into subdictionaries.
-  for k, v in subdict.iteritems():
-    if v.__class__ == dict:
-      LoadBuildFileIncludesIntoDict(v, subdict_path, data, aux_data, variables,
-                                    None, check)
-    elif v.__class__ == list:
-      LoadBuildFileIncludesIntoList(v, subdict_path, data, aux_data, variables,
-                                    check)
-
-
-# This recurses into lists so that it can look for dicts.
-def LoadBuildFileIncludesIntoList(sublist, sublist_path, data, aux_data,
-                                  variables, check):
-  for item in sublist:
-    if item.__class__ == dict:
-      LoadBuildFileIncludesIntoDict(item, sublist_path, data, aux_data,
-                                    variables, None, check)
-    elif item.__class__ == list:
-      LoadBuildFileIncludesIntoList(item, sublist_path, data, aux_data,
-                                    variables, check)
-
-# Processes toolsets in all the targets. This recurses into condition entries
-# since they can contain toolsets as well.
-def ProcessToolsetsInDict(data):
-  if 'targets' in data:
-    target_list = data['targets']
-    new_target_list = []
-    for target in target_list:
-      # If this target already has an explicit 'toolset', and no 'toolsets'
-      # list, don't modify it further.
-      if 'toolset' in target and 'toolsets' not in target:
-        new_target_list.append(target)
-        continue
-      if multiple_toolsets:
-        toolsets = target.get('toolsets', ['target'])
-      else:
-        toolsets = ['target']
-      # Make sure this 'toolsets' definition is only processed once.
-      if 'toolsets' in target:
-        del target['toolsets']
-      if len(toolsets) > 0:
-        # Optimization: only do copies if more than one toolset is specified.
-        for build in toolsets[1:]:
-          new_target = copy.deepcopy(target)
-          new_target['toolset'] = build
-          new_target_list.append(new_target)
-        target['toolset'] = toolsets[0]
-        new_target_list.append(target)
-    data['targets'] = new_target_list
-  if 'conditions' in data:
-    for condition in data['conditions']:
-      if isinstance(condition, list):
-        for condition_dict in condition[1:]:
-          ProcessToolsetsInDict(condition_dict)
-
-
-# TODO(mark): I don't love this name.  It just means that it's going to load
-# a build file that contains targets and is expected to provide a targets dict
-# that contains the targets...
-def LoadTargetBuildFile(build_file_path, data, aux_data, variables, includes,
-                        depth, check, load_dependencies):
-  # If depth is set, predefine the DEPTH variable to be a relative path from
-  # this build file's directory to the directory identified by depth.
-  if depth:
-    # TODO(dglazkov) The backslash/forward-slash replacement at the end is a
-    # temporary measure. This should really be addressed by keeping all paths
-    # in POSIX until actual project generation.
-    d = gyp.common.RelativePath(depth, os.path.dirname(build_file_path))
-    if d == '':
-      variables['DEPTH'] = '.'
-    else:
-      variables['DEPTH'] = d.replace('\\', '/')
-
-  # If the generator needs absolue paths, then do so.
-  if absolute_build_file_paths:
-    build_file_path = os.path.abspath(build_file_path)
-
-  if build_file_path in data['target_build_files']:
-    # Already loaded.
-    return False
-  data['target_build_files'].add(build_file_path)
-
-  gyp.DebugOutput(gyp.DEBUG_INCLUDES,
-                  "Loading Target Build File '%s'", build_file_path)
-
-  build_file_data = LoadOneBuildFile(build_file_path, data, aux_data, variables,
-                                     includes, True, check)
-
-  # Store DEPTH for later use in generators.
-  build_file_data['_DEPTH'] = depth
-
-  # Set up the included_files key indicating which .gyp files contributed to
-  # this target dict.
-  if 'included_files' in build_file_data:
-    raise GypError(build_file_path + ' must not contain included_files key')
-
-  included = GetIncludedBuildFiles(build_file_path, aux_data)
-  build_file_data['included_files'] = []
-  for included_file in included:
-    # included_file is relative to the current directory, but it needs to
-    # be made relative to build_file_path's directory.
-    included_relative = \
-        gyp.common.RelativePath(included_file,
-                                os.path.dirname(build_file_path))
-    build_file_data['included_files'].append(included_relative)
-
-  # Do a first round of toolsets expansion so that conditions can be defined
-  # per toolset.
-  ProcessToolsetsInDict(build_file_data)
-
-  # Apply "pre"/"early" variable expansions and condition evaluations.
-  ProcessVariablesAndConditionsInDict(
-      build_file_data, PHASE_EARLY, variables, build_file_path)
-
-  # Since some toolsets might have been defined conditionally, perform
-  # a second round of toolsets expansion now.
-  ProcessToolsetsInDict(build_file_data)
-
-  # Look at each project's target_defaults dict, and merge settings into
-  # targets.
-  if 'target_defaults' in build_file_data:
-    if 'targets' not in build_file_data:
-      raise GypError("Unable to find targets in build file %s" %
-                     build_file_path)
-
-    index = 0
-    while index < len(build_file_data['targets']):
-      # This procedure needs to give the impression that target_defaults is
-      # used as defaults, and the individual targets inherit from that.
-      # The individual targets need to be merged into the defaults.  Make
-      # a deep copy of the defaults for each target, merge the target dict
-      # as found in the input file into that copy, and then hook up the
-      # copy with the target-specific data merged into it as the replacement
-      # target dict.
-      old_target_dict = build_file_data['targets'][index]
-      new_target_dict = copy.deepcopy(build_file_data['target_defaults'])
-      MergeDicts(new_target_dict, old_target_dict,
-                 build_file_path, build_file_path)
-      build_file_data['targets'][index] = new_target_dict
-      index += 1
-
-    # No longer needed.
-    del build_file_data['target_defaults']
-
-  # Look for dependencies.  This means that dependency resolution occurs
-  # after "pre" conditionals and variable expansion, but before "post" -
-  # in other words, you can't put a "dependencies" section inside a "post"
-  # conditional within a target.
-
-  dependencies = []
-  if 'targets' in build_file_data:
-    for target_dict in build_file_data['targets']:
-      if 'dependencies' not in target_dict:
-        continue
-      for dependency in target_dict['dependencies']:
-        dependencies.append(
-            gyp.common.ResolveTarget(build_file_path, dependency, None)[0])
-
-  if load_dependencies:
-    for dependency in dependencies:
-      try:
-        LoadTargetBuildFile(dependency, data, aux_data, variables,
-                            includes, depth, check, load_dependencies)
-      except Exception, e:
-        gyp.common.ExceptionAppend(
-          e, 'while loading dependencies of %s' % build_file_path)
-        raise
-  else:
-    return (build_file_path, dependencies)
-
-
-def CallLoadTargetBuildFile(global_flags,
-                            build_file_path, data,
-                            aux_data, variables,
-                            includes, depth, check):
-  """Wrapper around LoadTargetBuildFile for parallel processing.
-
-     This wrapper is used when LoadTargetBuildFile is executed in
-     a worker process.
-  """
-
-  try:
-    signal.signal(signal.SIGINT, signal.SIG_IGN)
-
-    # Apply globals so that the worker process behaves the same.
-    for key, value in global_flags.iteritems():
-      globals()[key] = value
-
-    # Save the keys so we can return data that changed.
-    data_keys = set(data)
-    aux_data_keys = set(aux_data)
-
-    result = LoadTargetBuildFile(build_file_path, data,
-                                 aux_data, variables,
-                                 includes, depth, check, False)
-    if not result:
-      return result
-
-    (build_file_path, dependencies) = result
-
-    data_out = {}
-    for key in data:
-      if key == 'target_build_files':
-        continue
-      if key not in data_keys:
-        data_out[key] = data[key]
-    aux_data_out = {}
-    for key in aux_data:
-      if key not in aux_data_keys:
-        aux_data_out[key] = aux_data[key]
-
-    # This gets serialized and sent back to the main process via a pipe.
-    # It's handled in LoadTargetBuildFileCallback.
-    return (build_file_path,
-            data_out,
-            aux_data_out,
-            dependencies)
-  except Exception, e:
-    print >>sys.stderr, 'Exception: ', e
-    return None
-
-
-class ParallelProcessingError(Exception):
-  pass
-
-
-class ParallelState(object):
-  """Class to keep track of state when processing input files in parallel.
-
-  If build files are loaded in parallel, use this to keep track of
-  state during farming out and processing parallel jobs. It's stored
-  in a global so that the callback function can have access to it.
-  """
-
-  def __init__(self):
-    # The multiprocessing pool.
-    self.pool = None
-    # The condition variable used to protect this object and notify
-    # the main loop when there might be more data to process.
-    self.condition = None
-    # The "data" dict that was passed to LoadTargetBuildFileParallel
-    self.data = None
-    # The "aux_data" dict that was passed to LoadTargetBuildFileParallel
-    self.aux_data = None
-    # The number of parallel calls outstanding; decremented when a response
-    # was received.
-    self.pending = 0
-    # The set of all build files that have been scheduled, so we don't
-    # schedule the same one twice.
-    self.scheduled = set()
-    # A list of dependency build file paths that haven't been scheduled yet.
-    self.dependencies = []
-    # Flag to indicate if there was an error in a child process.
-    self.error = False
-
-  def LoadTargetBuildFileCallback(self, result):
-    """Handle the results of running LoadTargetBuildFile in another process.
-    """
-    self.condition.acquire()
-    if not result:
-      self.error = True
-      self.condition.notify()
-      self.condition.release()
-      return
-    (build_file_path0, data0, aux_data0, dependencies0) = result
-    self.data['target_build_files'].add(build_file_path0)
-    for key in data0:
-      self.data[key] = data0[key]
-    for key in aux_data0:
-      self.aux_data[key] = aux_data0[key]
-    for new_dependency in dependencies0:
-      if new_dependency not in self.scheduled:
-        self.scheduled.add(new_dependency)
-        self.dependencies.append(new_dependency)
-    self.pending -= 1
-    self.condition.notify()
-    self.condition.release()
-
-
-def LoadTargetBuildFileParallel(build_file_path, data, aux_data,
-                                variables, includes, depth, check):
-  parallel_state = ParallelState()
-  parallel_state.condition = threading.Condition()
-  parallel_state.dependencies = [build_file_path]
-  parallel_state.scheduled = set([build_file_path])
-  parallel_state.pending = 0
-  parallel_state.data = data
-  parallel_state.aux_data = aux_data
-
-  try:
-    parallel_state.condition.acquire()
-    while parallel_state.dependencies or parallel_state.pending:
-      if parallel_state.error:
-        print >>sys.stderr, (
-            '\n'
-            'Note: an error occurred while running gyp using multiprocessing.\n'
-            'For more verbose output, set GYP_PARALLEL=0 in your environment.\n'
-            'If the error only occurs when GYP_PARALLEL=1, '
-            'please report a bug!')
-        break
-      if not parallel_state.dependencies:
-        parallel_state.condition.wait()
-        continue
-
-      dependency = parallel_state.dependencies.pop()
-
-      parallel_state.pending += 1
-      data_in = {}
-      data_in['target_build_files'] = data['target_build_files']
-      aux_data_in = {}
-      global_flags = {
-        'path_sections': globals()['path_sections'],
-        'non_configuration_keys': globals()['non_configuration_keys'],
-        'absolute_build_file_paths': globals()['absolute_build_file_paths'],
-        'multiple_toolsets': globals()['multiple_toolsets']}
-
-      if not parallel_state.pool:
-        parallel_state.pool = multiprocessing.Pool(8)
-      parallel_state.pool.apply_async(
-          CallLoadTargetBuildFile,
-          args = (global_flags, dependency,
-                  data_in, aux_data_in,
-                  variables, includes, depth, check),
-          callback = parallel_state.LoadTargetBuildFileCallback)
-  except KeyboardInterrupt, e:
-    parallel_state.pool.terminate()
-    raise e
-
-  parallel_state.condition.release()
-  if parallel_state.error:
-    sys.exit()
-
-
-# Look for the bracket that matches the first bracket seen in a
-# string, and return the start and end as a tuple.  For example, if
-# the input is something like "<(foo <(bar)) blah", then it would
-# return (1, 13), indicating the entire string except for the leading
-# "<" and trailing " blah".
-LBRACKETS= set('{[(')
-BRACKETS = {'}': '{', ']': '[', ')': '('}
-def FindEnclosingBracketGroup(input_str):
-  stack = []
-  start = -1
-  for index, char in enumerate(input_str):
-    if char in LBRACKETS:
-      stack.append(char)
-      if start == -1:
-        start = index
-    elif char in BRACKETS:
-      if not stack:
-        return (-1, -1)
-      if stack.pop() != BRACKETS[char]:
-        return (-1, -1)
-      if not stack:
-        return (start, index + 1)
-  return (-1, -1)
-
-
-canonical_int_re = re.compile('(0|-?[1-9][0-9]*)$')
-
-
-def IsStrCanonicalInt(string):
-  """Returns True if |string| is in its canonical integer form.
-
-  The canonical form is such that str(int(string)) == string.
-  """
-  return isinstance(string, str) and canonical_int_re.match(string)
-
-
-# This matches things like "<(asdf)", "<!(cmd)", "<!@(cmd)", "<|(list)",
-# "<!interpreter(arguments)", "<([list])", and even "<([)" and "<(<())".
-# In the last case, the inner "<()" is captured in match['content'].
-early_variable_re = re.compile(
-    '(?P<replace>(?P<type><(?:(?:!?@?)|\|)?)'
-    '(?P<command_string>[-a-zA-Z0-9_.]+)?'
-    '\((?P<is_array>\s*\[?)'
-    '(?P<content>.*?)(\]?)\))')
-
-# This matches the same as early_variable_re, but with '>' instead of '<'.
-late_variable_re = re.compile(
-    '(?P<replace>(?P<type>>(?:(?:!?@?)|\|)?)'
-    '(?P<command_string>[-a-zA-Z0-9_.]+)?'
-    '\((?P<is_array>\s*\[?)'
-    '(?P<content>.*?)(\]?)\))')
-
-# This matches the same as early_variable_re, but with '^' instead of '<'.
-latelate_variable_re = re.compile(
-    '(?P<replace>(?P<type>[\^](?:(?:!?@?)|\|)?)'
-    '(?P<command_string>[-a-zA-Z0-9_.]+)?'
-    '\((?P<is_array>\s*\[?)'
-    '(?P<content>.*?)(\]?)\))')
-
-# Global cache of results from running commands so they don't have to be run
-# more then once.
-cached_command_results = {}
-
-
-def FixupPlatformCommand(cmd):
-  if sys.platform == 'win32':
-    if type(cmd) == list:
-      cmd = [re.sub('^cat ', 'type ', cmd[0])] + cmd[1:]
-    else:
-      cmd = re.sub('^cat ', 'type ', cmd)
-  return cmd
-
-
-PHASE_EARLY = 0
-PHASE_LATE = 1
-PHASE_LATELATE = 2
-
-
-def ExpandVariables(input, phase, variables, build_file):
-  # Look for the pattern that gets expanded into variables
-  if phase == PHASE_EARLY:
-    variable_re = early_variable_re
-    expansion_symbol = '<'
-  elif phase == PHASE_LATE:
-    variable_re = late_variable_re
-    expansion_symbol = '>'
-  elif phase == PHASE_LATELATE:
-    variable_re = latelate_variable_re
-    expansion_symbol = '^'
-  else:
-    assert False
-
-  input_str = str(input)
-  if IsStrCanonicalInt(input_str):
-    return int(input_str)
-
-  # Do a quick scan to determine if an expensive regex search is warranted.
-  if expansion_symbol not in input_str:
-    return input_str
-
-  # Get the entire list of matches as a list of MatchObject instances.
-  # (using findall here would return strings instead of MatchObjects).
-  matches = list(variable_re.finditer(input_str))
-  if not matches:
-    return input_str
-
-  output = input_str
-  # Reverse the list of matches so that replacements are done right-to-left.
-  # That ensures that earlier replacements won't mess up the string in a
-  # way that causes later calls to find the earlier substituted text instead
-  # of what's intended for replacement.
-  matches.reverse()
-  for match_group in matches:
-    match = match_group.groupdict()
-    gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Matches: %r", match)
-    # match['replace'] is the substring to look for, match['type']
-    # is the character code for the replacement type (< > <! >! <| >| <@
-    # >@ <!@ >!@), match['is_array'] contains a '[' for command
-    # arrays, and match['content'] is the name of the variable (< >)
-    # or command to run (<! >!). match['command_string'] is an optional
-    # command string. Currently, only 'pymod_do_main' is supported.
-
-    # run_command is true if a ! variant is used.
-    run_command = '!' in match['type']
-    command_string = match['command_string']
-
-    # file_list is true if a | variant is used.
-    file_list = '|' in match['type']
-
-    # Capture these now so we can adjust them later.
-    replace_start = match_group.start('replace')
-    replace_end = match_group.end('replace')
-
-    # Find the ending paren, and re-evaluate the contained string.
-    (c_start, c_end) = FindEnclosingBracketGroup(input_str[replace_start:])
-
-    # Adjust the replacement range to match the entire command
-    # found by FindEnclosingBracketGroup (since the variable_re
-    # probably doesn't match the entire command if it contained
-    # nested variables).
-    replace_end = replace_start + c_end
-
-    # Find the "real" replacement, matching the appropriate closing
-    # paren, and adjust the replacement start and end.
-    replacement = input_str[replace_start:replace_end]
-
-    # Figure out what the contents of the variable parens are.
-    contents_start = replace_start + c_start + 1
-    contents_end = replace_end - 1
-    contents = input_str[contents_start:contents_end]
-
-    # Do filter substitution now for <|().
-    # Admittedly, this is different than the evaluation order in other
-    # contexts. However, since filtration has no chance to run on <|(),
-    # this seems like the only obvious way to give them access to filters.
-    if file_list:
-      processed_variables = copy.deepcopy(variables)
-      ProcessListFiltersInDict(contents, processed_variables)
-      # Recurse to expand variables in the contents
-      contents = ExpandVariables(contents, phase,
-                                 processed_variables, build_file)
-    else:
-      # Recurse to expand variables in the contents
-      contents = ExpandVariables(contents, phase, variables, build_file)
-
-    # Strip off leading/trailing whitespace so that variable matches are
-    # simpler below (and because they are rarely needed).
-    contents = contents.strip()
-
-    # expand_to_list is true if an @ variant is used.  In that case,
-    # the expansion should result in a list.  Note that the caller
-    # is to be expecting a list in return, and not all callers do
-    # because not all are working in list context.  Also, for list
-    # expansions, there can be no other text besides the variable
-    # expansion in the input string.
-    expand_to_list = '@' in match['type'] and input_str == replacement
-
-    if run_command or file_list:
-      # Find the build file's directory, so commands can be run or file lists
-      # generated relative to it.
-      build_file_dir = os.path.dirname(build_file)
-      if build_file_dir == '':
-        # If build_file is just a leaf filename indicating a file in the
-        # current directory, build_file_dir might be an empty string.  Set
-        # it to None to signal to subprocess.Popen that it should run the
-        # command in the current directory.
-        build_file_dir = None
-
-    # Support <|(listfile.txt ...) which generates a file
-    # containing items from a gyp list, generated at gyp time.
-    # This works around actions/rules which have more inputs than will
-    # fit on the command line.
-    if file_list:
-      if type(contents) == list:
-        contents_list = contents
-      else:
-        contents_list = contents.split(' ')
-      replacement = contents_list[0]
-      path = replacement
-      if not os.path.isabs(path):
-        path = os.path.join(build_file_dir, path)
-      f = gyp.common.WriteOnDiff(path)
-      for i in contents_list[1:]:
-        f.write('%s\n' % i)
-      f.close()
-
-    elif run_command:
-      use_shell = True
-      if match['is_array']:
-        contents = eval(contents)
-        use_shell = False
-
-      # Check for a cached value to avoid executing commands, or generating
-      # file lists more than once.
-      # TODO(http://code.google.com/p/gyp/issues/detail?id=112): It is
-      # possible that the command being invoked depends on the current
-      # directory. For that case the syntax needs to be extended so that the
-      # directory is also used in cache_key (it becomes a tuple).
-      # TODO(http://code.google.com/p/gyp/issues/detail?id=111): In theory,
-      # someone could author a set of GYP files where each time the command
-      # is invoked it produces different output by design. When the need
-      # arises, the syntax should be extended to support no caching off a
-      # command's output so it is run every time.
-      cache_key = str(contents)
-      cached_value = cached_command_results.get(cache_key, None)
-      if cached_value is None:
-        gyp.DebugOutput(gyp.DEBUG_VARIABLES,
-                        "Executing command '%s' in directory '%s'",
-                        contents, build_file_dir)
-
-        replacement = ''
-
-        if command_string == 'pymod_do_main':
-          # <!pymod_do_main(modulename param eters) loads |modulename| as a
-          # python module and then calls that module's DoMain() function,
-          # passing ["param", "eters"] as a single list argument. For modules
-          # that don't load quickly, this can be faster than
-          # <!(python modulename param eters). Do this in |build_file_dir|.
-          oldwd = os.getcwd()  # Python doesn't like os.open('.'): no fchdir.
-          os.chdir(build_file_dir)
-          try:
-
-            parsed_contents = shlex.split(contents)
-            try:
-              py_module = __import__(parsed_contents[0])
-            except ImportError as e:
-              raise GypError("Error importing pymod_do_main"
-                             "module (%s): %s" % (parsed_contents[0], e))
-            replacement = str(py_module.DoMain(parsed_contents[1:])).rstrip()
-          finally:
-            os.chdir(oldwd)
-          assert replacement != None
-        elif command_string:
-          raise GypError("Unknown command string '%s' in '%s'." %
-                         (command_string, contents))
-        else:
-          # Fix up command with platform specific workarounds.
-          contents = FixupPlatformCommand(contents)
-          p = subprocess.Popen(contents, shell=use_shell,
-                               stdout=subprocess.PIPE,
-                               stderr=subprocess.PIPE,
-                               stdin=subprocess.PIPE,
-                               cwd=build_file_dir)
-
-          p_stdout, p_stderr = p.communicate('')
-
-          if p.wait() != 0 or p_stderr:
-            sys.stderr.write(p_stderr)
-            # Simulate check_call behavior, since check_call only exists
-            # in python 2.5 and later.
-            raise GypError("Call to '%s' returned exit status %d." %
-                           (contents, p.returncode))
-          replacement = p_stdout.rstrip()
-
-        cached_command_results[cache_key] = replacement
-      else:
-        gyp.DebugOutput(gyp.DEBUG_VARIABLES,
-                        "Had cache value for command '%s' in directory '%s'",
-                        contents,build_file_dir)
-        replacement = cached_value
-
-    else:
-      if not contents in variables:
-        if contents[-1] in ['!', '/']:
-          # In order to allow cross-compiles (nacl) to happen more naturally,
-          # we will allow references to >(sources/) etc. to resolve to
-          # and empty list if undefined. This allows actions to:
-          # 'action!': [
-          #   '>@(_sources!)',
-          # ],
-          # 'action/': [
-          #   '>@(_sources/)',
-          # ],
-          replacement = []
-        else:
-          raise GypError('Undefined variable ' + contents +
-                         ' in ' + build_file)
-      else:
-        replacement = variables[contents]
-
-    if isinstance(replacement, list):
-      for item in replacement:
-        if (not contents[-1] == '/' and
-            not isinstance(item, str) and not isinstance(item, int)):
-          raise GypError('Variable ' + contents +
-                         ' must expand to a string or list of strings; ' +
-                         'list contains a ' +
-                         item.__class__.__name__)
-      # Run through the list and handle variable expansions in it.  Since
-      # the list is guaranteed not to contain dicts, this won't do anything
-      # with conditions sections.
-      ProcessVariablesAndConditionsInList(replacement, phase, variables,
-                                          build_file)
-    elif not isinstance(replacement, str) and \
-         not isinstance(replacement, int):
-          raise GypError('Variable ' + contents +
-                         ' must expand to a string or list of strings; ' +
-                         'found a ' + replacement.__class__.__name__)
-
-    if expand_to_list:
-      # Expanding in list context.  It's guaranteed that there's only one
-      # replacement to do in |input_str| and that it's this replacement.  See
-      # above.
-      if isinstance(replacement, list):
-        # If it's already a list, make a copy.
-        output = replacement[:]
-      else:
-        # Split it the same way sh would split arguments.
-        output = shlex.split(str(replacement))
-    else:
-      # Expanding in string context.
-      encoded_replacement = ''
-      if isinstance(replacement, list):
-        # When expanding a list into string context, turn the list items
-        # into a string in a way that will work with a subprocess call.
-        #
-        # TODO(mark): This isn't completely correct.  This should
-        # call a generator-provided function that observes the
-        # proper list-to-argument quoting rules on a specific
-        # platform instead of just calling the POSIX encoding
-        # routine.
-        encoded_replacement = gyp.common.EncodePOSIXShellList(replacement)
-      else:
-        encoded_replacement = replacement
-
-      output = output[:replace_start] + str(encoded_replacement) + \
-               output[replace_end:]
-    # Prepare for the next match iteration.
-    input_str = output
-
-  # Look for more matches now that we've replaced some, to deal with
-  # expanding local variables (variables defined in the same
-  # variables block as this one).
-  gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Found output %r, recursing.", output)
-  if isinstance(output, list):
-    if output and isinstance(output[0], list):
-      # Leave output alone if it's a list of lists.
-      # We don't want such lists to be stringified.
-      pass
-    else:
-      new_output = []
-      for item in output:
-        new_output.append(
-            ExpandVariables(item, phase, variables, build_file))
-      output = new_output
-  else:
-    output = ExpandVariables(output, phase, variables, build_file)
-
-  # Convert all strings that are canonically-represented integers into integers.
-  if isinstance(output, list):
-    for index in xrange(0, len(output)):
-      if IsStrCanonicalInt(output[index]):
-        output[index] = int(output[index])
-  elif IsStrCanonicalInt(output):
-    output = int(output)
-
-  return output
-
-
-def ProcessConditionsInDict(the_dict, phase, variables, build_file):
-  # Process a 'conditions' or 'target_conditions' section in the_dict,
-  # depending on phase.
-  # early -> conditions
-  # late -> target_conditions
-  # latelate -> no conditions
-  #
-  # Each item in a conditions list consists of cond_expr, a string expression
-  # evaluated as the condition, and true_dict, a dict that will be merged into
-  # the_dict if cond_expr evaluates to true.  Optionally, a third item,
-  # false_dict, may be present.  false_dict is merged into the_dict if
-  # cond_expr evaluates to false.
-  #
-  # Any dict merged into the_dict will be recursively processed for nested
-  # conditionals and other expansions, also according to phase, immediately
-  # prior to being merged.
-
-  if phase == PHASE_EARLY:
-    conditions_key = 'conditions'
-  elif phase == PHASE_LATE:
-    conditions_key = 'target_conditions'
-  elif phase == PHASE_LATELATE:
-    return
-  else:
-    assert False
-
-  if not conditions_key in the_dict:
-    return
-
-  conditions_list = the_dict[conditions_key]
-  # Unhook the conditions list, it's no longer needed.
-  del the_dict[conditions_key]
-
-  for condition in conditions_list:
-    if not isinstance(condition, list):
-      raise GypError(conditions_key + ' must be a list')
-    if len(condition) != 2 and len(condition) != 3:
-      # It's possible that condition[0] won't work in which case this
-      # attempt will raise its own IndexError.  That's probably fine.
-      raise GypError(conditions_key + ' ' + condition[0] +
-                     ' must be length 2 or 3, not ' + str(len(condition)))
-
-    [cond_expr, true_dict] = condition[0:2]
-    false_dict = None
-    if len(condition) == 3:
-      false_dict = condition[2]
-
-    # Do expansions on the condition itself.  Since the conditon can naturally
-    # contain variable references without needing to resort to GYP expansion
-    # syntax, this is of dubious value for variables, but someone might want to
-    # use a command expansion directly inside a condition.
-    cond_expr_expanded = ExpandVariables(cond_expr, phase, variables,
-                                         build_file)
-    if not isinstance(cond_expr_expanded, str) and \
-       not isinstance(cond_expr_expanded, int):
-      raise ValueError, \
-            'Variable expansion in this context permits str and int ' + \
-            'only, found ' + expanded.__class__.__name__
-
-    try:
-      ast_code = compile(cond_expr_expanded, '<string>', 'eval')
-
-      if eval(ast_code, {'__builtins__': None}, variables):
-        merge_dict = true_dict
-      else:
-        merge_dict = false_dict
-    except SyntaxError, e:
-      syntax_error = SyntaxError('%s while evaluating condition \'%s\' in %s '
-                                 'at character %d.' %
-                                 (str(e.args[0]), e.text, build_file, e.offset),
-                                 e.filename, e.lineno, e.offset, e.text)
-      raise syntax_error
-    except NameError, e:
-      gyp.common.ExceptionAppend(e, 'while evaluating condition \'%s\' in %s' %
-                                 (cond_expr_expanded, build_file))
-      raise GypError(e)
-
-    if merge_dict != None:
-      # Expand variables and nested conditinals in the merge_dict before
-      # merging it.
-      ProcessVariablesAndConditionsInDict(merge_dict, phase,
-                                          variables, build_file)
-
-      MergeDicts(the_dict, merge_dict, build_file, build_file)
-
-
-def LoadAutomaticVariablesFromDict(variables, the_dict):
-  # Any keys with plain string values in the_dict become automatic variables.
-  # The variable name is the key name with a "_" character prepended.
-  for key, value in the_dict.iteritems():
-    if isinstance(value, str) or isinstance(value, int) or \
-       isinstance(value, list):
-      variables['_' + key] = value
-
-
-def LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key):
-  # Any keys in the_dict's "variables" dict, if it has one, becomes a
-  # variable.  The variable name is the key name in the "variables" dict.
-  # Variables that end with the % character are set only if they are unset in
-  # the variables dict.  the_dict_key is the name of the key that accesses
-  # the_dict in the_dict's parent dict.  If the_dict's parent is not a dict
-  # (it could be a list or it could be parentless because it is a root dict),
-  # the_dict_key will be None.
-  for key, value in the_dict.get('variables', {}).iteritems():
-    if not isinstance(value, str) and not isinstance(value, int) and \
-       not isinstance(value, list):
-      continue
-
-    if key.endswith('%'):
-      variable_name = key[:-1]
-      if variable_name in variables:
-        # If the variable is already set, don't set it.
-        continue
-      if the_dict_key is 'variables' and variable_name in the_dict:
-        # If the variable is set without a % in the_dict, and the_dict is a
-        # variables dict (making |variables| a varaibles sub-dict of a
-        # variables dict), use the_dict's definition.
-        value = the_dict[variable_name]
-    else:
-      variable_name = key
-
-    variables[variable_name] = value
-
-
-def ProcessVariablesAndConditionsInDict(the_dict, phase, variables_in,
-                                        build_file, the_dict_key=None):
-  """Handle all variable and command expansion and conditional evaluation.
-
-  This function is the public entry point for all variable expansions and
-  conditional evaluations.  The variables_in dictionary will not be modified
-  by this function.
-  """
-
-  # Make a copy of the variables_in dict that can be modified during the
-  # loading of automatics and the loading of the variables dict.
-  variables = variables_in.copy()
-  LoadAutomaticVariablesFromDict(variables, the_dict)
-
-  if 'variables' in the_dict:
-    # Make sure all the local variables are added to the variables
-    # list before we process them so that you can reference one
-    # variable from another.  They will be fully expanded by recursion
-    # in ExpandVariables.
-    for key, value in the_dict['variables'].iteritems():
-      variables[key] = value
-
-    # Handle the associated variables dict first, so that any variable
-    # references within can be resolved prior to using them as variables.
-    # Pass a copy of the variables dict to avoid having it be tainted.
-    # Otherwise, it would have extra automatics added for everything that
-    # should just be an ordinary variable in this scope.
-    ProcessVariablesAndConditionsInDict(the_dict['variables'], phase,
-                                        variables, build_file, 'variables')
-
-  LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key)
-
-  for key, value in the_dict.iteritems():
-    # Skip "variables", which was already processed if present.
-    if key != 'variables' and isinstance(value, str):
-      expanded = ExpandVariables(value, phase, variables, build_file)
-      if not isinstance(expanded, str) and not isinstance(expanded, int):
-        raise ValueError, \
-              'Variable expansion in this context permits str and int ' + \
-              'only, found ' + expanded.__class__.__name__ + ' for ' + key
-      the_dict[key] = expanded
-
-  # Variable expansion may have resulted in changes to automatics.  Reload.
-  # TODO(mark): Optimization: only reload if no changes were made.
-  variables = variables_in.copy()
-  LoadAutomaticVariablesFromDict(variables, the_dict)
-  LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key)
-
-  # Process conditions in this dict.  This is done after variable expansion
-  # so that conditions may take advantage of expanded variables.  For example,
-  # if the_dict contains:
-  #   {'type':       '<(library_type)',
-  #    'conditions': [['_type=="static_library"', { ... }]]},
-  # _type, as used in the condition, will only be set to the value of
-  # library_type if variable expansion is performed before condition
-  # processing.  However, condition processing should occur prior to recursion
-  # so that variables (both automatic and "variables" dict type) may be
-  # adjusted by conditions sections, merged into the_dict, and have the
-  # intended impact on contained dicts.
-  #
-  # This arrangement means that a "conditions" section containing a "variables"
-  # section will only have those variables effective in subdicts, not in
-  # the_dict.  The workaround is to put a "conditions" section within a
-  # "variables" section.  For example:
-  #   {'conditions': [['os=="mac"', {'variables': {'define': 'IS_MAC'}}]],
-  #    'defines':    ['<(define)'],
-  #    'my_subdict': {'defines': ['<(define)']}},
-  # will not result in "IS_MAC" being appended to the "defines" list in the
-  # current scope but would result in it being appended to the "defines" list
-  # within "my_subdict".  By comparison:
-  #   {'variables': {'conditions': [['os=="mac"', {'define': 'IS_MAC'}]]},
-  #    'defines':    ['<(define)'],
-  #    'my_subdict': {'defines': ['<(define)']}},
-  # will append "IS_MAC" to both "defines" lists.
-
-  # Evaluate conditions sections, allowing variable expansions within them
-  # as well as nested conditionals.  This will process a 'conditions' or
-  # 'target_conditions' section, perform appropriate merging and recursive
-  # conditional and variable processing, and then remove the conditions section
-  # from the_dict if it is present.
-  ProcessConditionsInDict(the_dict, phase, variables, build_file)
-
-  # Conditional processing may have resulted in changes to automatics or the
-  # variables dict.  Reload.
-  variables = variables_in.copy()
-  LoadAutomaticVariablesFromDict(variables, the_dict)
-  LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key)
-
-  # Recurse into child dicts, or process child lists which may result in
-  # further recursion into descendant dicts.
-  for key, value in the_dict.iteritems():
-    # Skip "variables" and string values, which were already processed if
-    # present.
-    if key == 'variables' or isinstance(value, str):
-      continue
-    if isinstance(value, dict):
-      # Pass a copy of the variables dict so that subdicts can't influence
-      # parents.
-      ProcessVariablesAndConditionsInDict(value, phase, variables,
-                                          build_file, key)
-    elif isinstance(value, list):
-      # The list itself can't influence the variables dict, and
-      # ProcessVariablesAndConditionsInList will make copies of the variables
-      # dict if it needs to pass it to something that can influence it.  No
-      # copy is necessary here.
-      ProcessVariablesAndConditionsInList(value, phase, variables,
-                                          build_file)
-    elif not isinstance(value, int):
-      raise TypeError, 'Unknown type ' + value.__class__.__name__ + \
-                       ' for ' + key
-
-
-def ProcessVariablesAndConditionsInList(the_list, phase, variables,
-                                        build_file):
-  # Iterate using an index so that new values can be assigned into the_list.
-  index = 0
-  while index < len(the_list):
-    item = the_list[index]
-    if isinstance(item, dict):
-      # Make a copy of the variables dict so that it won't influence anything
-      # outside of its own scope.
-      ProcessVariablesAndConditionsInDict(item, phase, variables, build_file)
-    elif isinstance(item, list):
-      ProcessVariablesAndConditionsInList(item, phase, variables, build_file)
-    elif isinstance(item, str):
-      expanded = ExpandVariables(item, phase, variables, build_file)
-      if isinstance(expanded, str) or isinstance(expanded, int):
-        the_list[index] = expanded
-      elif isinstance(expanded, list):
-        the_list[index:index+1] = expanded
-        index += len(expanded)
-
-        # index now identifies the next item to examine.  Continue right now
-        # without falling into the index increment below.
-        continue
-      else:
-        raise ValueError, \
-              'Variable expansion in this context permits strings and ' + \
-              'lists only, found ' + expanded.__class__.__name__ + ' at ' + \
-              index
-    elif not isinstance(item, int):
-      raise TypeError, 'Unknown type ' + item.__class__.__name__ + \
-                       ' at index ' + index
-    index = index + 1
-
-
-def BuildTargetsDict(data):
-  """Builds a dict mapping fully-qualified target names to their target dicts.
-
-  |data| is a dict mapping loaded build files by pathname relative to the
-  current directory.  Values in |data| are build file contents.  For each
-  |data| value with a "targets" key, the value of the "targets" key is taken
-  as a list containing target dicts.  Each target's fully-qualified name is
-  constructed from the pathname of the build file (|data| key) and its
-  "target_name" property.  These fully-qualified names are used as the keys
-  in the returned dict.  These keys provide access to the target dicts,
-  the dicts in the "targets" lists.
-  """
-
-  targets = {}
-  for build_file in data['target_build_files']:
-    for target in data[build_file].get('targets', []):
-      target_name = gyp.common.QualifiedTarget(build_file,
-                                               target['target_name'],
-                                               target['toolset'])
-      if target_name in targets:
-        raise GypError('Duplicate target definitions for ' + target_name)
-      targets[target_name] = target
-
-  return targets
-
-
-def QualifyDependencies(targets):
-  """Make dependency links fully-qualified relative to the current directory.
-
-  |targets| is a dict mapping fully-qualified target names to their target
-  dicts.  For each target in this dict, keys known to contain dependency
-  links are examined, and any dependencies referenced will be rewritten
-  so that they are fully-qualified and relative to the current directory.
-  All rewritten dependencies are suitable for use as keys to |targets| or a
-  similar dict.
-  """
-
-  all_dependency_sections = [dep + op
-                             for dep in dependency_sections
-                             for op in ('', '!', '/')]
-
-  for target, target_dict in targets.iteritems():
-    target_build_file = gyp.common.BuildFile(target)
-    toolset = target_dict['toolset']
-    for dependency_key in all_dependency_sections:
-      dependencies = target_dict.get(dependency_key, [])
-      for index in xrange(0, len(dependencies)):
-        dep_file, dep_target, dep_toolset = gyp.common.ResolveTarget(
-            target_build_file, dependencies[index], toolset)
-        if not multiple_toolsets:
-          # Ignore toolset specification in the dependency if it is specified.
-          dep_toolset = toolset
-        dependency = gyp.common.QualifiedTarget(dep_file,
-                                                dep_target,
-                                                dep_toolset)
-        dependencies[index] = dependency
-
-        # Make sure anything appearing in a list other than "dependencies" also
-        # appears in the "dependencies" list.
-        if dependency_key != 'dependencies' and \
-           dependency not in target_dict['dependencies']:
-          raise GypError('Found ' + dependency + ' in ' + dependency_key +
-                         ' of ' + target + ', but not in dependencies')
-
-
-def ExpandWildcardDependencies(targets, data):
-  """Expands dependencies specified as build_file:*.
-
-  For each target in |targets|, examines sections containing links to other
-  targets.  If any such section contains a link of the form build_file:*, it
-  is taken as a wildcard link, and is expanded to list each target in
-  build_file.  The |data| dict provides access to build file dicts.
-
-  Any target that does not wish to be included by wildcard can provide an
-  optional "suppress_wildcard" key in its target dict.  When present and
-  true, a wildcard dependency link will not include such targets.
-
-  All dependency names, including the keys to |targets| and the values in each
-  dependency list, must be qualified when this function is called.
-  """
-
-  for target, target_dict in targets.iteritems():
-    toolset = target_dict['toolset']
-    target_build_file = gyp.common.BuildFile(target)
-    for dependency_key in dependency_sections:
-      dependencies = target_dict.get(dependency_key, [])
-
-      # Loop this way instead of "for dependency in" or "for index in xrange"
-      # because the dependencies list will be modified within the loop body.
-      index = 0
-      while index < len(dependencies):
-        (dependency_build_file, dependency_target, dependency_toolset) = \
-            gyp.common.ParseQualifiedTarget(dependencies[index])
-        if dependency_target != '*' and dependency_toolset != '*':
-          # Not a wildcard.  Keep it moving.
-          index = index + 1
-          continue
-
-        if dependency_build_file == target_build_file:
-          # It's an error for a target to depend on all other targets in
-          # the same file, because a target cannot depend on itself.
-          raise GypError('Found wildcard in ' + dependency_key + ' of ' +
-                         target + ' referring to same build file')
-
-        # Take the wildcard out and adjust the index so that the next
-        # dependency in the list will be processed the next time through the
-        # loop.
-        del dependencies[index]
-        index = index - 1
-
-        # Loop through the targets in the other build file, adding them to
-        # this target's list of dependencies in place of the removed
-        # wildcard.
-        dependency_target_dicts = data[dependency_build_file]['targets']
-        for dependency_target_dict in dependency_target_dicts:
-          if int(dependency_target_dict.get('suppress_wildcard', False)):
-            continue
-          dependency_target_name = dependency_target_dict['target_name']
-          if (dependency_target != '*' and
-              dependency_target != dependency_target_name):
-            continue
-          dependency_target_toolset = dependency_target_dict['toolset']
-          if (dependency_toolset != '*' and
-              dependency_toolset != dependency_target_toolset):
-            continue
-          dependency = gyp.common.QualifiedTarget(dependency_build_file,
-                                                  dependency_target_name,
-                                                  dependency_target_toolset)
-          index = index + 1
-          dependencies.insert(index, dependency)
-
-        index = index + 1
-
-
-def Unify(l):
-  """Removes duplicate elements from l, keeping the first element."""
-  seen = {}
-  return [seen.setdefault(e, e) for e in l if e not in seen]
-
-
-def RemoveDuplicateDependencies(targets):
-  """Makes sure every dependency appears only once in all targets's dependency
-  lists."""
-  for target_name, target_dict in targets.iteritems():
-    for dependency_key in dependency_sections:
-      dependencies = target_dict.get(dependency_key, [])
-      if dependencies:
-        target_dict[dependency_key] = Unify(dependencies)
-
-
-def Filter(l, item):
-  """Removes item from l."""
-  res = {}
-  return [res.setdefault(e, e) for e in l if e != item]
-
-
-def RemoveSelfDependencies(targets):
-  """Remove self dependencies from targets that have the prune_self_dependency
-  variable set."""
-  for target_name, target_dict in targets.iteritems():
-    for dependency_key in dependency_sections:
-      dependencies = target_dict.get(dependency_key, [])
-      if dependencies:
-        for t in dependencies:
-          if t == target_name:
-            if targets[t].get('variables', {}).get('prune_self_dependency', 0):
-              target_dict[dependency_key] = Filter(dependencies, target_name)
-
-
-class DependencyGraphNode(object):
-  """
-
-  Attributes:
-    ref: A reference to an object that this DependencyGraphNode represents.
-    dependencies: List of DependencyGraphNodes on which this one depends.
-    dependents: List of DependencyGraphNodes that depend on this one.
-  """
-
-  class CircularException(GypError):
-    pass
-
-  def __init__(self, ref):
-    self.ref = ref
-    self.dependencies = []
-    self.dependents = []
-
-  def FlattenToList(self):
-    # flat_list is the sorted list of dependencies - actually, the list items
-    # are the "ref" attributes of DependencyGraphNodes.  Every target will
-    # appear in flat_list after all of its dependencies, and before all of its
-    # dependents.
-    flat_list = []
-
-    # in_degree_zeros is the list of DependencyGraphNodes that have no
-    # dependencies not in flat_list.  Initially, it is a copy of the children
-    # of this node, because when the graph was built, nodes with no
-    # dependencies were made implicit dependents of the root node.
-    in_degree_zeros = set(self.dependents[:])
-
-    while in_degree_zeros:
-      # Nodes in in_degree_zeros have no dependencies not in flat_list, so they
-      # can be appended to flat_list.  Take these nodes out of in_degree_zeros
-      # as work progresses, so that the next node to process from the list can
-      # always be accessed at a consistent position.
-      node = in_degree_zeros.pop()
-      flat_list.append(node.ref)
-
-      # Look at dependents of the node just added to flat_list.  Some of them
-      # may now belong in in_degree_zeros.
-      for node_dependent in node.dependents:
-        is_in_degree_zero = True
-        for node_dependent_dependency in node_dependent.dependencies:
-          if not node_dependent_dependency.ref in flat_list:
-            # The dependent one or more dependencies not in flat_list.  There
-            # will be more chances to add it to flat_list when examining
-            # it again as a dependent of those other dependencies, provided
-            # that there are no cycles.
-            is_in_degree_zero = False
-            break
-
-        if is_in_degree_zero:
-          # All of the dependent's dependencies are already in flat_list.  Add
-          # it to in_degree_zeros where it will be processed in a future
-          # iteration of the outer loop.
-          in_degree_zeros.add(node_dependent)
-
-    return flat_list
-
-  def DirectDependencies(self, dependencies=None):
-    """Returns a list of just direct dependencies."""
-    if dependencies == None:
-      dependencies = []
-
-    for dependency in self.dependencies:
-      # Check for None, corresponding to the root node.
-      if dependency.ref != None and dependency.ref not in dependencies:
-        dependencies.append(dependency.ref)
-
-    return dependencies
-
-  def _AddImportedDependencies(self, targets, dependencies=None):
-    """Given a list of direct dependencies, adds indirect dependencies that
-    other dependencies have declared to export their settings.
-
-    This method does not operate on self.  Rather, it operates on the list
-    of dependencies in the |dependencies| argument.  For each dependency in
-    that list, if any declares that it exports the settings of one of its
-    own dependencies, those dependencies whose settings are "passed through"
-    are added to the list.  As new items are added to the list, they too will
-    be processed, so it is possible to import settings through multiple levels
-    of dependencies.
-
-    This method is not terribly useful on its own, it depends on being
-    "primed" with a list of direct dependencies such as one provided by
-    DirectDependencies.  DirectAndImportedDependencies is intended to be the
-    public entry point.
-    """
-
-    if dependencies == None:
-      dependencies = []
-
-    index = 0
-    while index < len(dependencies):
-      dependency = dependencies[index]
-      dependency_dict = targets[dependency]
-      # Add any dependencies whose settings should be imported to the list
-      # if not already present.  Newly-added items will be checked for
-      # their own imports when the list iteration reaches them.
-      # Rather than simply appending new items, insert them after the
-      # dependency that exported them.  This is done to more closely match
-      # the depth-first method used by DeepDependencies.
-      add_index = 1
-      for imported_dependency in \
-          dependency_dict.get('export_dependent_settings', []):
-        if imported_dependency not in dependencies:
-          dependencies.insert(index + add_index, imported_dependency)
-          add_index = add_index + 1
-      index = index + 1
-
-    return dependencies
-
-  def DirectAndImportedDependencies(self, targets, dependencies=None):
-    """Returns a list of a target's direct dependencies and all indirect
-    dependencies that a dependency has advertised settings should be exported
-    through the dependency for.
-    """
-
-    dependencies = self.DirectDependencies(dependencies)
-    return self._AddImportedDependencies(targets, dependencies)
-
-  def DeepDependencies(self, dependencies=None):
-    """Returns a list of all of a target's dependencies, recursively."""
-    if dependencies == None:
-      dependencies = []
-
-    for dependency in self.dependencies:
-      # Check for None, corresponding to the root node.
-      if dependency.ref != None and dependency.ref not in dependencies:
-        dependencies.append(dependency.ref)
-        dependency.DeepDependencies(dependencies)
-
-    return dependencies
-
-  def LinkDependencies(self, targets, dependencies=None, initial=True):
-    """Returns a list of dependency targets that are linked into this target.
-
-    This function has a split personality, depending on the setting of
-    |initial|.  Outside callers should always leave |initial| at its default
-    setting.
-
-    When adding a target to the list of dependencies, this function will
-    recurse into itself with |initial| set to False, to collect dependencies
-    that are linked into the linkable target for which the list is being built.
-    """
-    if dependencies == None:
-      dependencies = []
-
-    # Check for None, corresponding to the root node.
-    if self.ref == None:
-      return dependencies
-
-    # It's kind of sucky that |targets| has to be passed into this function,
-    # but that's presently the easiest way to access the target dicts so that
-    # this function can find target types.
-
-    if 'target_name' not in targets[self.ref]:
-      raise GypError("Missing 'target_name' field in target.")
-
-    if 'type' not in targets[self.ref]:
-      raise GypError("Missing 'type' field in target %s" %
-                     targets[self.ref]['target_name'])
-
-    target_type = targets[self.ref]['type']
-
-    is_linkable = target_type in linkable_types
-
-    if initial and not is_linkable:
-      # If this is the first target being examined and it's not linkable,
-      # return an empty list of link dependencies, because the link
-      # dependencies are intended to apply to the target itself (initial is
-      # True) and this target won't be linked.
-      return dependencies
-
-    # Don't traverse 'none' targets if explicitly excluded.
-    if (target_type == 'none' and
-        not targets[self.ref].get('dependencies_traverse', True)):
-      if self.ref not in dependencies:
-        dependencies.append(self.ref)
-      return dependencies
-
-    # Executables and loadable modules are already fully and finally linked.
-    # Nothing else can be a link dependency of them, there can only be
-    # dependencies in the sense that a dependent target might run an
-    # executable or load the loadable_module.
-    if not initial and target_type in ('executable', 'loadable_module'):
-      return dependencies
-
-    # The target is linkable, add it to the list of link dependencies.
-    if self.ref not in dependencies:
-      dependencies.append(self.ref)
-      if initial or not is_linkable:
-        # If this is a subsequent target and it's linkable, don't look any
-        # further for linkable dependencies, as they'll already be linked into
-        # this target linkable.  Always look at dependencies of the initial
-        # target, and always look at dependencies of non-linkables.
-        for dependency in self.dependencies:
-          dependency.LinkDependencies(targets, dependencies, False)
-
-    return dependencies
-
-
-def BuildDependencyList(targets):
-  # Create a DependencyGraphNode for each target.  Put it into a dict for easy
-  # access.
-  dependency_nodes = {}
-  for target, spec in targets.iteritems():
-    if target not in dependency_nodes:
-      dependency_nodes[target] = DependencyGraphNode(target)
-
-  # Set up the dependency links.  Targets that have no dependencies are treated
-  # as dependent on root_node.
-  root_node = DependencyGraphNode(None)
-  for target, spec in targets.iteritems():
-    target_node = dependency_nodes[target]
-    target_build_file = gyp.common.BuildFile(target)
-    dependencies = spec.get('dependencies')
-    if not dependencies:
-      target_node.dependencies = [root_node]
-      root_node.dependents.append(target_node)
-    else:
-      for dependency in dependencies:
-        dependency_node = dependency_nodes.get(dependency)
-        if not dependency_node:
-          raise GypError("Dependency '%s' not found while "
-                         "trying to load target %s" % (dependency, target))
-        target_node.dependencies.append(dependency_node)
-        dependency_node.dependents.append(target_node)
-
-  flat_list = root_node.FlattenToList()
-
-  # If there's anything left unvisited, there must be a circular dependency
-  # (cycle).  If you need to figure out what's wrong, look for elements of
-  # targets that are not in flat_list.
-  if len(flat_list) != len(targets):
-    raise DependencyGraphNode.CircularException(
-        'Some targets not reachable, cycle in dependency graph detected: ' +
-        ' '.join(set(flat_list) ^ set(targets)))
-
-  return [dependency_nodes, flat_list]
-
-
-def VerifyNoGYPFileCircularDependencies(targets):
-  # Create a DependencyGraphNode for each gyp file containing a target.  Put
-  # it into a dict for easy access.
-  dependency_nodes = {}
-  for target in targets.iterkeys():
-    build_file = gyp.common.BuildFile(target)
-    if not build_file in dependency_nodes:
-      dependency_nodes[build_file] = DependencyGraphNode(build_file)
-
-  # Set up the dependency links.
-  for target, spec in targets.iteritems():
-    build_file = gyp.common.BuildFile(target)
-    build_file_node = dependency_nodes[build_file]
-    target_dependencies = spec.get('dependencies', [])
-    for dependency in target_dependencies:
-      try:
-        dependency_build_file = gyp.common.BuildFile(dependency)
-      except GypError, e:
-        gyp.common.ExceptionAppend(
-            e, 'while computing dependencies of .gyp file %s' % build_file)
-        raise
-
-      if dependency_build_file == build_file:
-        # A .gyp file is allowed to refer back to itself.
-        continue
-      dependency_node = dependency_nodes.get(dependency_build_file)
-      if not dependency_node:
-        raise GypError("Dependancy '%s' not found" % dependency_build_file)
-      if dependency_node not in build_file_node.dependencies:
-        build_file_node.dependencies.append(dependency_node)
-        dependency_node.dependents.append(build_file_node)
-
-
-  # Files that have no dependencies are treated as dependent on root_node.
-  root_node = DependencyGraphNode(None)
-  for build_file_node in dependency_nodes.itervalues():
-    if len(build_file_node.dependencies) == 0:
-      build_file_node.dependencies.append(root_node)
-      root_node.dependents.append(build_file_node)
-
-  flat_list = root_node.FlattenToList()
-
-  # If there's anything left unvisited, there must be a circular dependency
-  # (cycle).
-  if len(flat_list) != len(dependency_nodes):
-    bad_files = []
-    for file in dependency_nodes.iterkeys():
-      if not file in flat_list:
-        bad_files.append(file)
-    raise DependencyGraphNode.CircularException, \
-        'Some files not reachable, cycle in .gyp file dependency graph ' + \
-        'detected involving some or all of: ' + \
-        ' '.join(bad_files)
-
-
-def DoDependentSettings(key, flat_list, targets, dependency_nodes):
-  # key should be one of all_dependent_settings, direct_dependent_settings,
-  # or link_settings.
-
-  for target in flat_list:
-    target_dict = targets[target]
-    build_file = gyp.common.BuildFile(target)
-
-    if key == 'all_dependent_settings':
-      dependencies = dependency_nodes[target].DeepDependencies()
-    elif key == 'direct_dependent_settings':
-      dependencies = \
-          dependency_nodes[target].DirectAndImportedDependencies(targets)
-    elif key == 'link_settings':
-      dependencies = dependency_nodes[target].LinkDependencies(targets)
-    else:
-      raise GypError("DoDependentSettings doesn't know how to determine "
-                      'dependencies for ' + key)
-
-    for dependency in dependencies:
-      dependency_dict = targets[dependency]
-      if not key in dependency_dict:
-        continue
-      dependency_build_file = gyp.common.BuildFile(dependency)
-      MergeDicts(target_dict, dependency_dict[key],
-                 build_file, dependency_build_file)
-
-
-def AdjustStaticLibraryDependencies(flat_list, targets, dependency_nodes,
-                                    sort_dependencies):
-  # Recompute target "dependencies" properties.  For each static library
-  # target, remove "dependencies" entries referring to other static libraries,
-  # unless the dependency has the "hard_dependency" attribute set.  For each
-  # linkable target, add a "dependencies" entry referring to all of the
-  # target's computed list of link dependencies (including static libraries
-  # if no such entry is already present.
-  for target in flat_list:
-    target_dict = targets[target]
-    target_type = target_dict['type']
-
-    if target_type == 'static_library':
-      if not 'dependencies' in target_dict:
-        continue
-
-      target_dict['dependencies_original'] = target_dict.get(
-          'dependencies', [])[:]
-
-      # A static library should not depend on another static library unless
-      # the dependency relationship is "hard," which should only be done when
-      # a dependent relies on some side effect other than just the build
-      # product, like a rule or action output. Further, if a target has a
-      # non-hard dependency, but that dependency exports a hard dependency,
-      # the non-hard dependency can safely be removed, but the exported hard
-      # dependency must be added to the target to keep the same dependency
-      # ordering.
-      dependencies = \
-          dependency_nodes[target].DirectAndImportedDependencies(targets)
-      index = 0
-      while index < len(dependencies):
-        dependency = dependencies[index]
-        dependency_dict = targets[dependency]
-
-        # Remove every non-hard static library dependency and remove every
-        # non-static library dependency that isn't a direct dependency.
-        if (dependency_dict['type'] == 'static_library' and \
-            not dependency_dict.get('hard_dependency', False)) or \
-           (dependency_dict['type'] != 'static_library' and \
-            not dependency in target_dict['dependencies']):
-          # Take the dependency out of the list, and don't increment index
-          # because the next dependency to analyze will shift into the index
-          # formerly occupied by the one being removed.
-          del dependencies[index]
-        else:
-          index = index + 1
-
-      # Update the dependencies. If the dependencies list is empty, it's not
-      # needed, so unhook it.
-      if len(dependencies) > 0:
-        target_dict['dependencies'] = dependencies
-      else:
-        del target_dict['dependencies']
-
-    elif target_type in linkable_types:
-      # Get a list of dependency targets that should be linked into this
-      # target.  Add them to the dependencies list if they're not already
-      # present.
-
-      link_dependencies = dependency_nodes[target].LinkDependencies(targets)
-      for dependency in link_dependencies:
-        if dependency == target:
-          continue
-        if not 'dependencies' in target_dict:
-          target_dict['dependencies'] = []
-        if not dependency in target_dict['dependencies']:
-          target_dict['dependencies'].append(dependency)
-      # Sort the dependencies list in the order from dependents to dependencies.
-      # e.g. If A and B depend on C and C depends on D, sort them in A, B, C, D.
-      # Note: flat_list is already sorted in the order from dependencies to
-      # dependents.
-      if sort_dependencies and 'dependencies' in target_dict:
-        target_dict['dependencies'] = [dep for dep in reversed(flat_list)
-                                       if dep in target_dict['dependencies']]
-
-
-# Initialize this here to speed up MakePathRelative.
-exception_re = re.compile(r'''["']?[-/$<>^]''')
-
-
-def MakePathRelative(to_file, fro_file, item):
-  # If item is a relative path, it's relative to the build file dict that it's
-  # coming from.  Fix it up to make it relative to the build file dict that
-  # it's going into.
-  # Exception: any |item| that begins with these special characters is
-  # returned without modification.
-  #   /   Used when a path is already absolute (shortcut optimization;
-  #       such paths would be returned as absolute anyway)
-  #   $   Used for build environment variables
-  #   -   Used for some build environment flags (such as -lapr-1 in a
-  #       "libraries" section)
-  #   <   Used for our own variable and command expansions (see ExpandVariables)
-  #   >   Used for our own variable and command expansions (see ExpandVariables)
-  #   ^   Used for our own variable and command expansions (see ExpandVariables)
-  #
-  #   "/' Used when a value is quoted.  If these are present, then we
-  #       check the second character instead.
-  #
-  if to_file == fro_file or exception_re.match(item):
-    return item
-  else:
-    # TODO(dglazkov) The backslash/forward-slash replacement at the end is a
-    # temporary measure. This should really be addressed by keeping all paths
-    # in POSIX until actual project generation.
-    ret = os.path.normpath(os.path.join(
-        gyp.common.RelativePath(os.path.dirname(fro_file),
-                                os.path.dirname(to_file)),
-                                item)).replace('\\', '/')
-    if item[-1] == '/':
-      ret += '/'
-    return ret
-
-def MergeLists(to, fro, to_file, fro_file, is_paths=False, append=True):
-  # Python documentation recommends objects which do not support hash
-  # set this value to None. Python library objects follow this rule.
-  is_hashable = lambda val: val.__hash__
-
-  # If x is hashable, returns whether x is in s. Else returns whether x is in l.
-  def is_in_set_or_list(x, s, l):
-    if is_hashable(x):
-      return x in s
-    return x in l
-
-  prepend_index = 0
-
-  # Make membership testing of hashables in |to| (in particular, strings)
-  # faster.
-  hashable_to_set = set(x for x in to if is_hashable(x))
-  for item in fro:
-    singleton = False
-    if isinstance(item, str) or isinstance(item, int):
-      # The cheap and easy case.
-      if is_paths:
-        to_item = MakePathRelative(to_file, fro_file, item)
-      else:
-        to_item = item
-
-      if not isinstance(item, str) or not item.startswith('-'):
-        # Any string that doesn't begin with a "-" is a singleton - it can
-        # only appear once in a list, to be enforced by the list merge append
-        # or prepend.
-        singleton = True
-    elif isinstance(item, dict):
-      # Make a copy of the dictionary, continuing to look for paths to fix.
-      # The other intelligent aspects of merge processing won't apply because
-      # item is being merged into an empty dict.
-      to_item = {}
-      MergeDicts(to_item, item, to_file, fro_file)
-    elif isinstance(item, list):
-      # Recurse, making a copy of the list.  If the list contains any
-      # descendant dicts, path fixing will occur.  Note that here, custom
-      # values for is_paths and append are dropped; those are only to be
-      # applied to |to| and |fro|, not sublists of |fro|.  append shouldn't
-      # matter anyway because the new |to_item| list is empty.
-      to_item = []
-      MergeLists(to_item, item, to_file, fro_file)
-    else:
-      raise TypeError, \
-          'Attempt to merge list item of unsupported type ' + \
-          item.__class__.__name__
-
-    if append:
-      # If appending a singleton that's already in the list, don't append.
-      # This ensures that the earliest occurrence of the item will stay put.
-      if not singleton or not is_in_set_or_list(to_item, hashable_to_set, to):
-        to.append(to_item)
-        if is_hashable(to_item):
-          hashable_to_set.add(to_item)
-    else:
-      # If prepending a singleton that's already in the list, remove the
-      # existing instance and proceed with the prepend.  This ensures that the
-      # item appears at the earliest possible position in the list.
-      while singleton and to_item in to:
-        to.remove(to_item)
-
-      # Don't just insert everything at index 0.  That would prepend the new
-      # items to the list in reverse order, which would be an unwelcome
-      # surprise.
-      to.insert(prepend_index, to_item)
-      if is_hashable(to_item):
-        hashable_to_set.add(to_item)
-      prepend_index = prepend_index + 1
-
-
-def MergeDicts(to, fro, to_file, fro_file):
-  # I wanted to name the parameter "from" but it's a Python keyword...
-  for k, v in fro.iteritems():
-    # It would be nice to do "if not k in to: to[k] = v" but that wouldn't give
-    # copy semantics.  Something else may want to merge from the |fro| dict
-    # later, and having the same dict ref pointed to twice in the tree isn't
-    # what anyone wants considering that the dicts may subsequently be
-    # modified.
-    if k in to:
-      bad_merge = False
-      if isinstance(v, str) or isinstance(v, int):
-        if not (isinstance(to[k], str) or isinstance(to[k], int)):
-          bad_merge = True
-      elif v.__class__ != to[k].__class__:
-        bad_merge = True
-
-      if bad_merge:
-        raise TypeError, \
-            'Attempt to merge dict value of type ' + v.__class__.__name__ + \
-            ' into incompatible type ' + to[k].__class__.__name__ + \
-            ' for key ' + k
-    if isinstance(v, str) or isinstance(v, int):
-      # Overwrite the existing value, if any.  Cheap and easy.
-      is_path = IsPathSection(k)
-      if is_path:
-        to[k] = MakePathRelative(to_file, fro_file, v)
-      else:
-        to[k] = v
-    elif isinstance(v, dict):
-      # Recurse, guaranteeing copies will be made of objects that require it.
-      if not k in to:
-        to[k] = {}
-      MergeDicts(to[k], v, to_file, fro_file)
-    elif isinstance(v, list):
-      # Lists in dicts can be merged with different policies, depending on
-      # how the key in the "from" dict (k, the from-key) is written.
-      #
-      # If the from-key has          ...the to-list will have this action
-      # this character appended:...     applied when receiving the from-list:
-      #                           =  replace
-      #                           +  prepend
-      #                           ?  set, only if to-list does not yet exist
-      #                      (none)  append
-      #
-      # This logic is list-specific, but since it relies on the associated
-      # dict key, it's checked in this dict-oriented function.
-      ext = k[-1]
-      append = True
-      if ext == '=':
-        list_base = k[:-1]
-        lists_incompatible = [list_base, list_base + '?']
-        to[list_base] = []
-      elif ext == '+':
-        list_base = k[:-1]
-        lists_incompatible = [list_base + '=', list_base + '?']
-        append = False
-      elif ext == '?':
-        list_base = k[:-1]
-        lists_incompatible = [list_base, list_base + '=', list_base + '+']
-      else:
-        list_base = k
-        lists_incompatible = [list_base + '=', list_base + '?']
-
-      # Some combinations of merge policies appearing together are meaningless.
-      # It's stupid to replace and append simultaneously, for example.  Append
-      # and prepend are the only policies that can coexist.
-      for list_incompatible in lists_incompatible:
-        if list_incompatible in fro:
-          raise GypError('Incompatible list policies ' + k + ' and ' +
-                         list_incompatible)
-
-      if list_base in to:
-        if ext == '?':
-          # If the key ends in "?", the list will only be merged if it doesn't
-          # already exist.
-          continue
-        if not isinstance(to[list_base], list):
-          # This may not have been checked above if merging in a list with an
-          # extension character.
-          raise TypeError, \
-              'Attempt to merge dict value of type ' + v.__class__.__name__ + \
-              ' into incompatible type ' + to[list_base].__class__.__name__ + \
-              ' for key ' + list_base + '(' + k + ')'
-      else:
-        to[list_base] = []
-
-      # Call MergeLists, which will make copies of objects that require it.
-      # MergeLists can recurse back into MergeDicts, although this will be
-      # to make copies of dicts (with paths fixed), there will be no
-      # subsequent dict "merging" once entering a list because lists are
-      # always replaced, appended to, or prepended to.
-      is_paths = IsPathSection(list_base)
-      MergeLists(to[list_base], v, to_file, fro_file, is_paths, append)
-    else:
-      raise TypeError, \
-          'Attempt to merge dict value of unsupported type ' + \
-          v.__class__.__name__ + ' for key ' + k
-
-
-def MergeConfigWithInheritance(new_configuration_dict, build_file,
-                               target_dict, configuration, visited):
-  # Skip if previously visted.
-  if configuration in visited:
-    return
-
-  # Look at this configuration.
-  configuration_dict = target_dict['configurations'][configuration]
-
-  # Merge in parents.
-  for parent in configuration_dict.get('inherit_from', []):
-    MergeConfigWithInheritance(new_configuration_dict, build_file,
-                               target_dict, parent, visited + [configuration])
-
-  # Merge it into the new config.
-  MergeDicts(new_configuration_dict, configuration_dict,
-             build_file, build_file)
-
-  # Drop abstract.
-  if 'abstract' in new_configuration_dict:
-    del new_configuration_dict['abstract']
-
-
-def SetUpConfigurations(target, target_dict):
-  # key_suffixes is a list of key suffixes that might appear on key names.
-  # These suffixes are handled in conditional evaluations (for =, +, and ?)
-  # and rules/exclude processing (for ! and /).  Keys with these suffixes
-  # should be treated the same as keys without.
-  key_suffixes = ['=', '+', '?', '!', '/']
-
-  build_file = gyp.common.BuildFile(target)
-
-  # Provide a single configuration by default if none exists.
-  # TODO(mark): Signal an error if default_configurations exists but
-  # configurations does not.
-  if not 'configurations' in target_dict:
-    target_dict['configurations'] = {'Default': {}}
-  if not 'default_configuration' in target_dict:
-    concrete = [i for i in target_dict['configurations'].iterkeys()
-                if not target_dict['configurations'][i].get('abstract')]
-    target_dict['default_configuration'] = sorted(concrete)[0]
-
-  for configuration in target_dict['configurations'].keys():
-    old_configuration_dict = target_dict['configurations'][configuration]
-    # Skip abstract configurations (saves work only).
-    if old_configuration_dict.get('abstract'):
-      continue
-    # Configurations inherit (most) settings from the enclosing target scope.
-    # Get the inheritance relationship right by making a copy of the target
-    # dict.
-    new_configuration_dict = copy.deepcopy(target_dict)
-
-    # Take out the bits that don't belong in a "configurations" section.
-    # Since configuration setup is done before conditional, exclude, and rules
-    # processing, be careful with handling of the suffix characters used in
-    # those phases.
-    delete_keys = []
-    for key in new_configuration_dict:
-      key_ext = key[-1:]
-      if key_ext in key_suffixes:
-        key_base = key[:-1]
-      else:
-        key_base = key
-      if key_base in non_configuration_keys:
-        delete_keys.append(key)
-
-    for key in delete_keys:
-      del new_configuration_dict[key]
-
-    # Merge in configuration (with all its parents first).
-    MergeConfigWithInheritance(new_configuration_dict, build_file,
-                               target_dict, configuration, [])
-
-    # Put the new result back into the target dict as a configuration.
-    target_dict['configurations'][configuration] = new_configuration_dict
-
-  # Now drop all the abstract ones.
-  for configuration in target_dict['configurations'].keys():
-    old_configuration_dict = target_dict['configurations'][configuration]
-    if old_configuration_dict.get('abstract'):
-      del target_dict['configurations'][configuration]
-
-  # Now that all of the target's configurations have been built, go through
-  # the target dict's keys and remove everything that's been moved into a
-  # "configurations" section.
-  delete_keys = []
-  for key in target_dict:
-    key_ext = key[-1:]
-    if key_ext in key_suffixes:
-      key_base = key[:-1]
-    else:
-      key_base = key
-    if not key_base in non_configuration_keys:
-      delete_keys.append(key)
-  for key in delete_keys:
-    del target_dict[key]
-
-  # Check the configurations to see if they contain invalid keys.
-  for configuration in target_dict['configurations'].keys():
-    configuration_dict = target_dict['configurations'][configuration]
-    for key in configuration_dict.keys():
-      if key in invalid_configuration_keys:
-        raise GypError('%s not allowed in the %s configuration, found in '
-                       'target %s' % (key, configuration, target))
-
-
-
-def ProcessListFiltersInDict(name, the_dict):
-  """Process regular expression and exclusion-based filters on lists.
-
-  An exclusion list is in a dict key named with a trailing "!", like
-  "sources!".  Every item in such a list is removed from the associated
-  main list, which in this example, would be "sources".  Removed items are
-  placed into a "sources_excluded" list in the dict.
-
-  Regular expression (regex) filters are contained in dict keys named with a
-  trailing "/", such as "sources/" to operate on the "sources" list.  Regex
-  filters in a dict take the form:
-    'sources/': [ ['exclude', '_(linux|mac|win)\\.cc$'],
-                  ['include', '_mac\\.cc$'] ],
-  The first filter says to exclude all files ending in _linux.cc, _mac.cc, and
-  _win.cc.  The second filter then includes all files ending in _mac.cc that
-  are now or were once in the "sources" list.  Items matching an "exclude"
-  filter are subject to the same processing as would occur if they were listed
-  by name in an exclusion list (ending in "!").  Items matching an "include"
-  filter are brought back into the main list if previously excluded by an
-  exclusion list or exclusion regex filter.  Subsequent matching "exclude"
-  patterns can still cause items to be excluded after matching an "include".
-  """
-
-  # Look through the dictionary for any lists whose keys end in "!" or "/".
-  # These are lists that will be treated as exclude lists and regular
-  # expression-based exclude/include lists.  Collect the lists that are
-  # needed first, looking for the lists that they operate on, and assemble
-  # then into |lists|.  This is done in a separate loop up front, because
-  # the _included and _excluded keys need to be added to the_dict, and that
-  # can't be done while iterating through it.
-
-  lists = []
-  del_lists = []
-  for key, value in the_dict.iteritems():
-    operation = key[-1]
-    if operation != '!' and operation != '/':
-      continue
-
-    if not isinstance(value, list):
-      raise ValueError, name + ' key ' + key + ' must be list, not ' + \
-                        value.__class__.__name__
-
-    list_key = key[:-1]
-    if list_key not in the_dict:
-      # This happens when there's a list like "sources!" but no corresponding
-      # "sources" list.  Since there's nothing for it to operate on, queue up
-      # the "sources!" list for deletion now.
-      del_lists.append(key)
-      continue
-
-    if not isinstance(the_dict[list_key], list):
-      raise ValueError, name + ' key ' + list_key + \
-                        ' must be list, not ' + \
-                        value.__class__.__name__ + ' when applying ' + \
-                        {'!': 'exclusion', '/': 'regex'}[operation]
-
-    if not list_key in lists:
-      lists.append(list_key)
-
-  # Delete the lists that are known to be unneeded at this point.
-  for del_list in del_lists:
-    del the_dict[del_list]
-
-  for list_key in lists:
-    the_list = the_dict[list_key]
-
-    # Initialize the list_actions list, which is parallel to the_list.  Each
-    # item in list_actions identifies whether the corresponding item in
-    # the_list should be excluded, unconditionally preserved (included), or
-    # whether no exclusion or inclusion has been applied.  Items for which
-    # no exclusion or inclusion has been applied (yet) have value -1, items
-    # excluded have value 0, and items included have value 1.  Includes and
-    # excludes override previous actions.  All items in list_actions are
-    # initialized to -1 because no excludes or includes have been processed
-    # yet.
-    list_actions = list((-1,) * len(the_list))
-
-    exclude_key = list_key + '!'
-    if exclude_key in the_dict:
-      for exclude_item in the_dict[exclude_key]:
-        for index in xrange(0, len(the_list)):
-          if exclude_item == the_list[index]:
-            # This item matches the exclude_item, so set its action to 0
-            # (exclude).
-            list_actions[index] = 0
-
-      # The "whatever!" list is no longer needed, dump it.
-      del the_dict[exclude_key]
-
-    regex_key = list_key + '/'
-    if regex_key in the_dict:
-      for regex_item in the_dict[regex_key]:
-        [action, pattern] = regex_item
-        pattern_re = re.compile(pattern)
-
-        if action == 'exclude':
-          # This item matches an exclude regex, so set its value to 0 (exclude).
-          action_value = 0
-        elif action == 'include':
-          # This item matches an include regex, so set its value to 1 (include).
-          action_value = 1
-        else:
-          # This is an action that doesn't make any sense.
-          raise ValueError, 'Unrecognized action ' + action + ' in ' + name + \
-                            ' key ' + regex_key
-
-        for index in xrange(0, len(the_list)):
-          list_item = the_list[index]
-          if list_actions[index] == action_value:
-            # Even if the regex matches, nothing will change so continue (regex
-            # searches are expensive).
-            continue
-          if pattern_re.search(list_item):
-            # Regular expression match.
-            list_actions[index] = action_value
-
-      # The "whatever/" list is no longer needed, dump it.
-      del the_dict[regex_key]
-
-    # Add excluded items to the excluded list.
-    #
-    # Note that exclude_key ("sources!") is different from excluded_key
-    # ("sources_excluded").  The exclude_key list is input and it was already
-    # processed and deleted; the excluded_key list is output and it's about
-    # to be created.
-    excluded_key = list_key + '_excluded'
-    if excluded_key in the_dict:
-      raise GypError(name + ' key ' + excluded_key +
-                     ' must not be present prior '
-                     ' to applying exclusion/regex filters for ' + list_key)
-
-    excluded_list = []
-
-    # Go backwards through the list_actions list so that as items are deleted,
-    # the indices of items that haven't been seen yet don't shift.  That means
-    # that things need to be prepended to excluded_list to maintain them in the
-    # same order that they existed in the_list.
-    for index in xrange(len(list_actions) - 1, -1, -1):
-      if list_actions[index] == 0:
-        # Dump anything with action 0 (exclude).  Keep anything with action 1
-        # (include) or -1 (no include or exclude seen for the item).
-        excluded_list.insert(0, the_list[index])
-        del the_list[index]
-
-    # If anything was excluded, put the excluded list into the_dict at
-    # excluded_key.
-    if len(excluded_list) > 0:
-      the_dict[excluded_key] = excluded_list
-
-  # Now recurse into subdicts and lists that may contain dicts.
-  for key, value in the_dict.iteritems():
-    if isinstance(value, dict):
-      ProcessListFiltersInDict(key, value)
-    elif isinstance(value, list):
-      ProcessListFiltersInList(key, value)
-
-
-def ProcessListFiltersInList(name, the_list):
-  for item in the_list:
-    if isinstance(item, dict):
-      ProcessListFiltersInDict(name, item)
-    elif isinstance(item, list):
-      ProcessListFiltersInList(name, item)
-
-
-def ValidateTargetType(target, target_dict):
-  """Ensures the 'type' field on the target is one of the known types.
-
-  Arguments:
-    target: string, name of target.
-    target_dict: dict, target spec.
-
-  Raises an exception on error.
-  """
-  VALID_TARGET_TYPES = ('executable', 'loadable_module',
-                        'static_library', 'shared_library',
-                        'none')
-  target_type = target_dict.get('type', None)
-  if target_type not in VALID_TARGET_TYPES:
-    raise GypError("Target %s has an invalid target type '%s'.  "
-                   "Must be one of %s." %
-                   (target, target_type, '/'.join(VALID_TARGET_TYPES)))
-  if (target_dict.get('standalone_static_library', 0) and
-      not target_type == 'static_library'):
-    raise GypError('Target %s has type %s but standalone_static_library flag is'
-                   ' only valid for static_library type.' % (target,
-                                                             target_type))
-
-
-def ValidateSourcesInTarget(target, target_dict, build_file):
-  # TODO: Check if MSVC allows this for loadable_module targets.
-  if target_dict.get('type', None) not in ('static_library', 'shared_library'):
-    return
-  sources = target_dict.get('sources', [])
-  basenames = {}
-  for source in sources:
-    name, ext = os.path.splitext(source)
-    is_compiled_file = ext in [
-        '.c', '.cc', '.cpp', '.cxx', '.m', '.mm', '.s', '.S']
-    if not is_compiled_file:
-      continue
-    basename = os.path.basename(name)  # Don't include extension.
-    basenames.setdefault(basename, []).append(source)
-
-  error = ''
-  for basename, files in basenames.iteritems():
-    if len(files) > 1:
-      error += '  %s: %s\n' % (basename, ' '.join(files))
-
-  if error:
-    print('static library %s has several files with the same basename:\n' %
-          target + error + 'Some build systems, e.g. MSVC08, '
-          'cannot handle that.')
-    raise GypError('Duplicate basenames in sources section, see list above')
-
-
-def ValidateRulesInTarget(target, target_dict, extra_sources_for_rules):
-  """Ensures that the rules sections in target_dict are valid and consistent,
-  and determines which sources they apply to.
-
-  Arguments:
-    target: string, name of target.
-    target_dict: dict, target spec containing "rules" and "sources" lists.
-    extra_sources_for_rules: a list of keys to scan for rule matches in
-        addition to 'sources'.
-  """
-
-  # Dicts to map between values found in rules' 'rule_name' and 'extension'
-  # keys and the rule dicts themselves.
-  rule_names = {}
-  rule_extensions = {}
-
-  rules = target_dict.get('rules', [])
-  for rule in rules:
-    # Make sure that there's no conflict among rule names and extensions.
-    rule_name = rule['rule_name']
-    if rule_name in rule_names:
-      raise GypError('rule %s exists in duplicate, target %s' %
-                     (rule_name, target))
-    rule_names[rule_name] = rule
-
-    rule_extension = rule['extension']
-    if rule_extension in rule_extensions:
-      raise GypError(('extension %s associated with multiple rules, ' +
-                      'target %s rules %s and %s') %
-                     (rule_extension, target,
-                      rule_extensions[rule_extension]['rule_name'],
-                      rule_name))
-    rule_extensions[rule_extension] = rule
-
-    # Make sure rule_sources isn't already there.  It's going to be
-    # created below if needed.
-    if 'rule_sources' in rule:
-      raise GypError(
-            'rule_sources must not exist in input, target %s rule %s' %
-            (target, rule_name))
-    extension = rule['extension']
-
-    rule_sources = []
-    source_keys = ['sources']
-    source_keys.extend(extra_sources_for_rules)
-    for source_key in source_keys:
-      for source in target_dict.get(source_key, []):
-        (source_root, source_extension) = os.path.splitext(source)
-        if source_extension.startswith('.'):
-          source_extension = source_extension[1:]
-        if source_extension == extension:
-          rule_sources.append(source)
-
-    if len(rule_sources) > 0:
-      rule['rule_sources'] = rule_sources
-
-
-def ValidateRunAsInTarget(target, target_dict, build_file):
-  target_name = target_dict.get('target_name')
-  run_as = target_dict.get('run_as')
-  if not run_as:
-    return
-  if not isinstance(run_as, dict):
-    raise GypError("The 'run_as' in target %s from file %s should be a "
-                   "dictionary." %
-                   (target_name, build_file))
-  action = run_as.get('action')
-  if not action:
-    raise GypError("The 'run_as' in target %s from file %s must have an "
-                   "'action' section." %
-                   (target_name, build_file))
-  if not isinstance(action, list):
-    raise GypError("The 'action' for 'run_as' in target %s from file %s "
-                   "must be a list." %
-                   (target_name, build_file))
-  working_directory = run_as.get('working_directory')
-  if working_directory and not isinstance(working_directory, str):
-    raise GypError("The 'working_directory' for 'run_as' in target %s "
-                   "in file %s should be a string." %
-                   (target_name, build_file))
-  environment = run_as.get('environment')
-  if environment and not isinstance(environment, dict):
-    raise GypError("The 'environment' for 'run_as' in target %s "
-                   "in file %s should be a dictionary." %
-                   (target_name, build_file))
-
-
-def ValidateActionsInTarget(target, target_dict, build_file):
-  '''Validates the inputs to the actions in a target.'''
-  target_name = target_dict.get('target_name')
-  actions = target_dict.get('actions', [])
-  for action in actions:
-    action_name = action.get('action_name')
-    if not action_name:
-      raise GypError("Anonymous action in target %s.  "
-                     "An action must have an 'action_name' field." %
-                     target_name)
-    inputs = action.get('inputs', None)
-    if inputs is None:
-      raise GypError('Action in target %s has no inputs.' % target_name)
-    action_command = action.get('action')
-    if action_command and not action_command[0]:
-      raise GypError("Empty action as command in target %s." % target_name)
-
-
-def TurnIntIntoStrInDict(the_dict):
-  """Given dict the_dict, recursively converts all integers into strings.
-  """
-  # Use items instead of iteritems because there's no need to try to look at
-  # reinserted keys and their associated values.
-  for k, v in the_dict.items():
-    if isinstance(v, int):
-      v = str(v)
-      the_dict[k] = v
-    elif isinstance(v, dict):
-      TurnIntIntoStrInDict(v)
-    elif isinstance(v, list):
-      TurnIntIntoStrInList(v)
-
-    if isinstance(k, int):
-      the_dict[str(k)] = v
-      del the_dict[k]
-
-
-def TurnIntIntoStrInList(the_list):
-  """Given list the_list, recursively converts all integers into strings.
-  """
-  for index in xrange(0, len(the_list)):
-    item = the_list[index]
-    if isinstance(item, int):
-      the_list[index] = str(item)
-    elif isinstance(item, dict):
-      TurnIntIntoStrInDict(item)
-    elif isinstance(item, list):
-      TurnIntIntoStrInList(item)
-
-
-def VerifyNoCollidingTargets(targets):
-  """Verify that no two targets in the same directory share the same name.
-
-  Arguments:
-    targets: A list of targets in the form 'path/to/file.gyp:target_name'.
-  """
-  # Keep a dict going from 'subdirectory:target_name' to 'foo.gyp'.
-  used = {}
-  for target in targets:
-    # Separate out 'path/to/file.gyp, 'target_name' from
-    # 'path/to/file.gyp:target_name'.
-    path, name = target.rsplit(':', 1)
-    # Separate out 'path/to', 'file.gyp' from 'path/to/file.gyp'.
-    subdir, gyp = os.path.split(path)
-    # Use '.' for the current directory '', so that the error messages make
-    # more sense.
-    if not subdir:
-      subdir = '.'
-    # Prepare a key like 'path/to:target_name'.
-    key = subdir + ':' + name
-    if key in used:
-      # Complain if this target is already used.
-      raise GypError('Duplicate target name "%s" in directory "%s" used both '
-                     'in "%s" and "%s".' % (name, subdir, gyp, used[key]))
-    used[key] = gyp
-
-
-def Load(build_files, variables, includes, depth, generator_input_info, check,
-         circular_check, parallel):
-  # Set up path_sections and non_configuration_keys with the default data plus
-  # the generator-specifc data.
-  global path_sections
-  path_sections = base_path_sections[:]
-  path_sections.extend(generator_input_info['path_sections'])
-
-  global non_configuration_keys
-  non_configuration_keys = base_non_configuration_keys[:]
-  non_configuration_keys.extend(generator_input_info['non_configuration_keys'])
-
-  # TODO(mark) handle variants if the generator doesn't want them directly.
-  generator_handles_variants = \
-      generator_input_info['generator_handles_variants']
-
-  global absolute_build_file_paths
-  absolute_build_file_paths = \
-      generator_input_info['generator_wants_absolute_build_file_paths']
-
-  global multiple_toolsets
-  multiple_toolsets = generator_input_info[
-      'generator_supports_multiple_toolsets']
-
-  # A generator can have other lists (in addition to sources) be processed
-  # for rules.
-  extra_sources_for_rules = generator_input_info['extra_sources_for_rules']
-
-  # Load build files.  This loads every target-containing build file into
-  # the |data| dictionary such that the keys to |data| are build file names,
-  # and the values are the entire build file contents after "early" or "pre"
-  # processing has been done and includes have been resolved.
-  # NOTE: data contains both "target" files (.gyp) and "includes" (.gypi), as
-  # well as meta-data (e.g. 'included_files' key). 'target_build_files' keeps
-  # track of the keys corresponding to "target" files.
-  data = {'target_build_files': set()}
-  aux_data = {}
-  for build_file in build_files:
-    # Normalize paths everywhere.  This is important because paths will be
-    # used as keys to the data dict and for references between input files.
-    build_file = os.path.normpath(build_file)
-    try:
-      if parallel:
-        print >>sys.stderr, 'Using parallel processing.'
-        LoadTargetBuildFileParallel(build_file, data, aux_data,
-                                    variables, includes, depth, check)
-      else:
-        LoadTargetBuildFile(build_file, data, aux_data,
-                            variables, includes, depth, check, True)
-    except Exception, e:
-      gyp.common.ExceptionAppend(e, 'while trying to load %s' % build_file)
-      raise
-
-  # Build a dict to access each target's subdict by qualified name.
-  targets = BuildTargetsDict(data)
-
-  # Fully qualify all dependency links.
-  QualifyDependencies(targets)
-
-  # Remove self-dependencies from targets that have 'prune_self_dependencies'
-  # set to 1.
-  RemoveSelfDependencies(targets)
-
-  # Expand dependencies specified as build_file:*.
-  ExpandWildcardDependencies(targets, data)
-
-  # Apply exclude (!) and regex (/) list filters only for dependency_sections.
-  for target_name, target_dict in targets.iteritems():
-    tmp_dict = {}
-    for key_base in dependency_sections:
-      for op in ('', '!', '/'):
-        key = key_base + op
-        if key in target_dict:
-          tmp_dict[key] = target_dict[key]
-          del target_dict[key]
-    ProcessListFiltersInDict(target_name, tmp_dict)
-    # Write the results back to |target_dict|.
-    for key in tmp_dict:
-      target_dict[key] = tmp_dict[key]
-
-  # Make sure every dependency appears at most once.
-  RemoveDuplicateDependencies(targets)
-
-  if circular_check:
-    # Make sure that any targets in a.gyp don't contain dependencies in other
-    # .gyp files that further depend on a.gyp.
-    VerifyNoGYPFileCircularDependencies(targets)
-
-  [dependency_nodes, flat_list] = BuildDependencyList(targets)
-
-  # Check that no two targets in the same directory have the same name.
-  VerifyNoCollidingTargets(flat_list)
-
-  # Handle dependent settings of various types.
-  for settings_type in ['all_dependent_settings',
-                        'direct_dependent_settings',
-                        'link_settings']:
-    DoDependentSettings(settings_type, flat_list, targets, dependency_nodes)
-
-    # Take out the dependent settings now that they've been published to all
-    # of the targets that require them.
-    for target in flat_list:
-      if settings_type in targets[target]:
-        del targets[target][settings_type]
-
-  # Make sure static libraries don't declare dependencies on other static
-  # libraries, but that linkables depend on all unlinked static libraries
-  # that they need so that their link steps will be correct.
-  gii = generator_input_info
-  if gii['generator_wants_static_library_dependencies_adjusted']:
-    AdjustStaticLibraryDependencies(flat_list, targets, dependency_nodes,
-                                    gii['generator_wants_sorted_dependencies'])
-
-  # Apply "post"/"late"/"target" variable expansions and condition evaluations.
-  for target in flat_list:
-    target_dict = targets[target]
-    build_file = gyp.common.BuildFile(target)
-    ProcessVariablesAndConditionsInDict(
-        target_dict, PHASE_LATE, variables, build_file)
-
-  # Move everything that can go into a "configurations" section into one.
-  for target in flat_list:
-    target_dict = targets[target]
-    SetUpConfigurations(target, target_dict)
-
-  # Apply exclude (!) and regex (/) list filters.
-  for target in flat_list:
-    target_dict = targets[target]
-    ProcessListFiltersInDict(target, target_dict)
-
-  # Apply "latelate" variable expansions and condition evaluations.
-  for target in flat_list:
-    target_dict = targets[target]
-    build_file = gyp.common.BuildFile(target)
-    ProcessVariablesAndConditionsInDict(
-        target_dict, PHASE_LATELATE, variables, build_file)
-
-  # Make sure that the rules make sense, and build up rule_sources lists as
-  # needed.  Not all generators will need to use the rule_sources lists, but
-  # some may, and it seems best to build the list in a common spot.
-  # Also validate actions and run_as elements in targets.
-  for target in flat_list:
-    target_dict = targets[target]
-    build_file = gyp.common.BuildFile(target)
-    ValidateTargetType(target, target_dict)
-    # TODO(thakis): Get vpx_scale/arm/scalesystemdependent.c to be renamed to
-    #               scalesystemdependent_arm_additions.c or similar.
-    if 'arm' not in variables.get('target_arch', ''):
-      ValidateSourcesInTarget(target, target_dict, build_file)
-    ValidateRulesInTarget(target, target_dict, extra_sources_for_rules)
-    ValidateRunAsInTarget(target, target_dict, build_file)
-    ValidateActionsInTarget(target, target_dict, build_file)
-
-  # Generators might not expect ints.  Turn them into strs.
-  TurnIntIntoStrInDict(data)
-
-  # TODO(mark): Return |data| for now because the generator needs a list of
-  # build files that came in.  In the future, maybe it should just accept
-  # a list, and not the whole data dict.
-  return [flat_list, targets, data]
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/mac_tool.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,223 +0,0 @@
-#!/usr/bin/env python
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Utility functions to perform Xcode-style build steps.
-
-These functions are executed via gyp-mac-tool when using the Makefile generator.
-"""
-
-import fcntl
-import os
-import plistlib
-import re
-import shutil
-import string
-import subprocess
-import sys
-
-
-def main(args):
-  executor = MacTool()
-  exit_code = executor.Dispatch(args)
-  if exit_code is not None:
-    sys.exit(exit_code)
-
-
-class MacTool(object):
-  """This class performs all the Mac tooling steps. The methods can either be
-  executed directly, or dispatched from an argument list."""
-
-  def Dispatch(self, args):
-    """Dispatches a string command to a method."""
-    if len(args) < 1:
-      raise Exception("Not enough arguments")
-
-    method = "Exec%s" % self._CommandifyName(args[0])
-    return getattr(self, method)(*args[1:])
-
-  def _CommandifyName(self, name_string):
-    """Transforms a tool name like copy-info-plist to CopyInfoPlist"""
-    return name_string.title().replace('-', '')
-
-  def ExecCopyBundleResource(self, source, dest):
-    """Copies a resource file to the bundle/Resources directory, performing any
-    necessary compilation on each resource."""
-    extension = os.path.splitext(source)[1].lower()
-    if os.path.isdir(source):
-      # Copy tree.
-      if os.path.exists(dest):
-        shutil.rmtree(dest)
-      shutil.copytree(source, dest)
-    elif extension == '.xib':
-      return self._CopyXIBFile(source, dest)
-    elif extension == '.strings':
-      self._CopyStringsFile(source, dest)
-    else:
-      shutil.copyfile(source, dest)
-
-  def _CopyXIBFile(self, source, dest):
-    """Compiles a XIB file with ibtool into a binary plist in the bundle."""
-    tools_dir = os.environ.get('DEVELOPER_BIN_DIR', '/usr/bin')
-    args = [os.path.join(tools_dir, 'ibtool'), '--errors', '--warnings',
-        '--notices', '--output-format', 'human-readable-text', '--compile',
-        dest, source]
-    ibtool_section_re = re.compile(r'/\*.*\*/')
-    ibtool_re = re.compile(r'.*note:.*is clipping its content')
-    ibtoolout = subprocess.Popen(args, stdout=subprocess.PIPE)
-    current_section_header = None
-    for line in ibtoolout.stdout:
-      if ibtool_section_re.match(line):
-        current_section_header = line
-      elif not ibtool_re.match(line):
-        if current_section_header:
-          sys.stdout.write(current_section_header)
-          current_section_header = None
-        sys.stdout.write(line)
-    return ibtoolout.returncode
-
-  def _CopyStringsFile(self, source, dest):
-    """Copies a .strings file using iconv to reconvert the input into UTF-16."""
-    input_code = self._DetectInputEncoding(source) or "UTF-8"
-
-    # Xcode's CpyCopyStringsFile / builtin-copyStrings seems to call
-    # CFPropertyListCreateFromXMLData() behind the scenes; at least it prints
-    #     CFPropertyListCreateFromXMLData(): Old-style plist parser: missing
-    #     semicolon in dictionary.
-    # on invalid files. Do the same kind of validation.
-    import CoreFoundation
-    s = open(source).read()
-    d = CoreFoundation.CFDataCreate(None, s, len(s))
-    _, error = CoreFoundation.CFPropertyListCreateFromXMLData(None, d, 0, None)
-    if error:
-      return
-
-    fp = open(dest, 'w')
-    args = ['/usr/bin/iconv', '--from-code', input_code, '--to-code',
-        'UTF-16', source]
-    subprocess.call(args, stdout=fp)
-    fp.close()
-
-  def _DetectInputEncoding(self, file_name):
-    """Reads the first few bytes from file_name and tries to guess the text
-    encoding. Returns None as a guess if it can't detect it."""
-    fp = open(file_name, 'rb')
-    try:
-      header = fp.read(3)
-    except e:
-      fp.close()
-      return None
-    fp.close()
-    if header.startswith("\xFE\xFF"):
-      return "UTF-16BE"
-    elif header.startswith("\xFF\xFE"):
-      return "UTF-16LE"
-    elif header.startswith("\xEF\xBB\xBF"):
-      return "UTF-8"
-    else:
-      return None
-
-  def ExecCopyInfoPlist(self, source, dest):
-    """Copies the |source| Info.plist to the destination directory |dest|."""
-    # Read the source Info.plist into memory.
-    fd = open(source, 'r')
-    lines = fd.read()
-    fd.close()
-
-    # Go through all the environment variables and replace them as variables in
-    # the file.
-    for key in os.environ:
-      if key.startswith('_'):
-        continue
-      evar = '${%s}' % key
-      lines = string.replace(lines, evar, os.environ[key])
-
-    # Write out the file with variables replaced.
-    fd = open(dest, 'w')
-    fd.write(lines)
-    fd.close()
-
-    # Now write out PkgInfo file now that the Info.plist file has been
-    # "compiled".
-    self._WritePkgInfo(dest)
-
-  def _WritePkgInfo(self, info_plist):
-    """This writes the PkgInfo file from the data stored in Info.plist."""
-    plist = plistlib.readPlist(info_plist)
-    if not plist:
-      return
-
-    # Only create PkgInfo for executable types.
-    package_type = plist['CFBundlePackageType']
-    if package_type != 'APPL':
-      return
-
-    # The format of PkgInfo is eight characters, representing the bundle type
-    # and bundle signature, each four characters. If that is missing, four
-    # '?' characters are used instead.
-    signature_code = plist.get('CFBundleSignature', '????')
-    if len(signature_code) != 4:  # Wrong length resets everything, too.
-      signature_code = '?' * 4
-
-    dest = os.path.join(os.path.dirname(info_plist), 'PkgInfo')
-    fp = open(dest, 'w')
-    fp.write('%s%s' % (package_type, signature_code))
-    fp.close()
-
-  def ExecFlock(self, lockfile, *cmd_list):
-    """Emulates the most basic behavior of Linux's flock(1)."""
-    # Rely on exception handling to report errors.
-    fd = os.open(lockfile, os.O_RDONLY|os.O_NOCTTY|os.O_CREAT, 0o666)
-    fcntl.flock(fd, fcntl.LOCK_EX)
-    return subprocess.call(cmd_list)
-
-  def ExecFilterLibtool(self, *cmd_list):
-    """Calls libtool and filters out 'libtool: file: foo.o has no symbols'."""
-    libtool_re = re.compile(r'^libtool: file: .* has no symbols$')
-    libtoolout = subprocess.Popen(cmd_list, stderr=subprocess.PIPE)
-    _, err = libtoolout.communicate()
-    for line in err.splitlines():
-      if not libtool_re.match(line):
-        print >>sys.stderr, line
-    return libtoolout.returncode
-
-  def ExecPackageFramework(self, framework, version):
-    """Takes a path to Something.framework and the Current version of that and
-    sets up all the symlinks."""
-    # Find the name of the binary based on the part before the ".framework".
-    binary = os.path.basename(framework).split('.')[0]
-
-    CURRENT = 'Current'
-    RESOURCES = 'Resources'
-    VERSIONS = 'Versions'
-
-    if not os.path.exists(os.path.join(framework, VERSIONS, version, binary)):
-      # Binary-less frameworks don't seem to contain symlinks (see e.g.
-      # chromium's out/Debug/org.chromium.Chromium.manifest/ bundle).
-      return
-
-    # Move into the framework directory to set the symlinks correctly.
-    pwd = os.getcwd()
-    os.chdir(framework)
-
-    # Set up the Current version.
-    self._Relink(version, os.path.join(VERSIONS, CURRENT))
-
-    # Set up the root symlinks.
-    self._Relink(os.path.join(VERSIONS, CURRENT, binary), binary)
-    self._Relink(os.path.join(VERSIONS, CURRENT, RESOURCES), RESOURCES)
-
-    # Back to where we were before!
-    os.chdir(pwd)
-
-  def _Relink(self, dest, link):
-    """Creates a symlink to |dest| named |link|. If |link| already exists,
-    it is overwritten."""
-    if os.path.lexists(link):
-      os.remove(link)
-    os.symlink(dest, link)
-
-
-if __name__ == '__main__':
-  sys.exit(main(sys.argv[1:]))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/msvs_emulation.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,771 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""
-This module helps emulate Visual Studio 2008 behavior on top of other
-build systems, primarily ninja.
-"""
-
-import os
-import re
-import subprocess
-import sys
-
-import gyp.MSVSVersion
-
-windows_quoter_regex = re.compile(r'(\\*)"')
-
-def QuoteForRspFile(arg):
-  """Quote a command line argument so that it appears as one argument when
-  processed via cmd.exe and parsed by CommandLineToArgvW (as is typical for
-  Windows programs)."""
-  # See http://goo.gl/cuFbX and http://goo.gl/dhPnp including the comment
-  # threads. This is actually the quoting rules for CommandLineToArgvW, not
-  # for the shell, because the shell doesn't do anything in Windows. This
-  # works more or less because most programs (including the compiler, etc.)
-  # use that function to handle command line arguments.
-
-  # For a literal quote, CommandLineToArgvW requires 2n+1 backslashes
-  # preceding it, and results in n backslashes + the quote. So we substitute
-  # in 2* what we match, +1 more, plus the quote.
-  arg = windows_quoter_regex.sub(lambda mo: 2 * mo.group(1) + '\\"', arg)
-
-  # %'s also need to be doubled otherwise they're interpreted as batch
-  # positional arguments. Also make sure to escape the % so that they're
-  # passed literally through escaping so they can be singled to just the
-  # original %. Otherwise, trying to pass the literal representation that
-  # looks like an environment variable to the shell (e.g. %PATH%) would fail.
-  arg = arg.replace('%', '%%')
-
-  # These commands are used in rsp files, so no escaping for the shell (via ^)
-  # is necessary.
-
-  # Finally, wrap the whole thing in quotes so that the above quote rule
-  # applies and whitespace isn't a word break.
-  return '"' + arg + '"'
-
-
-def EncodeRspFileList(args):
-  """Process a list of arguments using QuoteCmdExeArgument."""
-  # Note that the first argument is assumed to be the command. Don't add
-  # quotes around it because then built-ins like 'echo', etc. won't work.
-  # Take care to normpath only the path in the case of 'call ../x.bat' because
-  # otherwise the whole thing is incorrectly interpreted as a path and not
-  # normalized correctly.
-  if not args: return ''
-  if args[0].startswith('call '):
-    call, program = args[0].split(' ', 1)
-    program = call + ' ' + os.path.normpath(program)
-  else:
-    program = os.path.normpath(args[0])
-  return program + ' ' + ' '.join(QuoteForRspFile(arg) for arg in args[1:])
-
-
-def _GenericRetrieve(root, default, path):
-  """Given a list of dictionary keys |path| and a tree of dicts |root|, find
-  value at path, or return |default| if any of the path doesn't exist."""
-  if not root:
-    return default
-  if not path:
-    return root
-  return _GenericRetrieve(root.get(path[0]), default, path[1:])
-
-
-def _AddPrefix(element, prefix):
-  """Add |prefix| to |element| or each subelement if element is iterable."""
-  if element is None:
-    return element
-  # Note, not Iterable because we don't want to handle strings like that.
-  if isinstance(element, list) or isinstance(element, tuple):
-    return [prefix + e for e in element]
-  else:
-    return prefix + element
-
-
-def _DoRemapping(element, map):
-  """If |element| then remap it through |map|. If |element| is iterable then
-  each item will be remapped. Any elements not found will be removed."""
-  if map is not None and element is not None:
-    if not callable(map):
-      map = map.get # Assume it's a dict, otherwise a callable to do the remap.
-    if isinstance(element, list) or isinstance(element, tuple):
-      element = filter(None, [map(elem) for elem in element])
-    else:
-      element = map(element)
-  return element
-
-
-def _AppendOrReturn(append, element):
-  """If |append| is None, simply return |element|. If |append| is not None,
-  then add |element| to it, adding each item in |element| if it's a list or
-  tuple."""
-  if append is not None and element is not None:
-    if isinstance(element, list) or isinstance(element, tuple):
-      append.extend(element)
-    else:
-      append.append(element)
-  else:
-    return element
-
-
-def _FindDirectXInstallation():
-  """Try to find an installation location for the DirectX SDK. Check for the
-  standard environment variable, and if that doesn't exist, try to find
-  via the registry. May return None if not found in either location."""
-  # Return previously calculated value, if there is one
-  if hasattr(_FindDirectXInstallation, 'dxsdk_dir'):
-    return _FindDirectXInstallation.dxsdk_dir
-
-  dxsdk_dir = os.environ.get('DXSDK_DIR')
-  if not dxsdk_dir:
-    # Setup params to pass to and attempt to launch reg.exe.
-    cmd = ['reg.exe', 'query', r'HKLM\Software\Microsoft\DirectX', '/s']
-    p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
-    for line in p.communicate()[0].splitlines():
-      if 'InstallPath' in line:
-        dxsdk_dir = line.split('    ')[3] + "\\"
-
-  # Cache return value
-  _FindDirectXInstallation.dxsdk_dir = dxsdk_dir
-  return dxsdk_dir
-
-
-class MsvsSettings(object):
-  """A class that understands the gyp 'msvs_...' values (especially the
-  msvs_settings field). They largely correpond to the VS2008 IDE DOM. This
-  class helps map those settings to command line options."""
-
-  def __init__(self, spec, generator_flags):
-    self.spec = spec
-    self.vs_version = GetVSVersion(generator_flags)
-    self.dxsdk_dir = _FindDirectXInstallation()
-
-    # Try to find an installation location for the Windows DDK by checking
-    # the WDK_DIR environment variable, may be None.
-    self.wdk_dir = os.environ.get('WDK_DIR')
-
-    supported_fields = [
-        ('msvs_configuration_attributes', dict),
-        ('msvs_settings', dict),
-        ('msvs_system_include_dirs', list),
-        ('msvs_disabled_warnings', list),
-        ('msvs_precompiled_header', str),
-        ('msvs_precompiled_source', str),
-        ('msvs_configuration_platform', str),
-        ('msvs_target_platform', str),
-        ]
-    configs = spec['configurations']
-    for field, default in supported_fields:
-      setattr(self, field, {})
-      for configname, config in configs.iteritems():
-        getattr(self, field)[configname] = config.get(field, default())
-
-    self.msvs_cygwin_dirs = spec.get('msvs_cygwin_dirs', ['.'])
-
-  def GetVSMacroEnv(self, base_to_build=None, config=None):
-    """Get a dict of variables mapping internal VS macro names to their gyp
-    equivalents."""
-    target_platform = 'Win32' if self.GetArch(config) == 'x86' else 'x64'
-    replacements = {
-        '$(OutDir)\\': base_to_build + '\\' if base_to_build else '',
-        '$(IntDir)': '$!INTERMEDIATE_DIR',
-        '$(InputPath)': '${source}',
-        '$(InputName)': '${root}',
-        '$(ProjectName)': self.spec['target_name'],
-        '$(PlatformName)': target_platform,
-        '$(ProjectDir)\\': '',
-    }
-    # '$(VSInstallDir)' and '$(VCInstallDir)' are available when and only when
-    # Visual Studio is actually installed.
-    if self.vs_version.Path():
-      replacements['$(VSInstallDir)'] = self.vs_version.Path()
-      replacements['$(VCInstallDir)'] = os.path.join(self.vs_version.Path(),
-                                                     'VC') + '\\'
-    # Chromium uses DXSDK_DIR in include/lib paths, but it may or may not be
-    # set. This happens when the SDK is sync'd via src-internal, rather than
-    # by typical end-user installation of the SDK. If it's not set, we don't
-    # want to leave the unexpanded variable in the path, so simply strip it.
-    replacements['$(DXSDK_DIR)'] = self.dxsdk_dir if self.dxsdk_dir else ''
-    replacements['$(WDK_DIR)'] = self.wdk_dir if self.wdk_dir else ''
-    return replacements
-
-  def ConvertVSMacros(self, s, base_to_build=None, config=None):
-    """Convert from VS macro names to something equivalent."""
-    env = self.GetVSMacroEnv(base_to_build, config=config)
-    return ExpandMacros(s, env)
-
-  def AdjustLibraries(self, libraries):
-    """Strip -l from library if it's specified with that."""
-    return [lib[2:] if lib.startswith('-l') else lib for lib in libraries]
-
-  def _GetAndMunge(self, field, path, default, prefix, append, map):
-    """Retrieve a value from |field| at |path| or return |default|. If
-    |append| is specified, and the item is found, it will be appended to that
-    object instead of returned. If |map| is specified, results will be
-    remapped through |map| before being returned or appended."""
-    result = _GenericRetrieve(field, default, path)
-    result = _DoRemapping(result, map)
-    result = _AddPrefix(result, prefix)
-    return _AppendOrReturn(append, result)
-
-  class _GetWrapper(object):
-    def __init__(self, parent, field, base_path, append=None):
-      self.parent = parent
-      self.field = field
-      self.base_path = [base_path]
-      self.append = append
-    def __call__(self, name, map=None, prefix='', default=None):
-      return self.parent._GetAndMunge(self.field, self.base_path + [name],
-          default=default, prefix=prefix, append=self.append, map=map)
-
-  def GetArch(self, config):
-    """Get architecture based on msvs_configuration_platform and
-    msvs_target_platform. Returns either 'x86' or 'x64'."""
-    configuration_platform = self.msvs_configuration_platform.get(config, '')
-    platform = self.msvs_target_platform.get(config, '')
-    if not platform: # If no specific override, use the configuration's.
-      platform = configuration_platform
-    # Map from platform to architecture.
-    return {'Win32': 'x86', 'x64': 'x64'}.get(platform, 'x86')
-
-  def _TargetConfig(self, config):
-    """Returns the target-specific configuration."""
-    # There's two levels of architecture/platform specification in VS. The
-    # first level is globally for the configuration (this is what we consider
-    # "the" config at the gyp level, which will be something like 'Debug' or
-    # 'Release_x64'), and a second target-specific configuration, which is an
-    # override for the global one. |config| is remapped here to take into
-    # account the local target-specific overrides to the global configuration.
-    arch = self.GetArch(config)
-    if arch == 'x64' and not config.endswith('_x64'):
-      config += '_x64'
-    if arch == 'x86' and config.endswith('_x64'):
-      config = config.rsplit('_', 1)[0]
-    return config
-
-  def _Setting(self, path, config,
-              default=None, prefix='', append=None, map=None):
-    """_GetAndMunge for msvs_settings."""
-    return self._GetAndMunge(
-        self.msvs_settings[config], path, default, prefix, append, map)
-
-  def _ConfigAttrib(self, path, config,
-                   default=None, prefix='', append=None, map=None):
-    """_GetAndMunge for msvs_configuration_attributes."""
-    return self._GetAndMunge(
-        self.msvs_configuration_attributes[config],
-        path, default, prefix, append, map)
-
-  def AdjustIncludeDirs(self, include_dirs, config):
-    """Updates include_dirs to expand VS specific paths, and adds the system
-    include dirs used for platform SDK and similar."""
-    config = self._TargetConfig(config)
-    includes = include_dirs + self.msvs_system_include_dirs[config]
-    includes.extend(self._Setting(
-      ('VCCLCompilerTool', 'AdditionalIncludeDirectories'), config, default=[]))
-    return [self.ConvertVSMacros(p, config=config) for p in includes]
-
-  def GetComputedDefines(self, config):
-    """Returns the set of defines that are injected to the defines list based
-    on other VS settings."""
-    config = self._TargetConfig(config)
-    defines = []
-    if self._ConfigAttrib(['CharacterSet'], config) == '1':
-      defines.extend(('_UNICODE', 'UNICODE'))
-    if self._ConfigAttrib(['CharacterSet'], config) == '2':
-      defines.append('_MBCS')
-    defines.extend(self._Setting(
-        ('VCCLCompilerTool', 'PreprocessorDefinitions'), config, default=[]))
-    return defines
-
-  def GetCompilerPdbName(self, config, expand_special):
-    """Get the pdb file name that should be used for compiler invocations, or
-    None if there's no explicit name specified."""
-    config = self._TargetConfig(config)
-    pdbname = self._Setting(
-        ('VCCLCompilerTool', 'ProgramDataBaseFileName'), config)
-    if pdbname:
-      pdbname = expand_special(self.ConvertVSMacros(pdbname))
-    return pdbname
-
-  def GetOutputName(self, config, expand_special):
-    """Gets the explicitly overridden output name for a target or returns None
-    if it's not overridden."""
-    config = self._TargetConfig(config)
-    type = self.spec['type']
-    root = 'VCLibrarianTool' if type == 'static_library' else 'VCLinkerTool'
-    # TODO(scottmg): Handle OutputDirectory without OutputFile.
-    output_file = self._Setting((root, 'OutputFile'), config)
-    if output_file:
-      output_file = expand_special(self.ConvertVSMacros(
-          output_file, config=config))
-    return output_file
-
-  def GetPDBName(self, config, expand_special):
-    """Gets the explicitly overridden pdb name for a target or returns None
-    if it's not overridden."""
-    config = self._TargetConfig(config)
-    output_file = self._Setting(('VCLinkerTool', 'ProgramDatabaseFile'), config)
-    if output_file:
-      output_file = expand_special(self.ConvertVSMacros(
-          output_file, config=config))
-    return output_file
-
-  def GetCflags(self, config):
-    """Returns the flags that need to be added to .c and .cc compilations."""
-    config = self._TargetConfig(config)
-    cflags = []
-    cflags.extend(['/wd' + w for w in self.msvs_disabled_warnings[config]])
-    cl = self._GetWrapper(self, self.msvs_settings[config],
-                          'VCCLCompilerTool', append=cflags)
-    cl('Optimization',
-       map={'0': 'd', '1': '1', '2': '2', '3': 'x'}, prefix='/O')
-    cl('InlineFunctionExpansion', prefix='/Ob')
-    cl('OmitFramePointers', map={'false': '-', 'true': ''}, prefix='/Oy')
-    cl('EnableIntrinsicFunctions', map={'false': '-', 'true': ''}, prefix='/Oi')
-    cl('FavorSizeOrSpeed', map={'1': 't', '2': 's'}, prefix='/O')
-    cl('WholeProgramOptimization', map={'true': '/GL'})
-    cl('WarningLevel', prefix='/W')
-    cl('WarnAsError', map={'true': '/WX'})
-    cl('DebugInformationFormat',
-        map={'1': '7', '3': 'i', '4': 'I'}, prefix='/Z')
-    cl('RuntimeTypeInfo', map={'true': '/GR', 'false': '/GR-'})
-    cl('EnableFunctionLevelLinking', map={'true': '/Gy', 'false': '/Gy-'})
-    cl('MinimalRebuild', map={'true': '/Gm'})
-    cl('BufferSecurityCheck', map={'true': '/GS', 'false': '/GS-'})
-    cl('BasicRuntimeChecks', map={'1': 's', '2': 'u', '3': '1'}, prefix='/RTC')
-    cl('RuntimeLibrary',
-        map={'0': 'T', '1': 'Td', '2': 'D', '3': 'Dd'}, prefix='/M')
-    cl('ExceptionHandling', map={'1': 'sc','2': 'a'}, prefix='/EH')
-    cl('DefaultCharIsUnsigned', map={'true': '/J'})
-    cl('TreatWChar_tAsBuiltInType',
-        map={'false': '-', 'true': ''}, prefix='/Zc:wchar_t')
-    cl('EnablePREfast', map={'true': '/analyze'})
-    cl('AdditionalOptions', prefix='')
-    cflags.extend(['/FI' + f for f in self._Setting(
-        ('VCCLCompilerTool', 'ForcedIncludeFiles'), config, default=[])])
-    # ninja handles parallelism by itself, don't have the compiler do it too.
-    cflags = filter(lambda x: not x.startswith('/MP'), cflags)
-    return cflags
-
-  def GetPrecompiledHeader(self, config, gyp_to_build_path):
-    """Returns an object that handles the generation of precompiled header
-    build steps."""
-    config = self._TargetConfig(config)
-    return _PchHelper(self, config, gyp_to_build_path)
-
-  def _GetPchFlags(self, config, extension):
-    """Get the flags to be added to the cflags for precompiled header support.
-    """
-    config = self._TargetConfig(config)
-    # The PCH is only built once by a particular source file. Usage of PCH must
-    # only be for the same language (i.e. C vs. C++), so only include the pch
-    # flags when the language matches.
-    if self.msvs_precompiled_header[config]:
-      source_ext = os.path.splitext(self.msvs_precompiled_source[config])[1]
-      if _LanguageMatchesForPch(source_ext, extension):
-        pch = os.path.split(self.msvs_precompiled_header[config])[1]
-        return ['/Yu' + pch, '/FI' + pch, '/Fp${pchprefix}.' + pch + '.pch']
-    return  []
-
-  def GetCflagsC(self, config):
-    """Returns the flags that need to be added to .c compilations."""
-    config = self._TargetConfig(config)
-    return self._GetPchFlags(config, '.c')
-
-  def GetCflagsCC(self, config):
-    """Returns the flags that need to be added to .cc compilations."""
-    config = self._TargetConfig(config)
-    return ['/TP'] + self._GetPchFlags(config, '.cc')
-
-  def _GetAdditionalLibraryDirectories(self, root, config, gyp_to_build_path):
-    """Get and normalize the list of paths in AdditionalLibraryDirectories
-    setting."""
-    config = self._TargetConfig(config)
-    libpaths = self._Setting((root, 'AdditionalLibraryDirectories'),
-                             config, default=[])
-    libpaths = [os.path.normpath(
-                    gyp_to_build_path(self.ConvertVSMacros(p, config=config)))
-                for p in libpaths]
-    return ['/LIBPATH:"' + p + '"' for p in libpaths]
-
-  def GetLibFlags(self, config, gyp_to_build_path):
-    """Returns the flags that need to be added to lib commands."""
-    config = self._TargetConfig(config)
-    libflags = []
-    lib = self._GetWrapper(self, self.msvs_settings[config],
-                          'VCLibrarianTool', append=libflags)
-    libflags.extend(self._GetAdditionalLibraryDirectories(
-        'VCLibrarianTool', config, gyp_to_build_path))
-    lib('LinkTimeCodeGeneration', map={'true': '/LTCG'})
-    lib('AdditionalOptions')
-    return libflags
-
-  def _GetDefFileAsLdflags(self, spec, ldflags, gyp_to_build_path):
-    """.def files get implicitly converted to a ModuleDefinitionFile for the
-    linker in the VS generator. Emulate that behaviour here."""
-    def_file = ''
-    if spec['type'] in ('shared_library', 'loadable_module', 'executable'):
-      def_files = [s for s in spec.get('sources', []) if s.endswith('.def')]
-      if len(def_files) == 1:
-        ldflags.append('/DEF:"%s"' % gyp_to_build_path(def_files[0]))
-      elif len(def_files) > 1:
-        raise Exception("Multiple .def files")
-
-  def GetLdflags(self, config, gyp_to_build_path, expand_special,
-                 manifest_base_name, is_executable):
-    """Returns the flags that need to be added to link commands, and the
-    manifest files."""
-    config = self._TargetConfig(config)
-    ldflags = []
-    ld = self._GetWrapper(self, self.msvs_settings[config],
-                          'VCLinkerTool', append=ldflags)
-    self._GetDefFileAsLdflags(self.spec, ldflags, gyp_to_build_path)
-    ld('GenerateDebugInformation', map={'true': '/DEBUG'})
-    ld('TargetMachine', map={'1': 'X86', '17': 'X64'}, prefix='/MACHINE:')
-    ldflags.extend(self._GetAdditionalLibraryDirectories(
-        'VCLinkerTool', config, gyp_to_build_path))
-    ld('DelayLoadDLLs', prefix='/DELAYLOAD:')
-    out = self.GetOutputName(config, expand_special)
-    if out:
-      ldflags.append('/OUT:' + out)
-    pdb = self.GetPDBName(config, expand_special)
-    if pdb:
-      ldflags.append('/PDB:' + pdb)
-    ld('AdditionalOptions', prefix='')
-    ld('SubSystem', map={'1': 'CONSOLE', '2': 'WINDOWS'}, prefix='/SUBSYSTEM:')
-    ld('TerminalServerAware', map={'1': ':NO', '2': ''}, prefix='/TSAWARE')
-    ld('LinkIncremental', map={'1': ':NO', '2': ''}, prefix='/INCREMENTAL')
-    ld('FixedBaseAddress', map={'1': ':NO', '2': ''}, prefix='/FIXED')
-    ld('RandomizedBaseAddress',
-        map={'1': ':NO', '2': ''}, prefix='/DYNAMICBASE')
-    ld('DataExecutionPrevention',
-        map={'1': ':NO', '2': ''}, prefix='/NXCOMPAT')
-    ld('OptimizeReferences', map={'1': 'NOREF', '2': 'REF'}, prefix='/OPT:')
-    ld('EnableCOMDATFolding', map={'1': 'NOICF', '2': 'ICF'}, prefix='/OPT:')
-    ld('LinkTimeCodeGeneration', map={'1': '/LTCG'})
-    ld('IgnoreDefaultLibraryNames', prefix='/NODEFAULTLIB:')
-    ld('ResourceOnlyDLL', map={'true': '/NOENTRY'})
-    ld('EntryPointSymbol', prefix='/ENTRY:')
-    ld('Profile', map={'true': '/PROFILE'})
-    ld('LargeAddressAware',
-        map={'1': ':NO', '2': ''}, prefix='/LARGEADDRESSAWARE')
-    # TODO(scottmg): This should sort of be somewhere else (not really a flag).
-    ld('AdditionalDependencies', prefix='')
-
-    # If the base address is not specifically controlled, DYNAMICBASE should
-    # be on by default.
-    base_flags = filter(lambda x: 'DYNAMICBASE' in x or x == '/FIXED',
-                        ldflags)
-    if not base_flags:
-      ldflags.append('/DYNAMICBASE')
-
-    # If the NXCOMPAT flag has not been specified, default to on. Despite the
-    # documentation that says this only defaults to on when the subsystem is
-    # Vista or greater (which applies to the linker), the IDE defaults it on
-    # unless it's explicitly off.
-    if not filter(lambda x: 'NXCOMPAT' in x, ldflags):
-      ldflags.append('/NXCOMPAT')
-
-    have_def_file = filter(lambda x: x.startswith('/DEF:'), ldflags)
-    manifest_flags, intermediate_manifest_file = self._GetLdManifestFlags(
-        config, manifest_base_name, is_executable and not have_def_file)
-    ldflags.extend(manifest_flags)
-    manifest_files = self._GetAdditionalManifestFiles(config, gyp_to_build_path)
-    manifest_files.append(intermediate_manifest_file)
-
-    return ldflags, manifest_files
-
-  def _GetLdManifestFlags(self, config, name, allow_isolation):
-    """Returns the set of flags that need to be added to the link to generate
-    a default manifest, as well as the name of the generated file."""
-    # Add manifest flags that mirror the defaults in VS. Chromium dev builds
-    # do not currently use any non-default settings, but we could parse
-    # VCManifestTool blocks if Chromium or other projects need them in the
-    # future. Of particular note, we do not yet support EmbedManifest because
-    # it complicates incremental linking.
-    output_name = name + '.intermediate.manifest'
-    flags = [
-      '/MANIFEST',
-      '/ManifestFile:' + output_name,
-      '''/MANIFESTUAC:"level='asInvoker' uiAccess='false'"'''
-    ]
-    if allow_isolation:
-      flags.append('/ALLOWISOLATION')
-    return flags, output_name
-
-  def _GetAdditionalManifestFiles(self, config, gyp_to_build_path):
-    """Gets additional manifest files that are added to the default one
-    generated by the linker."""
-    files = self._Setting(('VCManifestTool', 'AdditionalManifestFiles'), config,
-                          default=[])
-    if (self._Setting(
-        ('VCManifestTool', 'EmbedManifest'), config, default='') == 'true'):
-      print 'gyp/msvs_emulation.py: "EmbedManifest: true" not yet supported.'
-    if isinstance(files, str):
-      files = files.split(';')
-    return [os.path.normpath(
-                gyp_to_build_path(self.ConvertVSMacros(f, config=config)))
-            for f in files]
-
-  def IsUseLibraryDependencyInputs(self, config):
-    """Returns whether the target should be linked via Use Library Dependency
-    Inputs (using component .objs of a given .lib)."""
-    config = self._TargetConfig(config)
-    uldi = self._Setting(('VCLinkerTool', 'UseLibraryDependencyInputs'), config)
-    return uldi == 'true'
-
-  def GetRcflags(self, config, gyp_to_ninja_path):
-    """Returns the flags that need to be added to invocations of the resource
-    compiler."""
-    config = self._TargetConfig(config)
-    rcflags = []
-    rc = self._GetWrapper(self, self.msvs_settings[config],
-        'VCResourceCompilerTool', append=rcflags)
-    rc('AdditionalIncludeDirectories', map=gyp_to_ninja_path, prefix='/I')
-    rcflags.append('/I' + gyp_to_ninja_path('.'))
-    rc('PreprocessorDefinitions', prefix='/d')
-    # /l arg must be in hex without leading '0x'
-    rc('Culture', prefix='/l', map=lambda x: hex(int(x))[2:])
-    return rcflags
-
-  def BuildCygwinBashCommandLine(self, args, path_to_base):
-    """Build a command line that runs args via cygwin bash. We assume that all
-    incoming paths are in Windows normpath'd form, so they need to be
-    converted to posix style for the part of the command line that's passed to
-    bash. We also have to do some Visual Studio macro emulation here because
-    various rules use magic VS names for things. Also note that rules that
-    contain ninja variables cannot be fixed here (for example ${source}), so
-    the outer generator needs to make sure that the paths that are written out
-    are in posix style, if the command line will be used here."""
-    cygwin_dir = os.path.normpath(
-        os.path.join(path_to_base, self.msvs_cygwin_dirs[0]))
-    cd = ('cd %s' % path_to_base).replace('\\', '/')
-    args = [a.replace('\\', '/').replace('"', '\\"') for a in args]
-    args = ["'%s'" % a.replace("'", "'\\''") for a in args]
-    bash_cmd = ' '.join(args)
-    cmd = (
-        'call "%s\\setup_env.bat" && set CYGWIN=nontsec && ' % cygwin_dir +
-        'bash -c "%s ; %s"' % (cd, bash_cmd))
-    return cmd
-
-  def IsRuleRunUnderCygwin(self, rule):
-    """Determine if an action should be run under cygwin. If the variable is
-    unset, or set to 1 we use cygwin."""
-    return int(rule.get('msvs_cygwin_shell',
-                        self.spec.get('msvs_cygwin_shell', 1))) != 0
-
-  def _HasExplicitRuleForExtension(self, spec, extension):
-    """Determine if there's an explicit rule for a particular extension."""
-    for rule in spec.get('rules', []):
-      if rule['extension'] == extension:
-        return True
-    return False
-
-  def HasExplicitIdlRules(self, spec):
-    """Determine if there's an explicit rule for idl files. When there isn't we
-    need to generate implicit rules to build MIDL .idl files."""
-    return self._HasExplicitRuleForExtension(spec, 'idl')
-
-  def HasExplicitAsmRules(self, spec):
-    """Determine if there's an explicit rule for asm files. When there isn't we
-    need to generate implicit rules to assemble .asm files."""
-    return self._HasExplicitRuleForExtension(spec, 'asm')
-
-  def GetIdlBuildData(self, source, config):
-    """Determine the implicit outputs for an idl file. Returns output
-    directory, outputs, and variables and flags that are required."""
-    config = self._TargetConfig(config)
-    midl_get = self._GetWrapper(self, self.msvs_settings[config], 'VCMIDLTool')
-    def midl(name, default=None):
-      return self.ConvertVSMacros(midl_get(name, default=default),
-                                  config=config)
-    tlb = midl('TypeLibraryName', default='${root}.tlb')
-    header = midl('HeaderFileName', default='${root}.h')
-    dlldata = midl('DLLDataFileName', default='dlldata.c')
-    iid = midl('InterfaceIdentifierFileName', default='${root}_i.c')
-    proxy = midl('ProxyFileName', default='${root}_p.c')
-    # Note that .tlb is not included in the outputs as it is not always
-    # generated depending on the content of the input idl file.
-    outdir = midl('OutputDirectory', default='')
-    output = [header, dlldata, iid, proxy]
-    variables = [('tlb', tlb),
-                 ('h', header),
-                 ('dlldata', dlldata),
-                 ('iid', iid),
-                 ('proxy', proxy)]
-    # TODO(scottmg): Are there configuration settings to set these flags?
-    target_platform = 'win32' if self.GetArch(config) == 'x86' else 'x64'
-    flags = ['/char', 'signed', '/env', target_platform, '/Oicf']
-    return outdir, output, variables, flags
-
-
-def _LanguageMatchesForPch(source_ext, pch_source_ext):
-  c_exts = ('.c',)
-  cc_exts = ('.cc', '.cxx', '.cpp')
-  return ((source_ext in c_exts and pch_source_ext in c_exts) or
-          (source_ext in cc_exts and pch_source_ext in cc_exts))
-
-
-class PrecompiledHeader(object):
-  """Helper to generate dependencies and build rules to handle generation of
-  precompiled headers. Interface matches the GCH handler in xcode_emulation.py.
-  """
-  def __init__(
-      self, settings, config, gyp_to_build_path, gyp_to_unique_output, obj_ext):
-    self.settings = settings
-    self.config = config
-    pch_source = self.settings.msvs_precompiled_source[self.config]
-    self.pch_source = gyp_to_build_path(pch_source)
-    filename, _ = os.path.splitext(pch_source)
-    self.output_obj = gyp_to_unique_output(filename + obj_ext).lower()
-
-  def _PchHeader(self):
-    """Get the header that will appear in an #include line for all source
-    files."""
-    return os.path.split(self.settings.msvs_precompiled_header[self.config])[1]
-
-  def GetObjDependencies(self, sources, objs):
-    """Given a list of sources files and the corresponding object files,
-    returns a list of the pch files that should be depended upon. The
-    additional wrapping in the return value is for interface compatability
-    with make.py on Mac, and xcode_emulation.py."""
-    if not self._PchHeader():
-      return []
-    pch_ext = os.path.splitext(self.pch_source)[1]
-    for source in sources:
-      if _LanguageMatchesForPch(os.path.splitext(source)[1], pch_ext):
-        return [(None, None, self.output_obj)]
-    return []
-
-  def GetPchBuildCommands(self):
-    """Not used on Windows as there are no additional build steps required
-    (instead, existing steps are modified in GetFlagsModifications below)."""
-    return []
-
-  def GetFlagsModifications(self, input, output, implicit, command,
-                            cflags_c, cflags_cc, expand_special):
-    """Get the modified cflags and implicit dependencies that should be used
-    for the pch compilation step."""
-    if input == self.pch_source:
-      pch_output = ['/Yc' + self._PchHeader()]
-      if command == 'cxx':
-        return ([('cflags_cc', map(expand_special, cflags_cc + pch_output))],
-                self.output_obj, [])
-      elif command == 'cc':
-        return ([('cflags_c', map(expand_special, cflags_c + pch_output))],
-                self.output_obj, [])
-    return [], output, implicit
-
-
-vs_version = None
-def GetVSVersion(generator_flags):
-  global vs_version
-  if not vs_version:
-    vs_version = gyp.MSVSVersion.SelectVisualStudioVersion(
-        generator_flags.get('msvs_version', 'auto'))
-  return vs_version
-
-def _GetVsvarsSetupArgs(generator_flags, arch):
-  vs = GetVSVersion(generator_flags)
-  return vs.SetupScript()
-
-def ExpandMacros(string, expansions):
-  """Expand $(Variable) per expansions dict. See MsvsSettings.GetVSMacroEnv
-  for the canonical way to retrieve a suitable dict."""
-  if '$' in string:
-    for old, new in expansions.iteritems():
-      assert '$(' not in new, new
-      string = string.replace(old, new)
-  return string
-
-def _ExtractImportantEnvironment(output_of_set):
-  """Extracts environment variables required for the toolchain to run from
-  a textual dump output by the cmd.exe 'set' command."""
-  envvars_to_save = (
-      'goma_.*', # TODO(scottmg): This is ugly, but needed for goma.
-      'include',
-      'lib',
-      'libpath',
-      'path',
-      'pathext',
-      'systemroot',
-      'temp',
-      'tmp',
-      )
-  env = {}
-  for line in output_of_set.splitlines():
-    for envvar in envvars_to_save:
-      if re.match(envvar + '=', line.lower()):
-        var, setting = line.split('=', 1)
-        if envvar == 'path':
-          # Our own rules (for running gyp-win-tool) and other actions in
-          # Chromium rely on python being in the path. Add the path to this
-          # python here so that if it's not in the path when ninja is run
-          # later, python will still be found.
-          setting = os.path.dirname(sys.executable) + os.pathsep + setting
-        env[var.upper()] = setting
-        break
-  for required in ('SYSTEMROOT', 'TEMP', 'TMP'):
-    if required not in env:
-      raise Exception('Environment variable "%s" '
-                      'required to be set to valid path' % required)
-  return env
-
-def _FormatAsEnvironmentBlock(envvar_dict):
-  """Format as an 'environment block' directly suitable for CreateProcess.
-  Briefly this is a list of key=value\0, terminated by an additional \0. See
-  CreateProcess documentation for more details."""
-  block = ''
-  nul = '\0'
-  for key, value in envvar_dict.iteritems():
-    block += key + '=' + value + nul
-  block += nul
-  return block
-
-def GenerateEnvironmentFiles(toplevel_build_dir, generator_flags, open_out):
-  """It's not sufficient to have the absolute path to the compiler, linker,
-  etc. on Windows, as those tools rely on .dlls being in the PATH. We also
-  need to support both x86 and x64 compilers within the same build (to support
-  msvs_target_platform hackery). Different architectures require a different
-  compiler binary, and different supporting environment variables (INCLUDE,
-  LIB, LIBPATH). So, we extract the environment here, wrap all invocations
-  of compiler tools (cl, link, lib, rc, midl, etc.) via win_tool.py which
-  sets up the environment, and then we do not prefix the compiler with
-  an absolute path, instead preferring something like "cl.exe" in the rule
-  which will then run whichever the environment setup has put in the path.
-  When the following procedure to generate environment files does not
-  meet your requirement (e.g. for custom toolchains), you can pass
-  "-G ninja_use_custom_environment_files" to the gyp to suppress file
-  generation and use custom environment files prepared by yourself."""
-  if generator_flags.get('ninja_use_custom_environment_files', 0):
-    return
-  vs = GetVSVersion(generator_flags)
-  for arch in ('x86', 'x64'):
-    args = vs.SetupScript(arch)
-    args.extend(('&&', 'set'))
-    popen = subprocess.Popen(
-        args, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
-    variables, _ = popen.communicate()
-    env = _ExtractImportantEnvironment(variables)
-    env_block = _FormatAsEnvironmentBlock(env)
-    f = open_out(os.path.join(toplevel_build_dir, 'environment.' + arch), 'wb')
-    f.write(env_block)
-    f.close()
-
-def VerifyMissingSources(sources, build_dir, generator_flags, gyp_to_ninja):
-  """Emulate behavior of msvs_error_on_missing_sources present in the msvs
-  generator: Check that all regular source files, i.e. not created at run time,
-  exist on disk. Missing files cause needless recompilation when building via
-  VS, and we want this check to match for people/bots that build using ninja,
-  so they're not surprised when the VS build fails."""
-  if int(generator_flags.get('msvs_error_on_missing_sources', 0)):
-    no_specials = filter(lambda x: '$' not in x, sources)
-    relative = [os.path.join(build_dir, gyp_to_ninja(s)) for s in no_specials]
-    missing = filter(lambda x: not os.path.exists(x), relative)
-    if missing:
-      # They'll look like out\Release\..\..\stuff\things.cc, so normalize the
-      # path for a slightly less crazy looking output.
-      cleaned_up = [os.path.normpath(x) for x in missing]
-      raise Exception('Missing input files:\n%s' % '\n'.join(cleaned_up))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/ninja_syntax.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,152 +0,0 @@
-# This file comes from
-#   https://github.com/martine/ninja/blob/master/misc/ninja_syntax.py
-# Do not edit!  Edit the upstream one instead.
-
-"""Python module for generating .ninja files.
-
-Note that this is emphatically not a required piece of Ninja; it's
-just a helpful utility for build-file-generation systems that already
-use Python.
-"""
-
-import textwrap
-import re
-
-def escape_path(word):
-    return word.replace('$ ','$$ ').replace(' ','$ ').replace(':', '$:')
-
-class Writer(object):
-    def __init__(self, output, width=78):
-        self.output = output
-        self.width = width
-
-    def newline(self):
-        self.output.write('\n')
-
-    def comment(self, text):
-        for line in textwrap.wrap(text, self.width - 2):
-            self.output.write('# ' + line + '\n')
-
-    def variable(self, key, value, indent=0):
-        if value is None:
-            return
-        if isinstance(value, list):
-            value = ' '.join(filter(None, value))  # Filter out empty strings.
-        self._line('%s = %s' % (key, value), indent)
-
-    def rule(self, name, command, description=None, depfile=None,
-             generator=False, restat=False, rspfile=None, rspfile_content=None):
-        self._line('rule %s' % name)
-        self.variable('command', command, indent=1)
-        if description:
-            self.variable('description', description, indent=1)
-        if depfile:
-            self.variable('depfile', depfile, indent=1)
-        if generator:
-            self.variable('generator', '1', indent=1)
-        if restat:
-            self.variable('restat', '1', indent=1)
-        if rspfile:
-            self.variable('rspfile', rspfile, indent=1)
-        if rspfile_content:
-            self.variable('rspfile_content', rspfile_content, indent=1)
-
-    def build(self, outputs, rule, inputs=None, implicit=None, order_only=None,
-              variables=None):
-        outputs = self._as_list(outputs)
-        all_inputs = self._as_list(inputs)[:]
-        out_outputs = list(map(escape_path, outputs))
-        all_inputs = list(map(escape_path, all_inputs))
-
-        if implicit:
-            implicit = map(escape_path, self._as_list(implicit))
-            all_inputs.append('|')
-            all_inputs.extend(implicit)
-        if order_only:
-            order_only = map(escape_path, self._as_list(order_only))
-            all_inputs.append('||')
-            all_inputs.extend(order_only)
-
-        self._line('build %s: %s %s' % (' '.join(out_outputs),
-                                        rule,
-                                        ' '.join(all_inputs)))
-
-        if variables:
-            if isinstance(variables, dict):
-                iterator = variables.iteritems()
-            else:
-                iterator = iter(variables)
-
-            for key, val in iterator:
-                self.variable(key, val, indent=1)
-
-        return outputs
-
-    def include(self, path):
-        self._line('include %s' % path)
-
-    def subninja(self, path):
-        self._line('subninja %s' % path)
-
-    def default(self, paths):
-        self._line('default %s' % ' '.join(self._as_list(paths)))
-
-    def _count_dollars_before_index(self, s, i):
-      """Returns the number of '$' characters right in front of s[i]."""
-      dollar_count = 0
-      dollar_index = i - 1
-      while dollar_index > 0 and s[dollar_index] == '$':
-        dollar_count += 1
-        dollar_index -= 1
-      return dollar_count
-
-    def _line(self, text, indent=0):
-        """Write 'text' word-wrapped at self.width characters."""
-        leading_space = '  ' * indent
-        while len(leading_space) + len(text) > self.width:
-            # The text is too wide; wrap if possible.
-
-            # Find the rightmost space that would obey our width constraint and
-            # that's not an escaped space.
-            available_space = self.width - len(leading_space) - len(' $')
-            space = available_space
-            while True:
-              space = text.rfind(' ', 0, space)
-              if space < 0 or \
-                 self._count_dollars_before_index(text, space) % 2 == 0:
-                break
-
-            if space < 0:
-                # No such space; just use the first unescaped space we can find.
-                space = available_space - 1
-                while True:
-                  space = text.find(' ', space + 1)
-                  if space < 0 or \
-                     self._count_dollars_before_index(text, space) % 2 == 0:
-                    break
-            if space < 0:
-                # Give up on breaking.
-                break
-
-            self.output.write(leading_space + text[0:space] + ' $\n')
-            text = text[space+1:]
-
-            # Subsequent lines are continuations, so indent them.
-            leading_space = '  ' * (indent+2)
-
-        self.output.write(leading_space + text + '\n')
-
-    def _as_list(self, input):
-        if input is None:
-            return []
-        if isinstance(input, list):
-            return input
-        return [input]
-
-
-def escape(string):
-    """Escape a string such that it can be embedded into a Ninja file without
-    further interpretation."""
-    assert '\n' not in string, 'Ninja syntax does not allow newlines'
-    # We only have one special metacharacter: '$'.
-    return string.replace('$', '$$')
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/sun_tool.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,51 +0,0 @@
-#!/usr/bin/env python
-# Copyright (c) 2011 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""These functions are executed via gyp-sun-tool when using the Makefile
-generator."""
-
-import fcntl
-import os
-import struct
-import subprocess
-import sys
-
-
-def main(args):
-  executor = SunTool()
-  executor.Dispatch(args)
-
-
-class SunTool(object):
-  """This class performs all the SunOS tooling steps. The methods can either be
-  executed directly, or dispatched from an argument list."""
-
-  def Dispatch(self, args):
-    """Dispatches a string command to a method."""
-    if len(args) < 1:
-      raise Exception("Not enough arguments")
-
-    method = "Exec%s" % self._CommandifyName(args[0])
-    getattr(self, method)(*args[1:])
-
-  def _CommandifyName(self, name_string):
-    """Transforms a tool name like copy-info-plist to CopyInfoPlist"""
-    return name_string.title().replace('-', '')
-
-  def ExecFlock(self, lockfile, *cmd_list):
-    """Emulates the most basic behavior of Linux's flock(1)."""
-    # Rely on exception handling to report errors.
-    # Note that the stock python on SunOS has a bug
-    # where fcntl.flock(fd, LOCK_EX) always fails
-    # with EBADF, that's why we use this F_SETLK
-    # hack instead.
-    fd = os.open(lockfile, os.O_WRONLY|os.O_NOCTTY|os.O_CREAT, 0666)
-    op = struct.pack('hhllhhl', fcntl.F_WRLCK, 0, 0, 0, 0, 0, 0)
-    fcntl.fcntl(fd, fcntl.F_SETLK, op)
-    return subprocess.call(cmd_list)
-
-
-if __name__ == '__main__':
-  sys.exit(main(sys.argv[1:]))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/win_tool.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,193 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Utility functions for Windows builds.
-
-These functions are executed via gyp-win-tool when using the ninja generator.
-"""
-
-from ctypes import windll, wintypes
-import os
-import shutil
-import subprocess
-import sys
-
-BASE_DIR = os.path.dirname(os.path.abspath(__file__))
-
-
-def main(args):
-  executor = WinTool()
-  exit_code = executor.Dispatch(args)
-  if exit_code is not None:
-    sys.exit(exit_code)
-
-
-class LinkLock(object):
-  """A flock-style lock to limit the number of concurrent links to one.
-
-  Uses a session-local mutex based on the file's directory.
-  """
-  def __enter__(self):
-    name = 'Local\\%s' % BASE_DIR.replace('\\', '_').replace(':', '_')
-    self.mutex = windll.kernel32.CreateMutexW(
-        wintypes.c_int(0),
-        wintypes.c_int(0),
-        wintypes.create_unicode_buffer(name))
-    assert self.mutex
-    result = windll.kernel32.WaitForSingleObject(
-        self.mutex, wintypes.c_int(0xFFFFFFFF))
-    # 0x80 means another process was killed without releasing the mutex, but
-    # that this process has been given ownership. This is fine for our
-    # purposes.
-    assert result in (0, 0x80), (
-        "%s, %s" % (result, windll.kernel32.GetLastError()))
-
-  def __exit__(self, type, value, traceback):
-    windll.kernel32.ReleaseMutex(self.mutex)
-    windll.kernel32.CloseHandle(self.mutex)
-
-
-class WinTool(object):
-  """This class performs all the Windows tooling steps. The methods can either
-  be executed directly, or dispatched from an argument list."""
-
-  def Dispatch(self, args):
-    """Dispatches a string command to a method."""
-    if len(args) < 1:
-      raise Exception("Not enough arguments")
-
-    method = "Exec%s" % self._CommandifyName(args[0])
-    return getattr(self, method)(*args[1:])
-
-  def _CommandifyName(self, name_string):
-    """Transforms a tool name like recursive-mirror to RecursiveMirror."""
-    return name_string.title().replace('-', '')
-
-  def _GetEnv(self, arch):
-    """Gets the saved environment from a file for a given architecture."""
-    # The environment is saved as an "environment block" (see CreateProcess
-    # and msvs_emulation for details). We convert to a dict here.
-    # Drop last 2 NULs, one for list terminator, one for trailing vs. separator.
-    pairs = open(arch).read()[:-2].split('\0')
-    kvs = [item.split('=', 1) for item in pairs]
-    return dict(kvs)
-
-  def ExecStamp(self, path):
-    """Simple stamp command."""
-    open(path, 'w').close()
-
-  def ExecRecursiveMirror(self, source, dest):
-    """Emulation of rm -rf out && cp -af in out."""
-    if os.path.exists(dest):
-      if os.path.isdir(dest):
-        shutil.rmtree(dest)
-      else:
-        os.unlink(dest)
-    if os.path.isdir(source):
-      shutil.copytree(source, dest)
-    else:
-      shutil.copy2(source, dest)
-
-  def ExecLinkWrapper(self, arch, *args):
-    """Filter diagnostic output from link that looks like:
-    '   Creating library ui.dll.lib and object ui.dll.exp'
-    This happens when there are exports from the dll or exe.
-    """
-    with LinkLock():
-      env = self._GetEnv(arch)
-      popen = subprocess.Popen(args, shell=True, env=env,
-                               stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
-      out, _ = popen.communicate()
-      for line in out.splitlines():
-        if not line.startswith('   Creating library '):
-          print line
-      return popen.returncode
-
-  def ExecManifestWrapper(self, arch, *args):
-    """Run manifest tool with environment set. Strip out undesirable warning
-    (some XML blocks are recognized by the OS loader, but not the manifest
-    tool)."""
-    env = self._GetEnv(arch)
-    popen = subprocess.Popen(args, shell=True, env=env,
-                             stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
-    out, _ = popen.communicate()
-    for line in out.splitlines():
-      if line and 'manifest authoring warning 81010002' not in line:
-        print line
-    return popen.returncode
-
-  def ExecMidlWrapper(self, arch, outdir, tlb, h, dlldata, iid, proxy, idl,
-                      *flags):
-    """Filter noisy filenames output from MIDL compile step that isn't
-    quietable via command line flags.
-    """
-    args = ['midl', '/nologo'] + list(flags) + [
-        '/out', outdir,
-        '/tlb', tlb,
-        '/h', h,
-        '/dlldata', dlldata,
-        '/iid', iid,
-        '/proxy', proxy,
-        idl]
-    env = self._GetEnv(arch)
-    popen = subprocess.Popen(args, shell=True, env=env,
-                             stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
-    out, _ = popen.communicate()
-    # Filter junk out of stdout, and write filtered versions. Output we want
-    # to filter is pairs of lines that look like this:
-    # Processing C:\Program Files (x86)\Microsoft SDKs\...\include\objidl.idl
-    # objidl.idl
-    lines = out.splitlines()
-    prefix = 'Processing '
-    processing = set(os.path.basename(x) for x in lines if x.startswith(prefix))
-    for line in lines:
-      if not line.startswith(prefix) and line not in processing:
-        print line
-    return popen.returncode
-
-  def ExecAsmWrapper(self, arch, *args):
-    """Filter logo banner from invocations of asm.exe."""
-    env = self._GetEnv(arch)
-    # MSVS doesn't assemble x64 asm files.
-    if arch == 'environment.x64':
-      return 0
-    popen = subprocess.Popen(args, shell=True, env=env,
-                             stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
-    out, _ = popen.communicate()
-    for line in out.splitlines():
-      if (not line.startswith('Copyright (C) Microsoft Corporation') and
-          not line.startswith('Microsoft (R) Macro Assembler') and
-          not line.startswith(' Assembling: ') and
-          line):
-        print line
-    return popen.returncode
-
-  def ExecRcWrapper(self, arch, *args):
-    """Filter logo banner from invocations of rc.exe. Older versions of RC
-    don't support the /nologo flag."""
-    env = self._GetEnv(arch)
-    popen = subprocess.Popen(args, shell=True, env=env,
-                             stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
-    out, _ = popen.communicate()
-    for line in out.splitlines():
-      if (not line.startswith('Microsoft (R) Windows (R) Resource Compiler') and
-          not line.startswith('Copyright (C) Microsoft Corporation') and
-          line):
-        print line
-    return popen.returncode
-
-  def ExecActionWrapper(self, arch, rspfile, *dir):
-    """Runs an action command line from a response file using the environment
-    for |arch|. If |dir| is supplied, use that as the working directory."""
-    env = self._GetEnv(arch)
-    args = open(rspfile).read()
-    dir = dir[0] if dir else None
-    popen = subprocess.Popen(args, shell=True, env=env, cwd=dir)
-    popen.wait()
-    return popen.returncode
-
-if __name__ == '__main__':
-  sys.exit(main(sys.argv[1:]))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/xcode_emulation.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1083 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""
-This module contains classes that help to emulate xcodebuild behavior on top of
-other build systems, such as make and ninja.
-"""
-
-import gyp.common
-import os.path
-import re
-import shlex
-import subprocess
-import sys
-from gyp.common import GypError
-
-class XcodeSettings(object):
-  """A class that understands the gyp 'xcode_settings' object."""
-
-  # Populated lazily by _SdkPath(). Shared by all XcodeSettings, so cached
-  # at class-level for efficiency.
-  _sdk_path_cache = {}
-
-  def __init__(self, spec):
-    self.spec = spec
-
-    # Per-target 'xcode_settings' are pushed down into configs earlier by gyp.
-    # This means self.xcode_settings[config] always contains all settings
-    # for that config -- the per-target settings as well. Settings that are
-    # the same for all configs are implicitly per-target settings.
-    self.xcode_settings = {}
-    configs = spec['configurations']
-    for configname, config in configs.iteritems():
-      self.xcode_settings[configname] = config.get('xcode_settings', {})
-
-    # This is only non-None temporarily during the execution of some methods.
-    self.configname = None
-
-    # Used by _AdjustLibrary to match .a and .dylib entries in libraries.
-    self.library_re = re.compile(r'^lib([^/]+)\.(a|dylib)$')
-
-  def _Settings(self):
-    assert self.configname
-    return self.xcode_settings[self.configname]
-
-  def _Test(self, test_key, cond_key, default):
-    return self._Settings().get(test_key, default) == cond_key
-
-  def _Appendf(self, lst, test_key, format_str, default=None):
-    if test_key in self._Settings():
-      lst.append(format_str % str(self._Settings()[test_key]))
-    elif default:
-      lst.append(format_str % str(default))
-
-  def _WarnUnimplemented(self, test_key):
-    if test_key in self._Settings():
-      print 'Warning: Ignoring not yet implemented key "%s".' % test_key
-
-  def _IsBundle(self):
-    return int(self.spec.get('mac_bundle', 0)) != 0
-
-  def GetFrameworkVersion(self):
-    """Returns the framework version of the current target. Only valid for
-    bundles."""
-    assert self._IsBundle()
-    return self.GetPerTargetSetting('FRAMEWORK_VERSION', default='A')
-
-  def GetWrapperExtension(self):
-    """Returns the bundle extension (.app, .framework, .plugin, etc).  Only
-    valid for bundles."""
-    assert self._IsBundle()
-    if self.spec['type'] in ('loadable_module', 'shared_library'):
-      default_wrapper_extension = {
-        'loadable_module': 'bundle',
-        'shared_library': 'framework',
-      }[self.spec['type']]
-      wrapper_extension = self.GetPerTargetSetting(
-          'WRAPPER_EXTENSION', default=default_wrapper_extension)
-      return '.' + self.spec.get('product_extension', wrapper_extension)
-    elif self.spec['type'] == 'executable':
-      return '.app'
-    else:
-      assert False, "Don't know extension for '%s', target '%s'" % (
-          self.spec['type'], self.spec['target_name'])
-
-  def GetProductName(self):
-    """Returns PRODUCT_NAME."""
-    return self.spec.get('product_name', self.spec['target_name'])
-
-  def GetFullProductName(self):
-    """Returns FULL_PRODUCT_NAME."""
-    if self._IsBundle():
-      return self.GetWrapperName()
-    else:
-      return self._GetStandaloneBinaryPath()
-
-  def GetWrapperName(self):
-    """Returns the directory name of the bundle represented by this target.
-    Only valid for bundles."""
-    assert self._IsBundle()
-    return self.GetProductName() + self.GetWrapperExtension()
-
-  def GetBundleContentsFolderPath(self):
-    """Returns the qualified path to the bundle's contents folder. E.g.
-    Chromium.app/Contents or Foo.bundle/Versions/A. Only valid for bundles."""
-    assert self._IsBundle()
-    if self.spec['type'] == 'shared_library':
-      return os.path.join(
-          self.GetWrapperName(), 'Versions', self.GetFrameworkVersion())
-    else:
-      # loadable_modules have a 'Contents' folder like executables.
-      return os.path.join(self.GetWrapperName(), 'Contents')
-
-  def GetBundleResourceFolder(self):
-    """Returns the qualified path to the bundle's resource folder. E.g.
-    Chromium.app/Contents/Resources. Only valid for bundles."""
-    assert self._IsBundle()
-    return os.path.join(self.GetBundleContentsFolderPath(), 'Resources')
-
-  def GetBundlePlistPath(self):
-    """Returns the qualified path to the bundle's plist file. E.g.
-    Chromium.app/Contents/Info.plist. Only valid for bundles."""
-    assert self._IsBundle()
-    if self.spec['type'] in ('executable', 'loadable_module'):
-      return os.path.join(self.GetBundleContentsFolderPath(), 'Info.plist')
-    else:
-      return os.path.join(self.GetBundleContentsFolderPath(),
-                          'Resources', 'Info.plist')
-
-  def GetProductType(self):
-    """Returns the PRODUCT_TYPE of this target."""
-    if self._IsBundle():
-      return {
-        'executable': 'com.apple.product-type.application',
-        'loadable_module': 'com.apple.product-type.bundle',
-        'shared_library': 'com.apple.product-type.framework',
-      }[self.spec['type']]
-    else:
-      return {
-        'executable': 'com.apple.product-type.tool',
-        'loadable_module': 'com.apple.product-type.library.dynamic',
-        'shared_library': 'com.apple.product-type.library.dynamic',
-        'static_library': 'com.apple.product-type.library.static',
-      }[self.spec['type']]
-
-  def GetMachOType(self):
-    """Returns the MACH_O_TYPE of this target."""
-    # Weird, but matches Xcode.
-    if not self._IsBundle() and self.spec['type'] == 'executable':
-      return ''
-    return {
-      'executable': 'mh_execute',
-      'static_library': 'staticlib',
-      'shared_library': 'mh_dylib',
-      'loadable_module': 'mh_bundle',
-    }[self.spec['type']]
-
-  def _GetBundleBinaryPath(self):
-    """Returns the name of the bundle binary of by this target.
-    E.g. Chromium.app/Contents/MacOS/Chromium. Only valid for bundles."""
-    assert self._IsBundle()
-    if self.spec['type'] in ('shared_library'):
-      path = self.GetBundleContentsFolderPath()
-    elif self.spec['type'] in ('executable', 'loadable_module'):
-      path = os.path.join(self.GetBundleContentsFolderPath(), 'MacOS')
-    return os.path.join(path, self.GetExecutableName())
-
-  def _GetStandaloneExecutableSuffix(self):
-    if 'product_extension' in self.spec:
-      return '.' + self.spec['product_extension']
-    return {
-      'executable': '',
-      'static_library': '.a',
-      'shared_library': '.dylib',
-      'loadable_module': '.so',
-    }[self.spec['type']]
-
-  def _GetStandaloneExecutablePrefix(self):
-    return self.spec.get('product_prefix', {
-      'executable': '',
-      'static_library': 'lib',
-      'shared_library': 'lib',
-      # Non-bundled loadable_modules are called foo.so for some reason
-      # (that is, .so and no prefix) with the xcode build -- match that.
-      'loadable_module': '',
-    }[self.spec['type']])
-
-  def _GetStandaloneBinaryPath(self):
-    """Returns the name of the non-bundle binary represented by this target.
-    E.g. hello_world. Only valid for non-bundles."""
-    assert not self._IsBundle()
-    assert self.spec['type'] in (
-        'executable', 'shared_library', 'static_library', 'loadable_module'), (
-        'Unexpected type %s' % self.spec['type'])
-    target = self.spec['target_name']
-    if self.spec['type'] == 'static_library':
-      if target[:3] == 'lib':
-        target = target[3:]
-    elif self.spec['type'] in ('loadable_module', 'shared_library'):
-      if target[:3] == 'lib':
-        target = target[3:]
-
-    target_prefix = self._GetStandaloneExecutablePrefix()
-    target = self.spec.get('product_name', target)
-    target_ext = self._GetStandaloneExecutableSuffix()
-    return target_prefix + target + target_ext
-
-  def GetExecutableName(self):
-    """Returns the executable name of the bundle represented by this target.
-    E.g. Chromium."""
-    if self._IsBundle():
-      return self.spec.get('product_name', self.spec['target_name'])
-    else:
-      return self._GetStandaloneBinaryPath()
-
-  def GetExecutablePath(self):
-    """Returns the directory name of the bundle represented by this target. E.g.
-    Chromium.app/Contents/MacOS/Chromium."""
-    if self._IsBundle():
-      return self._GetBundleBinaryPath()
-    else:
-      return self._GetStandaloneBinaryPath()
-
-  def _GetSdkVersionInfoItem(self, sdk, infoitem):
-    job = subprocess.Popen(['xcodebuild', '-version', '-sdk', sdk, infoitem],
-                           stdout=subprocess.PIPE,
-                           stderr=subprocess.STDOUT)
-    out = job.communicate()[0]
-    if job.returncode != 0:
-      sys.stderr.write(out + '\n')
-      raise GypError('Error %d running xcodebuild' % job.returncode)
-    return out.rstrip('\n')
-
-  def _SdkPath(self):
-    sdk_root = self.GetPerTargetSetting('SDKROOT', default='macosx')
-    if sdk_root not in XcodeSettings._sdk_path_cache:
-      XcodeSettings._sdk_path_cache[sdk_root] = self._GetSdkVersionInfoItem(
-          sdk_root, 'Path')
-    return XcodeSettings._sdk_path_cache[sdk_root]
-
-  def _AppendPlatformVersionMinFlags(self, lst):
-    self._Appendf(lst, 'MACOSX_DEPLOYMENT_TARGET', '-mmacosx-version-min=%s')
-    if 'IPHONEOS_DEPLOYMENT_TARGET' in self._Settings():
-      # TODO: Implement this better?
-      sdk_path_basename = os.path.basename(self._SdkPath())
-      if sdk_path_basename.lower().startswith('iphonesimulator'):
-        self._Appendf(lst, 'IPHONEOS_DEPLOYMENT_TARGET',
-                      '-mios-simulator-version-min=%s')
-      else:
-        self._Appendf(lst, 'IPHONEOS_DEPLOYMENT_TARGET',
-                      '-miphoneos-version-min=%s')
-
-  def GetCflags(self, configname):
-    """Returns flags that need to be added to .c, .cc, .m, and .mm
-    compilations."""
-    # This functions (and the similar ones below) do not offer complete
-    # emulation of all xcode_settings keys. They're implemented on demand.
-
-    self.configname = configname
-    cflags = []
-
-    sdk_root = self._SdkPath()
-    if 'SDKROOT' in self._Settings():
-      cflags.append('-isysroot %s' % sdk_root)
-
-    if self._Test('CLANG_WARN_CONSTANT_CONVERSION', 'YES', default='NO'):
-      cflags.append('-Wconstant-conversion')
-
-    if self._Test('GCC_CHAR_IS_UNSIGNED_CHAR', 'YES', default='NO'):
-      cflags.append('-funsigned-char')
-
-    if self._Test('GCC_CW_ASM_SYNTAX', 'YES', default='YES'):
-      cflags.append('-fasm-blocks')
-
-    if 'GCC_DYNAMIC_NO_PIC' in self._Settings():
-      if self._Settings()['GCC_DYNAMIC_NO_PIC'] == 'YES':
-        cflags.append('-mdynamic-no-pic')
-    else:
-      pass
-      # TODO: In this case, it depends on the target. xcode passes
-      # mdynamic-no-pic by default for executable and possibly static lib
-      # according to mento
-
-    if self._Test('GCC_ENABLE_PASCAL_STRINGS', 'YES', default='YES'):
-      cflags.append('-mpascal-strings')
-
-    self._Appendf(cflags, 'GCC_OPTIMIZATION_LEVEL', '-O%s', default='s')
-
-    if self._Test('GCC_GENERATE_DEBUGGING_SYMBOLS', 'YES', default='YES'):
-      dbg_format = self._Settings().get('DEBUG_INFORMATION_FORMAT', 'dwarf')
-      if dbg_format == 'dwarf':
-        cflags.append('-gdwarf-2')
-      elif dbg_format == 'stabs':
-        raise NotImplementedError('stabs debug format is not supported yet.')
-      elif dbg_format == 'dwarf-with-dsym':
-        cflags.append('-gdwarf-2')
-      else:
-        raise NotImplementedError('Unknown debug format %s' % dbg_format)
-
-    if self._Test('GCC_SYMBOLS_PRIVATE_EXTERN', 'YES', default='NO'):
-      cflags.append('-fvisibility=hidden')
-
-    if self._Test('GCC_TREAT_WARNINGS_AS_ERRORS', 'YES', default='NO'):
-      cflags.append('-Werror')
-
-    if self._Test('GCC_WARN_ABOUT_MISSING_NEWLINE', 'YES', default='NO'):
-      cflags.append('-Wnewline-eof')
-
-    self._AppendPlatformVersionMinFlags(cflags)
-
-    # TODO:
-    if self._Test('COPY_PHASE_STRIP', 'YES', default='NO'):
-      self._WarnUnimplemented('COPY_PHASE_STRIP')
-    self._WarnUnimplemented('GCC_DEBUGGING_SYMBOLS')
-    self._WarnUnimplemented('GCC_ENABLE_OBJC_EXCEPTIONS')
-
-    # TODO: This is exported correctly, but assigning to it is not supported.
-    self._WarnUnimplemented('MACH_O_TYPE')
-    self._WarnUnimplemented('PRODUCT_TYPE')
-
-    archs = self._Settings().get('ARCHS', ['i386'])
-    if len(archs) != 1:
-      # TODO: Supporting fat binaries will be annoying.
-      self._WarnUnimplemented('ARCHS')
-      archs = ['i386']
-    cflags.append('-arch ' + archs[0])
-
-    if archs[0] in ('i386', 'x86_64'):
-      if self._Test('GCC_ENABLE_SSE3_EXTENSIONS', 'YES', default='NO'):
-        cflags.append('-msse3')
-      if self._Test('GCC_ENABLE_SUPPLEMENTAL_SSE3_INSTRUCTIONS', 'YES',
-                    default='NO'):
-        cflags.append('-mssse3')  # Note 3rd 's'.
-      if self._Test('GCC_ENABLE_SSE41_EXTENSIONS', 'YES', default='NO'):
-        cflags.append('-msse4.1')
-      if self._Test('GCC_ENABLE_SSE42_EXTENSIONS', 'YES', default='NO'):
-        cflags.append('-msse4.2')
-
-    cflags += self._Settings().get('WARNING_CFLAGS', [])
-
-    config = self.spec['configurations'][self.configname]
-    framework_dirs = config.get('mac_framework_dirs', [])
-    for directory in framework_dirs:
-      cflags.append('-F' + directory.replace('$(SDKROOT)', sdk_root))
-
-    self.configname = None
-    return cflags
-
-  def GetCflagsC(self, configname):
-    """Returns flags that need to be added to .c, and .m compilations."""
-    self.configname = configname
-    cflags_c = []
-    self._Appendf(cflags_c, 'GCC_C_LANGUAGE_STANDARD', '-std=%s')
-    cflags_c += self._Settings().get('OTHER_CFLAGS', [])
-    self.configname = None
-    return cflags_c
-
-  def GetCflagsCC(self, configname):
-    """Returns flags that need to be added to .cc, and .mm compilations."""
-    self.configname = configname
-    cflags_cc = []
-
-    clang_cxx_language_standard = self._Settings().get(
-        'CLANG_CXX_LANGUAGE_STANDARD')
-    # Note: Don't make c++0x to c++11 so that c++0x can be used with older
-    # clangs that don't understand c++11 yet (like Xcode 4.2's).
-    if clang_cxx_language_standard:
-      cflags_cc.append('-std=%s' % clang_cxx_language_standard)
-
-    self._Appendf(cflags_cc, 'CLANG_CXX_LIBRARY', '-stdlib=%s')
-
-    if self._Test('GCC_ENABLE_CPP_RTTI', 'NO', default='YES'):
-      cflags_cc.append('-fno-rtti')
-    if self._Test('GCC_ENABLE_CPP_EXCEPTIONS', 'NO', default='YES'):
-      cflags_cc.append('-fno-exceptions')
-    if self._Test('GCC_INLINES_ARE_PRIVATE_EXTERN', 'YES', default='NO'):
-      cflags_cc.append('-fvisibility-inlines-hidden')
-    if self._Test('GCC_THREADSAFE_STATICS', 'NO', default='YES'):
-      cflags_cc.append('-fno-threadsafe-statics')
-    # Note: This flag is a no-op for clang, it only has an effect for gcc.
-    if self._Test('GCC_WARN_ABOUT_INVALID_OFFSETOF_MACRO', 'NO', default='YES'):
-      cflags_cc.append('-Wno-invalid-offsetof')
-
-    other_ccflags = []
-
-    for flag in self._Settings().get('OTHER_CPLUSPLUSFLAGS', ['$(inherited)']):
-      # TODO: More general variable expansion. Missing in many other places too.
-      if flag in ('$inherited', '$(inherited)', '${inherited}'):
-        flag = '$OTHER_CFLAGS'
-      if flag in ('$OTHER_CFLAGS', '$(OTHER_CFLAGS)', '${OTHER_CFLAGS}'):
-        other_ccflags += self._Settings().get('OTHER_CFLAGS', [])
-      else:
-        other_ccflags.append(flag)
-    cflags_cc += other_ccflags
-
-    self.configname = None
-    return cflags_cc
-
-  def _AddObjectiveCGarbageCollectionFlags(self, flags):
-    gc_policy = self._Settings().get('GCC_ENABLE_OBJC_GC', 'unsupported')
-    if gc_policy == 'supported':
-      flags.append('-fobjc-gc')
-    elif gc_policy == 'required':
-      flags.append('-fobjc-gc-only')
-
-  def GetCflagsObjC(self, configname):
-    """Returns flags that need to be added to .m compilations."""
-    self.configname = configname
-    cflags_objc = []
-
-    self._AddObjectiveCGarbageCollectionFlags(cflags_objc)
-
-    self.configname = None
-    return cflags_objc
-
-  def GetCflagsObjCC(self, configname):
-    """Returns flags that need to be added to .mm compilations."""
-    self.configname = configname
-    cflags_objcc = []
-    self._AddObjectiveCGarbageCollectionFlags(cflags_objcc)
-    if self._Test('GCC_OBJC_CALL_CXX_CDTORS', 'YES', default='NO'):
-      cflags_objcc.append('-fobjc-call-cxx-cdtors')
-    self.configname = None
-    return cflags_objcc
-
-  def GetInstallNameBase(self):
-    """Return DYLIB_INSTALL_NAME_BASE for this target."""
-    # Xcode sets this for shared_libraries, and for nonbundled loadable_modules.
-    if (self.spec['type'] != 'shared_library' and
-        (self.spec['type'] != 'loadable_module' or self._IsBundle())):
-      return None
-    install_base = self.GetPerTargetSetting(
-        'DYLIB_INSTALL_NAME_BASE',
-        default='/Library/Frameworks' if self._IsBundle() else '/usr/local/lib')
-    return install_base
-
-  def _StandardizePath(self, path):
-    """Do :standardizepath processing for path."""
-    # I'm not quite sure what :standardizepath does. Just call normpath(),
-    # but don't let @executable_path/../foo collapse to foo.
-    if '/' in path:
-      prefix, rest = '', path
-      if path.startswith('@'):
-        prefix, rest = path.split('/', 1)
-      rest = os.path.normpath(rest)  # :standardizepath
-      path = os.path.join(prefix, rest)
-    return path
-
-  def GetInstallName(self):
-    """Return LD_DYLIB_INSTALL_NAME for this target."""
-    # Xcode sets this for shared_libraries, and for nonbundled loadable_modules.
-    if (self.spec['type'] != 'shared_library' and
-        (self.spec['type'] != 'loadable_module' or self._IsBundle())):
-      return None
-
-    default_install_name = \
-        '$(DYLIB_INSTALL_NAME_BASE:standardizepath)/$(EXECUTABLE_PATH)'
-    install_name = self.GetPerTargetSetting(
-        'LD_DYLIB_INSTALL_NAME', default=default_install_name)
-
-    # Hardcode support for the variables used in chromium for now, to
-    # unblock people using the make build.
-    if '$' in install_name:
-      assert install_name in ('$(DYLIB_INSTALL_NAME_BASE:standardizepath)/'
-          '$(WRAPPER_NAME)/$(PRODUCT_NAME)', default_install_name), (
-          'Variables in LD_DYLIB_INSTALL_NAME are not generally supported '
-          'yet in target \'%s\' (got \'%s\')' %
-              (self.spec['target_name'], install_name))
-
-      install_name = install_name.replace(
-          '$(DYLIB_INSTALL_NAME_BASE:standardizepath)',
-          self._StandardizePath(self.GetInstallNameBase()))
-      if self._IsBundle():
-        # These are only valid for bundles, hence the |if|.
-        install_name = install_name.replace(
-            '$(WRAPPER_NAME)', self.GetWrapperName())
-        install_name = install_name.replace(
-            '$(PRODUCT_NAME)', self.GetProductName())
-      else:
-        assert '$(WRAPPER_NAME)' not in install_name
-        assert '$(PRODUCT_NAME)' not in install_name
-
-      install_name = install_name.replace(
-          '$(EXECUTABLE_PATH)', self.GetExecutablePath())
-    return install_name
-
-  def _MapLinkerFlagFilename(self, ldflag, gyp_to_build_path):
-    """Checks if ldflag contains a filename and if so remaps it from
-    gyp-directory-relative to build-directory-relative."""
-    # This list is expanded on demand.
-    # They get matched as:
-    #   -exported_symbols_list file
-    #   -Wl,exported_symbols_list file
-    #   -Wl,exported_symbols_list,file
-    LINKER_FILE = '(\S+)'
-    WORD = '\S+'
-    linker_flags = [
-      ['-exported_symbols_list', LINKER_FILE],    # Needed for NaCl.
-      ['-unexported_symbols_list', LINKER_FILE],
-      ['-reexported_symbols_list', LINKER_FILE],
-      ['-sectcreate', WORD, WORD, LINKER_FILE],   # Needed for remoting.
-    ]
-    for flag_pattern in linker_flags:
-      regex = re.compile('(?:-Wl,)?' + '[ ,]'.join(flag_pattern))
-      m = regex.match(ldflag)
-      if m:
-        ldflag = ldflag[:m.start(1)] + gyp_to_build_path(m.group(1)) + \
-                 ldflag[m.end(1):]
-    # Required for ffmpeg (no idea why they don't use LIBRARY_SEARCH_PATHS,
-    # TODO(thakis): Update ffmpeg.gyp):
-    if ldflag.startswith('-L'):
-      ldflag = '-L' + gyp_to_build_path(ldflag[len('-L'):])
-    return ldflag
-
-  def GetLdflags(self, configname, product_dir, gyp_to_build_path):
-    """Returns flags that need to be passed to the linker.
-
-    Args:
-        configname: The name of the configuration to get ld flags for.
-        product_dir: The directory where products such static and dynamic
-            libraries are placed. This is added to the library search path.
-        gyp_to_build_path: A function that converts paths relative to the
-            current gyp file to paths relative to the build direcotry.
-    """
-    self.configname = configname
-    ldflags = []
-
-    # The xcode build is relative to a gyp file's directory, and OTHER_LDFLAGS
-    # can contain entries that depend on this. Explicitly absolutify these.
-    for ldflag in self._Settings().get('OTHER_LDFLAGS', []):
-      ldflags.append(self._MapLinkerFlagFilename(ldflag, gyp_to_build_path))
-
-    if self._Test('DEAD_CODE_STRIPPING', 'YES', default='NO'):
-      ldflags.append('-Wl,-dead_strip')
-
-    if self._Test('PREBINDING', 'YES', default='NO'):
-      ldflags.append('-Wl,-prebind')
-
-    self._Appendf(
-        ldflags, 'DYLIB_COMPATIBILITY_VERSION', '-compatibility_version %s')
-    self._Appendf(
-        ldflags, 'DYLIB_CURRENT_VERSION', '-current_version %s')
-
-    self._AppendPlatformVersionMinFlags(ldflags)
-
-    if 'SDKROOT' in self._Settings():
-      ldflags.append('-isysroot ' + self._SdkPath())
-
-    for library_path in self._Settings().get('LIBRARY_SEARCH_PATHS', []):
-      ldflags.append('-L' + gyp_to_build_path(library_path))
-
-    if 'ORDER_FILE' in self._Settings():
-      ldflags.append('-Wl,-order_file ' +
-                     '-Wl,' + gyp_to_build_path(
-                                  self._Settings()['ORDER_FILE']))
-
-    archs = self._Settings().get('ARCHS', ['i386'])
-    if len(archs) != 1:
-      # TODO: Supporting fat binaries will be annoying.
-      self._WarnUnimplemented('ARCHS')
-      archs = ['i386']
-    ldflags.append('-arch ' + archs[0])
-
-    # Xcode adds the product directory by default.
-    ldflags.append('-L' + product_dir)
-
-    install_name = self.GetInstallName()
-    if install_name:
-      ldflags.append('-install_name ' + install_name.replace(' ', r'\ '))
-
-    for rpath in self._Settings().get('LD_RUNPATH_SEARCH_PATHS', []):
-      ldflags.append('-Wl,-rpath,' + rpath)
-
-    config = self.spec['configurations'][self.configname]
-    framework_dirs = config.get('mac_framework_dirs', [])
-    for directory in framework_dirs:
-      ldflags.append('-F' + directory.replace('$(SDKROOT)', self._SdkPath()))
-
-    self.configname = None
-    return ldflags
-
-  def GetLibtoolflags(self, configname):
-    """Returns flags that need to be passed to the static linker.
-
-    Args:
-        configname: The name of the configuration to get ld flags for.
-    """
-    self.configname = configname
-    libtoolflags = []
-
-    for libtoolflag in self._Settings().get('OTHER_LDFLAGS', []):
-      libtoolflags.append(libtoolflag)
-    # TODO(thakis): ARCHS?
-
-    self.configname = None
-    return libtoolflags
-
-  def GetPerTargetSettings(self):
-    """Gets a list of all the per-target settings. This will only fetch keys
-    whose values are the same across all configurations."""
-    first_pass = True
-    result = {}
-    for configname in sorted(self.xcode_settings.keys()):
-      if first_pass:
-        result = dict(self.xcode_settings[configname])
-        first_pass = False
-      else:
-        for key, value in self.xcode_settings[configname].iteritems():
-          if key not in result:
-            continue
-          elif result[key] != value:
-            del result[key]
-    return result
-
-  def GetPerTargetSetting(self, setting, default=None):
-    """Tries to get xcode_settings.setting from spec. Assumes that the setting
-       has the same value in all configurations and throws otherwise."""
-    first_pass = True
-    result = None
-    for configname in sorted(self.xcode_settings.keys()):
-      if first_pass:
-        result = self.xcode_settings[configname].get(setting, None)
-        first_pass = False
-      else:
-        assert result == self.xcode_settings[configname].get(setting, None), (
-            "Expected per-target setting for '%s', got per-config setting "
-            "(target %s)" % (setting, spec['target_name']))
-    if result is None:
-      return default
-    return result
-
-  def _GetStripPostbuilds(self, configname, output_binary, quiet):
-    """Returns a list of shell commands that contain the shell commands
-    neccessary to strip this target's binary. These should be run as postbuilds
-    before the actual postbuilds run."""
-    self.configname = configname
-
-    result = []
-    if (self._Test('DEPLOYMENT_POSTPROCESSING', 'YES', default='NO') and
-        self._Test('STRIP_INSTALLED_PRODUCT', 'YES', default='NO')):
-
-      default_strip_style = 'debugging'
-      if self._IsBundle():
-        default_strip_style = 'non-global'
-      elif self.spec['type'] == 'executable':
-        default_strip_style = 'all'
-
-      strip_style = self._Settings().get('STRIP_STYLE', default_strip_style)
-      strip_flags = {
-        'all': '',
-        'non-global': '-x',
-        'debugging': '-S',
-      }[strip_style]
-
-      explicit_strip_flags = self._Settings().get('STRIPFLAGS', '')
-      if explicit_strip_flags:
-        strip_flags += ' ' + _NormalizeEnvVarReferences(explicit_strip_flags)
-
-      if not quiet:
-        result.append('echo STRIP\\(%s\\)' % self.spec['target_name'])
-      result.append('strip %s %s' % (strip_flags, output_binary))
-
-    self.configname = None
-    return result
-
-  def _GetDebugInfoPostbuilds(self, configname, output, output_binary, quiet):
-    """Returns a list of shell commands that contain the shell commands
-    neccessary to massage this target's debug information. These should be run
-    as postbuilds before the actual postbuilds run."""
-    self.configname = configname
-
-    # For static libraries, no dSYMs are created.
-    result = []
-    if (self._Test('GCC_GENERATE_DEBUGGING_SYMBOLS', 'YES', default='YES') and
-        self._Test(
-            'DEBUG_INFORMATION_FORMAT', 'dwarf-with-dsym', default='dwarf') and
-        self.spec['type'] != 'static_library'):
-      if not quiet:
-        result.append('echo DSYMUTIL\\(%s\\)' % self.spec['target_name'])
-      result.append('dsymutil %s -o %s' % (output_binary, output + '.dSYM'))
-
-    self.configname = None
-    return result
-
-  def GetTargetPostbuilds(self, configname, output, output_binary, quiet=False):
-    """Returns a list of shell commands that contain the shell commands
-    to run as postbuilds for this target, before the actual postbuilds."""
-    # dSYMs need to build before stripping happens.
-    return (
-        self._GetDebugInfoPostbuilds(configname, output, output_binary, quiet) +
-        self._GetStripPostbuilds(configname, output_binary, quiet))
-
-  def _AdjustLibrary(self, library):
-    if library.endswith('.framework'):
-      l = '-framework ' + os.path.splitext(os.path.basename(library))[0]
-    else:
-      m = self.library_re.match(library)
-      if m:
-        l = '-l' + m.group(1)
-      else:
-        l = library
-    return l.replace('$(SDKROOT)', self._SdkPath())
-
-  def AdjustLibraries(self, libraries):
-    """Transforms entries like 'Cocoa.framework' in libraries into entries like
-    '-framework Cocoa', 'libcrypto.dylib' into '-lcrypto', etc.
-    """
-    libraries = [ self._AdjustLibrary(library) for library in libraries]
-    return libraries
-
-
-class MacPrefixHeader(object):
-  """A class that helps with emulating Xcode's GCC_PREFIX_HEADER feature.
-
-  This feature consists of several pieces:
-  * If GCC_PREFIX_HEADER is present, all compilations in that project get an
-    additional |-include path_to_prefix_header| cflag.
-  * If GCC_PRECOMPILE_PREFIX_HEADER is present too, then the prefix header is
-    instead compiled, and all other compilations in the project get an
-    additional |-include path_to_compiled_header| instead.
-    + Compiled prefix headers have the extension gch. There is one gch file for
-      every language used in the project (c, cc, m, mm), since gch files for
-      different languages aren't compatible.
-    + gch files themselves are built with the target's normal cflags, but they
-      obviously don't get the |-include| flag. Instead, they need a -x flag that
-      describes their language.
-    + All o files in the target need to depend on the gch file, to make sure
-      it's built before any o file is built.
-
-  This class helps with some of these tasks, but it needs help from the build
-  system for writing dependencies to the gch files, for writing build commands
-  for the gch files, and for figuring out the location of the gch files.
-  """
-  def __init__(self, xcode_settings,
-               gyp_path_to_build_path, gyp_path_to_build_output):
-    """If xcode_settings is None, all methods on this class are no-ops.
-
-    Args:
-        gyp_path_to_build_path: A function that takes a gyp-relative path,
-            and returns a path relative to the build directory.
-        gyp_path_to_build_output: A function that takes a gyp-relative path and
-            a language code ('c', 'cc', 'm', or 'mm'), and that returns a path
-            to where the output of precompiling that path for that language
-            should be placed (without the trailing '.gch').
-    """
-    # This doesn't support per-configuration prefix headers. Good enough
-    # for now.
-    self.header = None
-    self.compile_headers = False
-    if xcode_settings:
-      self.header = xcode_settings.GetPerTargetSetting('GCC_PREFIX_HEADER')
-      self.compile_headers = xcode_settings.GetPerTargetSetting(
-          'GCC_PRECOMPILE_PREFIX_HEADER', default='NO') != 'NO'
-    self.compiled_headers = {}
-    if self.header:
-      if self.compile_headers:
-        for lang in ['c', 'cc', 'm', 'mm']:
-          self.compiled_headers[lang] = gyp_path_to_build_output(
-              self.header, lang)
-      self.header = gyp_path_to_build_path(self.header)
-
-  def GetInclude(self, lang):
-    """Gets the cflags to include the prefix header for language |lang|."""
-    if self.compile_headers and lang in self.compiled_headers:
-      return '-include %s' % self.compiled_headers[lang]
-    elif self.header:
-      return '-include %s' % self.header
-    else:
-      return ''
-
-  def _Gch(self, lang):
-    """Returns the actual file name of the prefix header for language |lang|."""
-    assert self.compile_headers
-    return self.compiled_headers[lang] + '.gch'
-
-  def GetObjDependencies(self, sources, objs):
-    """Given a list of source files and the corresponding object files, returns
-    a list of (source, object, gch) tuples, where |gch| is the build-directory
-    relative path to the gch file each object file depends on.  |compilable[i]|
-    has to be the source file belonging to |objs[i]|."""
-    if not self.header or not self.compile_headers:
-      return []
-
-    result = []
-    for source, obj in zip(sources, objs):
-      ext = os.path.splitext(source)[1]
-      lang = {
-        '.c': 'c',
-        '.cpp': 'cc', '.cc': 'cc', '.cxx': 'cc',
-        '.m': 'm',
-        '.mm': 'mm',
-      }.get(ext, None)
-      if lang:
-        result.append((source, obj, self._Gch(lang)))
-    return result
-
-  def GetPchBuildCommands(self):
-    """Returns [(path_to_gch, language_flag, language, header)].
-    |path_to_gch| and |header| are relative to the build directory.
-    """
-    if not self.header or not self.compile_headers:
-      return []
-    return [
-      (self._Gch('c'), '-x c-header', 'c', self.header),
-      (self._Gch('cc'), '-x c++-header', 'cc', self.header),
-      (self._Gch('m'), '-x objective-c-header', 'm', self.header),
-      (self._Gch('mm'), '-x objective-c++-header', 'mm', self.header),
-    ]
-
-
-def MergeGlobalXcodeSettingsToSpec(global_dict, spec):
-  """Merges the global xcode_settings dictionary into each configuration of the
-  target represented by spec. For keys that are both in the global and the local
-  xcode_settings dict, the local key gets precendence.
-  """
-  # The xcode generator special-cases global xcode_settings and does something
-  # that amounts to merging in the global xcode_settings into each local
-  # xcode_settings dict.
-  global_xcode_settings = global_dict.get('xcode_settings', {})
-  for config in spec['configurations'].values():
-    if 'xcode_settings' in config:
-      new_settings = global_xcode_settings.copy()
-      new_settings.update(config['xcode_settings'])
-      config['xcode_settings'] = new_settings
-
-
-def IsMacBundle(flavor, spec):
-  """Returns if |spec| should be treated as a bundle.
-
-  Bundles are directories with a certain subdirectory structure, instead of
-  just a single file. Bundle rules do not produce a binary but also package
-  resources into that directory."""
-  is_mac_bundle = (int(spec.get('mac_bundle', 0)) != 0 and flavor == 'mac')
-  if is_mac_bundle:
-    assert spec['type'] != 'none', (
-        'mac_bundle targets cannot have type none (target "%s")' %
-        spec['target_name'])
-  return is_mac_bundle
-
-
-def GetMacBundleResources(product_dir, xcode_settings, resources):
-  """Yields (output, resource) pairs for every resource in |resources|.
-  Only call this for mac bundle targets.
-
-  Args:
-      product_dir: Path to the directory containing the output bundle,
-          relative to the build directory.
-      xcode_settings: The XcodeSettings of the current target.
-      resources: A list of bundle resources, relative to the build directory.
-  """
-  dest = os.path.join(product_dir,
-                      xcode_settings.GetBundleResourceFolder())
-  for res in resources:
-    output = dest
-
-    # The make generator doesn't support it, so forbid it everywhere
-    # to keep the generators more interchangable.
-    assert ' ' not in res, (
-      "Spaces in resource filenames not supported (%s)"  % res)
-
-    # Split into (path,file).
-    res_parts = os.path.split(res)
-
-    # Now split the path into (prefix,maybe.lproj).
-    lproj_parts = os.path.split(res_parts[0])
-    # If the resource lives in a .lproj bundle, add that to the destination.
-    if lproj_parts[1].endswith('.lproj'):
-      output = os.path.join(output, lproj_parts[1])
-
-    output = os.path.join(output, res_parts[1])
-    # Compiled XIB files are referred to by .nib.
-    if output.endswith('.xib'):
-      output = output[0:-3] + 'nib'
-
-    yield output, res
-
-
-def GetMacInfoPlist(product_dir, xcode_settings, gyp_path_to_build_path):
-  """Returns (info_plist, dest_plist, defines, extra_env), where:
-  * |info_plist| is the sourc plist path, relative to the
-    build directory,
-  * |dest_plist| is the destination plist path, relative to the
-    build directory,
-  * |defines| is a list of preprocessor defines (empty if the plist
-    shouldn't be preprocessed,
-  * |extra_env| is a dict of env variables that should be exported when
-    invoking |mac_tool copy-info-plist|.
-
-  Only call this for mac bundle targets.
-
-  Args:
-      product_dir: Path to the directory containing the output bundle,
-          relative to the build directory.
-      xcode_settings: The XcodeSettings of the current target.
-      gyp_to_build_path: A function that converts paths relative to the
-          current gyp file to paths relative to the build direcotry.
-  """
-  info_plist = xcode_settings.GetPerTargetSetting('INFOPLIST_FILE')
-  if not info_plist:
-    return None, None, [], {}
-
-  # The make generator doesn't support it, so forbid it everywhere
-  # to keep the generators more interchangable.
-  assert ' ' not in info_plist, (
-    "Spaces in Info.plist filenames not supported (%s)"  % info_plist)
-
-  info_plist = gyp_path_to_build_path(info_plist)
-
-  # If explicitly set to preprocess the plist, invoke the C preprocessor and
-  # specify any defines as -D flags.
-  if xcode_settings.GetPerTargetSetting(
-      'INFOPLIST_PREPROCESS', default='NO') == 'YES':
-    # Create an intermediate file based on the path.
-    defines = shlex.split(xcode_settings.GetPerTargetSetting(
-        'INFOPLIST_PREPROCESSOR_DEFINITIONS', default=''))
-  else:
-    defines = []
-
-  dest_plist = os.path.join(product_dir, xcode_settings.GetBundlePlistPath())
-  extra_env = xcode_settings.GetPerTargetSettings()
-
-  return info_plist, dest_plist, defines, extra_env
-
-
-def _GetXcodeEnv(xcode_settings, built_products_dir, srcroot, configuration,
-                additional_settings=None):
-  """Return the environment variables that Xcode would set. See
-  http://developer.apple.com/library/mac/#documentation/DeveloperTools/Reference/XcodeBuildSettingRef/1-Build_Setting_Reference/build_setting_ref.html#//apple_ref/doc/uid/TP40003931-CH3-SW153
-  for a full list.
-
-  Args:
-      xcode_settings: An XcodeSettings object. If this is None, this function
-          returns an empty dict.
-      built_products_dir: Absolute path to the built products dir.
-      srcroot: Absolute path to the source root.
-      configuration: The build configuration name.
-      additional_settings: An optional dict with more values to add to the
-          result.
-  """
-  if not xcode_settings: return {}
-
-  # This function is considered a friend of XcodeSettings, so let it reach into
-  # its implementation details.
-  spec = xcode_settings.spec
-
-  # These are filled in on a as-needed basis.
-  env = {
-    'BUILT_PRODUCTS_DIR' : built_products_dir,
-    'CONFIGURATION' : configuration,
-    'PRODUCT_NAME' : xcode_settings.GetProductName(),
-    # See /Developer/Platforms/MacOSX.platform/Developer/Library/Xcode/Specifications/MacOSX\ Product\ Types.xcspec for FULL_PRODUCT_NAME
-    'SRCROOT' : srcroot,
-    'SOURCE_ROOT': '${SRCROOT}',
-    # This is not true for static libraries, but currently the env is only
-    # written for bundles:
-    'TARGET_BUILD_DIR' : built_products_dir,
-    'TEMP_DIR' : '${TMPDIR}',
-  }
-  if xcode_settings.GetPerTargetSetting('SDKROOT'):
-    env['SDKROOT'] = xcode_settings._SdkPath()
-  else:
-    env['SDKROOT'] = ''
-
-  if spec['type'] in (
-      'executable', 'static_library', 'shared_library', 'loadable_module'):
-    env['EXECUTABLE_NAME'] = xcode_settings.GetExecutableName()
-    env['EXECUTABLE_PATH'] = xcode_settings.GetExecutablePath()
-    env['FULL_PRODUCT_NAME'] = xcode_settings.GetFullProductName()
-    mach_o_type = xcode_settings.GetMachOType()
-    if mach_o_type:
-      env['MACH_O_TYPE'] = mach_o_type
-    env['PRODUCT_TYPE'] = xcode_settings.GetProductType()
-  if xcode_settings._IsBundle():
-    env['CONTENTS_FOLDER_PATH'] = \
-      xcode_settings.GetBundleContentsFolderPath()
-    env['UNLOCALIZED_RESOURCES_FOLDER_PATH'] = \
-        xcode_settings.GetBundleResourceFolder()
-    env['INFOPLIST_PATH'] = xcode_settings.GetBundlePlistPath()
-    env['WRAPPER_NAME'] = xcode_settings.GetWrapperName()
-
-  install_name = xcode_settings.GetInstallName()
-  if install_name:
-    env['LD_DYLIB_INSTALL_NAME'] = install_name
-  install_name_base = xcode_settings.GetInstallNameBase()
-  if install_name_base:
-    env['DYLIB_INSTALL_NAME_BASE'] = install_name_base
-
-  if not additional_settings:
-    additional_settings = {}
-  else:
-    # Flatten lists to strings.
-    for k in additional_settings:
-      if not isinstance(additional_settings[k], str):
-        additional_settings[k] = ' '.join(additional_settings[k])
-  additional_settings.update(env)
-
-  for k in additional_settings:
-    additional_settings[k] = _NormalizeEnvVarReferences(additional_settings[k])
-
-  return additional_settings
-
-
-def _NormalizeEnvVarReferences(str):
-  """Takes a string containing variable references in the form ${FOO}, $(FOO),
-  or $FOO, and returns a string with all variable references in the form ${FOO}.
-  """
-  # $FOO -> ${FOO}
-  str = re.sub(r'\$([a-zA-Z_][a-zA-Z0-9_]*)', r'${\1}', str)
-
-  # $(FOO) -> ${FOO}
-  matches = re.findall(r'(\$\(([a-zA-Z0-9\-_]+)\))', str)
-  for match in matches:
-    to_replace, variable = match
-    assert '$(' not in match, '$($(FOO)) variables not supported: ' + match
-    str = str.replace(to_replace, '${' + variable + '}')
-
-  return str
-
-
-def ExpandEnvVars(string, expansions):
-  """Expands ${VARIABLES}, $(VARIABLES), and $VARIABLES in string per the
-  expansions list. If the variable expands to something that references
-  another variable, this variable is expanded as well if it's in env --
-  until no variables present in env are left."""
-  for k, v in reversed(expansions):
-    string = string.replace('${' + k + '}', v)
-    string = string.replace('$(' + k + ')', v)
-    string = string.replace('$' + k, v)
-  return string
-
-
-def _TopologicallySortedEnvVarKeys(env):
-  """Takes a dict |env| whose values are strings that can refer to other keys,
-  for example env['foo'] = '$(bar) and $(baz)'. Returns a list L of all keys of
-  env such that key2 is after key1 in L if env[key2] refers to env[key1].
-
-  Throws an Exception in case of dependency cycles.
-  """
-  # Since environment variables can refer to other variables, the evaluation
-  # order is important. Below is the logic to compute the dependency graph
-  # and sort it.
-  regex = re.compile(r'\$\{([a-zA-Z0-9\-_]+)\}')
-  def GetEdges(node):
-    # Use a definition of edges such that user_of_variable -> used_varible.
-    # This happens to be easier in this case, since a variable's
-    # definition contains all variables it references in a single string.
-    # We can then reverse the result of the topological sort at the end.
-    # Since: reverse(topsort(DAG)) = topsort(reverse_edges(DAG))
-    matches = set([v for v in regex.findall(env[node]) if v in env])
-    for dependee in matches:
-      assert '${' not in dependee, 'Nested variables not supported: ' + dependee
-    return matches
-
-  try:
-    # Topologically sort, and then reverse, because we used an edge definition
-    # that's inverted from the expected result of this function (see comment
-    # above).
-    order = gyp.common.TopologicallySorted(env.keys(), GetEdges)
-    order.reverse()
-    return order
-  except gyp.common.CycleError, e:
-    raise GypError(
-        'Xcode environment variables are cyclically dependent: ' + str(e.nodes))
-
-
-def GetSortedXcodeEnv(xcode_settings, built_products_dir, srcroot,
-                      configuration, additional_settings=None):
-  env = _GetXcodeEnv(xcode_settings, built_products_dir, srcroot, configuration,
-                    additional_settings)
-  return [(key, env[key]) for key in _TopologicallySortedEnvVarKeys(env)]
-
-
-def GetSpecPostbuildCommands(spec, quiet=False):
-  """Returns the list of postbuilds explicitly defined on |spec|, in a form
-  executable by a shell."""
-  postbuilds = []
-  for postbuild in spec.get('postbuilds', []):
-    if not quiet:
-      postbuilds.append('echo POSTBUILD\\(%s\\) %s' % (
-            spec['target_name'], postbuild['postbuild_name']))
-    postbuilds.append(gyp.common.EncodePOSIXShellList(postbuild['action']))
-  return postbuilds
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/xcodeproj_file.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2870 +0,0 @@
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Xcode project file generator.
-
-This module is both an Xcode project file generator and a documentation of the
-Xcode project file format.  Knowledge of the project file format was gained
-based on extensive experience with Xcode, and by making changes to projects in
-Xcode.app and observing the resultant changes in the associated project files.
-
-XCODE PROJECT FILES
-
-The generator targets the file format as written by Xcode 3.2 (specifically,
-3.2.6), but past experience has taught that the format has not changed
-significantly in the past several years, and future versions of Xcode are able
-to read older project files.
-
-Xcode project files are "bundled": the project "file" from an end-user's
-perspective is actually a directory with an ".xcodeproj" extension.  The
-project file from this module's perspective is actually a file inside this
-directory, always named "project.pbxproj".  This file contains a complete
-description of the project and is all that is needed to use the xcodeproj.
-Other files contained in the xcodeproj directory are simply used to store
-per-user settings, such as the state of various UI elements in the Xcode
-application.
-
-The project.pbxproj file is a property list, stored in a format almost
-identical to the NeXTstep property list format.  The file is able to carry
-Unicode data, and is encoded in UTF-8.  The root element in the property list
-is a dictionary that contains several properties of minimal interest, and two
-properties of immense interest.  The most important property is a dictionary
-named "objects".  The entire structure of the project is represented by the
-children of this property.  The objects dictionary is keyed by unique 96-bit
-values represented by 24 uppercase hexadecimal characters.  Each value in the
-objects dictionary is itself a dictionary, describing an individual object.
-
-Each object in the dictionary is a member of a class, which is identified by
-the "isa" property of each object.  A variety of classes are represented in a
-project file.  Objects can refer to other objects by ID, using the 24-character
-hexadecimal object key.  A project's objects form a tree, with a root object
-of class PBXProject at the root.  As an example, the PBXProject object serves
-as parent to an XCConfigurationList object defining the build configurations
-used in the project, a PBXGroup object serving as a container for all files
-referenced in the project, and a list of target objects, each of which defines
-a target in the project.  There are several different types of target object,
-such as PBXNativeTarget and PBXAggregateTarget.  In this module, this
-relationship is expressed by having each target type derive from an abstract
-base named XCTarget.
-
-The project.pbxproj file's root dictionary also contains a property, sibling to
-the "objects" dictionary, named "rootObject".  The value of rootObject is a
-24-character object key referring to the root PBXProject object in the
-objects dictionary.
-
-In Xcode, every file used as input to a target or produced as a final product
-of a target must appear somewhere in the hierarchy rooted at the PBXGroup
-object referenced by the PBXProject's mainGroup property.  A PBXGroup is
-generally represented as a folder in the Xcode application.  PBXGroups can
-contain other PBXGroups as well as PBXFileReferences, which are pointers to
-actual files.
-
-Each XCTarget contains a list of build phases, represented in this module by
-the abstract base XCBuildPhase.  Examples of concrete XCBuildPhase derivations
-are PBXSourcesBuildPhase and PBXFrameworksBuildPhase, which correspond to the
-"Compile Sources" and "Link Binary With Libraries" phases displayed in the
-Xcode application.  Files used as input to these phases (for example, source
-files in the former case and libraries and frameworks in the latter) are
-represented by PBXBuildFile objects, referenced by elements of "files" lists
-in XCTarget objects.  Each PBXBuildFile object refers to a PBXBuildFile
-object as a "weak" reference: it does not "own" the PBXBuildFile, which is
-owned by the root object's mainGroup or a descendant group.  In most cases, the
-layer of indirection between an XCBuildPhase and a PBXFileReference via a
-PBXBuildFile appears extraneous, but there's actually one reason for this:
-file-specific compiler flags are added to the PBXBuildFile object so as to
-allow a single file to be a member of multiple targets while having distinct
-compiler flags for each.  These flags can be modified in the Xcode applciation
-in the "Build" tab of a File Info window.
-
-When a project is open in the Xcode application, Xcode will rewrite it.  As
-such, this module is careful to adhere to the formatting used by Xcode, to
-avoid insignificant changes appearing in the file when it is used in the
-Xcode application.  This will keep version control repositories happy, and
-makes it possible to compare a project file used in Xcode to one generated by
-this module to determine if any significant changes were made in the
-application.
-
-Xcode has its own way of assigning 24-character identifiers to each object,
-which is not duplicated here.  Because the identifier only is only generated
-once, when an object is created, and is then left unchanged, there is no need
-to attempt to duplicate Xcode's behavior in this area.  The generator is free
-to select any identifier, even at random, to refer to the objects it creates,
-and Xcode will retain those identifiers and use them when subsequently
-rewriting the project file.  However, the generator would choose new random
-identifiers each time the project files are generated, leading to difficulties
-comparing "used" project files to "pristine" ones produced by this module,
-and causing the appearance of changes as every object identifier is changed
-when updated projects are checked in to a version control repository.  To
-mitigate this problem, this module chooses identifiers in a more deterministic
-way, by hashing a description of each object as well as its parent and ancestor
-objects.  This strategy should result in minimal "shift" in IDs as successive
-generations of project files are produced.
-
-THIS MODULE
-
-This module introduces several classes, all derived from the XCObject class.
-Nearly all of the "brains" are built into the XCObject class, which understands
-how to create and modify objects, maintain the proper tree structure, compute
-identifiers, and print objects.  For the most part, classes derived from
-XCObject need only provide a _schema class object, a dictionary that
-expresses what properties objects of the class may contain.
-
-Given this structure, it's possible to build a minimal project file by creating
-objects of the appropriate types and making the proper connections:
-
-  config_list = XCConfigurationList()
-  group = PBXGroup()
-  project = PBXProject({'buildConfigurationList': config_list,
-                        'mainGroup': group})
-
-With the project object set up, it can be added to an XCProjectFile object.
-XCProjectFile is a pseudo-class in the sense that it is a concrete XCObject
-subclass that does not actually correspond to a class type found in a project
-file.  Rather, it is used to represent the project file's root dictionary.
-Printing an XCProjectFile will print the entire project file, including the
-full "objects" dictionary.
-
-  project_file = XCProjectFile({'rootObject': project})
-  project_file.ComputeIDs()
-  project_file.Print()
-
-Xcode project files are always encoded in UTF-8.  This module will accept
-strings of either the str class or the unicode class.  Strings of class str
-are assumed to already be encoded in UTF-8.  Obviously, if you're just using
-ASCII, you won't encounter difficulties because ASCII is a UTF-8 subset.
-Strings of class unicode are handled properly and encoded in UTF-8 when
-a project file is output.
-"""
-
-import gyp.common
-import posixpath
-import re
-import struct
-import sys
-
-# hashlib is supplied as of Python 2.5 as the replacement interface for sha
-# and other secure hashes.  In 2.6, sha is deprecated.  Import hashlib if
-# available, avoiding a deprecation warning under 2.6.  Import sha otherwise,
-# preserving 2.4 compatibility.
-try:
-  import hashlib
-  _new_sha1 = hashlib.sha1
-except ImportError:
-  import sha
-  _new_sha1 = sha.new
-
-
-# See XCObject._EncodeString.  This pattern is used to determine when a string
-# can be printed unquoted.  Strings that match this pattern may be printed
-# unquoted.  Strings that do not match must be quoted and may be further
-# transformed to be properly encoded.  Note that this expression matches the
-# characters listed with "+", for 1 or more occurrences: if a string is empty,
-# it must not match this pattern, because it needs to be encoded as "".
-_unquoted = re.compile('^[A-Za-z0-9$./_]+$')
-
-# Strings that match this pattern are quoted regardless of what _unquoted says.
-# Oddly, Xcode will quote any string with a run of three or more underscores.
-_quoted = re.compile('___')
-
-# This pattern should match any character that needs to be escaped by
-# XCObject._EncodeString.  See that function.
-_escaped = re.compile('[\\\\"]|[^ -~]')
-
-
-# Used by SourceTreeAndPathFromPath
-_path_leading_variable = re.compile('^\$\((.*?)\)(/(.*))?$')
-
-def SourceTreeAndPathFromPath(input_path):
-  """Given input_path, returns a tuple with sourceTree and path values.
-
-  Examples:
-    input_path     (source_tree, output_path)
-    '$(VAR)/path'  ('VAR', 'path')
-    '$(VAR)'       ('VAR', None)
-    'path'         (None, 'path')
-  """
-
-  source_group_match = _path_leading_variable.match(input_path)
-  if source_group_match:
-    source_tree = source_group_match.group(1)
-    output_path = source_group_match.group(3)  # This may be None.
-  else:
-    source_tree = None
-    output_path = input_path
-
-  return (source_tree, output_path)
-
-def ConvertVariablesToShellSyntax(input_string):
-  return re.sub('\$\((.*?)\)', '${\\1}', input_string)
-
-class XCObject(object):
-  """The abstract base of all class types used in Xcode project files.
-
-  Class variables:
-    _schema: A dictionary defining the properties of this class.  The keys to
-             _schema are string property keys as used in project files.  Values
-             are a list of four or five elements:
-             [ is_list, property_type, is_strong, is_required, default ]
-             is_list: True if the property described is a list, as opposed
-                      to a single element.
-             property_type: The type to use as the value of the property,
-                            or if is_list is True, the type to use for each
-                            element of the value's list.  property_type must
-                            be an XCObject subclass, or one of the built-in
-                            types str, int, or dict.
-             is_strong: If property_type is an XCObject subclass, is_strong
-                        is True to assert that this class "owns," or serves
-                        as parent, to the property value (or, if is_list is
-                        True, values).  is_strong must be False if
-                        property_type is not an XCObject subclass.
-             is_required: True if the property is required for the class.
-                          Note that is_required being True does not preclude
-                          an empty string ("", in the case of property_type
-                          str) or list ([], in the case of is_list True) from
-                          being set for the property.
-             default: Optional.  If is_requried is True, default may be set
-                      to provide a default value for objects that do not supply
-                      their own value.  If is_required is True and default
-                      is not provided, users of the class must supply their own
-                      value for the property.
-             Note that although the values of the array are expressed in
-             boolean terms, subclasses provide values as integers to conserve
-             horizontal space.
-    _should_print_single_line: False in XCObject.  Subclasses whose objects
-                               should be written to the project file in the
-                               alternate single-line format, such as
-                               PBXFileReference and PBXBuildFile, should
-                               set this to True.
-    _encode_transforms: Used by _EncodeString to encode unprintable characters.
-                        The index into this list is the ordinal of the
-                        character to transform; each value is a string
-                        used to represent the character in the output.  XCObject
-                        provides an _encode_transforms list suitable for most
-                        XCObject subclasses.
-    _alternate_encode_transforms: Provided for subclasses that wish to use
-                                  the alternate encoding rules.  Xcode seems
-                                  to use these rules when printing objects in
-                                  single-line format.  Subclasses that desire
-                                  this behavior should set _encode_transforms
-                                  to _alternate_encode_transforms.
-    _hashables: A list of XCObject subclasses that can be hashed by ComputeIDs
-                to construct this object's ID.  Most classes that need custom
-                hashing behavior should do it by overriding Hashables,
-                but in some cases an object's parent may wish to push a
-                hashable value into its child, and it can do so by appending
-                to _hashables.
-  Attributes:
-    id: The object's identifier, a 24-character uppercase hexadecimal string.
-        Usually, objects being created should not set id until the entire
-        project file structure is built.  At that point, UpdateIDs() should
-        be called on the root object to assign deterministic values for id to
-        each object in the tree.
-    parent: The object's parent.  This is set by a parent XCObject when a child
-            object is added to it.
-    _properties: The object's property dictionary.  An object's properties are
-                 described by its class' _schema variable.
-  """
-
-  _schema = {}
-  _should_print_single_line = False
-
-  # See _EncodeString.
-  _encode_transforms = []
-  i = 0
-  while i < ord(' '):
-    _encode_transforms.append('\\U%04x' % i)
-    i = i + 1
-  _encode_transforms[7] = '\\a'
-  _encode_transforms[8] = '\\b'
-  _encode_transforms[9] = '\\t'
-  _encode_transforms[10] = '\\n'
-  _encode_transforms[11] = '\\v'
-  _encode_transforms[12] = '\\f'
-  _encode_transforms[13] = '\\n'
-
-  _alternate_encode_transforms = list(_encode_transforms)
-  _alternate_encode_transforms[9] = chr(9)
-  _alternate_encode_transforms[10] = chr(10)
-  _alternate_encode_transforms[11] = chr(11)
-
-  def __init__(self, properties=None, id=None, parent=None):
-    self.id = id
-    self.parent = parent
-    self._properties = {}
-    self._hashables = []
-    self._SetDefaultsFromSchema()
-    self.UpdateProperties(properties)
-
-  def __repr__(self):
-    try:
-      name = self.Name()
-    except NotImplementedError:
-      return '<%s at 0x%x>' % (self.__class__.__name__, id(self))
-    return '<%s %r at 0x%x>' % (self.__class__.__name__, name, id(self))
-
-  def Copy(self):
-    """Make a copy of this object.
-
-    The new object will have its own copy of lists and dicts.  Any XCObject
-    objects owned by this object (marked "strong") will be copied in the
-    new object, even those found in lists.  If this object has any weak
-    references to other XCObjects, the same references are added to the new
-    object without making a copy.
-    """
-
-    that = self.__class__(id=self.id, parent=self.parent)
-    for key, value in self._properties.iteritems():
-      is_strong = self._schema[key][2]
-
-      if isinstance(value, XCObject):
-        if is_strong:
-          new_value = value.Copy()
-          new_value.parent = that
-          that._properties[key] = new_value
-        else:
-          that._properties[key] = value
-      elif isinstance(value, str) or isinstance(value, unicode) or \
-           isinstance(value, int):
-        that._properties[key] = value
-      elif isinstance(value, list):
-        if is_strong:
-          # If is_strong is True, each element is an XCObject, so it's safe to
-          # call Copy.
-          that._properties[key] = []
-          for item in value:
-            new_item = item.Copy()
-            new_item.parent = that
-            that._properties[key].append(new_item)
-        else:
-          that._properties[key] = value[:]
-      elif isinstance(value, dict):
-        # dicts are never strong.
-        if is_strong:
-          raise TypeError, 'Strong dict for key ' + key + ' in ' + \
-                           self.__class__.__name__
-        else:
-          that._properties[key] = value.copy()
-      else:
-        raise TypeError, 'Unexpected type ' + value.__class__.__name__ + \
-                         ' for key ' + key + ' in ' + self.__class__.__name__
-
-    return that
-
-  def Name(self):
-    """Return the name corresponding to an object.
-
-    Not all objects necessarily need to be nameable, and not all that do have
-    a "name" property.  Override as needed.
-    """
-
-    # If the schema indicates that "name" is required, try to access the
-    # property even if it doesn't exist.  This will result in a KeyError
-    # being raised for the property that should be present, which seems more
-    # appropriate than NotImplementedError in this case.
-    if 'name' in self._properties or \
-        ('name' in self._schema and self._schema['name'][3]):
-      return self._properties['name']
-
-    raise NotImplementedError, \
-          self.__class__.__name__ + ' must implement Name'
-
-  def Comment(self):
-    """Return a comment string for the object.
-
-    Most objects just use their name as the comment, but PBXProject uses
-    different values.
-
-    The returned comment is not escaped and does not have any comment marker
-    strings applied to it.
-    """
-
-    return self.Name()
-
-  def Hashables(self):
-    hashables = [self.__class__.__name__]
-
-    name = self.Name()
-    if name != None:
-      hashables.append(name)
-
-    hashables.extend(self._hashables)
-
-    return hashables
-
-  def HashablesForChild(self):
-    return None
-
-  def ComputeIDs(self, recursive=True, overwrite=True, seed_hash=None):
-    """Set "id" properties deterministically.
-
-    An object's "id" property is set based on a hash of its class type and
-    name, as well as the class type and name of all ancestor objects.  As
-    such, it is only advisable to call ComputeIDs once an entire project file
-    tree is built.
-
-    If recursive is True, recurse into all descendant objects and update their
-    hashes.
-
-    If overwrite is True, any existing value set in the "id" property will be
-    replaced.
-    """
-
-    def _HashUpdate(hash, data):
-      """Update hash with data's length and contents.
-
-      If the hash were updated only with the value of data, it would be
-      possible for clowns to induce collisions by manipulating the names of
-      their objects.  By adding the length, it's exceedingly less likely that
-      ID collisions will be encountered, intentionally or not.
-      """
-
-      hash.update(struct.pack('>i', len(data)))
-      hash.update(data)
-
-    if seed_hash is None:
-      seed_hash = _new_sha1()
-
-    hash = seed_hash.copy()
-
-    hashables = self.Hashables()
-    assert len(hashables) > 0
-    for hashable in hashables:
-      _HashUpdate(hash, hashable)
-
-    if recursive:
-      hashables_for_child = self.HashablesForChild()
-      if hashables_for_child is None:
-        child_hash = hash
-      else:
-        assert len(hashables_for_child) > 0
-        child_hash = seed_hash.copy()
-        for hashable in hashables_for_child:
-          _HashUpdate(child_hash, hashable)
-
-      for child in self.Children():
-        child.ComputeIDs(recursive, overwrite, child_hash)
-
-    if overwrite or self.id is None:
-      # Xcode IDs are only 96 bits (24 hex characters), but a SHA-1 digest is
-      # is 160 bits.  Instead of throwing out 64 bits of the digest, xor them
-      # into the portion that gets used.
-      assert hash.digest_size % 4 == 0
-      digest_int_count = hash.digest_size / 4
-      digest_ints = struct.unpack('>' + 'I' * digest_int_count, hash.digest())
-      id_ints = [0, 0, 0]
-      for index in xrange(0, digest_int_count):
-        id_ints[index % 3] ^= digest_ints[index]
-      self.id = '%08X%08X%08X' % tuple(id_ints)
-
-  def EnsureNoIDCollisions(self):
-    """Verifies that no two objects have the same ID.  Checks all descendants.
-    """
-
-    ids = {}
-    descendants = self.Descendants()
-    for descendant in descendants:
-      if descendant.id in ids:
-        other = ids[descendant.id]
-        raise KeyError, \
-              'Duplicate ID %s, objects "%s" and "%s" in "%s"' % \
-              (descendant.id, str(descendant._properties),
-               str(other._properties), self._properties['rootObject'].Name())
-      ids[descendant.id] = descendant
-
-  def Children(self):
-    """Returns a list of all of this object's owned (strong) children."""
-
-    children = []
-    for property, attributes in self._schema.iteritems():
-      (is_list, property_type, is_strong) = attributes[0:3]
-      if is_strong and property in self._properties:
-        if not is_list:
-          children.append(self._properties[property])
-        else:
-          children.extend(self._properties[property])
-    return children
-
-  def Descendants(self):
-    """Returns a list of all of this object's descendants, including this
-    object.
-    """
-
-    children = self.Children()
-    descendants = [self]
-    for child in children:
-      descendants.extend(child.Descendants())
-    return descendants
-
-  def PBXProjectAncestor(self):
-    # The base case for recursion is defined at PBXProject.PBXProjectAncestor.
-    if self.parent:
-      return self.parent.PBXProjectAncestor()
-    return None
-
-  def _EncodeComment(self, comment):
-    """Encodes a comment to be placed in the project file output, mimicing
-    Xcode behavior.
-    """
-
-    # This mimics Xcode behavior by wrapping the comment in "/*" and "*/".  If
-    # the string already contains a "*/", it is turned into "(*)/".  This keeps
-    # the file writer from outputting something that would be treated as the
-    # end of a comment in the middle of something intended to be entirely a
-    # comment.
-
-    return '/* ' + comment.replace('*/', '(*)/') + ' */'
-
-  def _EncodeTransform(self, match):
-    # This function works closely with _EncodeString.  It will only be called
-    # by re.sub with match.group(0) containing a character matched by the
-    # the _escaped expression.
-    char = match.group(0)
-
-    # Backslashes (\) and quotation marks (") are always replaced with a
-    # backslash-escaped version of the same.  Everything else gets its
-    # replacement from the class' _encode_transforms array.
-    if char == '\\':
-      return '\\\\'
-    if char == '"':
-      return '\\"'
-    return self._encode_transforms[ord(char)]
-
-  def _EncodeString(self, value):
-    """Encodes a string to be placed in the project file output, mimicing
-    Xcode behavior.
-    """
-
-    # Use quotation marks when any character outside of the range A-Z, a-z, 0-9,
-    # $ (dollar sign), . (period), and _ (underscore) is present.  Also use
-    # quotation marks to represent empty strings.
-    #
-    # Escape " (double-quote) and \ (backslash) by preceding them with a
-    # backslash.
-    #
-    # Some characters below the printable ASCII range are encoded specially:
-    #     7 ^G BEL is encoded as "\a"
-    #     8 ^H BS  is encoded as "\b"
-    #    11 ^K VT  is encoded as "\v"
-    #    12 ^L NP  is encoded as "\f"
-    #   127 ^? DEL is passed through as-is without escaping
-    #  - In PBXFileReference and PBXBuildFile objects:
-    #     9 ^I HT  is passed through as-is without escaping
-    #    10 ^J NL  is passed through as-is without escaping
-    #    13 ^M CR  is passed through as-is without escaping
-    #  - In other objects:
-    #     9 ^I HT  is encoded as "\t"
-    #    10 ^J NL  is encoded as "\n"
-    #    13 ^M CR  is encoded as "\n" rendering it indistinguishable from
-    #              10 ^J NL
-    # All other nonprintable characters within the ASCII range (0 through 127
-    # inclusive) are encoded as "\U001f" referring to the Unicode code point in
-    # hexadecimal.  For example, character 14 (^N SO) is encoded as "\U000e".
-    # Characters above the ASCII range are passed through to the output encoded
-    # as UTF-8 without any escaping.  These mappings are contained in the
-    # class' _encode_transforms list.
-
-    if _unquoted.search(value) and not _quoted.search(value):
-      return value
-
-    return '"' + _escaped.sub(self._EncodeTransform, value) + '"'
-
-  def _XCPrint(self, file, tabs, line):
-    file.write('\t' * tabs + line)
-
-  def _XCPrintableValue(self, tabs, value, flatten_list=False):
-    """Returns a representation of value that may be printed in a project file,
-    mimicing Xcode's behavior.
-
-    _XCPrintableValue can handle str and int values, XCObjects (which are
-    made printable by returning their id property), and list and dict objects
-    composed of any of the above types.  When printing a list or dict, and
-    _should_print_single_line is False, the tabs parameter is used to determine
-    how much to indent the lines corresponding to the items in the list or
-    dict.
-
-    If flatten_list is True, single-element lists will be transformed into
-    strings.
-    """
-
-    printable = ''
-    comment = None
-
-    if self._should_print_single_line:
-      sep = ' '
-      element_tabs = ''
-      end_tabs = ''
-    else:
-      sep = '\n'
-      element_tabs = '\t' * (tabs + 1)
-      end_tabs = '\t' * tabs
-
-    if isinstance(value, XCObject):
-      printable += value.id
-      comment = value.Comment()
-    elif isinstance(value, str):
-      printable += self._EncodeString(value)
-    elif isinstance(value, unicode):
-      printable += self._EncodeString(value.encode('utf-8'))
-    elif isinstance(value, int):
-      printable += str(value)
-    elif isinstance(value, list):
-      if flatten_list and len(value) <= 1:
-        if len(value) == 0:
-          printable += self._EncodeString('')
-        else:
-          printable += self._EncodeString(value[0])
-      else:
-        printable = '(' + sep
-        for item in value:
-          printable += element_tabs + \
-                       self._XCPrintableValue(tabs + 1, item, flatten_list) + \
-                       ',' + sep
-        printable += end_tabs + ')'
-    elif isinstance(value, dict):
-      printable = '{' + sep
-      for item_key, item_value in sorted(value.iteritems()):
-        printable += element_tabs + \
-            self._XCPrintableValue(tabs + 1, item_key, flatten_list) + ' = ' + \
-            self._XCPrintableValue(tabs + 1, item_value, flatten_list) + ';' + \
-            sep
-      printable += end_tabs + '}'
-    else:
-      raise TypeError, "Can't make " + value.__class__.__name__ + ' printable'
-
-    if comment != None:
-      printable += ' ' + self._EncodeComment(comment)
-
-    return printable
-
-  def _XCKVPrint(self, file, tabs, key, value):
-    """Prints a key and value, members of an XCObject's _properties dictionary,
-    to file.
-
-    tabs is an int identifying the indentation level.  If the class'
-    _should_print_single_line variable is True, tabs is ignored and the
-    key-value pair will be followed by a space insead of a newline.
-    """
-
-    if self._should_print_single_line:
-      printable = ''
-      after_kv = ' '
-    else:
-      printable = '\t' * tabs
-      after_kv = '\n'
-
-    # Xcode usually prints remoteGlobalIDString values in PBXContainerItemProxy
-    # objects without comments.  Sometimes it prints them with comments, but
-    # the majority of the time, it doesn't.  To avoid unnecessary changes to
-    # the project file after Xcode opens it, don't write comments for
-    # remoteGlobalIDString.  This is a sucky hack and it would certainly be
-    # cleaner to extend the schema to indicate whether or not a comment should
-    # be printed, but since this is the only case where the problem occurs and
-    # Xcode itself can't seem to make up its mind, the hack will suffice.
-    #
-    # Also see PBXContainerItemProxy._schema['remoteGlobalIDString'].
-    if key == 'remoteGlobalIDString' and isinstance(self,
-                                                    PBXContainerItemProxy):
-      value_to_print = value.id
-    else:
-      value_to_print = value
-
-    # PBXBuildFile's settings property is represented in the output as a dict,
-    # but a hack here has it represented as a string. Arrange to strip off the
-    # quotes so that it shows up in the output as expected.
-    if key == 'settings' and isinstance(self, PBXBuildFile):
-      strip_value_quotes = True
-    else:
-      strip_value_quotes = False
-
-    # In another one-off, let's set flatten_list on buildSettings properties
-    # of XCBuildConfiguration objects, because that's how Xcode treats them.
-    if key == 'buildSettings' and isinstance(self, XCBuildConfiguration):
-      flatten_list = True
-    else:
-      flatten_list = False
-
-    try:
-      printable_key = self._XCPrintableValue(tabs, key, flatten_list)
-      printable_value = self._XCPrintableValue(tabs, value_to_print,
-                                               flatten_list)
-      if strip_value_quotes and len(printable_value) > 1 and \
-          printable_value[0] == '"' and printable_value[-1] == '"':
-        printable_value = printable_value[1:-1]
-      printable += printable_key + ' = ' + printable_value + ';' + after_kv
-    except TypeError, e:
-      gyp.common.ExceptionAppend(e,
-                                 'while printing key "%s"' % key)
-      raise
-
-    self._XCPrint(file, 0, printable)
-
-  def Print(self, file=sys.stdout):
-    """Prints a reprentation of this object to file, adhering to Xcode output
-    formatting.
-    """
-
-    self.VerifyHasRequiredProperties()
-
-    if self._should_print_single_line:
-      # When printing an object in a single line, Xcode doesn't put any space
-      # between the beginning of a dictionary (or presumably a list) and the
-      # first contained item, so you wind up with snippets like
-      #   ...CDEF = {isa = PBXFileReference; fileRef = 0123...
-      # If it were me, I would have put a space in there after the opening
-      # curly, but I guess this is just another one of those inconsistencies
-      # between how Xcode prints PBXFileReference and PBXBuildFile objects as
-      # compared to other objects.  Mimic Xcode's behavior here by using an
-      # empty string for sep.
-      sep = ''
-      end_tabs = 0
-    else:
-      sep = '\n'
-      end_tabs = 2
-
-    # Start the object.  For example, '\t\tPBXProject = {\n'.
-    self._XCPrint(file, 2, self._XCPrintableValue(2, self) + ' = {' + sep)
-
-    # "isa" isn't in the _properties dictionary, it's an intrinsic property
-    # of the class which the object belongs to.  Xcode always outputs "isa"
-    # as the first element of an object dictionary.
-    self._XCKVPrint(file, 3, 'isa', self.__class__.__name__)
-
-    # The remaining elements of an object dictionary are sorted alphabetically.
-    for property, value in sorted(self._properties.iteritems()):
-      self._XCKVPrint(file, 3, property, value)
-
-    # End the object.
-    self._XCPrint(file, end_tabs, '};\n')
-
-  def UpdateProperties(self, properties, do_copy=False):
-    """Merge the supplied properties into the _properties dictionary.
-
-    The input properties must adhere to the class schema or a KeyError or
-    TypeError exception will be raised.  If adding an object of an XCObject
-    subclass and the schema indicates a strong relationship, the object's
-    parent will be set to this object.
-
-    If do_copy is True, then lists, dicts, strong-owned XCObjects, and
-    strong-owned XCObjects in lists will be copied instead of having their
-    references added.
-    """
-
-    if properties is None:
-      return
-
-    for property, value in properties.iteritems():
-      # Make sure the property is in the schema.
-      if not property in self._schema:
-        raise KeyError, property + ' not in ' + self.__class__.__name__
-
-      # Make sure the property conforms to the schema.
-      (is_list, property_type, is_strong) = self._schema[property][0:3]
-      if is_list:
-        if value.__class__ != list:
-          raise TypeError, \
-                property + ' of ' + self.__class__.__name__ + \
-                ' must be list, not ' + value.__class__.__name__
-        for item in value:
-          if not isinstance(item, property_type) and \
-             not (item.__class__ == unicode and property_type == str):
-            # Accept unicode where str is specified.  str is treated as
-            # UTF-8-encoded.
-            raise TypeError, \
-                  'item of ' + property + ' of ' + self.__class__.__name__ + \
-                  ' must be ' + property_type.__name__ + ', not ' + \
-                  item.__class__.__name__
-      elif not isinstance(value, property_type) and \
-           not (value.__class__ == unicode and property_type == str):
-        # Accept unicode where str is specified.  str is treated as
-        # UTF-8-encoded.
-        raise TypeError, \
-              property + ' of ' + self.__class__.__name__ + ' must be ' + \
-              property_type.__name__ + ', not ' + value.__class__.__name__
-
-      # Checks passed, perform the assignment.
-      if do_copy:
-        if isinstance(value, XCObject):
-          if is_strong:
-            self._properties[property] = value.Copy()
-          else:
-            self._properties[property] = value
-        elif isinstance(value, str) or isinstance(value, unicode) or \
-             isinstance(value, int):
-          self._properties[property] = value
-        elif isinstance(value, list):
-          if is_strong:
-            # If is_strong is True, each element is an XCObject, so it's safe
-            # to call Copy.
-            self._properties[property] = []
-            for item in value:
-              self._properties[property].append(item.Copy())
-          else:
-            self._properties[property] = value[:]
-        elif isinstance(value, dict):
-          self._properties[property] = value.copy()
-        else:
-          raise TypeError, "Don't know how to copy a " + \
-                           value.__class__.__name__ + ' object for ' + \
-                           property + ' in ' + self.__class__.__name__
-      else:
-        self._properties[property] = value
-
-      # Set up the child's back-reference to this object.  Don't use |value|
-      # any more because it may not be right if do_copy is true.
-      if is_strong:
-        if not is_list:
-          self._properties[property].parent = self
-        else:
-          for item in self._properties[property]:
-            item.parent = self
-
-  def HasProperty(self, key):
-    return key in self._properties
-
-  def GetProperty(self, key):
-    return self._properties[key]
-
-  def SetProperty(self, key, value):
-    self.UpdateProperties({key: value})
-
-  def DelProperty(self, key):
-    if key in self._properties:
-      del self._properties[key]
-
-  def AppendProperty(self, key, value):
-    # TODO(mark): Support ExtendProperty too (and make this call that)?
-
-    # Schema validation.
-    if not key in self._schema:
-      raise KeyError, key + ' not in ' + self.__class__.__name__
-
-    (is_list, property_type, is_strong) = self._schema[key][0:3]
-    if not is_list:
-      raise TypeError, key + ' of ' + self.__class__.__name__ + ' must be list'
-    if not isinstance(value, property_type):
-      raise TypeError, 'item of ' + key + ' of ' + self.__class__.__name__ + \
-                       ' must be ' + property_type.__name__ + ', not ' + \
-                       value.__class__.__name__
-
-    # If the property doesn't exist yet, create a new empty list to receive the
-    # item.
-    if not key in self._properties:
-      self._properties[key] = []
-
-    # Set up the ownership link.
-    if is_strong:
-      value.parent = self
-
-    # Store the item.
-    self._properties[key].append(value)
-
-  def VerifyHasRequiredProperties(self):
-    """Ensure that all properties identified as required by the schema are
-    set.
-    """
-
-    # TODO(mark): A stronger verification mechanism is needed.  Some
-    # subclasses need to perform validation beyond what the schema can enforce.
-    for property, attributes in self._schema.iteritems():
-      (is_list, property_type, is_strong, is_required) = attributes[0:4]
-      if is_required and not property in self._properties:
-        raise KeyError, self.__class__.__name__ + ' requires ' + property
-
-  def _SetDefaultsFromSchema(self):
-    """Assign object default values according to the schema.  This will not
-    overwrite properties that have already been set."""
-
-    defaults = {}
-    for property, attributes in self._schema.iteritems():
-      (is_list, property_type, is_strong, is_required) = attributes[0:4]
-      if is_required and len(attributes) >= 5 and \
-          not property in self._properties:
-        default = attributes[4]
-
-        defaults[property] = default
-
-    if len(defaults) > 0:
-      # Use do_copy=True so that each new object gets its own copy of strong
-      # objects, lists, and dicts.
-      self.UpdateProperties(defaults, do_copy=True)
-
-
-class XCHierarchicalElement(XCObject):
-  """Abstract base for PBXGroup and PBXFileReference.  Not represented in a
-  project file."""
-
-  # TODO(mark): Do name and path belong here?  Probably so.
-  # If path is set and name is not, name may have a default value.  Name will
-  # be set to the basename of path, if the basename of path is different from
-  # the full value of path.  If path is already just a leaf name, name will
-  # not be set.
-  _schema = XCObject._schema.copy()
-  _schema.update({
-    'comments':       [0, str, 0, 0],
-    'fileEncoding':   [0, str, 0, 0],
-    'includeInIndex': [0, int, 0, 0],
-    'indentWidth':    [0, int, 0, 0],
-    'lineEnding':     [0, int, 0, 0],
-    'sourceTree':     [0, str, 0, 1, '<group>'],
-    'tabWidth':       [0, int, 0, 0],
-    'usesTabs':       [0, int, 0, 0],
-    'wrapsLines':     [0, int, 0, 0],
-  })
-
-  def __init__(self, properties=None, id=None, parent=None):
-    # super
-    XCObject.__init__(self, properties, id, parent)
-    if 'path' in self._properties and not 'name' in self._properties:
-      path = self._properties['path']
-      name = posixpath.basename(path)
-      if name != '' and path != name:
-        self.SetProperty('name', name)
-
-    if 'path' in self._properties and \
-        (not 'sourceTree' in self._properties or \
-         self._properties['sourceTree'] == '<group>'):
-      # If the pathname begins with an Xcode variable like "$(SDKROOT)/", take
-      # the variable out and make the path be relative to that variable by
-      # assigning the variable name as the sourceTree.
-      (source_tree, path) = SourceTreeAndPathFromPath(self._properties['path'])
-      if source_tree != None:
-        self._properties['sourceTree'] = source_tree
-      if path != None:
-        self._properties['path'] = path
-      if source_tree != None and path is None and \
-         not 'name' in self._properties:
-        # The path was of the form "$(SDKROOT)" with no path following it.
-        # This object is now relative to that variable, so it has no path
-        # attribute of its own.  It does, however, keep a name.
-        del self._properties['path']
-        self._properties['name'] = source_tree
-
-  def Name(self):
-    if 'name' in self._properties:
-      return self._properties['name']
-    elif 'path' in self._properties:
-      return self._properties['path']
-    else:
-      # This happens in the case of the root PBXGroup.
-      return None
-
-  def Hashables(self):
-    """Custom hashables for XCHierarchicalElements.
-
-    XCHierarchicalElements are special.  Generally, their hashes shouldn't
-    change if the paths don't change.  The normal XCObject implementation of
-    Hashables adds a hashable for each object, which means that if
-    the hierarchical structure changes (possibly due to changes caused when
-    TakeOverOnlyChild runs and encounters slight changes in the hierarchy),
-    the hashes will change.  For example, if a project file initially contains
-    a/b/f1 and a/b becomes collapsed into a/b, f1 will have a single parent
-    a/b.  If someone later adds a/f2 to the project file, a/b can no longer be
-    collapsed, and f1 winds up with parent b and grandparent a.  That would
-    be sufficient to change f1's hash.
-
-    To counteract this problem, hashables for all XCHierarchicalElements except
-    for the main group (which has neither a name nor a path) are taken to be
-    just the set of path components.  Because hashables are inherited from
-    parents, this provides assurance that a/b/f1 has the same set of hashables
-    whether its parent is b or a/b.
-
-    The main group is a special case.  As it is permitted to have no name or
-    path, it is permitted to use the standard XCObject hash mechanism.  This
-    is not considered a problem because there can be only one main group.
-    """
-
-    if self == self.PBXProjectAncestor()._properties['mainGroup']:
-      # super
-      return XCObject.Hashables(self)
-
-    hashables = []
-
-    # Put the name in first, ensuring that if TakeOverOnlyChild collapses
-    # children into a top-level group like "Source", the name always goes
-    # into the list of hashables without interfering with path components.
-    if 'name' in self._properties:
-      # Make it less likely for people to manipulate hashes by following the
-      # pattern of always pushing an object type value onto the list first.
-      hashables.append(self.__class__.__name__ + '.name')
-      hashables.append(self._properties['name'])
-
-    # NOTE: This still has the problem that if an absolute path is encountered,
-    # including paths with a sourceTree, they'll still inherit their parents'
-    # hashables, even though the paths aren't relative to their parents.  This
-    # is not expected to be much of a problem in practice.
-    path = self.PathFromSourceTreeAndPath()
-    if path != None:
-      components = path.split(posixpath.sep)
-      for component in components:
-        hashables.append(self.__class__.__name__ + '.path')
-        hashables.append(component)
-
-    hashables.extend(self._hashables)
-
-    return hashables
-
-  def Compare(self, other):
-    # Allow comparison of these types.  PBXGroup has the highest sort rank;
-    # PBXVariantGroup is treated as equal to PBXFileReference.
-    valid_class_types = {
-      PBXFileReference: 'file',
-      PBXGroup:         'group',
-      PBXVariantGroup:  'file',
-    }
-    self_type = valid_class_types[self.__class__]
-    other_type = valid_class_types[other.__class__]
-
-    if self_type == other_type:
-      # If the two objects are of the same sort rank, compare their names.
-      return cmp(self.Name(), other.Name())
-
-    # Otherwise, sort groups before everything else.
-    if self_type == 'group':
-      return -1
-    return 1
-
-  def CompareRootGroup(self, other):
-    # This function should be used only to compare direct children of the
-    # containing PBXProject's mainGroup.  These groups should appear in the
-    # listed order.
-    # TODO(mark): "Build" is used by gyp.generator.xcode, perhaps the
-    # generator should have a way of influencing this list rather than having
-    # to hardcode for the generator here.
-    order = ['Source', 'Intermediates', 'Projects', 'Frameworks', 'Products',
-             'Build']
-
-    # If the groups aren't in the listed order, do a name comparison.
-    # Otherwise, groups in the listed order should come before those that
-    # aren't.
-    self_name = self.Name()
-    other_name = other.Name()
-    self_in = isinstance(self, PBXGroup) and self_name in order
-    other_in = isinstance(self, PBXGroup) and other_name in order
-    if not self_in and not other_in:
-      return self.Compare(other)
-    if self_name in order and not other_name in order:
-      return -1
-    if other_name in order and not self_name in order:
-      return 1
-
-    # If both groups are in the listed order, go by the defined order.
-    self_index = order.index(self_name)
-    other_index = order.index(other_name)
-    if self_index < other_index:
-      return -1
-    if self_index > other_index:
-      return 1
-    return 0
-
-  def PathFromSourceTreeAndPath(self):
-    # Turn the object's sourceTree and path properties into a single flat
-    # string of a form comparable to the path parameter.  If there's a
-    # sourceTree property other than "<group>", wrap it in $(...) for the
-    # comparison.
-    components = []
-    if self._properties['sourceTree'] != '<group>':
-      components.append('$(' + self._properties['sourceTree'] + ')')
-    if 'path' in self._properties:
-      components.append(self._properties['path'])
-
-    if len(components) > 0:
-      return posixpath.join(*components)
-
-    return None
-
-  def FullPath(self):
-    # Returns a full path to self relative to the project file, or relative
-    # to some other source tree.  Start with self, and walk up the chain of
-    # parents prepending their paths, if any, until no more parents are
-    # available (project-relative path) or until a path relative to some
-    # source tree is found.
-    xche = self
-    path = None
-    while isinstance(xche, XCHierarchicalElement) and \
-          (path is None or \
-           (not path.startswith('/') and not path.startswith('$'))):
-      this_path = xche.PathFromSourceTreeAndPath()
-      if this_path != None and path != None:
-        path = posixpath.join(this_path, path)
-      elif this_path != None:
-        path = this_path
-      xche = xche.parent
-
-    return path
-
-
-class PBXGroup(XCHierarchicalElement):
-  """
-  Attributes:
-    _children_by_path: Maps pathnames of children of this PBXGroup to the
-      actual child XCHierarchicalElement objects.
-    _variant_children_by_name_and_path: Maps (name, path) tuples of
-      PBXVariantGroup children to the actual child PBXVariantGroup objects.
-  """
-
-  _schema = XCHierarchicalElement._schema.copy()
-  _schema.update({
-    'children': [1, XCHierarchicalElement, 1, 1, []],
-    'name':     [0, str,                   0, 0],
-    'path':     [0, str,                   0, 0],
-  })
-
-  def __init__(self, properties=None, id=None, parent=None):
-    # super
-    XCHierarchicalElement.__init__(self, properties, id, parent)
-    self._children_by_path = {}
-    self._variant_children_by_name_and_path = {}
-    for child in self._properties.get('children', []):
-      self._AddChildToDicts(child)
-
-  def Hashables(self):
-    # super
-    hashables = XCHierarchicalElement.Hashables(self)
-
-    # It is not sufficient to just rely on name and parent to build a unique
-    # hashable : a node could have two child PBXGroup sharing a common name.
-    # To add entropy the hashable is enhanced with the names of all its
-    # children.
-    for child in self._properties.get('children', []):
-      child_name = child.Name()
-      if child_name != None:
-        hashables.append(child_name)
-
-    return hashables
-
-  def HashablesForChild(self):
-    # To avoid a circular reference the hashables used to compute a child id do
-    # not include the child names.
-    return XCHierarchicalElement.Hashables(self)
-
-  def _AddChildToDicts(self, child):
-    # Sets up this PBXGroup object's dicts to reference the child properly.
-    child_path = child.PathFromSourceTreeAndPath()
-    if child_path:
-      if child_path in self._children_by_path:
-        raise ValueError, 'Found multiple children with path ' + child_path
-      self._children_by_path[child_path] = child
-
-    if isinstance(child, PBXVariantGroup):
-      child_name = child._properties.get('name', None)
-      key = (child_name, child_path)
-      if key in self._variant_children_by_name_and_path:
-        raise ValueError, 'Found multiple PBXVariantGroup children with ' + \
-                          'name ' + str(child_name) + ' and path ' + \
-                          str(child_path)
-      self._variant_children_by_name_and_path[key] = child
-
-  def AppendChild(self, child):
-    # Callers should use this instead of calling
-    # AppendProperty('children', child) directly because this function
-    # maintains the group's dicts.
-    self.AppendProperty('children', child)
-    self._AddChildToDicts(child)
-
-  def GetChildByName(self, name):
-    # This is not currently optimized with a dict as GetChildByPath is because
-    # it has few callers.  Most callers probably want GetChildByPath.  This
-    # function is only useful to get children that have names but no paths,
-    # which is rare.  The children of the main group ("Source", "Products",
-    # etc.) is pretty much the only case where this likely to come up.
-    #
-    # TODO(mark): Maybe this should raise an error if more than one child is
-    # present with the same name.
-    if not 'children' in self._properties:
-      return None
-
-    for child in self._properties['children']:
-      if child.Name() == name:
-        return child
-
-    return None
-
-  def GetChildByPath(self, path):
-    if not path:
-      return None
-
-    if path in self._children_by_path:
-      return self._children_by_path[path]
-
-    return None
-
-  def GetChildByRemoteObject(self, remote_object):
-    # This method is a little bit esoteric.  Given a remote_object, which
-    # should be a PBXFileReference in another project file, this method will
-    # return this group's PBXReferenceProxy object serving as a local proxy
-    # for the remote PBXFileReference.
-    #
-    # This function might benefit from a dict optimization as GetChildByPath
-    # for some workloads, but profiling shows that it's not currently a
-    # problem.
-    if not 'children' in self._properties:
-      return None
-
-    for child in self._properties['children']:
-      if not isinstance(child, PBXReferenceProxy):
-        continue
-
-      container_proxy = child._properties['remoteRef']
-      if container_proxy._properties['remoteGlobalIDString'] == remote_object:
-        return child
-
-    return None
-
-  def AddOrGetFileByPath(self, path, hierarchical):
-    """Returns an existing or new file reference corresponding to path.
-
-    If hierarchical is True, this method will create or use the necessary
-    hierarchical group structure corresponding to path.  Otherwise, it will
-    look in and create an item in the current group only.
-
-    If an existing matching reference is found, it is returned, otherwise, a
-    new one will be created, added to the correct group, and returned.
-
-    If path identifies a directory by virtue of carrying a trailing slash,
-    this method returns a PBXFileReference of "folder" type.  If path
-    identifies a variant, by virtue of it identifying a file inside a directory
-    with an ".lproj" extension, this method returns a PBXVariantGroup
-    containing the variant named by path, and possibly other variants.  For
-    all other paths, a "normal" PBXFileReference will be returned.
-    """
-
-    # Adding or getting a directory?  Directories end with a trailing slash.
-    is_dir = False
-    if path.endswith('/'):
-      is_dir = True
-    path = posixpath.normpath(path)
-    if is_dir:
-      path = path + '/'
-
-    # Adding or getting a variant?  Variants are files inside directories
-    # with an ".lproj" extension.  Xcode uses variants for localization.  For
-    # a variant path/to/Language.lproj/MainMenu.nib, put a variant group named
-    # MainMenu.nib inside path/to, and give it a variant named Language.  In
-    # this example, grandparent would be set to path/to and parent_root would
-    # be set to Language.
-    variant_name = None
-    parent = posixpath.dirname(path)
-    grandparent = posixpath.dirname(parent)
-    parent_basename = posixpath.basename(parent)
-    (parent_root, parent_ext) = posixpath.splitext(parent_basename)
-    if parent_ext == '.lproj':
-      variant_name = parent_root
-    if grandparent == '':
-      grandparent = None
-
-    # Putting a directory inside a variant group is not currently supported.
-    assert not is_dir or variant_name is None
-
-    path_split = path.split(posixpath.sep)
-    if len(path_split) == 1 or \
-       ((is_dir or variant_name != None) and len(path_split) == 2) or \
-       not hierarchical:
-      # The PBXFileReference or PBXVariantGroup will be added to or gotten from
-      # this PBXGroup, no recursion necessary.
-      if variant_name is None:
-        # Add or get a PBXFileReference.
-        file_ref = self.GetChildByPath(path)
-        if file_ref != None:
-          assert file_ref.__class__ == PBXFileReference
-        else:
-          file_ref = PBXFileReference({'path': path})
-          self.AppendChild(file_ref)
-      else:
-        # Add or get a PBXVariantGroup.  The variant group name is the same
-        # as the basename (MainMenu.nib in the example above).  grandparent
-        # specifies the path to the variant group itself, and path_split[-2:]
-        # is the path of the specific variant relative to its group.
-        variant_group_name = posixpath.basename(path)
-        variant_group_ref = self.AddOrGetVariantGroupByNameAndPath(
-            variant_group_name, grandparent)
-        variant_path = posixpath.sep.join(path_split[-2:])
-        variant_ref = variant_group_ref.GetChildByPath(variant_path)
-        if variant_ref != None:
-          assert variant_ref.__class__ == PBXFileReference
-        else:
-          variant_ref = PBXFileReference({'name': variant_name,
-                                          'path': variant_path})
-          variant_group_ref.AppendChild(variant_ref)
-        # The caller is interested in the variant group, not the specific
-        # variant file.
-        file_ref = variant_group_ref
-      return file_ref
-    else:
-      # Hierarchical recursion.  Add or get a PBXGroup corresponding to the
-      # outermost path component, and then recurse into it, chopping off that
-      # path component.
-      next_dir = path_split[0]
-      group_ref = self.GetChildByPath(next_dir)
-      if group_ref != None:
-        assert group_ref.__class__ == PBXGroup
-      else:
-        group_ref = PBXGroup({'path': next_dir})
-        self.AppendChild(group_ref)
-      return group_ref.AddOrGetFileByPath(posixpath.sep.join(path_split[1:]),
-                                          hierarchical)
-
-  def AddOrGetVariantGroupByNameAndPath(self, name, path):
-    """Returns an existing or new PBXVariantGroup for name and path.
-
-    If a PBXVariantGroup identified by the name and path arguments is already
-    present as a child of this object, it is returned.  Otherwise, a new
-    PBXVariantGroup with the correct properties is created, added as a child,
-    and returned.
-
-    This method will generally be called by AddOrGetFileByPath, which knows
-    when to create a variant group based on the structure of the pathnames
-    passed to it.
-    """
-
-    key = (name, path)
-    if key in self._variant_children_by_name_and_path:
-      variant_group_ref = self._variant_children_by_name_and_path[key]
-      assert variant_group_ref.__class__ == PBXVariantGroup
-      return variant_group_ref
-
-    variant_group_properties = {'name': name}
-    if path != None:
-      variant_group_properties['path'] = path
-    variant_group_ref = PBXVariantGroup(variant_group_properties)
-    self.AppendChild(variant_group_ref)
-
-    return variant_group_ref
-
-  def TakeOverOnlyChild(self, recurse=False):
-    """If this PBXGroup has only one child and it's also a PBXGroup, take
-    it over by making all of its children this object's children.
-
-    This function will continue to take over only children when those children
-    are groups.  If there are three PBXGroups representing a, b, and c, with
-    c inside b and b inside a, and a and b have no other children, this will
-    result in a taking over both b and c, forming a PBXGroup for a/b/c.
-
-    If recurse is True, this function will recurse into children and ask them
-    to collapse themselves by taking over only children as well.  Assuming
-    an example hierarchy with files at a/b/c/d1, a/b/c/d2, and a/b/c/d3/e/f
-    (d1, d2, and f are files, the rest are groups), recursion will result in
-    a group for a/b/c containing a group for d3/e.
-    """
-
-    # At this stage, check that child class types are PBXGroup exactly,
-    # instead of using isinstance.  The only subclass of PBXGroup,
-    # PBXVariantGroup, should not participate in reparenting in the same way:
-    # reparenting by merging different object types would be wrong.
-    while len(self._properties['children']) == 1 and \
-          self._properties['children'][0].__class__ == PBXGroup:
-      # Loop to take over the innermost only-child group possible.
-
-      child = self._properties['children'][0]
-
-      # Assume the child's properties, including its children.  Save a copy
-      # of this object's old properties, because they'll still be needed.
-      # This object retains its existing id and parent attributes.
-      old_properties = self._properties
-      self._properties = child._properties
-      self._children_by_path = child._children_by_path
-
-      if not 'sourceTree' in self._properties or \
-         self._properties['sourceTree'] == '<group>':
-        # The child was relative to its parent.  Fix up the path.  Note that
-        # children with a sourceTree other than "<group>" are not relative to
-        # their parents, so no path fix-up is needed in that case.
-        if 'path' in old_properties:
-          if 'path' in self._properties:
-            # Both the original parent and child have paths set.
-            self._properties['path'] = posixpath.join(old_properties['path'],
-                                                      self._properties['path'])
-          else:
-            # Only the original parent has a path, use it.
-            self._properties['path'] = old_properties['path']
-        if 'sourceTree' in old_properties:
-          # The original parent had a sourceTree set, use it.
-          self._properties['sourceTree'] = old_properties['sourceTree']
-
-      # If the original parent had a name set, keep using it.  If the original
-      # parent didn't have a name but the child did, let the child's name
-      # live on.  If the name attribute seems unnecessary now, get rid of it.
-      if 'name' in old_properties and old_properties['name'] != None and \
-         old_properties['name'] != self.Name():
-        self._properties['name'] = old_properties['name']
-      if 'name' in self._properties and 'path' in self._properties and \
-         self._properties['name'] == self._properties['path']:
-        del self._properties['name']
-
-      # Notify all children of their new parent.
-      for child in self._properties['children']:
-        child.parent = self
-
-    # If asked to recurse, recurse.
-    if recurse:
-      for child in self._properties['children']:
-        if child.__class__ == PBXGroup:
-          child.TakeOverOnlyChild(recurse)
-
-  def SortGroup(self):
-    self._properties['children'] = \
-        sorted(self._properties['children'], cmp=lambda x,y: x.Compare(y))
-
-    # Recurse.
-    for child in self._properties['children']:
-      if isinstance(child, PBXGroup):
-        child.SortGroup()
-
-
-class XCFileLikeElement(XCHierarchicalElement):
-  # Abstract base for objects that can be used as the fileRef property of
-  # PBXBuildFile.
-
-  def PathHashables(self):
-    # A PBXBuildFile that refers to this object will call this method to
-    # obtain additional hashables specific to this XCFileLikeElement.  Don't
-    # just use this object's hashables, they're not specific and unique enough
-    # on their own (without access to the parent hashables.)  Instead, provide
-    # hashables that identify this object by path by getting its hashables as
-    # well as the hashables of ancestor XCHierarchicalElement objects.
-
-    hashables = []
-    xche = self
-    while xche != None and isinstance(xche, XCHierarchicalElement):
-      xche_hashables = xche.Hashables()
-      for index in xrange(0, len(xche_hashables)):
-        hashables.insert(index, xche_hashables[index])
-      xche = xche.parent
-    return hashables
-
-
-class XCContainerPortal(XCObject):
-  # Abstract base for objects that can be used as the containerPortal property
-  # of PBXContainerItemProxy.
-  pass
-
-
-class XCRemoteObject(XCObject):
-  # Abstract base for objects that can be used as the remoteGlobalIDString
-  # property of PBXContainerItemProxy.
-  pass
-
-
-class PBXFileReference(XCFileLikeElement, XCContainerPortal, XCRemoteObject):
-  _schema = XCFileLikeElement._schema.copy()
-  _schema.update({
-    'explicitFileType':  [0, str, 0, 0],
-    'lastKnownFileType': [0, str, 0, 0],
-    'name':              [0, str, 0, 0],
-    'path':              [0, str, 0, 1],
-  })
-
-  # Weird output rules for PBXFileReference.
-  _should_print_single_line = True
-  # super
-  _encode_transforms = XCFileLikeElement._alternate_encode_transforms
-
-  def __init__(self, properties=None, id=None, parent=None):
-    # super
-    XCFileLikeElement.__init__(self, properties, id, parent)
-    if 'path' in self._properties and self._properties['path'].endswith('/'):
-      self._properties['path'] = self._properties['path'][:-1]
-      is_dir = True
-    else:
-      is_dir = False
-
-    if 'path' in self._properties and \
-        not 'lastKnownFileType' in self._properties and \
-        not 'explicitFileType' in self._properties:
-      # TODO(mark): This is the replacement for a replacement for a quick hack.
-      # It is no longer incredibly sucky, but this list needs to be extended.
-      extension_map = {
-        'a':           'archive.ar',
-        'app':         'wrapper.application',
-        'bdic':        'file',
-        'bundle':      'wrapper.cfbundle',
-        'c':           'sourcecode.c.c',
-        'cc':          'sourcecode.cpp.cpp',
-        'cpp':         'sourcecode.cpp.cpp',
-        'css':         'text.css',
-        'cxx':         'sourcecode.cpp.cpp',
-        'dylib':       'compiled.mach-o.dylib',
-        'framework':   'wrapper.framework',
-        'h':           'sourcecode.c.h',
-        'hxx':         'sourcecode.cpp.h',
-        'icns':        'image.icns',
-        'java':        'sourcecode.java',
-        'js':          'sourcecode.javascript',
-        'm':           'sourcecode.c.objc',
-        'mm':          'sourcecode.cpp.objcpp',
-        'nib':         'wrapper.nib',
-        'o':           'compiled.mach-o.objfile',
-        'pdf':         'image.pdf',
-        'pl':          'text.script.perl',
-        'plist':       'text.plist.xml',
-        'pm':          'text.script.perl',
-        'png':         'image.png',
-        'py':          'text.script.python',
-        'r':           'sourcecode.rez',
-        'rez':         'sourcecode.rez',
-        's':           'sourcecode.asm',
-        'storyboard':  'file.storyboard',
-        'strings':     'text.plist.strings',
-        'ttf':         'file',
-        'xcconfig':    'text.xcconfig',
-        'xcdatamodel': 'wrapper.xcdatamodel',
-        'xib':         'file.xib',
-        'y':           'sourcecode.yacc',
-      }
-
-      if is_dir:
-        file_type = 'folder'
-      else:
-        basename = posixpath.basename(self._properties['path'])
-        (root, ext) = posixpath.splitext(basename)
-        # Check the map using a lowercase extension.
-        # TODO(mark): Maybe it should try with the original case first and fall
-        # back to lowercase, in case there are any instances where case
-        # matters.  There currently aren't.
-        if ext != '':
-          ext = ext[1:].lower()
-
-        # TODO(mark): "text" is the default value, but "file" is appropriate
-        # for unrecognized files not containing text.  Xcode seems to choose
-        # based on content.
-        file_type = extension_map.get(ext, 'text')
-
-      self._properties['lastKnownFileType'] = file_type
-
-
-class PBXVariantGroup(PBXGroup, XCFileLikeElement):
-  """PBXVariantGroup is used by Xcode to represent localizations."""
-  # No additions to the schema relative to PBXGroup.
-  pass
-
-
-# PBXReferenceProxy is also an XCFileLikeElement subclass.  It is defined below
-# because it uses PBXContainerItemProxy, defined below.
-
-
-class XCBuildConfiguration(XCObject):
-  _schema = XCObject._schema.copy()
-  _schema.update({
-    'baseConfigurationReference': [0, PBXFileReference, 0, 0],
-    'buildSettings':              [0, dict, 0, 1, {}],
-    'name':                       [0, str,  0, 1],
-  })
-
-  def HasBuildSetting(self, key):
-    return key in self._properties['buildSettings']
-
-  def GetBuildSetting(self, key):
-    return self._properties['buildSettings'][key]
-
-  def SetBuildSetting(self, key, value):
-    # TODO(mark): If a list, copy?
-    self._properties['buildSettings'][key] = value
-
-  def AppendBuildSetting(self, key, value):
-    if not key in self._properties['buildSettings']:
-      self._properties['buildSettings'][key] = []
-    self._properties['buildSettings'][key].append(value)
-
-  def DelBuildSetting(self, key):
-    if key in self._properties['buildSettings']:
-      del self._properties['buildSettings'][key]
-
-  def SetBaseConfiguration(self, value):
-    self._properties['baseConfigurationReference'] = value
-
-class XCConfigurationList(XCObject):
-  # _configs is the default list of configurations.
-  _configs = [ XCBuildConfiguration({'name': 'Debug'}),
-               XCBuildConfiguration({'name': 'Release'}) ]
-
-  _schema = XCObject._schema.copy()
-  _schema.update({
-    'buildConfigurations':           [1, XCBuildConfiguration, 1, 1, _configs],
-    'defaultConfigurationIsVisible': [0, int,                  0, 1, 1],
-    'defaultConfigurationName':      [0, str,                  0, 1, 'Release'],
-  })
-
-  def Name(self):
-    return 'Build configuration list for ' + \
-           self.parent.__class__.__name__ + ' "' + self.parent.Name() + '"'
-
-  def ConfigurationNamed(self, name):
-    """Convenience accessor to obtain an XCBuildConfiguration by name."""
-    for configuration in self._properties['buildConfigurations']:
-      if configuration._properties['name'] == name:
-        return configuration
-
-    raise KeyError, name
-
-  def DefaultConfiguration(self):
-    """Convenience accessor to obtain the default XCBuildConfiguration."""
-    return self.ConfigurationNamed(self._properties['defaultConfigurationName'])
-
-  def HasBuildSetting(self, key):
-    """Determines the state of a build setting in all XCBuildConfiguration
-    child objects.
-
-    If all child objects have key in their build settings, and the value is the
-    same in all child objects, returns 1.
-
-    If no child objects have the key in their build settings, returns 0.
-
-    If some, but not all, child objects have the key in their build settings,
-    or if any children have different values for the key, returns -1.
-    """
-
-    has = None
-    value = None
-    for configuration in self._properties['buildConfigurations']:
-      configuration_has = configuration.HasBuildSetting(key)
-      if has is None:
-        has = configuration_has
-      elif has != configuration_has:
-        return -1
-
-      if configuration_has:
-        configuration_value = configuration.GetBuildSetting(key)
-        if value is None:
-          value = configuration_value
-        elif value != configuration_value:
-          return -1
-
-    if not has:
-      return 0
-
-    return 1
-
-  def GetBuildSetting(self, key):
-    """Gets the build setting for key.
-
-    All child XCConfiguration objects must have the same value set for the
-    setting, or a ValueError will be raised.
-    """
-
-    # TODO(mark): This is wrong for build settings that are lists.  The list
-    # contents should be compared (and a list copy returned?)
-
-    value = None
-    for configuration in self._properties['buildConfigurations']:
-      configuration_value = configuration.GetBuildSetting(key)
-      if value is None:
-        value = configuration_value
-      else:
-        if value != configuration_value:
-          raise ValueError, 'Variant values for ' + key
-
-    return value
-
-  def SetBuildSetting(self, key, value):
-    """Sets the build setting for key to value in all child
-    XCBuildConfiguration objects.
-    """
-
-    for configuration in self._properties['buildConfigurations']:
-      configuration.SetBuildSetting(key, value)
-
-  def AppendBuildSetting(self, key, value):
-    """Appends value to the build setting for key, which is treated as a list,
-    in all child XCBuildConfiguration objects.
-    """
-
-    for configuration in self._properties['buildConfigurations']:
-      configuration.AppendBuildSetting(key, value)
-
-  def DelBuildSetting(self, key):
-    """Deletes the build setting key from all child XCBuildConfiguration
-    objects.
-    """
-
-    for configuration in self._properties['buildConfigurations']:
-      configuration.DelBuildSetting(key)
-
-  def SetBaseConfiguration(self, value):
-    """Sets the build configuration in all child XCBuildConfiguration objects.
-    """
-
-    for configuration in self._properties['buildConfigurations']:
-      configuration.SetBaseConfiguration(value)
-
-
-class PBXBuildFile(XCObject):
-  _schema = XCObject._schema.copy()
-  _schema.update({
-    'fileRef':  [0, XCFileLikeElement, 0, 1],
-    'settings': [0, str,               0, 0],  # hack, it's a dict
-  })
-
-  # Weird output rules for PBXBuildFile.
-  _should_print_single_line = True
-  _encode_transforms = XCObject._alternate_encode_transforms
-
-  def Name(self):
-    # Example: "main.cc in Sources"
-    return self._properties['fileRef'].Name() + ' in ' + self.parent.Name()
-
-  def Hashables(self):
-    # super
-    hashables = XCObject.Hashables(self)
-
-    # It is not sufficient to just rely on Name() to get the
-    # XCFileLikeElement's name, because that is not a complete pathname.
-    # PathHashables returns hashables unique enough that no two
-    # PBXBuildFiles should wind up with the same set of hashables, unless
-    # someone adds the same file multiple times to the same target.  That
-    # would be considered invalid anyway.
-    hashables.extend(self._properties['fileRef'].PathHashables())
-
-    return hashables
-
-
-class XCBuildPhase(XCObject):
-  """Abstract base for build phase classes.  Not represented in a project
-  file.
-
-  Attributes:
-    _files_by_path: A dict mapping each path of a child in the files list by
-      path (keys) to the corresponding PBXBuildFile children (values).
-    _files_by_xcfilelikeelement: A dict mapping each XCFileLikeElement (keys)
-      to the corresponding PBXBuildFile children (values).
-  """
-
-  # TODO(mark): Some build phase types, like PBXShellScriptBuildPhase, don't
-  # actually have a "files" list.  XCBuildPhase should not have "files" but
-  # another abstract subclass of it should provide this, and concrete build
-  # phase types that do have "files" lists should be derived from that new
-  # abstract subclass.  XCBuildPhase should only provide buildActionMask and
-  # runOnlyForDeploymentPostprocessing, and not files or the various
-  # file-related methods and attributes.
-
-  _schema = XCObject._schema.copy()
-  _schema.update({
-    'buildActionMask':                    [0, int,          0, 1, 0x7fffffff],
-    'files':                              [1, PBXBuildFile, 1, 1, []],
-    'runOnlyForDeploymentPostprocessing': [0, int,          0, 1, 0],
-  })
-
-  def __init__(self, properties=None, id=None, parent=None):
-    # super
-    XCObject.__init__(self, properties, id, parent)
-
-    self._files_by_path = {}
-    self._files_by_xcfilelikeelement = {}
-    for pbxbuildfile in self._properties.get('files', []):
-      self._AddBuildFileToDicts(pbxbuildfile)
-
-  def FileGroup(self, path):
-    # Subclasses must override this by returning a two-element tuple.  The
-    # first item in the tuple should be the PBXGroup to which "path" should be
-    # added, either as a child or deeper descendant.  The second item should
-    # be a boolean indicating whether files should be added into hierarchical
-    # groups or one single flat group.
-    raise NotImplementedError, \
-          self.__class__.__name__ + ' must implement FileGroup'
-
-  def _AddPathToDict(self, pbxbuildfile, path):
-    """Adds path to the dict tracking paths belonging to this build phase.
-
-    If the path is already a member of this build phase, raises an exception.
-    """
-
-    if path in self._files_by_path:
-      raise ValueError, 'Found multiple build files with path ' + path
-    self._files_by_path[path] = pbxbuildfile
-
-  def _AddBuildFileToDicts(self, pbxbuildfile, path=None):
-    """Maintains the _files_by_path and _files_by_xcfilelikeelement dicts.
-
-    If path is specified, then it is the path that is being added to the
-    phase, and pbxbuildfile must contain either a PBXFileReference directly
-    referencing that path, or it must contain a PBXVariantGroup that itself
-    contains a PBXFileReference referencing the path.
-
-    If path is not specified, either the PBXFileReference's path or the paths
-    of all children of the PBXVariantGroup are taken as being added to the
-    phase.
-
-    If the path is already present in the phase, raises an exception.
-
-    If the PBXFileReference or PBXVariantGroup referenced by pbxbuildfile
-    are already present in the phase, referenced by a different PBXBuildFile
-    object, raises an exception.  This does not raise an exception when
-    a PBXFileReference or PBXVariantGroup reappear and are referenced by the
-    same PBXBuildFile that has already introduced them, because in the case
-    of PBXVariantGroup objects, they may correspond to multiple paths that are
-    not all added simultaneously.  When this situation occurs, the path needs
-    to be added to _files_by_path, but nothing needs to change in
-    _files_by_xcfilelikeelement, and the caller should have avoided adding
-    the PBXBuildFile if it is already present in the list of children.
-    """
-
-    xcfilelikeelement = pbxbuildfile._properties['fileRef']
-
-    paths = []
-    if path != None:
-      # It's best when the caller provides the path.
-      if isinstance(xcfilelikeelement, PBXVariantGroup):
-        paths.append(path)
-    else:
-      # If the caller didn't provide a path, there can be either multiple
-      # paths (PBXVariantGroup) or one.
-      if isinstance(xcfilelikeelement, PBXVariantGroup):
-        for variant in xcfilelikeelement._properties['children']:
-          paths.append(variant.FullPath())
-      else:
-        paths.append(xcfilelikeelement.FullPath())
-
-    # Add the paths first, because if something's going to raise, the
-    # messages provided by _AddPathToDict are more useful owing to its
-    # having access to a real pathname and not just an object's Name().
-    for a_path in paths:
-      self._AddPathToDict(pbxbuildfile, a_path)
-
-    # If another PBXBuildFile references this XCFileLikeElement, there's a
-    # problem.
-    if xcfilelikeelement in self._files_by_xcfilelikeelement and \
-       self._files_by_xcfilelikeelement[xcfilelikeelement] != pbxbuildfile:
-      raise ValueError, 'Found multiple build files for ' + \
-                        xcfilelikeelement.Name()
-    self._files_by_xcfilelikeelement[xcfilelikeelement] = pbxbuildfile
-
-  def AppendBuildFile(self, pbxbuildfile, path=None):
-    # Callers should use this instead of calling
-    # AppendProperty('files', pbxbuildfile) directly because this function
-    # maintains the object's dicts.  Better yet, callers can just call AddFile
-    # with a pathname and not worry about building their own PBXBuildFile
-    # objects.
-    self.AppendProperty('files', pbxbuildfile)
-    self._AddBuildFileToDicts(pbxbuildfile, path)
-
-  def AddFile(self, path, settings=None):
-    (file_group, hierarchical) = self.FileGroup(path)
-    file_ref = file_group.AddOrGetFileByPath(path, hierarchical)
-
-    if file_ref in self._files_by_xcfilelikeelement and \
-       isinstance(file_ref, PBXVariantGroup):
-      # There's already a PBXBuildFile in this phase corresponding to the
-      # PBXVariantGroup.  path just provides a new variant that belongs to
-      # the group.  Add the path to the dict.
-      pbxbuildfile = self._files_by_xcfilelikeelement[file_ref]
-      self._AddBuildFileToDicts(pbxbuildfile, path)
-    else:
-      # Add a new PBXBuildFile to get file_ref into the phase.
-      if settings is None:
-        pbxbuildfile = PBXBuildFile({'fileRef': file_ref})
-      else:
-        pbxbuildfile = PBXBuildFile({'fileRef': file_ref, 'settings': settings})
-      self.AppendBuildFile(pbxbuildfile, path)
-
-
-class PBXHeadersBuildPhase(XCBuildPhase):
-  # No additions to the schema relative to XCBuildPhase.
-
-  def Name(self):
-    return 'Headers'
-
-  def FileGroup(self, path):
-    return self.PBXProjectAncestor().RootGroupForPath(path)
-
-
-class PBXResourcesBuildPhase(XCBuildPhase):
-  # No additions to the schema relative to XCBuildPhase.
-
-  def Name(self):
-    return 'Resources'
-
-  def FileGroup(self, path):
-    return self.PBXProjectAncestor().RootGroupForPath(path)
-
-
-class PBXSourcesBuildPhase(XCBuildPhase):
-  # No additions to the schema relative to XCBuildPhase.
-
-  def Name(self):
-    return 'Sources'
-
-  def FileGroup(self, path):
-    return self.PBXProjectAncestor().RootGroupForPath(path)
-
-
-class PBXFrameworksBuildPhase(XCBuildPhase):
-  # No additions to the schema relative to XCBuildPhase.
-
-  def Name(self):
-    return 'Frameworks'
-
-  def FileGroup(self, path):
-    (root, ext) = posixpath.splitext(path)
-    if ext != '':
-      ext = ext[1:].lower()
-    if ext == 'o':
-      # .o files are added to Xcode Frameworks phases, but conceptually aren't
-      # frameworks, they're more like sources or intermediates. Redirect them
-      # to show up in one of those other groups.
-      return self.PBXProjectAncestor().RootGroupForPath(path)
-    else:
-      return (self.PBXProjectAncestor().FrameworksGroup(), False)
-
-
-class PBXShellScriptBuildPhase(XCBuildPhase):
-  _schema = XCBuildPhase._schema.copy()
-  _schema.update({
-    'inputPaths':       [1, str, 0, 1, []],
-    'name':             [0, str, 0, 0],
-    'outputPaths':      [1, str, 0, 1, []],
-    'shellPath':        [0, str, 0, 1, '/bin/sh'],
-    'shellScript':      [0, str, 0, 1],
-    'showEnvVarsInLog': [0, int, 0, 0],
-  })
-
-  def Name(self):
-    if 'name' in self._properties:
-      return self._properties['name']
-
-    return 'ShellScript'
-
-
-class PBXCopyFilesBuildPhase(XCBuildPhase):
-  _schema = XCBuildPhase._schema.copy()
-  _schema.update({
-    'dstPath':          [0, str, 0, 1],
-    'dstSubfolderSpec': [0, int, 0, 1],
-    'name':             [0, str, 0, 0],
-  })
-
-  # path_tree_re matches "$(DIR)/path" or just "$(DIR)".  Match group 1 is
-  # "DIR", match group 3 is "path" or None.
-  path_tree_re = re.compile('^\\$\\((.*)\\)(/(.*)|)$')
-
-  # path_tree_to_subfolder maps names of Xcode variables to the associated
-  # dstSubfolderSpec property value used in a PBXCopyFilesBuildPhase object.
-  path_tree_to_subfolder = {
-    'BUILT_PRODUCTS_DIR': 16,  # Products Directory
-    # Other types that can be chosen via the Xcode UI.
-    # TODO(mark): Map Xcode variable names to these.
-    # : 1,  # Wrapper
-    # : 6,  # Executables: 6
-    # : 7,  # Resources
-    # : 15,  # Java Resources
-    # : 10,  # Frameworks
-    # : 11,  # Shared Frameworks
-    # : 12,  # Shared Support
-    # : 13,  # PlugIns
-  }
-
-  def Name(self):
-    if 'name' in self._properties:
-      return self._properties['name']
-
-    return 'CopyFiles'
-
-  def FileGroup(self, path):
-    return self.PBXProjectAncestor().RootGroupForPath(path)
-
-  def SetDestination(self, path):
-    """Set the dstSubfolderSpec and dstPath properties from path.
-
-    path may be specified in the same notation used for XCHierarchicalElements,
-    specifically, "$(DIR)/path".
-    """
-
-    path_tree_match = self.path_tree_re.search(path)
-    if path_tree_match:
-      # Everything else needs to be relative to an Xcode variable.
-      path_tree = path_tree_match.group(1)
-      relative_path = path_tree_match.group(3)
-
-      if path_tree in self.path_tree_to_subfolder:
-        subfolder = self.path_tree_to_subfolder[path_tree]
-        if relative_path is None:
-          relative_path = ''
-      else:
-        # The path starts with an unrecognized Xcode variable
-        # name like $(SRCROOT).  Xcode will still handle this
-        # as an "absolute path" that starts with the variable.
-        subfolder = 0
-        relative_path = path
-    elif path.startswith('/'):
-      # Special case.  Absolute paths are in dstSubfolderSpec 0.
-      subfolder = 0
-      relative_path = path[1:]
-    else:
-      raise ValueError, 'Can\'t use path %s in a %s' % \
-                        (path, self.__class__.__name__)
-
-    self._properties['dstPath'] = relative_path
-    self._properties['dstSubfolderSpec'] = subfolder
-
-
-class PBXBuildRule(XCObject):
-  _schema = XCObject._schema.copy()
-  _schema.update({
-    'compilerSpec': [0, str, 0, 1],
-    'filePatterns': [0, str, 0, 0],
-    'fileType':     [0, str, 0, 1],
-    'isEditable':   [0, int, 0, 1, 1],
-    'outputFiles':  [1, str, 0, 1, []],
-    'script':       [0, str, 0, 0],
-  })
-
-  def Name(self):
-    # Not very inspired, but it's what Xcode uses.
-    return self.__class__.__name__
-
-  def Hashables(self):
-    # super
-    hashables = XCObject.Hashables(self)
-
-    # Use the hashables of the weak objects that this object refers to.
-    hashables.append(self._properties['fileType'])
-    if 'filePatterns' in self._properties:
-      hashables.append(self._properties['filePatterns'])
-    return hashables
-
-
-class PBXContainerItemProxy(XCObject):
-  # When referencing an item in this project file, containerPortal is the
-  # PBXProject root object of this project file.  When referencing an item in
-  # another project file, containerPortal is a PBXFileReference identifying
-  # the other project file.
-  #
-  # When serving as a proxy to an XCTarget (in this project file or another),
-  # proxyType is 1.  When serving as a proxy to a PBXFileReference (in another
-  # project file), proxyType is 2.  Type 2 is used for references to the
-  # producs of the other project file's targets.
-  #
-  # Xcode is weird about remoteGlobalIDString.  Usually, it's printed without
-  # a comment, indicating that it's tracked internally simply as a string, but
-  # sometimes it's printed with a comment (usually when the object is initially
-  # created), indicating that it's tracked as a project file object at least
-  # sometimes.  This module always tracks it as an object, but contains a hack
-  # to prevent it from printing the comment in the project file output.  See
-  # _XCKVPrint.
-  _schema = XCObject._schema.copy()
-  _schema.update({
-    'containerPortal':      [0, XCContainerPortal, 0, 1],
-    'proxyType':            [0, int,               0, 1],
-    'remoteGlobalIDString': [0, XCRemoteObject,    0, 1],
-    'remoteInfo':           [0, str,               0, 1],
-  })
-
-  def __repr__(self):
-    props = self._properties
-    name = '%s.gyp:%s' % (props['containerPortal'].Name(), props['remoteInfo'])
-    return '<%s %r at 0x%x>' % (self.__class__.__name__, name, id(self))
-
-  def Name(self):
-    # Admittedly not the best name, but it's what Xcode uses.
-    return self.__class__.__name__
-
-  def Hashables(self):
-    # super
-    hashables = XCObject.Hashables(self)
-
-    # Use the hashables of the weak objects that this object refers to.
-    hashables.extend(self._properties['containerPortal'].Hashables())
-    hashables.extend(self._properties['remoteGlobalIDString'].Hashables())
-    return hashables
-
-
-class PBXTargetDependency(XCObject):
-  # The "target" property accepts an XCTarget object, and obviously not
-  # NoneType.  But XCTarget is defined below, so it can't be put into the
-  # schema yet.  The definition of PBXTargetDependency can't be moved below
-  # XCTarget because XCTarget's own schema references PBXTargetDependency.
-  # Python doesn't deal well with this circular relationship, and doesn't have
-  # a real way to do forward declarations.  To work around, the type of
-  # the "target" property is reset below, after XCTarget is defined.
-  #
-  # At least one of "name" and "target" is required.
-  _schema = XCObject._schema.copy()
-  _schema.update({
-    'name':        [0, str,                   0, 0],
-    'target':      [0, None.__class__,        0, 0],
-    'targetProxy': [0, PBXContainerItemProxy, 1, 1],
-  })
-
-  def __repr__(self):
-    name = self._properties.get('name') or self._properties['target'].Name()
-    return '<%s %r at 0x%x>' % (self.__class__.__name__, name, id(self))
-
-  def Name(self):
-    # Admittedly not the best name, but it's what Xcode uses.
-    return self.__class__.__name__
-
-  def Hashables(self):
-    # super
-    hashables = XCObject.Hashables(self)
-
-    # Use the hashables of the weak objects that this object refers to.
-    hashables.extend(self._properties['targetProxy'].Hashables())
-    return hashables
-
-
-class PBXReferenceProxy(XCFileLikeElement):
-  _schema = XCFileLikeElement._schema.copy()
-  _schema.update({
-    'fileType':  [0, str,                   0, 1],
-    'path':      [0, str,                   0, 1],
-    'remoteRef': [0, PBXContainerItemProxy, 1, 1],
-  })
-
-
-class XCTarget(XCRemoteObject):
-  # An XCTarget is really just an XCObject, the XCRemoteObject thing is just
-  # to allow PBXProject to be used in the remoteGlobalIDString property of
-  # PBXContainerItemProxy.
-  #
-  # Setting a "name" property at instantiation may also affect "productName",
-  # which may in turn affect the "PRODUCT_NAME" build setting in children of
-  # "buildConfigurationList".  See __init__ below.
-  _schema = XCRemoteObject._schema.copy()
-  _schema.update({
-    'buildConfigurationList': [0, XCConfigurationList, 1, 1,
-                               XCConfigurationList()],
-    'buildPhases':            [1, XCBuildPhase,        1, 1, []],
-    'dependencies':           [1, PBXTargetDependency, 1, 1, []],
-    'name':                   [0, str,                 0, 1],
-    'productName':            [0, str,                 0, 1],
-  })
-
-  def __init__(self, properties=None, id=None, parent=None,
-               force_outdir=None, force_prefix=None, force_extension=None):
-    # super
-    XCRemoteObject.__init__(self, properties, id, parent)
-
-    # Set up additional defaults not expressed in the schema.  If a "name"
-    # property was supplied, set "productName" if it is not present.  Also set
-    # the "PRODUCT_NAME" build setting in each configuration, but only if
-    # the setting is not present in any build configuration.
-    if 'name' in self._properties:
-      if not 'productName' in self._properties:
-        self.SetProperty('productName', self._properties['name'])
-
-    if 'productName' in self._properties:
-      if 'buildConfigurationList' in self._properties:
-        configs = self._properties['buildConfigurationList']
-        if configs.HasBuildSetting('PRODUCT_NAME') == 0:
-          configs.SetBuildSetting('PRODUCT_NAME',
-                                  self._properties['productName'])
-
-  def AddDependency(self, other):
-    pbxproject = self.PBXProjectAncestor()
-    other_pbxproject = other.PBXProjectAncestor()
-    if pbxproject == other_pbxproject:
-      # Add a dependency to another target in the same project file.
-      container = PBXContainerItemProxy({'containerPortal':      pbxproject,
-                                         'proxyType':            1,
-                                         'remoteGlobalIDString': other,
-                                         'remoteInfo':           other.Name()})
-      dependency = PBXTargetDependency({'target':      other,
-                                        'targetProxy': container})
-      self.AppendProperty('dependencies', dependency)
-    else:
-      # Add a dependency to a target in a different project file.
-      other_project_ref = \
-          pbxproject.AddOrGetProjectReference(other_pbxproject)[1]
-      container = PBXContainerItemProxy({
-            'containerPortal':      other_project_ref,
-            'proxyType':            1,
-            'remoteGlobalIDString': other,
-            'remoteInfo':           other.Name(),
-          })
-      dependency = PBXTargetDependency({'name':        other.Name(),
-                                        'targetProxy': container})
-      self.AppendProperty('dependencies', dependency)
-
-  # Proxy all of these through to the build configuration list.
-
-  def ConfigurationNamed(self, name):
-    return self._properties['buildConfigurationList'].ConfigurationNamed(name)
-
-  def DefaultConfiguration(self):
-    return self._properties['buildConfigurationList'].DefaultConfiguration()
-
-  def HasBuildSetting(self, key):
-    return self._properties['buildConfigurationList'].HasBuildSetting(key)
-
-  def GetBuildSetting(self, key):
-    return self._properties['buildConfigurationList'].GetBuildSetting(key)
-
-  def SetBuildSetting(self, key, value):
-    return self._properties['buildConfigurationList'].SetBuildSetting(key, \
-                                                                      value)
-
-  def AppendBuildSetting(self, key, value):
-    return self._properties['buildConfigurationList'].AppendBuildSetting(key, \
-                                                                         value)
-
-  def DelBuildSetting(self, key):
-    return self._properties['buildConfigurationList'].DelBuildSetting(key)
-
-
-# Redefine the type of the "target" property.  See PBXTargetDependency._schema
-# above.
-PBXTargetDependency._schema['target'][1] = XCTarget
-
-
-class PBXNativeTarget(XCTarget):
-  # buildPhases is overridden in the schema to be able to set defaults.
-  #
-  # NOTE: Contrary to most objects, it is advisable to set parent when
-  # constructing PBXNativeTarget.  A parent of an XCTarget must be a PBXProject
-  # object.  A parent reference is required for a PBXNativeTarget during
-  # construction to be able to set up the target defaults for productReference,
-  # because a PBXBuildFile object must be created for the target and it must
-  # be added to the PBXProject's mainGroup hierarchy.
-  _schema = XCTarget._schema.copy()
-  _schema.update({
-    'buildPhases':      [1, XCBuildPhase,     1, 1,
-                         [PBXSourcesBuildPhase(), PBXFrameworksBuildPhase()]],
-    'buildRules':       [1, PBXBuildRule,     1, 1, []],
-    'productReference': [0, PBXFileReference, 0, 1],
-    'productType':      [0, str,              0, 1],
-  })
-
-  # Mapping from Xcode product-types to settings.  The settings are:
-  #  filetype : used for explicitFileType in the project file
-  #  prefix : the prefix for the file name
-  #  suffix : the suffix for the filen ame
-  _product_filetypes = {
-    'com.apple.product-type.application':     ['wrapper.application',
-                                               '', '.app'],
-    'com.apple.product-type.bundle':          ['wrapper.cfbundle',
-                                               '', '.bundle'],
-    'com.apple.product-type.framework':       ['wrapper.framework',
-                                               '', '.framework'],
-    'com.apple.product-type.library.dynamic': ['compiled.mach-o.dylib',
-                                               'lib', '.dylib'],
-    'com.apple.product-type.library.static':  ['archive.ar',
-                                               'lib', '.a'],
-    'com.apple.product-type.tool':            ['compiled.mach-o.executable',
-                                               '', ''],
-    'com.googlecode.gyp.xcode.bundle':        ['compiled.mach-o.dylib',
-                                               '', '.so'],
-  }
-
-  def __init__(self, properties=None, id=None, parent=None,
-               force_outdir=None, force_prefix=None, force_extension=None):
-    # super
-    XCTarget.__init__(self, properties, id, parent)
-
-    if 'productName' in self._properties and \
-       'productType' in self._properties and \
-       not 'productReference' in self._properties and \
-       self._properties['productType'] in self._product_filetypes:
-      products_group = None
-      pbxproject = self.PBXProjectAncestor()
-      if pbxproject != None:
-        products_group = pbxproject.ProductsGroup()
-
-      if products_group != None:
-        (filetype, prefix, suffix) = \
-            self._product_filetypes[self._properties['productType']]
-        # Xcode does not have a distinct type for loadable modules that are
-        # pure BSD targets (not in a bundle wrapper). GYP allows such modules
-        # to be specified by setting a target type to loadable_module without
-        # having mac_bundle set. These are mapped to the pseudo-product type
-        # com.googlecode.gyp.xcode.bundle.
-        #
-        # By picking up this special type and converting it to a dynamic
-        # library (com.apple.product-type.library.dynamic) with fix-ups,
-        # single-file loadable modules can be produced.
-        #
-        # MACH_O_TYPE is changed to mh_bundle to produce the proper file type
-        # (as opposed to mh_dylib). In order for linking to succeed,
-        # DYLIB_CURRENT_VERSION and DYLIB_COMPATIBILITY_VERSION must be
-        # cleared. They are meaningless for type mh_bundle.
-        #
-        # Finally, the .so extension is forcibly applied over the default
-        # (.dylib), unless another forced extension is already selected.
-        # .dylib is plainly wrong, and .bundle is used by loadable_modules in
-        # bundle wrappers (com.apple.product-type.bundle). .so seems an odd
-        # choice because it's used as the extension on many other systems that
-        # don't distinguish between linkable shared libraries and non-linkable
-        # loadable modules, but there's precedent: Python loadable modules on
-        # Mac OS X use an .so extension.
-        if self._properties['productType'] == 'com.googlecode.gyp.xcode.bundle':
-          self._properties['productType'] = \
-              'com.apple.product-type.library.dynamic'
-          self.SetBuildSetting('MACH_O_TYPE', 'mh_bundle')
-          self.SetBuildSetting('DYLIB_CURRENT_VERSION', '')
-          self.SetBuildSetting('DYLIB_COMPATIBILITY_VERSION', '')
-          if force_extension is None:
-            force_extension = suffix[1:]
-
-        if force_extension is not None:
-          # If it's a wrapper (bundle), set WRAPPER_EXTENSION.
-          if filetype.startswith('wrapper.'):
-            self.SetBuildSetting('WRAPPER_EXTENSION', force_extension)
-          else:
-            # Extension override.
-            suffix = '.' + force_extension
-            self.SetBuildSetting('EXECUTABLE_EXTENSION', force_extension)
-
-          if filetype.startswith('compiled.mach-o.executable'):
-            product_name = self._properties['productName']
-            product_name += suffix
-            suffix = ''
-            self.SetProperty('productName', product_name)
-            self.SetBuildSetting('PRODUCT_NAME', product_name)
-
-        # Xcode handles most prefixes based on the target type, however there
-        # are exceptions.  If a "BSD Dynamic Library" target is added in the
-        # Xcode UI, Xcode sets EXECUTABLE_PREFIX.  This check duplicates that
-        # behavior.
-        if force_prefix is not None:
-          prefix = force_prefix
-        if filetype.startswith('wrapper.'):
-          self.SetBuildSetting('WRAPPER_PREFIX', prefix)
-        else:
-          self.SetBuildSetting('EXECUTABLE_PREFIX', prefix)
-
-        if force_outdir is not None:
-          self.SetBuildSetting('TARGET_BUILD_DIR', force_outdir)
-
-        # TODO(tvl): Remove the below hack.
-        #    http://code.google.com/p/gyp/issues/detail?id=122
-
-        # Some targets include the prefix in the target_name.  These targets
-        # really should just add a product_name setting that doesn't include
-        # the prefix.  For example:
-        #  target_name = 'libevent', product_name = 'event'
-        # This check cleans up for them.
-        product_name = self._properties['productName']
-        prefix_len = len(prefix)
-        if prefix_len and (product_name[:prefix_len] == prefix):
-          product_name = product_name[prefix_len:]
-          self.SetProperty('productName', product_name)
-          self.SetBuildSetting('PRODUCT_NAME', product_name)
-
-        ref_props = {
-          'explicitFileType': filetype,
-          'includeInIndex':   0,
-          'path':             prefix + product_name + suffix,
-          'sourceTree':       'BUILT_PRODUCTS_DIR',
-        }
-        file_ref = PBXFileReference(ref_props)
-        products_group.AppendChild(file_ref)
-        self.SetProperty('productReference', file_ref)
-
-  def GetBuildPhaseByType(self, type):
-    if not 'buildPhases' in self._properties:
-      return None
-
-    the_phase = None
-    for phase in self._properties['buildPhases']:
-      if isinstance(phase, type):
-        # Some phases may be present in multiples in a well-formed project file,
-        # but phases like PBXSourcesBuildPhase may only be present singly, and
-        # this function is intended as an aid to GetBuildPhaseByType.  Loop
-        # over the entire list of phases and assert if more than one of the
-        # desired type is found.
-        assert the_phase is None
-        the_phase = phase
-
-    return the_phase
-
-  def HeadersPhase(self):
-    headers_phase = self.GetBuildPhaseByType(PBXHeadersBuildPhase)
-    if headers_phase is None:
-      headers_phase = PBXHeadersBuildPhase()
-
-      # The headers phase should come before the resources, sources, and
-      # frameworks phases, if any.
-      insert_at = len(self._properties['buildPhases'])
-      for index in xrange(0, len(self._properties['buildPhases'])):
-        phase = self._properties['buildPhases'][index]
-        if isinstance(phase, PBXResourcesBuildPhase) or \
-           isinstance(phase, PBXSourcesBuildPhase) or \
-           isinstance(phase, PBXFrameworksBuildPhase):
-          insert_at = index
-          break
-
-      self._properties['buildPhases'].insert(insert_at, headers_phase)
-      headers_phase.parent = self
-
-    return headers_phase
-
-  def ResourcesPhase(self):
-    resources_phase = self.GetBuildPhaseByType(PBXResourcesBuildPhase)
-    if resources_phase is None:
-      resources_phase = PBXResourcesBuildPhase()
-
-      # The resources phase should come before the sources and frameworks
-      # phases, if any.
-      insert_at = len(self._properties['buildPhases'])
-      for index in xrange(0, len(self._properties['buildPhases'])):
-        phase = self._properties['buildPhases'][index]
-        if isinstance(phase, PBXSourcesBuildPhase) or \
-           isinstance(phase, PBXFrameworksBuildPhase):
-          insert_at = index
-          break
-
-      self._properties['buildPhases'].insert(insert_at, resources_phase)
-      resources_phase.parent = self
-
-    return resources_phase
-
-  def SourcesPhase(self):
-    sources_phase = self.GetBuildPhaseByType(PBXSourcesBuildPhase)
-    if sources_phase is None:
-      sources_phase = PBXSourcesBuildPhase()
-      self.AppendProperty('buildPhases', sources_phase)
-
-    return sources_phase
-
-  def FrameworksPhase(self):
-    frameworks_phase = self.GetBuildPhaseByType(PBXFrameworksBuildPhase)
-    if frameworks_phase is None:
-      frameworks_phase = PBXFrameworksBuildPhase()
-      self.AppendProperty('buildPhases', frameworks_phase)
-
-    return frameworks_phase
-
-  def AddDependency(self, other):
-    # super
-    XCTarget.AddDependency(self, other)
-
-    static_library_type = 'com.apple.product-type.library.static'
-    shared_library_type = 'com.apple.product-type.library.dynamic'
-    framework_type = 'com.apple.product-type.framework'
-    if isinstance(other, PBXNativeTarget) and \
-       'productType' in self._properties and \
-       self._properties['productType'] != static_library_type and \
-       'productType' in other._properties and \
-       (other._properties['productType'] == static_library_type or \
-        ((other._properties['productType'] == shared_library_type or \
-          other._properties['productType'] == framework_type) and \
-         ((not other.HasBuildSetting('MACH_O_TYPE')) or
-          other.GetBuildSetting('MACH_O_TYPE') != 'mh_bundle'))):
-
-      file_ref = other.GetProperty('productReference')
-
-      pbxproject = self.PBXProjectAncestor()
-      other_pbxproject = other.PBXProjectAncestor()
-      if pbxproject != other_pbxproject:
-        other_project_product_group = \
-            pbxproject.AddOrGetProjectReference(other_pbxproject)[0]
-        file_ref = other_project_product_group.GetChildByRemoteObject(file_ref)
-
-      self.FrameworksPhase().AppendProperty('files',
-                                            PBXBuildFile({'fileRef': file_ref}))
-
-
-class PBXAggregateTarget(XCTarget):
-  pass
-
-
-class PBXProject(XCContainerPortal):
-  # A PBXProject is really just an XCObject, the XCContainerPortal thing is
-  # just to allow PBXProject to be used in the containerPortal property of
-  # PBXContainerItemProxy.
-  """
-
-  Attributes:
-    path: "sample.xcodeproj".  TODO(mark) Document me!
-    _other_pbxprojects: A dictionary, keyed by other PBXProject objects.  Each
-                        value is a reference to the dict in the
-                        projectReferences list associated with the keyed
-                        PBXProject.
-  """
-
-  _schema = XCContainerPortal._schema.copy()
-  _schema.update({
-    'attributes':             [0, dict,                0, 0],
-    'buildConfigurationList': [0, XCConfigurationList, 1, 1,
-                               XCConfigurationList()],
-    'compatibilityVersion':   [0, str,                 0, 1, 'Xcode 3.2'],
-    'hasScannedForEncodings': [0, int,                 0, 1, 1],
-    'mainGroup':              [0, PBXGroup,            1, 1, PBXGroup()],
-    'projectDirPath':         [0, str,                 0, 1, ''],
-    'projectReferences':      [1, dict,                0, 0],
-    'projectRoot':            [0, str,                 0, 1, ''],
-    'targets':                [1, XCTarget,            1, 1, []],
-  })
-
-  def __init__(self, properties=None, id=None, parent=None, path=None):
-    self.path = path
-    self._other_pbxprojects = {}
-    # super
-    return XCContainerPortal.__init__(self, properties, id, parent)
-
-  def Name(self):
-    name = self.path
-    if name[-10:] == '.xcodeproj':
-      name = name[:-10]
-    return posixpath.basename(name)
-
-  def Path(self):
-    return self.path
-
-  def Comment(self):
-    return 'Project object'
-
-  def Children(self):
-    # super
-    children = XCContainerPortal.Children(self)
-
-    # Add children that the schema doesn't know about.  Maybe there's a more
-    # elegant way around this, but this is the only case where we need to own
-    # objects in a dictionary (that is itself in a list), and three lines for
-    # a one-off isn't that big a deal.
-    if 'projectReferences' in self._properties:
-      for reference in self._properties['projectReferences']:
-        children.append(reference['ProductGroup'])
-
-    return children
-
-  def PBXProjectAncestor(self):
-    return self
-
-  def _GroupByName(self, name):
-    if not 'mainGroup' in self._properties:
-      self.SetProperty('mainGroup', PBXGroup())
-
-    main_group = self._properties['mainGroup']
-    group = main_group.GetChildByName(name)
-    if group is None:
-      group = PBXGroup({'name': name})
-      main_group.AppendChild(group)
-
-    return group
-
-  # SourceGroup and ProductsGroup are created by default in Xcode's own
-  # templates.
-  def SourceGroup(self):
-    return self._GroupByName('Source')
-
-  def ProductsGroup(self):
-    return self._GroupByName('Products')
-
-  # IntermediatesGroup is used to collect source-like files that are generated
-  # by rules or script phases and are placed in intermediate directories such
-  # as DerivedSources.
-  def IntermediatesGroup(self):
-    return self._GroupByName('Intermediates')
-
-  # FrameworksGroup and ProjectsGroup are top-level groups used to collect
-  # frameworks and projects.
-  def FrameworksGroup(self):
-    return self._GroupByName('Frameworks')
-
-  def ProjectsGroup(self):
-    return self._GroupByName('Projects')
-
-  def RootGroupForPath(self, path):
-    """Returns a PBXGroup child of this object to which path should be added.
-
-    This method is intended to choose between SourceGroup and
-    IntermediatesGroup on the basis of whether path is present in a source
-    directory or an intermediates directory.  For the purposes of this
-    determination, any path located within a derived file directory such as
-    PROJECT_DERIVED_FILE_DIR is treated as being in an intermediates
-    directory.
-
-    The returned value is a two-element tuple.  The first element is the
-    PBXGroup, and the second element specifies whether that group should be
-    organized hierarchically (True) or as a single flat list (False).
-    """
-
-    # TODO(mark): make this a class variable and bind to self on call?
-    # Also, this list is nowhere near exhaustive.
-    # INTERMEDIATE_DIR and SHARED_INTERMEDIATE_DIR are used by
-    # gyp.generator.xcode.  There should probably be some way for that module
-    # to push the names in, rather than having to hard-code them here.
-    source_tree_groups = {
-      'DERIVED_FILE_DIR':         (self.IntermediatesGroup, True),
-      'INTERMEDIATE_DIR':         (self.IntermediatesGroup, True),
-      'PROJECT_DERIVED_FILE_DIR': (self.IntermediatesGroup, True),
-      'SHARED_INTERMEDIATE_DIR':  (self.IntermediatesGroup, True),
-    }
-
-    (source_tree, path) = SourceTreeAndPathFromPath(path)
-    if source_tree != None and source_tree in source_tree_groups:
-      (group_func, hierarchical) = source_tree_groups[source_tree]
-      group = group_func()
-      return (group, hierarchical)
-
-    # TODO(mark): make additional choices based on file extension.
-
-    return (self.SourceGroup(), True)
-
-  def AddOrGetFileInRootGroup(self, path):
-    """Returns a PBXFileReference corresponding to path in the correct group
-    according to RootGroupForPath's heuristics.
-
-    If an existing PBXFileReference for path exists, it will be returned.
-    Otherwise, one will be created and returned.
-    """
-
-    (group, hierarchical) = self.RootGroupForPath(path)
-    return group.AddOrGetFileByPath(path, hierarchical)
-
-  def RootGroupsTakeOverOnlyChildren(self, recurse=False):
-    """Calls TakeOverOnlyChild for all groups in the main group."""
-
-    for group in self._properties['mainGroup']._properties['children']:
-      if isinstance(group, PBXGroup):
-        group.TakeOverOnlyChild(recurse)
-
-  def SortGroups(self):
-    # Sort the children of the mainGroup (like "Source" and "Products")
-    # according to their defined order.
-    self._properties['mainGroup']._properties['children'] = \
-        sorted(self._properties['mainGroup']._properties['children'],
-               cmp=lambda x,y: x.CompareRootGroup(y))
-
-    # Sort everything else by putting group before files, and going
-    # alphabetically by name within sections of groups and files.  SortGroup
-    # is recursive.
-    for group in self._properties['mainGroup']._properties['children']:
-      if not isinstance(group, PBXGroup):
-        continue
-
-      if group.Name() == 'Products':
-        # The Products group is a special case.  Instead of sorting
-        # alphabetically, sort things in the order of the targets that
-        # produce the products.  To do this, just build up a new list of
-        # products based on the targets.
-        products = []
-        for target in self._properties['targets']:
-          if not isinstance(target, PBXNativeTarget):
-            continue
-          product = target._properties['productReference']
-          # Make sure that the product is already in the products group.
-          assert product in group._properties['children']
-          products.append(product)
-
-        # Make sure that this process doesn't miss anything that was already
-        # in the products group.
-        assert len(products) == len(group._properties['children'])
-        group._properties['children'] = products
-      else:
-        group.SortGroup()
-
-  def AddOrGetProjectReference(self, other_pbxproject):
-    """Add a reference to another project file (via PBXProject object) to this
-    one.
-
-    Returns [ProductGroup, ProjectRef].  ProductGroup is a PBXGroup object in
-    this project file that contains a PBXReferenceProxy object for each
-    product of each PBXNativeTarget in the other project file.  ProjectRef is
-    a PBXFileReference to the other project file.
-
-    If this project file already references the other project file, the
-    existing ProductGroup and ProjectRef are returned.  The ProductGroup will
-    still be updated if necessary.
-    """
-
-    if not 'projectReferences' in self._properties:
-      self._properties['projectReferences'] = []
-
-    product_group = None
-    project_ref = None
-
-    if not other_pbxproject in self._other_pbxprojects:
-      # This project file isn't yet linked to the other one.  Establish the
-      # link.
-      product_group = PBXGroup({'name': 'Products'})
-
-      # ProductGroup is strong.
-      product_group.parent = self
-
-      # There's nothing unique about this PBXGroup, and if left alone, it will
-      # wind up with the same set of hashables as all other PBXGroup objects
-      # owned by the projectReferences list.  Add the hashables of the
-      # remote PBXProject that it's related to.
-      product_group._hashables.extend(other_pbxproject.Hashables())
-
-      # The other project reports its path as relative to the same directory
-      # that this project's path is relative to.  The other project's path
-      # is not necessarily already relative to this project.  Figure out the
-      # pathname that this project needs to use to refer to the other one.
-      this_path = posixpath.dirname(self.Path())
-      projectDirPath = self.GetProperty('projectDirPath')
-      if projectDirPath:
-        if posixpath.isabs(projectDirPath[0]):
-          this_path = projectDirPath
-        else:
-          this_path = posixpath.join(this_path, projectDirPath)
-      other_path = gyp.common.RelativePath(other_pbxproject.Path(), this_path)
-
-      # ProjectRef is weak (it's owned by the mainGroup hierarchy).
-      project_ref = PBXFileReference({
-            'lastKnownFileType': 'wrapper.pb-project',
-            'path':              other_path,
-            'sourceTree':        'SOURCE_ROOT',
-          })
-      self.ProjectsGroup().AppendChild(project_ref)
-
-      ref_dict = {'ProductGroup': product_group, 'ProjectRef': project_ref}
-      self._other_pbxprojects[other_pbxproject] = ref_dict
-      self.AppendProperty('projectReferences', ref_dict)
-
-      # Xcode seems to sort this list case-insensitively
-      self._properties['projectReferences'] = \
-          sorted(self._properties['projectReferences'], cmp=lambda x,y:
-                 cmp(x['ProjectRef'].Name().lower(),
-                     y['ProjectRef'].Name().lower()))
-    else:
-      # The link already exists.  Pull out the relevnt data.
-      project_ref_dict = self._other_pbxprojects[other_pbxproject]
-      product_group = project_ref_dict['ProductGroup']
-      project_ref = project_ref_dict['ProjectRef']
-
-    self._SetUpProductReferences(other_pbxproject, product_group, project_ref)
-
-    return [product_group, project_ref]
-
-  def _SetUpProductReferences(self, other_pbxproject, product_group,
-                              project_ref):
-    # TODO(mark): This only adds references to products in other_pbxproject
-    # when they don't exist in this pbxproject.  Perhaps it should also
-    # remove references from this pbxproject that are no longer present in
-    # other_pbxproject.  Perhaps it should update various properties if they
-    # change.
-    for target in other_pbxproject._properties['targets']:
-      if not isinstance(target, PBXNativeTarget):
-        continue
-
-      other_fileref = target._properties['productReference']
-      if product_group.GetChildByRemoteObject(other_fileref) is None:
-        # Xcode sets remoteInfo to the name of the target and not the name
-        # of its product, despite this proxy being a reference to the product.
-        container_item = PBXContainerItemProxy({
-              'containerPortal':      project_ref,
-              'proxyType':            2,
-              'remoteGlobalIDString': other_fileref,
-              'remoteInfo':           target.Name()
-            })
-        # TODO(mark): Does sourceTree get copied straight over from the other
-        # project?  Can the other project ever have lastKnownFileType here
-        # instead of explicitFileType?  (Use it if so?)  Can path ever be
-        # unset?  (I don't think so.)  Can other_fileref have name set, and
-        # does it impact the PBXReferenceProxy if so?  These are the questions
-        # that perhaps will be answered one day.
-        reference_proxy = PBXReferenceProxy({
-              'fileType':   other_fileref._properties['explicitFileType'],
-              'path':       other_fileref._properties['path'],
-              'sourceTree': other_fileref._properties['sourceTree'],
-              'remoteRef':  container_item,
-            })
-
-        product_group.AppendChild(reference_proxy)
-
-  def SortRemoteProductReferences(self):
-    # For each remote project file, sort the associated ProductGroup in the
-    # same order that the targets are sorted in the remote project file.  This
-    # is the sort order used by Xcode.
-
-    def CompareProducts(x, y, remote_products):
-      # x and y are PBXReferenceProxy objects.  Go through their associated
-      # PBXContainerItem to get the remote PBXFileReference, which will be
-      # present in the remote_products list.
-      x_remote = x._properties['remoteRef']._properties['remoteGlobalIDString']
-      y_remote = y._properties['remoteRef']._properties['remoteGlobalIDString']
-      x_index = remote_products.index(x_remote)
-      y_index = remote_products.index(y_remote)
-
-      # Use the order of each remote PBXFileReference in remote_products to
-      # determine the sort order.
-      return cmp(x_index, y_index)
-
-    for other_pbxproject, ref_dict in self._other_pbxprojects.iteritems():
-      # Build up a list of products in the remote project file, ordered the
-      # same as the targets that produce them.
-      remote_products = []
-      for target in other_pbxproject._properties['targets']:
-        if not isinstance(target, PBXNativeTarget):
-          continue
-        remote_products.append(target._properties['productReference'])
-
-      # Sort the PBXReferenceProxy children according to the list of remote
-      # products.
-      product_group = ref_dict['ProductGroup']
-      product_group._properties['children'] = sorted(
-          product_group._properties['children'],
-          cmp=lambda x, y: CompareProducts(x, y, remote_products))
-
-
-class XCProjectFile(XCObject):
-  _schema = XCObject._schema.copy()
-  _schema.update({
-    'archiveVersion': [0, int,        0, 1, 1],
-    'classes':        [0, dict,       0, 1, {}],
-    'objectVersion':  [0, int,        0, 1, 45],
-    'rootObject':     [0, PBXProject, 1, 1],
-  })
-
-  def SetXcodeVersion(self, version):
-    version_to_object_version = {
-      '2.4': 45,
-      '3.0': 45,
-      '3.1': 45,
-      '3.2': 46,
-    }
-    if not version in version_to_object_version:
-      supported_str = ', '.join(sorted(version_to_object_version.keys()))
-      raise Exception(
-          'Unsupported Xcode version %s (supported: %s)' %
-          ( version, supported_str ) )
-    compatibility_version = 'Xcode %s' % version
-    self._properties['rootObject'].SetProperty('compatibilityVersion',
-                                               compatibility_version)
-    self.SetProperty('objectVersion', version_to_object_version[version]);
-
-  def ComputeIDs(self, recursive=True, overwrite=True, hash=None):
-    # Although XCProjectFile is implemented here as an XCObject, it's not a
-    # proper object in the Xcode sense, and it certainly doesn't have its own
-    # ID.  Pass through an attempt to update IDs to the real root object.
-    if recursive:
-      self._properties['rootObject'].ComputeIDs(recursive, overwrite, hash)
-
-  def Print(self, file=sys.stdout):
-    self.VerifyHasRequiredProperties()
-
-    # Add the special "objects" property, which will be caught and handled
-    # separately during printing.  This structure allows a fairly standard
-    # loop do the normal printing.
-    self._properties['objects'] = {}
-    self._XCPrint(file, 0, '// !$*UTF8*$!\n')
-    if self._should_print_single_line:
-      self._XCPrint(file, 0, '{ ')
-    else:
-      self._XCPrint(file, 0, '{\n')
-    for property, value in sorted(self._properties.iteritems(),
-                                  cmp=lambda x, y: cmp(x, y)):
-      if property == 'objects':
-        self._PrintObjects(file)
-      else:
-        self._XCKVPrint(file, 1, property, value)
-    self._XCPrint(file, 0, '}\n')
-    del self._properties['objects']
-
-  def _PrintObjects(self, file):
-    if self._should_print_single_line:
-      self._XCPrint(file, 0, 'objects = {')
-    else:
-      self._XCPrint(file, 1, 'objects = {\n')
-
-    objects_by_class = {}
-    for object in self.Descendants():
-      if object == self:
-        continue
-      class_name = object.__class__.__name__
-      if not class_name in objects_by_class:
-        objects_by_class[class_name] = []
-      objects_by_class[class_name].append(object)
-
-    for class_name in sorted(objects_by_class):
-      self._XCPrint(file, 0, '\n')
-      self._XCPrint(file, 0, '/* Begin ' + class_name + ' section */\n')
-      for object in sorted(objects_by_class[class_name],
-                           cmp=lambda x, y: cmp(x.id, y.id)):
-        object.Print(file)
-      self._XCPrint(file, 0, '/* End ' + class_name + ' section */\n')
-
-    if self._should_print_single_line:
-      self._XCPrint(file, 0, '}; ')
-    else:
-      self._XCPrint(file, 1, '};\n')
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/xml_fix.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,69 +0,0 @@
-# Copyright (c) 2011 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Applies a fix to CR LF TAB handling in xml.dom.
-
-Fixes this: http://code.google.com/p/chromium/issues/detail?id=76293
-Working around this: http://bugs.python.org/issue5752
-TODO(bradnelson): Consider dropping this when we drop XP support.
-"""
-
-
-import xml.dom.minidom
-
-
-def _Replacement_write_data(writer, data, is_attrib=False):
-  """Writes datachars to writer."""
-  data = data.replace("&", "&amp;").replace("<", "&lt;")
-  data = data.replace("\"", "&quot;").replace(">", "&gt;")
-  if is_attrib:
-    data = data.replace(
-        "\r", "&#xD;").replace(
-        "\n", "&#xA;").replace(
-        "\t", "&#x9;")
-  writer.write(data)
-
-
-def _Replacement_writexml(self, writer, indent="", addindent="", newl=""):
-  # indent = current indentation
-  # addindent = indentation to add to higher levels
-  # newl = newline string
-  writer.write(indent+"<" + self.tagName)
-
-  attrs = self._get_attributes()
-  a_names = attrs.keys()
-  a_names.sort()
-
-  for a_name in a_names:
-    writer.write(" %s=\"" % a_name)
-    _Replacement_write_data(writer, attrs[a_name].value, is_attrib=True)
-    writer.write("\"")
-  if self.childNodes:
-    writer.write(">%s" % newl)
-    for node in self.childNodes:
-      node.writexml(writer, indent + addindent, addindent, newl)
-    writer.write("%s</%s>%s" % (indent, self.tagName, newl))
-  else:
-    writer.write("/>%s" % newl)
-
-
-class XmlFix(object):
-  """Object to manage temporary patching of xml.dom.minidom."""
-
-  def __init__(self):
-    # Preserve current xml.dom.minidom functions.
-    self.write_data = xml.dom.minidom._write_data
-    self.writexml = xml.dom.minidom.Element.writexml
-    # Inject replacement versions of a function and a method.
-    xml.dom.minidom._write_data = _Replacement_write_data
-    xml.dom.minidom.Element.writexml = _Replacement_writexml
-
-  def Cleanup(self):
-    if self.write_data:
-      xml.dom.minidom._write_data = self.write_data
-      xml.dom.minidom.Element.writexml = self.writexml
-      self.write_data = None
-
-  def __del__(self):
-    self.Cleanup()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/pylintrc	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,307 +0,0 @@
-[MASTER]
-
-# Specify a configuration file.
-#rcfile=
-
-# Python code to execute, usually for sys.path manipulation such as
-# pygtk.require().
-#init-hook=
-
-# Profiled execution.
-profile=no
-
-# Add files or directories to the blacklist. They should be base names, not
-# paths.
-ignore=CVS
-
-# Pickle collected data for later comparisons.
-persistent=yes
-
-# List of plugins (as comma separated values of python modules names) to load,
-# usually to register additional checkers.
-load-plugins=
-
-
-[MESSAGES CONTROL]
-
-# Enable the message, report, category or checker with the given id(s). You can
-# either give multiple identifier separated by comma (,) or put this option
-# multiple time.
-#enable=
-
-# Disable the message, report, category or checker with the given id(s). You
-# can either give multiple identifier separated by comma (,) or put this option
-# multiple time (only on the command line, not in the configuration file where
-# it should appear only once).
-# C0103: Invalid name "NN" (should match [a-z_][a-z0-9_]{2,30}$)
-# C0111: Missing docstring
-# C0302: Too many lines in module (NN)
-# R0902: Too many instance attributes (N/7)
-# R0903: Too few public methods (N/2)
-# R0904: Too many public methods (NN/20)
-# R0912: Too many branches (NN/12)
-# R0913: Too many arguments (N/5)
-# R0914: Too many local variables (NN/15)
-# R0915: Too many statements (NN/50)
-# W0141: Used builtin function 'map'
-# W0142: Used * or ** magic
-# W0232: Class has no __init__ method
-# W0511: TODO
-# W0603: Using the global statement
-#
-# These should be enabled eventually:
-# C0112: Empty docstring
-# C0301: Line too long (NN/80)
-# C0321: More than one statement on single line
-# C0322: Operator not preceded by a space
-# C0323: Operator not followed by a space
-# C0324: Comma not followed by a space
-# E0101: Explicit return in __init__
-# E0102: function already defined line NN
-# E1002: Use of super on an old style class
-# E1101: Instance of 'XX' has no 'YY' member
-# E1103: Instance of 'XX' has no 'XX' member (but some types could not be inferred)
-# E0602: Undefined variable 'XX'
-# F0401: Unable to import 'XX'
-# R0201: Method could be a function
-# R0801: Similar lines in N files
-# W0102: Dangerous default value {} as argument
-# W0104: Statement seems to have no effect
-# W0105: String statement has no effect
-# W0108: Lambda may not be necessary
-# W0201: Attribute 'XX' defined outside __init__
-# W0212: Access to a protected member XX of a client class
-# W0221: Arguments number differs from overridden method
-# W0223: Method 'XX' is abstract in class 'YY' but is not overridden
-# W0231: __init__ method from base class 'XX' is not called
-# W0301: Unnecessary semicolon
-# W0311: Bad indentation. Found NN spaces, expected NN
-# W0401: Wildcard import XX
-# W0402: Uses of a deprecated module 'string'
-# W0403: Relative import 'XX', should be 'YY.XX'
-# W0404: Reimport 'XX' (imported line NN)
-# W0601: Global variable 'XX' undefined at the module level
-# W0602: Using global for 'XX' but no assignment is done
-# W0611: Unused import pprint
-# W0612: Unused variable 'XX'
-# W0613: Unused argument 'XX'
-# W0614: Unused import XX from wildcard import
-# W0621: Redefining name 'XX' from outer scope (line NN)
-# W0622: Redefining built-in 'NN'
-# W0631: Using possibly undefined loop variable 'XX'
-# W0701: Raising a string exception
-# W0702: No exception type(s) specified
-disable=C0103,C0111,C0302,R0902,R0903,R0904,R0912,R0913,R0914,R0915,W0141,W0142,W0232,W0511,W0603,C0112,C0301,C0321,C0322,C0323,C0324,E0101,E0102,E1002,E1101,E1103,E0602,F0401,R0201,R0801,W0102,W0104,W0105,W0108,W0201,W0212,W0221,W0223,W0231,W0301,W0311,W0401,W0402,W0403,W0404,W0601,W0602,W0611,W0612,W0613,W0614,W0621,W0622,W0631,W0701,W0702
-
-
-[REPORTS]
-
-# Set the output format. Available formats are text, parseable, colorized, msvs
-# (visual studio) and html
-output-format=text
-
-# Include message's id in output
-include-ids=yes
-
-# Put messages in a separate file for each module / package specified on the
-# command line instead of printing them on stdout. Reports (if any) will be
-# written in a file name "pylint_global.[txt|html]".
-files-output=no
-
-# Tells whether to display a full report or only the messages
-reports=no
-
-# Python expression which should return a note less than 10 (10 is the highest
-# note). You have access to the variables errors warning, statement which
-# respectively contain the number of errors / warnings messages and the total
-# number of statements analyzed. This is used by the global evaluation report
-# (RP0004).
-evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
-
-# Add a comment according to your evaluation note. This is used by the global
-# evaluation report (RP0004).
-comment=no
-
-
-[VARIABLES]
-
-# Tells whether we should check for unused import in __init__ files.
-init-import=no
-
-# A regular expression matching the beginning of the name of dummy variables
-# (i.e. not used).
-dummy-variables-rgx=_|dummy
-
-# List of additional names supposed to be defined in builtins. Remember that
-# you should avoid to define new builtins when possible.
-additional-builtins=
-
-
-[TYPECHECK]
-
-# Tells whether missing members accessed in mixin class should be ignored. A
-# mixin class is detected if its name ends with "mixin" (case insensitive).
-ignore-mixin-members=yes
-
-# List of classes names for which member attributes should not be checked
-# (useful for classes with attributes dynamically set).
-ignored-classes=SQLObject
-
-# When zope mode is activated, add a predefined set of Zope acquired attributes
-# to generated-members.
-zope=no
-
-# List of members which are set dynamically and missed by pylint inference
-# system, and so shouldn't trigger E0201 when accessed. Python regular
-# expressions are accepted.
-generated-members=REQUEST,acl_users,aq_parent
-
-
-[MISCELLANEOUS]
-
-# List of note tags to take in consideration, separated by a comma.
-notes=FIXME,XXX,TODO
-
-
-[SIMILARITIES]
-
-# Minimum lines number of a similarity.
-min-similarity-lines=4
-
-# Ignore comments when computing similarities.
-ignore-comments=yes
-
-# Ignore docstrings when computing similarities.
-ignore-docstrings=yes
-
-
-[FORMAT]
-
-# Maximum number of characters on a single line.
-max-line-length=80
-
-# Maximum number of lines in a module
-max-module-lines=1000
-
-# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
-# tab).
-indent-string='  '
-
-
-[BASIC]
-
-# Required attributes for module, separated by a comma
-required-attributes=
-
-# List of builtins function names that should not be used, separated by a comma
-bad-functions=map,filter,apply,input
-
-# Regular expression which should only match correct module names
-module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
-
-# Regular expression which should only match correct module level names
-const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__))$
-
-# Regular expression which should only match correct class names
-class-rgx=[A-Z_][a-zA-Z0-9]+$
-
-# Regular expression which should only match correct function names
-function-rgx=[a-z_][a-z0-9_]{2,30}$
-
-# Regular expression which should only match correct method names
-method-rgx=[a-z_][a-z0-9_]{2,30}$
-
-# Regular expression which should only match correct instance attribute names
-attr-rgx=[a-z_][a-z0-9_]{2,30}$
-
-# Regular expression which should only match correct argument names
-argument-rgx=[a-z_][a-z0-9_]{2,30}$
-
-# Regular expression which should only match correct variable names
-variable-rgx=[a-z_][a-z0-9_]{2,30}$
-
-# Regular expression which should only match correct list comprehension /
-# generator expression variable names
-inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
-
-# Good variable names which should always be accepted, separated by a comma
-good-names=i,j,k,ex,Run,_
-
-# Bad variable names which should always be refused, separated by a comma
-bad-names=foo,bar,baz,toto,tutu,tata
-
-# Regular expression which should only match functions or classes name which do
-# not require a docstring
-no-docstring-rgx=__.*__
-
-
-[DESIGN]
-
-# Maximum number of arguments for function / method
-max-args=5
-
-# Argument names that match this expression will be ignored. Default to name
-# with leading underscore
-ignored-argument-names=_.*
-
-# Maximum number of locals for function / method body
-max-locals=15
-
-# Maximum number of return / yield for function / method body
-max-returns=6
-
-# Maximum number of branch for function / method body
-max-branchs=12
-
-# Maximum number of statements in function / method body
-max-statements=50
-
-# Maximum number of parents for a class (see R0901).
-max-parents=7
-
-# Maximum number of attributes for a class (see R0902).
-max-attributes=7
-
-# Minimum number of public methods for a class (see R0903).
-min-public-methods=2
-
-# Maximum number of public methods for a class (see R0904).
-max-public-methods=20
-
-
-[CLASSES]
-
-# List of interface methods to ignore, separated by a comma. This is used for
-# instance to not check methods defines in Zope's Interface base class.
-ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by
-
-# List of method names used to declare (i.e. assign) instance attributes.
-defining-attr-methods=__init__,__new__,setUp
-
-# List of valid names for the first argument in a class method.
-valid-classmethod-first-arg=cls
-
-
-[IMPORTS]
-
-# Deprecated modules which should not be used, separated by a comma
-deprecated-modules=regsub,string,TERMIOS,Bastion,rexec
-
-# Create a graph of every (i.e. internal and external) dependencies in the
-# given file (report RP0402 must not be disabled)
-import-graph=
-
-# Create a graph of external dependencies in the given file (report RP0402 must
-# not be disabled)
-ext-import-graph=
-
-# Create a graph of internal dependencies in the given file (report RP0402 must
-# not be disabled)
-int-import-graph=
-
-
-[EXCEPTIONS]
-
-# Exceptions that will emit a warning when being caught. Defaults to
-# "Exception"
-overgeneral-exceptions=Exception
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/samples/samples	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,81 +0,0 @@
-#!/usr/bin/python
-
-# Copyright (c) 2009 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import os.path
-import shutil
-import sys
-
-
-gyps = [
-    'app/app.gyp',
-    'base/base.gyp',
-    'build/temp_gyp/googleurl.gyp',
-    'build/all.gyp',
-    'build/common.gypi',
-    'build/external_code.gypi',
-    'chrome/test/security_tests/security_tests.gyp',
-    'chrome/third_party/hunspell/hunspell.gyp',
-    'chrome/chrome.gyp',
-    'media/media.gyp',
-    'net/net.gyp',
-    'printing/printing.gyp',
-    'sdch/sdch.gyp',
-    'skia/skia.gyp',
-    'testing/gmock.gyp',
-    'testing/gtest.gyp',
-    'third_party/bzip2/bzip2.gyp',
-    'third_party/icu38/icu38.gyp',
-    'third_party/libevent/libevent.gyp',
-    'third_party/libjpeg/libjpeg.gyp',
-    'third_party/libpng/libpng.gyp',
-    'third_party/libxml/libxml.gyp',
-    'third_party/libxslt/libxslt.gyp',
-    'third_party/lzma_sdk/lzma_sdk.gyp',
-    'third_party/modp_b64/modp_b64.gyp',
-    'third_party/npapi/npapi.gyp',
-    'third_party/sqlite/sqlite.gyp',
-    'third_party/zlib/zlib.gyp',
-    'v8/tools/gyp/v8.gyp',
-    'webkit/activex_shim/activex_shim.gyp',
-    'webkit/activex_shim_dll/activex_shim_dll.gyp',
-    'webkit/build/action_csspropertynames.py',
-    'webkit/build/action_cssvaluekeywords.py',
-    'webkit/build/action_jsconfig.py',
-    'webkit/build/action_makenames.py',
-    'webkit/build/action_maketokenizer.py',
-    'webkit/build/action_useragentstylesheets.py',
-    'webkit/build/rule_binding.py',
-    'webkit/build/rule_bison.py',
-    'webkit/build/rule_gperf.py',
-    'webkit/tools/test_shell/test_shell.gyp',
-    'webkit/webkit.gyp',
-]
-
-
-def Main(argv):
-  if len(argv) != 3 or argv[1] not in ['push', 'pull']:
-    print 'Usage: %s push/pull PATH_TO_CHROME' % argv[0]
-    return 1
-
-  path_to_chrome = argv[2]
-
-  for g in gyps:
-    chrome_file = os.path.join(path_to_chrome, g)
-    local_file = os.path.join(os.path.dirname(argv[0]), os.path.split(g)[1])
-    if argv[1] == 'push':
-      print 'Copying %s to %s' % (local_file, chrome_file)
-      shutil.copyfile(local_file, chrome_file)
-    elif argv[1] == 'pull':
-      print 'Copying %s to %s' % (chrome_file, local_file)
-      shutil.copyfile(chrome_file, local_file)
-    else:
-      assert False
-
-  return 0
-
-
-if __name__ == '__main__':
-  sys.exit(Main(sys.argv))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/samples/samples.bat	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-@rem Copyright (c) 2009 Google Inc. All rights reserved.
-@rem Use of this source code is governed by a BSD-style license that can be
-@rem found in the LICENSE file.
-
-@python %~dp0/samples %*
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/setup.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2009 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-from distutils.core import setup
-from distutils.command.install import install
-from distutils.command.install_lib import install_lib
-from distutils.command.install_scripts import install_scripts
-
-setup(
-  name='gyp',
-  version='0.1',
-  description='Generate Your Projects',
-  author='Chromium Authors',
-  author_email='chromium-dev@googlegroups.com',
-  url='http://code.google.com/p/gyp',
-  package_dir = {'': 'pylib'},
-  packages=['gyp', 'gyp.generator'],
-
-  scripts = ['gyp'],
-  cmdclass = {'install': install,
-              'install_lib': install_lib,
-              'install_scripts': install_scripts},
-)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/README	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-pretty_vcproj:
-  Usage: pretty_vcproj.py "c:\path\to\vcproj.vcproj" [key1=value1] [key2=value2]
-
-  They key/value pair are used to resolve vsprops name.
-
-  For example, if I want to diff the base.vcproj project:
-
-  pretty_vcproj.py z:\dev\src-chrome\src\base\build\base.vcproj "$(SolutionDir)=z:\dev\src-chrome\src\chrome\\" "$(CHROMIUM_BUILD)=" "$(CHROME_BUILD_TYPE)=" > orignal.txt
-  pretty_vcproj.py z:\dev\src-chrome\src\base\base_gyp.vcproj "$(SolutionDir)=z:\dev\src-chrome\src\chrome\\" "$(CHROMIUM_BUILD)=" "$(CHROME_BUILD_TYPE)=" > gyp.txt
-
-  And you can use your favorite diff tool to see the changes.
-
-  Note: In the case of base.vcproj, the original vcproj is one level up the generated one.
-        I suggest you do a search and replace for '"..\' and replace it with '"' in original.txt
-        before you perform the diff.
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/README	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-Specifications contains syntax formatters for Xcode 3. These do not appear to be supported yet on Xcode 4. To use these with Xcode 3 please install both the gyp.pbfilespec and gyp.xclangspec files in
-
-~/Library/Application Support/Developer/Shared/Xcode/Specifications/
-
-and restart Xcode.
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/Specifications/gyp.pbfilespec	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-/*
-	gyp.pbfilespec
-	GYP source file spec for Xcode 3
-
-	There is not much documentation available regarding the format
-	of .pbfilespec files. As a starting point, see for instance the
-	outdated documentation at:
-	http://maxao.free.fr/xcode-plugin-interface/specifications.html
-	and the files in:
-	/Developer/Library/PrivateFrameworks/XcodeEdit.framework/Versions/A/Resources/
-
-	Place this file in directory:
-	~/Library/Application Support/Developer/Shared/Xcode/Specifications/
-*/
-
-(
-	{
-		Identifier = sourcecode.gyp;
-		BasedOn = sourcecode;
-		Name = "GYP Files";
-		Extensions = ("gyp", "gypi");
-		MIMETypes = ("text/gyp");
-		Language = "xcode.lang.gyp";
-		IsTextFile = YES;
-		IsSourceFile = YES;
-	}
-)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/Specifications/gyp.xclangspec	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,226 +0,0 @@
-/*
-	Copyright (c) 2011 Google Inc. All rights reserved.
-	Use of this source code is governed by a BSD-style license that can be
-	found in the LICENSE file.
-	
-	gyp.xclangspec
-	GYP language specification for Xcode 3
-
-	There is not much documentation available regarding the format
-	of .xclangspec files. As a starting point, see for instance the
-	outdated documentation at:
-	http://maxao.free.fr/xcode-plugin-interface/specifications.html
-	and the files in:
-	/Developer/Library/PrivateFrameworks/XcodeEdit.framework/Versions/A/Resources/
-
-	Place this file in directory:
-	~/Library/Application Support/Developer/Shared/Xcode/Specifications/
-*/
-
-(
-
-    {
-        Identifier = "xcode.lang.gyp.keyword";
-        Syntax = {
-            Words = (
-                "and",
-                "or",
-                "<!",
-                "<",
-             );
-            Type = "xcode.syntax.keyword";
-        };
-    },
-
-    {
-        Identifier = "xcode.lang.gyp.target.declarator";
-        Syntax = {
-        	Words = (
-        		"'target_name'",
-        	);
-            Type = "xcode.syntax.identifier.type";
-        };
-    },
-
-	{
-		Identifier = "xcode.lang.gyp.string.singlequote";
-		Syntax = {
-			IncludeRules = (
-				"xcode.lang.string",
-				"xcode.lang.gyp.keyword",
-				"xcode.lang.number",
-			);
-			Start = "'";
-			End = "'";
-		};
-	},
-	
-	{
-		Identifier = "xcode.lang.gyp.comma";
-		Syntax = {
-			Words = ( ",", );
-			
-		};
-	},
-
-	{
-		Identifier = "xcode.lang.gyp";
-		Description = "GYP Coloring";
-		BasedOn = "xcode.lang.simpleColoring";
-		IncludeInMenu = YES;
-		Name = "GYP";
-		Syntax = {
-			Tokenizer = "xcode.lang.gyp.lexer.toplevel";
-			IncludeRules = (
-				"xcode.lang.gyp.dictionary",
-			);
-			Type = "xcode.syntax.plain";
-		};
-	},
-
-	// The following rule returns tokens to the other rules
-	{
-		Identifier = "xcode.lang.gyp.lexer";
-		Syntax = {
-			IncludeRules = (
-				"xcode.lang.gyp.comment",
-				"xcode.lang.string",
-				'xcode.lang.gyp.targetname.declarator',
-				"xcode.lang.gyp.string.singlequote",
-				"xcode.lang.number",
-				"xcode.lang.gyp.comma",
-			);
-		};
-	},
-
-	{
-		Identifier = "xcode.lang.gyp.lexer.toplevel";
-		Syntax = {
-			IncludeRules = (
-				"xcode.lang.gyp.comment",
-			);
-		};
-	},
-
-	{
-        Identifier = "xcode.lang.gyp.assignment";
-        Syntax = {
-            Tokenizer = "xcode.lang.gyp.lexer";
-            Rules = (
-            	"xcode.lang.gyp.assignment.lhs",
-            	":",
-                "xcode.lang.gyp.assignment.rhs",
-            );
-        };
-       
-    },
-    
-    {
-        Identifier = "xcode.lang.gyp.target.declaration";
-        Syntax = {
-            Tokenizer = "xcode.lang.gyp.lexer";
-            Rules = (
-                "xcode.lang.gyp.target.declarator",
-                ":",
-                "xcode.lang.gyp.target.name",
-            );
-        };
-   },
-   
-   {
-        Identifier = "xcode.lang.gyp.target.name";
-        Syntax = {
-            Tokenizer = "xcode.lang.gyp.lexer";
-            Rules = (
-                "xcode.lang.gyp.string.singlequote",
-            );
-        	Type = "xcode.syntax.definition.function";
-        };
-    },
-    
-	{
-        Identifier = "xcode.lang.gyp.assignment.lhs";
-        Syntax = {
-            Tokenizer = "xcode.lang.gyp.lexer";
-            Rules = (
-            	"xcode.lang.gyp.string.singlequote",
-            );
-         	Type = "xcode.syntax.identifier.type";
-        };
-    },
-    
-    {
-        Identifier = "xcode.lang.gyp.assignment.rhs";
-        Syntax = {
-        	Tokenizer = "xcode.lang.gyp.lexer";
-            Rules = (
-            	"xcode.lang.gyp.string.singlequote?",
-                "xcode.lang.gyp.array?",
-				"xcode.lang.gyp.dictionary?",
-				"xcode.lang.number?",
-            );
-        };
-    },
-
-	{
-		Identifier = "xcode.lang.gyp.dictionary";
-		Syntax = {
-			Tokenizer = "xcode.lang.gyp.lexer";
-			Start = "{";
-			End = "}";
-			Foldable = YES;
-			Recursive = YES;
-			IncludeRules = (
-				"xcode.lang.gyp.target.declaration",
-				"xcode.lang.gyp.assignment",
-			);
-		};
-	},
-
-	{
-		Identifier = "xcode.lang.gyp.array";
-		Syntax = {
-			Tokenizer = "xcode.lang.gyp.lexer";
-			Start = "[";
-			End = "]";
-			Foldable = YES;
-			Recursive = YES;
-			IncludeRules = (
-				"xcode.lang.gyp.array",
-				"xcode.lang.gyp.dictionary",
-				"xcode.lang.gyp.string.singlequote",
-			);
-		};
-	},
-
-    {
-        Identifier = "xcode.lang.gyp.todo.mark";
-        Syntax = {
-            StartChars = "T";
-            Match = (
-                "^\(TODO\(.*\):[ \t]+.*\)$",       // include "TODO: " in the markers list
-            );
-            // This is the order of captures. All of the match strings above need the same order.
-            CaptureTypes = (
-                "xcode.syntax.mark"
-            );
-            Type = "xcode.syntax.comment";
-        };
-    },
-
-	{
-		Identifier = "xcode.lang.gyp.comment";
-		BasedOn = "xcode.lang.comment"; // for text macros
-		Syntax = {
-			Start = "#";
-			End = "\n";
-			IncludeRules = (
-				"xcode.lang.url",
-				"xcode.lang.url.mail",
-				"xcode.lang.comment.mark",
-				"xcode.lang.gyp.todo.mark",
-			);
-			Type = "xcode.syntax.comment";
-		};
-	},
-)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/README	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,12 +0,0 @@
-How to install gyp-mode for emacs:
-
-Add the following to your ~/.emacs (replace ... with the path to your gyp
-checkout).
-
-(setq load-path (cons ".../tools/emacs" load-path))
-(require 'gyp)
-
-Restart emacs (or eval-region the added lines) and you should be all set.
-
-Please note that ert is required for running the tests, which is included in
-Emacs 24, or available separately from https://github.com/ohler/ert
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/gyp-tests.el	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-;;; gyp-tests.el - unit tests for gyp-mode.
-
-;; Copyright (c) 2012 Google Inc. All rights reserved.
-;; Use of this source code is governed by a BSD-style license that can be
-;; found in the LICENSE file.
-
-;; The recommended way to run these tests is to run them from the command-line,
-;; with the run-unit-tests.sh script.
-
-(require 'cl)
-(require 'ert)
-(require 'gyp)
-
-(defconst samples (directory-files "testdata" t ".gyp$")
-  "List of golden samples to check")
-
-(defun fontify (filename)
-  (with-temp-buffer
-    (insert-file-contents-literally filename)
-    (gyp-mode)
-    (font-lock-fontify-buffer)
-    (buffer-string)))
-
-(defun read-golden-sample (filename)
-  (with-temp-buffer
-    (insert-file-contents-literally (concat filename ".fontified"))
-    (read (current-buffer))))
-
-(defun equivalent-face (face)
-  "For the purposes of face comparison, we're not interested in the
-   differences between certain faces. For example, the difference between
-   font-lock-comment-delimiter and font-lock-comment-face."
-  (case face
-    ((font-lock-comment-delimiter-face) font-lock-comment-face)
-    (t face)))
-
-(defun text-face-properties (s)
-  "Extract the text properties from s"
-  (let ((result (list t)))
-    (dotimes (i (length s))
-      (setq result (cons (equivalent-face (get-text-property i 'face s))
-                         result)))
-    (nreverse result)))
-
-(ert-deftest test-golden-samples ()
-  "Check that fontification produces the same results as the golden samples"
-  (dolist (sample samples)
-    (let ((golden (read-golden-sample sample))
-          (fontified (fontify sample)))
-      (should (equal golden fontified))
-      (should (equal (text-face-properties golden)
-                     (text-face-properties fontified))))))
-
-(defun create-golden-sample (filename)
-  "Create a golden sample by fontifying filename and writing out the printable
-   representation of the fontified buffer (with text properties) to the
-   FILENAME.fontified"
-  (with-temp-file (concat filename ".fontified")
-    (print (fontify filename) (current-buffer))))
-
-(defun create-golden-samples ()
-  "Recreate the golden samples"
-  (dolist (sample samples) (create-golden-sample sample)))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/gyp.el	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,251 +0,0 @@
-;;; gyp.el - font-lock-mode support for gyp files.
-
-;; Copyright (c) 2012 Google Inc. All rights reserved.
-;; Use of this source code is governed by a BSD-style license that can be
-;; found in the LICENSE file.
-
-;; Put this somewhere in your load-path and
-;; (require 'gyp)
-
-(require 'python)
-(require 'cl)
-
-(when (string-match "python-mode.el" (symbol-file 'python-mode 'defun))
-  (error (concat "python-mode must be loaded from python.el (bundled with "
-                 "recent emacsen), not from the older and less maintained "
-                 "python-mode.el")))
-
-(defadvice python-calculate-indentation (after ami-outdent-closing-parens
-                                               activate)
-  "De-indent closing parens, braces, and brackets in gyp-mode."
-  (if (and (eq major-mode 'gyp-mode)
-           (string-match "^ *[])}][],)}]* *$"
-                         (buffer-substring-no-properties
-                          (line-beginning-position) (line-end-position))))
-      (setq ad-return-value (- ad-return-value 2))))
-
-(define-derived-mode gyp-mode python-mode "Gyp"
-  "Major mode for editing .gyp files. See http://code.google.com/p/gyp/"
-  ;; gyp-parse-history is a stack of (POSITION . PARSE-STATE) tuples,
-  ;; with greater positions at the top of the stack. PARSE-STATE
-  ;; is a list of section symbols (see gyp-section-name and gyp-parse-to)
-  ;; with most nested section symbol at the front of the list.
-  (set (make-local-variable 'gyp-parse-history) '((1 . (list))))
-  (gyp-add-font-lock-keywords))
-
-(defun gyp-set-indentation ()
-  "Hook function to configure python indentation to suit gyp mode."
-  (setq python-continuation-offset 2
-        python-indent 2
-        python-guess-indent nil))
-
-(add-hook 'gyp-mode-hook 'gyp-set-indentation)
-
-(add-to-list 'auto-mode-alist '("\\.gyp\\'" . gyp-mode))
-(add-to-list 'auto-mode-alist '("\\.gypi\\'" . gyp-mode))
-
-;;; Font-lock support
-
-(defconst gyp-dependencies-regexp
-  (regexp-opt (list "dependencies" "export_dependent_settings"))
-  "Regular expression to introduce 'dependencies' section")
-
-(defconst gyp-sources-regexp
-  (regexp-opt (list "action" "files" "include_dirs" "includes" "inputs"
-                    "libraries" "outputs" "sources"))
-  "Regular expression to introduce 'sources' sections")
-
-(defconst gyp-conditions-regexp
-  (regexp-opt (list "conditions" "target_conditions"))
-  "Regular expression to introduce conditions sections")
-
-(defconst gyp-variables-regexp
-  "^variables"
-  "Regular expression to introduce variables sections")
-
-(defconst gyp-defines-regexp
-  "^defines"
-  "Regular expression to introduce 'defines' sections")
-
-(defconst gyp-targets-regexp
-  "^targets"
-  "Regular expression to introduce 'targets' sections")
-
-(defun gyp-section-name (section)
-  "Map the sections we are interested in from SECTION to symbol.
-
-   SECTION is a string from the buffer that introduces a section.  The result is
-   a symbol representing the kind of section.
-
-   This allows us to treat (for the purposes of font-lock) several different
-   section names as the same kind of section. For example, a 'sources section
-   can be introduced by the 'sources', 'inputs', 'outputs' keyword.
-
-   'other is the default section kind when a more specific match is not made."
-  (cond ((string-match-p gyp-dependencies-regexp section) 'dependencies)
-        ((string-match-p gyp-sources-regexp section) 'sources)
-        ((string-match-p gyp-variables-regexp section) 'variables)
-        ((string-match-p gyp-conditions-regexp section) 'conditions)
-        ((string-match-p gyp-targets-regexp section) 'targets)
-        ((string-match-p gyp-defines-regexp section) 'defines)
-        (t 'other)))
-
-(defun gyp-invalidate-parse-states-after (target-point)
-  "Erase any parse information after target-point."
-  (while (> (caar gyp-parse-history) target-point)
-    (setq gyp-parse-history (cdr gyp-parse-history))))
-
-(defun gyp-parse-point ()
-  "The point of the last parse state added by gyp-parse-to."
-  (caar gyp-parse-history))
-
-(defun gyp-parse-sections ()
-  "A list of section symbols holding at the last parse state point."
-  (cdar gyp-parse-history))
-
-(defun gyp-inside-dictionary-p ()
-  "Predicate returning true if the parser is inside a dictionary."
-  (not (eq (cadar gyp-parse-history) 'list)))
-
-(defun gyp-add-parse-history (point sections)
-  "Add parse state SECTIONS to the parse history at POINT so that parsing can be
-   resumed instantly."
-  (while (>= (caar gyp-parse-history) point)
-    (setq gyp-parse-history (cdr gyp-parse-history)))
-  (setq gyp-parse-history (cons (cons point sections) gyp-parse-history)))
-
-(defun gyp-parse-to (target-point)
-  "Parses from (point) to TARGET-POINT adding the parse state information to
-   gyp-parse-state-history. Parsing stops if TARGET-POINT is reached or if a
-   string literal has been parsed. Returns nil if no further parsing can be
-   done, otherwise returns the position of the start of a parsed string, leaving
-   the point at the end of the string."
-  (let ((parsing t)
-        string-start)
-    (while parsing
-      (setq string-start nil)
-      ;; Parse up to a character that starts a sexp, or if the nesting
-      ;; level decreases.
-      (let ((state (parse-partial-sexp (gyp-parse-point)
-                                       target-point
-                                       -1
-                                       t))
-            (sections (gyp-parse-sections)))
-        (if (= (nth 0 state) -1)
-            (setq sections (cdr sections)) ; pop out a level
-          (cond ((looking-at-p "['\"]") ; a string
-                 (setq string-start (point))
-                 (goto-char (scan-sexps (point) 1))
-                 (if (gyp-inside-dictionary-p)
-                     ;; Look for sections inside a dictionary
-                     (let ((section (gyp-section-name
-                                     (buffer-substring-no-properties
-                                      (+ 1 string-start)
-                                      (- (point) 1)))))
-                       (setq sections (cons section (cdr sections)))))
-                 ;; Stop after the string so it can be fontified.
-                 (setq target-point (point)))
-                ((looking-at-p "{")
-                 ;; Inside a dictionary. Increase nesting.
-                 (forward-char 1)
-                 (setq sections (cons 'unknown sections)))
-                ((looking-at-p "\\[")
-                 ;; Inside a list. Increase nesting
-                 (forward-char 1)
-                 (setq sections (cons 'list sections)))
-                ((not (eobp))
-                 ;; other
-                 (forward-char 1))))
-        (gyp-add-parse-history (point) sections)
-        (setq parsing (< (point) target-point))))
-    string-start))
-
-(defun gyp-section-at-point ()
-  "Transform the last parse state, which is a list of nested sections and return
-   the section symbol that should be used to determine font-lock information for
-   the string. Can return nil indicating the string should not have any attached
-   section."
-  (let ((sections (gyp-parse-sections)))
-    (cond
-     ((eq (car sections) 'conditions)
-      ;; conditions can occur in a variables section, but we still want to
-      ;; highlight it as a keyword.
-      nil)
-     ((and (eq (car sections) 'list)
-           (eq (cadr sections) 'list))
-      ;; conditions and sources can have items in [[ ]]
-      (caddr sections))
-     (t (cadr sections)))))
-
-(defun gyp-section-match (limit)
-  "Parse from (point) to LIMIT returning by means of match data what was
-   matched. The group of the match indicates what style font-lock should apply.
-   See also `gyp-add-font-lock-keywords'."
-  (gyp-invalidate-parse-states-after (point))
-  (let ((group nil)
-        (string-start t))
-    (while (and (< (point) limit)
-                (not group)
-                string-start)
-      (setq string-start (gyp-parse-to limit))
-      (if string-start
-          (setq group (case (gyp-section-at-point)
-                        ('dependencies 1)
-                        ('variables 2)
-                        ('conditions 2)
-                        ('sources 3)
-                        ('defines 4)
-                        (nil nil)))))
-    (if group
-        (progn
-          ;; Set the match data to indicate to the font-lock mechanism the
-          ;; highlighting to be performed.
-          (set-match-data (append (list string-start (point))
-                                  (make-list (* (1- group) 2) nil)
-                                  (list (1+ string-start) (1- (point)))))
-          t))))
-
-;;; Please see http://code.google.com/p/gyp/wiki/GypLanguageSpecification for
-;;; canonical list of keywords.
-(defun gyp-add-font-lock-keywords ()
-  "Add gyp-mode keywords to font-lock mechanism."
-  ;; TODO(jknotten): Move all the keyword highlighting into gyp-section-match
-  ;; so that we can do the font-locking in a single font-lock pass.
-  (font-lock-add-keywords
-   nil
-   (list
-    ;; Top-level keywords
-    (list (concat "['\"]\\("
-              (regexp-opt (list "action" "action_name" "actions" "cflags"
-                                "conditions" "configurations" "copies" "defines"
-                                "dependencies" "destination"
-                                "direct_dependent_settings"
-                                "export_dependent_settings" "extension" "files"
-                                "include_dirs" "includes" "inputs" "libraries"
-                                "link_settings" "mac_bundle" "message"
-                                "msvs_external_rule" "outputs" "product_name"
-                                "process_outputs_as_sources" "rules" "rule_name"
-                                "sources" "suppress_wildcard"
-                                "target_conditions" "target_defaults"
-                                "target_defines" "target_name" "toolsets"
-                                "targets" "type" "variables" "xcode_settings"))
-              "[!/+=]?\\)") 1 'font-lock-keyword-face t)
-    ;; Type of target
-    (list (concat "['\"]\\("
-              (regexp-opt (list "loadable_module" "static_library"
-                                "shared_library" "executable" "none"))
-              "\\)") 1 'font-lock-type-face t)
-    (list "\\(?:target\\|action\\)_name['\"]\\s-*:\\s-*['\"]\\([^ '\"]*\\)" 1
-          'font-lock-function-name-face t)
-    (list 'gyp-section-match
-          (list 1 'font-lock-function-name-face t t) ; dependencies
-          (list 2 'font-lock-variable-name-face t t) ; variables, conditions
-          (list 3 'font-lock-constant-face t t) ; sources
-          (list 4 'font-lock-preprocessor-face t t)) ; preprocessor
-    ;; Variable expansion
-    (list "<@?(\\([^\n )]+\\))" 1 'font-lock-variable-name-face t)
-    ;; Command expansion
-    (list "<!@?(\\([^\n )]+\\))" 1 'font-lock-variable-name-face t)
-    )))
-
-(provide 'gyp)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/run-unit-tests.sh	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,7 +0,0 @@
-#!/bin/sh
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-emacs --no-site-file --no-init-file --batch \
-      --load ert.el --load gyp.el --load gyp-tests.el \
-      -f ert-run-tests-batch-and-exit
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/testdata/media.gyp	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1105 +0,0 @@
-# Copyright (c) 2012 The Chromium Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-{
-  'variables': {
-    'chromium_code': 1,
-    # Override to dynamically link the PulseAudio library.
-    'use_pulseaudio%': 0,
-    # Override to dynamically link the cras (ChromeOS audio) library.
-    'use_cras%': 0,
-  },
-  'targets': [
-    {
-      'target_name': 'media',
-      'type': '<(component)',
-      'dependencies': [
-        'yuv_convert',
-        '../base/base.gyp:base',
-        '../base/third_party/dynamic_annotations/dynamic_annotations.gyp:dynamic_annotations',
-        '../build/temp_gyp/googleurl.gyp:googleurl',
-        '../crypto/crypto.gyp:crypto',
-        '../third_party/openmax/openmax.gyp:il',
-        '../ui/ui.gyp:ui',
-      ],
-      'defines': [
-        'MEDIA_IMPLEMENTATION',
-      ],
-      'include_dirs': [
-        '..',
-      ],
-      'sources': [
-        'audio/android/audio_manager_android.cc',
-        'audio/android/audio_manager_android.h',
-        'audio/android/audio_track_output_android.cc',
-        'audio/android/audio_track_output_android.h',
-        'audio/android/opensles_input.cc',
-        'audio/android/opensles_input.h',
-        'audio/android/opensles_output.cc',
-        'audio/android/opensles_output.h',
-        'audio/async_socket_io_handler.h',
-        'audio/async_socket_io_handler_posix.cc',
-        'audio/async_socket_io_handler_win.cc',
-        'audio/audio_buffers_state.cc',
-        'audio/audio_buffers_state.h',
-        'audio/audio_io.h',
-        'audio/audio_input_controller.cc',
-        'audio/audio_input_controller.h',
-        'audio/audio_input_stream_impl.cc',
-        'audio/audio_input_stream_impl.h',
-        'audio/audio_device_name.cc',
-        'audio/audio_device_name.h',
-        'audio/audio_manager.cc',
-        'audio/audio_manager.h',
-        'audio/audio_manager_base.cc',
-        'audio/audio_manager_base.h',
-        'audio/audio_output_controller.cc',
-        'audio/audio_output_controller.h',
-        'audio/audio_output_dispatcher.cc',
-        'audio/audio_output_dispatcher.h',
-        'audio/audio_output_dispatcher_impl.cc',
-        'audio/audio_output_dispatcher_impl.h',
-        'audio/audio_output_mixer.cc',
-        'audio/audio_output_mixer.h',
-        'audio/audio_output_proxy.cc',
-        'audio/audio_output_proxy.h',
-        'audio/audio_parameters.cc',
-        'audio/audio_parameters.h',
-        'audio/audio_util.cc',
-        'audio/audio_util.h',
-        'audio/cross_process_notification.cc',
-        'audio/cross_process_notification.h',
-        'audio/cross_process_notification_win.cc',
-        'audio/cross_process_notification_posix.cc',
-        'audio/fake_audio_input_stream.cc',
-        'audio/fake_audio_input_stream.h',
-        'audio/fake_audio_output_stream.cc',
-        'audio/fake_audio_output_stream.h',
-        'audio/linux/audio_manager_linux.cc',
-        'audio/linux/audio_manager_linux.h',
-        'audio/linux/alsa_input.cc',
-        'audio/linux/alsa_input.h',
-        'audio/linux/alsa_output.cc',
-        'audio/linux/alsa_output.h',
-        'audio/linux/alsa_util.cc',
-        'audio/linux/alsa_util.h',
-        'audio/linux/alsa_wrapper.cc',
-        'audio/linux/alsa_wrapper.h',
-        'audio/linux/cras_output.cc',
-        'audio/linux/cras_output.h',
-        'audio/openbsd/audio_manager_openbsd.cc',
-        'audio/openbsd/audio_manager_openbsd.h',
-        'audio/mac/audio_input_mac.cc',
-        'audio/mac/audio_input_mac.h',
-        'audio/mac/audio_low_latency_input_mac.cc',
-        'audio/mac/audio_low_latency_input_mac.h',
-        'audio/mac/audio_low_latency_output_mac.cc',
-        'audio/mac/audio_low_latency_output_mac.h',
-        'audio/mac/audio_manager_mac.cc',
-        'audio/mac/audio_manager_mac.h',
-        'audio/mac/audio_output_mac.cc',
-        'audio/mac/audio_output_mac.h',
-        'audio/null_audio_sink.cc',
-        'audio/null_audio_sink.h',
-        'audio/pulse/pulse_output.cc',
-        'audio/pulse/pulse_output.h',
-        'audio/sample_rates.cc',
-        'audio/sample_rates.h',
-        'audio/simple_sources.cc',
-        'audio/simple_sources.h',
-        'audio/win/audio_low_latency_input_win.cc',
-        'audio/win/audio_low_latency_input_win.h',
-        'audio/win/audio_low_latency_output_win.cc',
-        'audio/win/audio_low_latency_output_win.h',
-        'audio/win/audio_manager_win.cc',
-        'audio/win/audio_manager_win.h',
-        'audio/win/avrt_wrapper_win.cc',
-        'audio/win/avrt_wrapper_win.h',
-        'audio/win/device_enumeration_win.cc',
-        'audio/win/device_enumeration_win.h',
-        'audio/win/wavein_input_win.cc',
-        'audio/win/wavein_input_win.h',
-        'audio/win/waveout_output_win.cc',
-        'audio/win/waveout_output_win.h',
-        'base/android/media_jni_registrar.cc',
-        'base/android/media_jni_registrar.h',
-        'base/audio_decoder.cc',
-        'base/audio_decoder.h',
-        'base/audio_decoder_config.cc',
-        'base/audio_decoder_config.h',
-        'base/audio_renderer.h',
-        'base/audio_renderer_mixer.cc',
-        'base/audio_renderer_mixer.h',
-        'base/audio_renderer_mixer_input.cc',
-        'base/audio_renderer_mixer_input.h',
-        'base/bitstream_buffer.h',
-        'base/buffers.cc',
-        'base/buffers.h',
-        'base/byte_queue.cc',
-        'base/byte_queue.h',
-        'base/channel_layout.cc',
-        'base/channel_layout.h',
-        'base/clock.cc',
-        'base/clock.h',
-        'base/composite_filter.cc',
-        'base/composite_filter.h',
-        'base/data_buffer.cc',
-        'base/data_buffer.h',
-        'base/data_source.cc',
-        'base/data_source.h',
-        'base/decoder_buffer.cc',
-        'base/decoder_buffer.h',
-        'base/decrypt_config.cc',
-        'base/decrypt_config.h',
-        'base/decryptor.h',
-        'base/decryptor_client.h',
-        'base/demuxer.cc',
-        'base/demuxer.h',
-        'base/demuxer_stream.cc',
-        'base/demuxer_stream.h',
-        'base/djb2.cc',
-        'base/djb2.h',
-        'base/filter_collection.cc',
-        'base/filter_collection.h',
-        'base/filter_host.h',
-        'base/filters.cc',
-        'base/filters.h',
-        'base/h264_bitstream_converter.cc',
-        'base/h264_bitstream_converter.h',
-        'base/media.h',
-        'base/media_android.cc',
-        'base/media_export.h',
-        'base/media_log.cc',
-        'base/media_log.h',
-        'base/media_log_event.h',
-        'base/media_posix.cc',
-        'base/media_switches.cc',
-        'base/media_switches.h',
-        'base/media_win.cc',
-        'base/message_loop_factory.cc',
-        'base/message_loop_factory.h',
-        'base/pipeline.cc',
-        'base/pipeline.h',
-        'base/pipeline_status.cc',
-        'base/pipeline_status.h',
-        'base/ranges.cc',
-        'base/ranges.h',
-        'base/seekable_buffer.cc',
-        'base/seekable_buffer.h',
-        'base/state_matrix.cc',
-        'base/state_matrix.h',
-        'base/stream_parser.cc',
-        'base/stream_parser.h',
-        'base/stream_parser_buffer.cc',
-        'base/stream_parser_buffer.h',
-        'base/video_decoder.cc',
-        'base/video_decoder.h',
-        'base/video_decoder_config.cc',
-        'base/video_decoder_config.h',
-        'base/video_frame.cc',
-        'base/video_frame.h',
-        'base/video_renderer.h',
-        'base/video_util.cc',
-        'base/video_util.h',
-        'crypto/aes_decryptor.cc',
-        'crypto/aes_decryptor.h',
-        'ffmpeg/ffmpeg_common.cc',
-        'ffmpeg/ffmpeg_common.h',
-        'ffmpeg/file_protocol.cc',
-        'ffmpeg/file_protocol.h',
-        'filters/audio_file_reader.cc',
-        'filters/audio_file_reader.h',
-        'filters/audio_renderer_algorithm.cc',
-        'filters/audio_renderer_algorithm.h',
-        'filters/audio_renderer_impl.cc',
-        'filters/audio_renderer_impl.h',
-        'filters/bitstream_converter.cc',
-        'filters/bitstream_converter.h',
-        'filters/chunk_demuxer.cc',
-        'filters/chunk_demuxer.h',
-        'filters/chunk_demuxer_client.h',
-        'filters/dummy_demuxer.cc',
-        'filters/dummy_demuxer.h',
-        'filters/ffmpeg_audio_decoder.cc',
-        'filters/ffmpeg_audio_decoder.h',
-        'filters/ffmpeg_demuxer.cc',
-        'filters/ffmpeg_demuxer.h',
-        'filters/ffmpeg_h264_bitstream_converter.cc',
-        'filters/ffmpeg_h264_bitstream_converter.h',
-        'filters/ffmpeg_glue.cc',
-        'filters/ffmpeg_glue.h',
-        'filters/ffmpeg_video_decoder.cc',
-        'filters/ffmpeg_video_decoder.h',
-        'filters/file_data_source.cc',
-        'filters/file_data_source.h',
-        'filters/gpu_video_decoder.cc',
-        'filters/gpu_video_decoder.h',
-        'filters/in_memory_url_protocol.cc',
-        'filters/in_memory_url_protocol.h',
-        'filters/source_buffer_stream.cc',
-        'filters/source_buffer_stream.h',
-        'filters/video_frame_generator.cc',
-        'filters/video_frame_generator.h',
-        'filters/video_renderer_base.cc',
-        'filters/video_renderer_base.h',
-        'video/capture/fake_video_capture_device.cc',
-        'video/capture/fake_video_capture_device.h',
-        'video/capture/linux/video_capture_device_linux.cc',
-        'video/capture/linux/video_capture_device_linux.h',
-        'video/capture/mac/video_capture_device_mac.h',
-        'video/capture/mac/video_capture_device_mac.mm',
-        'video/capture/mac/video_capture_device_qtkit_mac.h',
-        'video/capture/mac/video_capture_device_qtkit_mac.mm',
-        'video/capture/video_capture.h',
-        'video/capture/video_capture_device.h',
-        'video/capture/video_capture_device_dummy.cc',
-        'video/capture/video_capture_device_dummy.h',
-        'video/capture/video_capture_proxy.cc',
-        'video/capture/video_capture_proxy.h',
-        'video/capture/video_capture_types.h',
-        'video/capture/win/filter_base_win.cc',
-        'video/capture/win/filter_base_win.h',
-        'video/capture/win/pin_base_win.cc',
-        'video/capture/win/pin_base_win.h',
-        'video/capture/win/sink_filter_observer_win.h',
-        'video/capture/win/sink_filter_win.cc',
-        'video/capture/win/sink_filter_win.h',
-        'video/capture/win/sink_input_pin_win.cc',
-        'video/capture/win/sink_input_pin_win.h',
-        'video/capture/win/video_capture_device_win.cc',
-        'video/capture/win/video_capture_device_win.h',
-        'video/picture.cc',
-        'video/picture.h',
-        'video/video_decode_accelerator.cc',
-        'video/video_decode_accelerator.h',
-        'webm/webm_constants.h',
-        'webm/webm_cluster_parser.cc',
-        'webm/webm_cluster_parser.h',
-        'webm/webm_content_encodings.cc',
-        'webm/webm_content_encodings.h',
-        'webm/webm_content_encodings_client.cc',
-        'webm/webm_content_encodings_client.h',
-        'webm/webm_info_parser.cc',
-        'webm/webm_info_parser.h',
-        'webm/webm_parser.cc',
-        'webm/webm_parser.h',
-        'webm/webm_stream_parser.cc',
-        'webm/webm_stream_parser.h',
-        'webm/webm_tracks_parser.cc',
-        'webm/webm_tracks_parser.h',
-      ],
-      'direct_dependent_settings': {
-        'include_dirs': [
-          '..',
-        ],
-      },
-      'conditions': [
-        # Android doesn't use ffmpeg, so make the dependency conditional
-        # and exclude the sources which depend on ffmpeg.
-        ['OS != "android"', {
-          'dependencies': [
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-        }],
-        ['OS == "android"', {
-          'sources!': [
-            'base/media_posix.cc',
-            'ffmpeg/ffmpeg_common.cc',
-            'ffmpeg/ffmpeg_common.h',
-            'ffmpeg/file_protocol.cc',
-            'ffmpeg/file_protocol.h',
-            'filters/audio_file_reader.cc',
-            'filters/audio_file_reader.h',
-            'filters/bitstream_converter.cc',
-            'filters/bitstream_converter.h',
-            'filters/chunk_demuxer.cc',
-            'filters/chunk_demuxer.h',
-            'filters/chunk_demuxer_client.h',
-            'filters/ffmpeg_audio_decoder.cc',
-            'filters/ffmpeg_audio_decoder.h',
-            'filters/ffmpeg_demuxer.cc',
-            'filters/ffmpeg_demuxer.h',
-            'filters/ffmpeg_h264_bitstream_converter.cc',
-            'filters/ffmpeg_h264_bitstream_converter.h',
-            'filters/ffmpeg_glue.cc',
-            'filters/ffmpeg_glue.h',
-            'filters/ffmpeg_video_decoder.cc',
-            'filters/ffmpeg_video_decoder.h',
-            'filters/gpu_video_decoder.cc',
-            'filters/gpu_video_decoder.h',
-            'webm/webm_cluster_parser.cc',
-            'webm/webm_cluster_parser.h',
-            'webm/webm_stream_parser.cc',
-            'webm/webm_stream_parser.h',
-          ],
-        }],
-        # The below 'android' condition were added temporarily and should be
-        # removed in downstream, because there is no Java environment setup in
-        # upstream yet.
-        ['OS == "android"', {
-          'sources!':[
-            'audio/android/audio_track_output_android.cc',
-          ],
-          'sources':[
-            'audio/android/audio_track_output_stub_android.cc',
-          ],
-          'link_settings': {
-            'libraries': [
-              '-lOpenSLES',
-            ],
-          },
-        }],
-        ['OS=="linux" or OS=="freebsd" or OS=="solaris"', {
-          'link_settings': {
-            'libraries': [
-              '-lasound',
-            ],
-          },
-        }],
-        ['OS=="openbsd"', {
-          'sources/': [ ['exclude', '/alsa_' ],
-                        ['exclude', '/audio_manager_linux' ] ],
-          'link_settings': {
-            'libraries': [
-            ],
-          },
-        }],
-        ['OS!="openbsd"', {
-          'sources!': [
-            'audio/openbsd/audio_manager_openbsd.cc',
-            'audio/openbsd/audio_manager_openbsd.h',
-          ],
-        }],
-        ['OS=="linux"', {
-          'variables': {
-            'conditions': [
-              ['sysroot!=""', {
-                'pkg-config': '../build/linux/pkg-config-wrapper "<(sysroot)" "<(target_arch)"',
-              }, {
-                'pkg-config': 'pkg-config'
-              }],
-            ],
-          },
-          'conditions': [
-            ['use_cras == 1', {
-              'cflags': [
-                '<!@(<(pkg-config) --cflags libcras)',
-              ],
-              'link_settings': {
-                'libraries': [
-                  '<!@(<(pkg-config) --libs libcras)',
-                ],
-              },
-              'defines': [
-                'USE_CRAS',
-              ],
-            }, {  # else: use_cras == 0
-              'sources!': [
-                'audio/linux/cras_output.cc',
-                'audio/linux/cras_output.h',
-              ],
-            }],
-          ],
-        }],
-        ['os_posix == 1', {
-          'conditions': [
-            ['use_pulseaudio == 1', {
-              'cflags': [
-                '<!@(pkg-config --cflags libpulse)',
-              ],
-              'link_settings': {
-                'libraries': [
-                  '<!@(pkg-config --libs-only-l libpulse)',
-                ],
-              },
-              'defines': [
-                'USE_PULSEAUDIO',
-              ],
-            }, {  # else: use_pulseaudio == 0
-              'sources!': [
-                'audio/pulse/pulse_output.cc',
-                'audio/pulse/pulse_output.h',
-              ],
-            }],
-          ],
-        }],
-        ['os_posix == 1 and OS != "android"', {
-          # Video capture isn't supported in Android yet.
-          'sources!': [
-            'video/capture/video_capture_device_dummy.cc',
-            'video/capture/video_capture_device_dummy.h',
-          ],
-        }],
-        ['OS=="mac"', {
-          'link_settings': {
-            'libraries': [
-              '$(SDKROOT)/System/Library/Frameworks/AudioUnit.framework',
-              '$(SDKROOT)/System/Library/Frameworks/AudioToolbox.framework',
-              '$(SDKROOT)/System/Library/Frameworks/CoreAudio.framework',
-              '$(SDKROOT)/System/Library/Frameworks/CoreVideo.framework',
-              '$(SDKROOT)/System/Library/Frameworks/QTKit.framework',
-            ],
-          },
-        }],
-        ['OS=="win"', {
-          'sources!': [
-            'audio/pulse/pulse_output.cc',
-            'audio/pulse/pulse_output.h',
-            'video/capture/video_capture_device_dummy.cc',
-            'video/capture/video_capture_device_dummy.h',
-          ],
-        }],
-        ['proprietary_codecs==1 or branding=="Chrome"', {
-          'sources': [
-            'mp4/avc.cc',
-            'mp4/avc.h',
-            'mp4/box_definitions.cc',
-            'mp4/box_definitions.h',
-            'mp4/box_reader.cc',
-            'mp4/box_reader.h',
-            'mp4/cenc.cc',
-            'mp4/cenc.h',
-            'mp4/mp4_stream_parser.cc',
-            'mp4/mp4_stream_parser.h',
-            'mp4/offset_byte_queue.cc',
-            'mp4/offset_byte_queue.h',
-            'mp4/track_run_iterator.cc',
-            'mp4/track_run_iterator.h',
-          ],
-        }],
-      ],
-    },
-    {
-      'target_name': 'yuv_convert',
-      'type': 'static_library',
-      'include_dirs': [
-        '..',
-      ],
-      'conditions': [
-        ['order_profiling != 0', {
-          'target_conditions' : [
-            ['_toolset=="target"', {
-              'cflags!': [ '-finstrument-functions' ],
-            }],
-          ],
-        }],
-        [ 'target_arch == "ia32" or target_arch == "x64"', {
-          'dependencies': [
-            'yuv_convert_simd_x86',
-          ],
-        }],
-        [ 'target_arch == "arm"', {
-          'dependencies': [
-            'yuv_convert_simd_arm',
-          ],
-        }],
-      ],
-      'sources': [
-        'base/yuv_convert.cc',
-        'base/yuv_convert.h',
-      ],
-    },
-    {
-      'target_name': 'yuv_convert_simd_x86',
-      'type': 'static_library',
-      'include_dirs': [
-        '..',
-      ],
-      'sources': [
-        'base/simd/convert_rgb_to_yuv_c.cc',
-        'base/simd/convert_rgb_to_yuv_sse2.cc',
-        'base/simd/convert_rgb_to_yuv_ssse3.asm',
-        'base/simd/convert_rgb_to_yuv_ssse3.cc',
-        'base/simd/convert_rgb_to_yuv_ssse3.inc',
-        'base/simd/convert_yuv_to_rgb_c.cc',
-        'base/simd/convert_yuv_to_rgb_x86.cc',
-        'base/simd/convert_yuv_to_rgb_mmx.asm',
-        'base/simd/convert_yuv_to_rgb_mmx.inc',
-        'base/simd/convert_yuv_to_rgb_sse.asm',
-        'base/simd/filter_yuv.h',
-        'base/simd/filter_yuv_c.cc',
-        'base/simd/filter_yuv_mmx.cc',
-        'base/simd/filter_yuv_sse2.cc',
-        'base/simd/linear_scale_yuv_to_rgb_mmx.asm',
-        'base/simd/linear_scale_yuv_to_rgb_mmx.inc',
-        'base/simd/linear_scale_yuv_to_rgb_sse.asm',
-        'base/simd/scale_yuv_to_rgb_mmx.asm',
-        'base/simd/scale_yuv_to_rgb_mmx.inc',
-        'base/simd/scale_yuv_to_rgb_sse.asm',
-        'base/simd/yuv_to_rgb_table.cc',
-        'base/simd/yuv_to_rgb_table.h',
-      ],
-      'conditions': [
-        ['order_profiling != 0', {
-          'target_conditions' : [
-            ['_toolset=="target"', {
-              'cflags!': [ '-finstrument-functions' ],
-            }],
-          ],
-        }],
-        [ 'target_arch == "x64"', {
-          # Source files optimized for X64 systems.
-          'sources': [
-            'base/simd/linear_scale_yuv_to_rgb_mmx_x64.asm',
-            'base/simd/scale_yuv_to_rgb_sse2_x64.asm',
-          ],
-        }],
-        [ 'os_posix == 1 and OS != "mac" and OS != "android"', {
-          'cflags': [
-            '-msse2',
-          ],
-        }],
-        [ 'OS == "mac"', {
-          'configurations': {
-            'Debug': {
-              'xcode_settings': {
-                # gcc on the mac builds horribly unoptimized sse code in debug
-                # mode. Since this is rarely going to be debugged, run with full
-                # optimizations in Debug as well as Release.
-                'GCC_OPTIMIZATION_LEVEL': '3',  # -O3
-               },
-             },
-          },
-        }],
-        [ 'OS=="win"', {
-          'variables': {
-            'yasm_flags': [
-              '-DWIN32',
-              '-DMSVC',
-              '-DCHROMIUM',
-              '-Isimd',
-            ],
-          },
-        }],
-        [ 'OS=="mac"', {
-          'variables': {
-            'yasm_flags': [
-              '-DPREFIX',
-              '-DMACHO',
-              '-DCHROMIUM',
-              '-Isimd',
-            ],
-          },
-        }],
-        [ 'os_posix==1 and OS!="mac"', {
-          'variables': {
-            'conditions': [
-              [ 'target_arch=="ia32"', {
-                'yasm_flags': [
-                  '-DX86_32',
-                  '-DELF',
-                  '-DCHROMIUM',
-                  '-Isimd',
-                ],
-              }, {
-                'yasm_flags': [
-                  '-DARCH_X86_64',
-                  '-DELF',
-                  '-DPIC',
-                  '-DCHROMIUM',
-                  '-Isimd',
-                ],
-              }],
-            ],
-          },
-        }],
-      ],
-      'variables': {
-        'yasm_output_path': '<(SHARED_INTERMEDIATE_DIR)/media',
-      },
-      'msvs_2010_disable_uldi_when_referenced': 1,
-      'includes': [
-        '../third_party/yasm/yasm_compile.gypi',
-      ],
-    },
-    {
-      'target_name': 'yuv_convert_simd_arm',
-      'type': 'static_library',
-      'include_dirs': [
-        '..',
-      ],
-      'sources': [
-        'base/simd/convert_rgb_to_yuv_c.cc',
-        'base/simd/convert_rgb_to_yuv.h',
-        'base/simd/convert_yuv_to_rgb_c.cc',
-        'base/simd/convert_yuv_to_rgb.h',
-        'base/simd/filter_yuv.h',
-        'base/simd/filter_yuv_c.cc',
-        'base/simd/yuv_to_rgb_table.cc',
-        'base/simd/yuv_to_rgb_table.h',
-      ],
-    },
-    {
-      'target_name': 'media_unittests',
-      'type': 'executable',
-      'dependencies': [
-        'media',
-        'media_test_support',
-        'yuv_convert',
-        '../base/base.gyp:base',
-        '../base/base.gyp:base_i18n',
-        '../base/base.gyp:test_support_base',
-        '../testing/gmock.gyp:gmock',
-        '../testing/gtest.gyp:gtest',
-        '../ui/ui.gyp:ui',
-      ],
-      'sources': [
-        'audio/async_socket_io_handler_unittest.cc',
-        'audio/audio_input_controller_unittest.cc',
-        'audio/audio_input_device_unittest.cc',
-        'audio/audio_input_unittest.cc',
-        'audio/audio_input_volume_unittest.cc',
-        'audio/audio_low_latency_input_output_unittest.cc',
-        'audio/audio_output_controller_unittest.cc',
-        'audio/audio_output_proxy_unittest.cc',
-        'audio/audio_parameters_unittest.cc',
-        'audio/audio_util_unittest.cc',
-        'audio/cross_process_notification_unittest.cc',
-        'audio/linux/alsa_output_unittest.cc',
-        'audio/mac/audio_low_latency_input_mac_unittest.cc',
-        'audio/mac/audio_output_mac_unittest.cc',
-        'audio/simple_sources_unittest.cc',
-        'audio/win/audio_low_latency_input_win_unittest.cc',
-        'audio/win/audio_low_latency_output_win_unittest.cc',
-        'audio/win/audio_output_win_unittest.cc',
-        'base/audio_renderer_mixer_unittest.cc',
-        'base/audio_renderer_mixer_input_unittest.cc',
-        'base/buffers_unittest.cc',
-        'base/clock_unittest.cc',
-        'base/composite_filter_unittest.cc',
-        'base/data_buffer_unittest.cc',
-        'base/decoder_buffer_unittest.cc',
-        'base/djb2_unittest.cc',
-        'base/fake_audio_render_callback.cc',
-        'base/fake_audio_render_callback.h',
-        'base/filter_collection_unittest.cc',
-        'base/h264_bitstream_converter_unittest.cc',
-        'base/pipeline_unittest.cc',
-        'base/ranges_unittest.cc',
-        'base/run_all_unittests.cc',
-        'base/seekable_buffer_unittest.cc',
-        'base/state_matrix_unittest.cc',
-        'base/test_data_util.cc',
-        'base/test_data_util.h',
-        'base/video_frame_unittest.cc',
-        'base/video_util_unittest.cc',
-        'base/yuv_convert_unittest.cc',
-        'crypto/aes_decryptor_unittest.cc',
-        'ffmpeg/ffmpeg_common_unittest.cc',
-        'filters/audio_renderer_algorithm_unittest.cc',
-        'filters/audio_renderer_impl_unittest.cc',
-        'filters/bitstream_converter_unittest.cc',
-        'filters/chunk_demuxer_unittest.cc',
-        'filters/ffmpeg_audio_decoder_unittest.cc',
-        'filters/ffmpeg_decoder_unittest.h',
-        'filters/ffmpeg_demuxer_unittest.cc',
-        'filters/ffmpeg_glue_unittest.cc',
-        'filters/ffmpeg_h264_bitstream_converter_unittest.cc',
-        'filters/ffmpeg_video_decoder_unittest.cc',
-        'filters/file_data_source_unittest.cc',
-        'filters/pipeline_integration_test.cc',
-        'filters/pipeline_integration_test_base.cc',
-        'filters/source_buffer_stream_unittest.cc',
-        'filters/video_renderer_base_unittest.cc',
-        'video/capture/video_capture_device_unittest.cc',
-        'webm/cluster_builder.cc',
-        'webm/cluster_builder.h',
-        'webm/webm_cluster_parser_unittest.cc',
-        'webm/webm_content_encodings_client_unittest.cc',
-        'webm/webm_parser_unittest.cc',
-      ],
-      'conditions': [
-        ['os_posix==1 and OS!="mac"', {
-          'conditions': [
-            ['linux_use_tcmalloc==1', {
-              'dependencies': [
-                '../base/allocator/allocator.gyp:allocator',
-              ],
-            }],
-          ],
-        }],
-        ['OS != "android"', {
-          'dependencies': [
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-        }],
-        ['OS == "android"', {
-          'sources!': [
-            'audio/audio_input_volume_unittest.cc',
-            'base/test_data_util.cc',
-            'base/test_data_util.h',
-            'ffmpeg/ffmpeg_common_unittest.cc',
-            'filters/ffmpeg_audio_decoder_unittest.cc',
-            'filters/bitstream_converter_unittest.cc',
-            'filters/chunk_demuxer_unittest.cc',
-            'filters/ffmpeg_demuxer_unittest.cc',
-            'filters/ffmpeg_glue_unittest.cc',
-            'filters/ffmpeg_h264_bitstream_converter_unittest.cc',
-            'filters/ffmpeg_video_decoder_unittest.cc',
-            'filters/pipeline_integration_test.cc',
-            'filters/pipeline_integration_test_base.cc',
-            'mp4/mp4_stream_parser_unittest.cc',
-            'webm/webm_cluster_parser_unittest.cc',
-          ],
-        }],
-        ['OS == "linux"', {
-          'conditions': [
-            ['use_cras == 1', {
-              'sources': [
-                'audio/linux/cras_output_unittest.cc',
-              ],
-              'defines': [
-                'USE_CRAS',
-              ],
-            }],
-          ],
-        }],
-        [ 'target_arch=="ia32" or target_arch=="x64"', {
-          'sources': [
-            'base/simd/convert_rgb_to_yuv_unittest.cc',
-          ],
-        }],
-        ['proprietary_codecs==1 or branding=="Chrome"', {
-          'sources': [
-            'mp4/avc_unittest.cc',
-            'mp4/box_reader_unittest.cc',
-            'mp4/mp4_stream_parser_unittest.cc',
-            'mp4/offset_byte_queue_unittest.cc',
-          ],
-        }],
-      ],
-    },
-    {
-      'target_name': 'media_test_support',
-      'type': 'static_library',
-      'dependencies': [
-        'media',
-        '../base/base.gyp:base',
-        '../testing/gmock.gyp:gmock',
-        '../testing/gtest.gyp:gtest',
-      ],
-      'sources': [
-        'audio/test_audio_input_controller_factory.cc',
-        'audio/test_audio_input_controller_factory.h',
-        'base/mock_callback.cc',
-        'base/mock_callback.h',
-        'base/mock_data_source_host.cc',
-        'base/mock_data_source_host.h',
-        'base/mock_demuxer_host.cc',
-        'base/mock_demuxer_host.h',
-        'base/mock_filter_host.cc',
-        'base/mock_filter_host.h',
-        'base/mock_filters.cc',
-        'base/mock_filters.h',
-      ],
-    },
-    {
-      'target_name': 'scaler_bench',
-      'type': 'executable',
-      'dependencies': [
-        'media',
-        'yuv_convert',
-        '../base/base.gyp:base',
-        '../skia/skia.gyp:skia',
-      ],
-      'sources': [
-        'tools/scaler_bench/scaler_bench.cc',
-      ],
-    },
-    {
-      'target_name': 'qt_faststart',
-      'type': 'executable',
-      'sources': [
-        'tools/qt_faststart/qt_faststart.c'
-      ],
-    },
-    {
-      'target_name': 'seek_tester',
-      'type': 'executable',
-      'dependencies': [
-        'media',
-        '../base/base.gyp:base',
-      ],
-      'sources': [
-        'tools/seek_tester/seek_tester.cc',
-      ],
-    },
-  ],
-  'conditions': [
-    ['OS=="win"', {
-      'targets': [
-        {
-          'target_name': 'player_wtl',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'yuv_convert',
-            '../base/base.gyp:base',
-            '../base/third_party/dynamic_annotations/dynamic_annotations.gyp:dynamic_annotations',
-            '../ui/ui.gyp:ui',
-          ],
-          'include_dirs': [
-            '<(DEPTH)/third_party/wtl/include',
-          ],
-          'sources': [
-            'tools/player_wtl/list.h',
-            'tools/player_wtl/mainfrm.h',
-            'tools/player_wtl/movie.cc',
-            'tools/player_wtl/movie.h',
-            'tools/player_wtl/player_wtl.cc',
-            'tools/player_wtl/player_wtl.rc',
-            'tools/player_wtl/props.h',
-            'tools/player_wtl/seek.h',
-            'tools/player_wtl/resource.h',
-            'tools/player_wtl/view.h',
-          ],
-          'msvs_settings': {
-            'VCLinkerTool': {
-              'SubSystem': '2',         # Set /SUBSYSTEM:WINDOWS
-            },
-          },
-          'defines': [
-            '_CRT_SECURE_NO_WARNINGS=1',
-          ],
-        },
-      ],
-    }],
-    ['OS == "win" or toolkit_uses_gtk == 1', {
-      'targets': [
-        {
-          'target_name': 'shader_bench',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'yuv_convert',
-            '../base/base.gyp:base',
-            '../ui/gl/gl.gyp:gl',
-          ],
-          'sources': [
-            'tools/shader_bench/shader_bench.cc',
-            'tools/shader_bench/cpu_color_painter.cc',
-            'tools/shader_bench/cpu_color_painter.h',
-            'tools/shader_bench/gpu_color_painter.cc',
-            'tools/shader_bench/gpu_color_painter.h',
-            'tools/shader_bench/gpu_painter.cc',
-            'tools/shader_bench/gpu_painter.h',
-            'tools/shader_bench/painter.cc',
-            'tools/shader_bench/painter.h',
-            'tools/shader_bench/window.cc',
-            'tools/shader_bench/window.h',
-          ],
-          'conditions': [
-            ['toolkit_uses_gtk == 1', {
-              'dependencies': [
-                '../build/linux/system.gyp:gtk',
-              ],
-              'sources': [
-                'tools/shader_bench/window_linux.cc',
-              ],
-            }],
-            ['OS=="win"', {
-              'dependencies': [
-                '../third_party/angle/src/build_angle.gyp:libEGL',
-                '../third_party/angle/src/build_angle.gyp:libGLESv2',
-              ],
-              'sources': [
-                'tools/shader_bench/window_win.cc',
-              ],
-            }],
-          ],
-        },
-      ],
-    }],
-    ['OS == "linux" and target_arch != "arm"', {
-      'targets': [
-        {
-          'target_name': 'tile_render_bench',
-          'type': 'executable',
-          'dependencies': [
-            '../base/base.gyp:base',
-            '../ui/gl/gl.gyp:gl',
-          ],
-          'libraries': [
-            '-lGL',
-            '-ldl',
-          ],
-          'sources': [
-            'tools/tile_render_bench/tile_render_bench.cc',
-          ],
-        },
-      ],
-    }],
-    ['os_posix == 1 and OS != "mac" and OS != "android"', {
-      'targets': [
-        {
-          'target_name': 'player_x11',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'yuv_convert',
-            '../base/base.gyp:base',
-            '../ui/gl/gl.gyp:gl',
-          ],
-          'link_settings': {
-            'libraries': [
-              '-ldl',
-              '-lX11',
-              '-lXrender',
-              '-lXext',
-            ],
-          },
-          'sources': [
-            'tools/player_x11/data_source_logger.cc',
-            'tools/player_x11/data_source_logger.h',
-            'tools/player_x11/gl_video_renderer.cc',
-            'tools/player_x11/gl_video_renderer.h',
-            'tools/player_x11/player_x11.cc',
-            'tools/player_x11/x11_video_renderer.cc',
-            'tools/player_x11/x11_video_renderer.h',
-          ],
-        },
-      ],
-    }],
-    ['OS == "android"', {
-      'targets': [
-        {
-          'target_name': 'player_android',
-          'type': 'static_library',
-          'sources': [
-            'base/android/media_player_bridge.cc',
-            'base/android/media_player_bridge.h',
-          ],
-          'dependencies': [
-            '../base/base.gyp:base',
-          ],
-          'include_dirs': [
-            '<(SHARED_INTERMEDIATE_DIR)/media',
-          ],
-          'actions': [
-            {
-              'action_name': 'generate-jni-headers',
-              'inputs': [
-                '../base/android/jni_generator/jni_generator.py',
-                'base/android/java/src/org/chromium/media/MediaPlayerListener.java',
-              ],
-              'outputs': [
-                '<(SHARED_INTERMEDIATE_DIR)/media/jni/media_player_listener_jni.h',
-              ],
-              'action': [
-                'python',
-                '<(DEPTH)/base/android/jni_generator/jni_generator.py',
-                '-o',
-                '<@(_inputs)',
-                '<@(_outputs)',
-              ],
-            },
-          ],
-        },
-        {
-          'target_name': 'media_java',
-          'type': 'none',
-          'dependencies': [ '../base/base.gyp:base_java' ],
-          'variables': {
-            'package_name': 'media',
-            'java_in_dir': 'base/android/java',
-          },
-          'includes': [ '../build/java.gypi' ],
-        },
-
-      ],
-    }, { # OS != "android"'
-      # Android does not use ffmpeg, so disable the targets which require it.
-      'targets': [
-        {
-          'target_name': 'ffmpeg_unittests',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'media_test_support',
-            '../base/base.gyp:base',
-            '../base/base.gyp:base_i18n',
-            '../base/base.gyp:test_support_base',
-            '../base/base.gyp:test_support_perf',
-            '../testing/gtest.gyp:gtest',
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-          'sources': [
-            'ffmpeg/ffmpeg_unittest.cc',
-          ],
-          'conditions': [
-            ['toolkit_uses_gtk == 1', {
-              'dependencies': [
-                # Needed for the following #include chain:
-                #   base/run_all_unittests.cc
-                #   ../base/test_suite.h
-                #   gtk/gtk.h
-                '../build/linux/system.gyp:gtk',
-              ],
-              'conditions': [
-                ['linux_use_tcmalloc==1', {
-                  'dependencies': [
-                    '../base/allocator/allocator.gyp:allocator',
-                  ],
-                }],
-              ],
-            }],
-          ],
-        },
-        {
-          'target_name': 'ffmpeg_regression_tests',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'media_test_support',
-            '../base/base.gyp:test_support_base',
-            '../testing/gmock.gyp:gmock',
-            '../testing/gtest.gyp:gtest',
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-          'sources': [
-            'base/test_data_util.cc',
-            'base/run_all_unittests.cc',
-            'ffmpeg/ffmpeg_regression_tests.cc',
-            'filters/pipeline_integration_test_base.cc',
-          ],
-          'conditions': [
-            ['os_posix==1 and OS!="mac"', {
-              'conditions': [
-                ['linux_use_tcmalloc==1', {
-                  'dependencies': [
-                    '../base/allocator/allocator.gyp:allocator',
-                  ],
-                }],
-              ],
-            }],
-          ],
-        },
-        {
-          'target_name': 'ffmpeg_tests',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            '../base/base.gyp:base',
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-          'sources': [
-            'test/ffmpeg_tests/ffmpeg_tests.cc',
-          ],
-        },
-        {
-          'target_name': 'media_bench',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            '../base/base.gyp:base',
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-          'sources': [
-            'tools/media_bench/media_bench.cc',
-          ],
-        },
-      ],
-    }]
-  ],
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/testdata/media.gyp.fontified	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1107 +0,0 @@
-
-#("# Copyright (c) 2012 The Chromium Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-{
-  'variables': {
-    'chromium_code': 1,
-    # Override to dynamically link the PulseAudio library.
-    'use_pulseaudio%': 0,
-    # Override to dynamically link the cras (ChromeOS audio) library.
-    'use_cras%': 0,
-  },
-  'targets': [
-    {
-      'target_name': 'media',
-      'type': '<(component)',
-      'dependencies': [
-        'yuv_convert',
-        '../base/base.gyp:base',
-        '../base/third_party/dynamic_annotations/dynamic_annotations.gyp:dynamic_annotations',
-        '../build/temp_gyp/googleurl.gyp:googleurl',
-        '../crypto/crypto.gyp:crypto',
-        '../third_party/openmax/openmax.gyp:il',
-        '../ui/ui.gyp:ui',
-      ],
-      'defines': [
-        'MEDIA_IMPLEMENTATION',
-      ],
-      'include_dirs': [
-        '..',
-      ],
-      'sources': [
-        'audio/android/audio_manager_android.cc',
-        'audio/android/audio_manager_android.h',
-        'audio/android/audio_track_output_android.cc',
-        'audio/android/audio_track_output_android.h',
-        'audio/android/opensles_input.cc',
-        'audio/android/opensles_input.h',
-        'audio/android/opensles_output.cc',
-        'audio/android/opensles_output.h',
-        'audio/async_socket_io_handler.h',
-        'audio/async_socket_io_handler_posix.cc',
-        'audio/async_socket_io_handler_win.cc',
-        'audio/audio_buffers_state.cc',
-        'audio/audio_buffers_state.h',
-        'audio/audio_io.h',
-        'audio/audio_input_controller.cc',
-        'audio/audio_input_controller.h',
-        'audio/audio_input_stream_impl.cc',
-        'audio/audio_input_stream_impl.h',
-        'audio/audio_device_name.cc',
-        'audio/audio_device_name.h',
-        'audio/audio_manager.cc',
-        'audio/audio_manager.h',
-        'audio/audio_manager_base.cc',
-        'audio/audio_manager_base.h',
-        'audio/audio_output_controller.cc',
-        'audio/audio_output_controller.h',
-        'audio/audio_output_dispatcher.cc',
-        'audio/audio_output_dispatcher.h',
-        'audio/audio_output_dispatcher_impl.cc',
-        'audio/audio_output_dispatcher_impl.h',
-        'audio/audio_output_mixer.cc',
-        'audio/audio_output_mixer.h',
-        'audio/audio_output_proxy.cc',
-        'audio/audio_output_proxy.h',
-        'audio/audio_parameters.cc',
-        'audio/audio_parameters.h',
-        'audio/audio_util.cc',
-        'audio/audio_util.h',
-        'audio/cross_process_notification.cc',
-        'audio/cross_process_notification.h',
-        'audio/cross_process_notification_win.cc',
-        'audio/cross_process_notification_posix.cc',
-        'audio/fake_audio_input_stream.cc',
-        'audio/fake_audio_input_stream.h',
-        'audio/fake_audio_output_stream.cc',
-        'audio/fake_audio_output_stream.h',
-        'audio/linux/audio_manager_linux.cc',
-        'audio/linux/audio_manager_linux.h',
-        'audio/linux/alsa_input.cc',
-        'audio/linux/alsa_input.h',
-        'audio/linux/alsa_output.cc',
-        'audio/linux/alsa_output.h',
-        'audio/linux/alsa_util.cc',
-        'audio/linux/alsa_util.h',
-        'audio/linux/alsa_wrapper.cc',
-        'audio/linux/alsa_wrapper.h',
-        'audio/linux/cras_output.cc',
-        'audio/linux/cras_output.h',
-        'audio/openbsd/audio_manager_openbsd.cc',
-        'audio/openbsd/audio_manager_openbsd.h',
-        'audio/mac/audio_input_mac.cc',
-        'audio/mac/audio_input_mac.h',
-        'audio/mac/audio_low_latency_input_mac.cc',
-        'audio/mac/audio_low_latency_input_mac.h',
-        'audio/mac/audio_low_latency_output_mac.cc',
-        'audio/mac/audio_low_latency_output_mac.h',
-        'audio/mac/audio_manager_mac.cc',
-        'audio/mac/audio_manager_mac.h',
-        'audio/mac/audio_output_mac.cc',
-        'audio/mac/audio_output_mac.h',
-        'audio/null_audio_sink.cc',
-        'audio/null_audio_sink.h',
-        'audio/pulse/pulse_output.cc',
-        'audio/pulse/pulse_output.h',
-        'audio/sample_rates.cc',
-        'audio/sample_rates.h',
-        'audio/simple_sources.cc',
-        'audio/simple_sources.h',
-        'audio/win/audio_low_latency_input_win.cc',
-        'audio/win/audio_low_latency_input_win.h',
-        'audio/win/audio_low_latency_output_win.cc',
-        'audio/win/audio_low_latency_output_win.h',
-        'audio/win/audio_manager_win.cc',
-        'audio/win/audio_manager_win.h',
-        'audio/win/avrt_wrapper_win.cc',
-        'audio/win/avrt_wrapper_win.h',
-        'audio/win/device_enumeration_win.cc',
-        'audio/win/device_enumeration_win.h',
-        'audio/win/wavein_input_win.cc',
-        'audio/win/wavein_input_win.h',
-        'audio/win/waveout_output_win.cc',
-        'audio/win/waveout_output_win.h',
-        'base/android/media_jni_registrar.cc',
-        'base/android/media_jni_registrar.h',
-        'base/audio_decoder.cc',
-        'base/audio_decoder.h',
-        'base/audio_decoder_config.cc',
-        'base/audio_decoder_config.h',
-        'base/audio_renderer.h',
-        'base/audio_renderer_mixer.cc',
-        'base/audio_renderer_mixer.h',
-        'base/audio_renderer_mixer_input.cc',
-        'base/audio_renderer_mixer_input.h',
-        'base/bitstream_buffer.h',
-        'base/buffers.cc',
-        'base/buffers.h',
-        'base/byte_queue.cc',
-        'base/byte_queue.h',
-        'base/channel_layout.cc',
-        'base/channel_layout.h',
-        'base/clock.cc',
-        'base/clock.h',
-        'base/composite_filter.cc',
-        'base/composite_filter.h',
-        'base/data_buffer.cc',
-        'base/data_buffer.h',
-        'base/data_source.cc',
-        'base/data_source.h',
-        'base/decoder_buffer.cc',
-        'base/decoder_buffer.h',
-        'base/decrypt_config.cc',
-        'base/decrypt_config.h',
-        'base/decryptor.h',
-        'base/decryptor_client.h',
-        'base/demuxer.cc',
-        'base/demuxer.h',
-        'base/demuxer_stream.cc',
-        'base/demuxer_stream.h',
-        'base/djb2.cc',
-        'base/djb2.h',
-        'base/filter_collection.cc',
-        'base/filter_collection.h',
-        'base/filter_host.h',
-        'base/filters.cc',
-        'base/filters.h',
-        'base/h264_bitstream_converter.cc',
-        'base/h264_bitstream_converter.h',
-        'base/media.h',
-        'base/media_android.cc',
-        'base/media_export.h',
-        'base/media_log.cc',
-        'base/media_log.h',
-        'base/media_log_event.h',
-        'base/media_posix.cc',
-        'base/media_switches.cc',
-        'base/media_switches.h',
-        'base/media_win.cc',
-        'base/message_loop_factory.cc',
-        'base/message_loop_factory.h',
-        'base/pipeline.cc',
-        'base/pipeline.h',
-        'base/pipeline_status.cc',
-        'base/pipeline_status.h',
-        'base/ranges.cc',
-        'base/ranges.h',
-        'base/seekable_buffer.cc',
-        'base/seekable_buffer.h',
-        'base/state_matrix.cc',
-        'base/state_matrix.h',
-        'base/stream_parser.cc',
-        'base/stream_parser.h',
-        'base/stream_parser_buffer.cc',
-        'base/stream_parser_buffer.h',
-        'base/video_decoder.cc',
-        'base/video_decoder.h',
-        'base/video_decoder_config.cc',
-        'base/video_decoder_config.h',
-        'base/video_frame.cc',
-        'base/video_frame.h',
-        'base/video_renderer.h',
-        'base/video_util.cc',
-        'base/video_util.h',
-        'crypto/aes_decryptor.cc',
-        'crypto/aes_decryptor.h',
-        'ffmpeg/ffmpeg_common.cc',
-        'ffmpeg/ffmpeg_common.h',
-        'ffmpeg/file_protocol.cc',
-        'ffmpeg/file_protocol.h',
-        'filters/audio_file_reader.cc',
-        'filters/audio_file_reader.h',
-        'filters/audio_renderer_algorithm.cc',
-        'filters/audio_renderer_algorithm.h',
-        'filters/audio_renderer_impl.cc',
-        'filters/audio_renderer_impl.h',
-        'filters/bitstream_converter.cc',
-        'filters/bitstream_converter.h',
-        'filters/chunk_demuxer.cc',
-        'filters/chunk_demuxer.h',
-        'filters/chunk_demuxer_client.h',
-        'filters/dummy_demuxer.cc',
-        'filters/dummy_demuxer.h',
-        'filters/ffmpeg_audio_decoder.cc',
-        'filters/ffmpeg_audio_decoder.h',
-        'filters/ffmpeg_demuxer.cc',
-        'filters/ffmpeg_demuxer.h',
-        'filters/ffmpeg_h264_bitstream_converter.cc',
-        'filters/ffmpeg_h264_bitstream_converter.h',
-        'filters/ffmpeg_glue.cc',
-        'filters/ffmpeg_glue.h',
-        'filters/ffmpeg_video_decoder.cc',
-        'filters/ffmpeg_video_decoder.h',
-        'filters/file_data_source.cc',
-        'filters/file_data_source.h',
-        'filters/gpu_video_decoder.cc',
-        'filters/gpu_video_decoder.h',
-        'filters/in_memory_url_protocol.cc',
-        'filters/in_memory_url_protocol.h',
-        'filters/source_buffer_stream.cc',
-        'filters/source_buffer_stream.h',
-        'filters/video_frame_generator.cc',
-        'filters/video_frame_generator.h',
-        'filters/video_renderer_base.cc',
-        'filters/video_renderer_base.h',
-        'video/capture/fake_video_capture_device.cc',
-        'video/capture/fake_video_capture_device.h',
-        'video/capture/linux/video_capture_device_linux.cc',
-        'video/capture/linux/video_capture_device_linux.h',
-        'video/capture/mac/video_capture_device_mac.h',
-        'video/capture/mac/video_capture_device_mac.mm',
-        'video/capture/mac/video_capture_device_qtkit_mac.h',
-        'video/capture/mac/video_capture_device_qtkit_mac.mm',
-        'video/capture/video_capture.h',
-        'video/capture/video_capture_device.h',
-        'video/capture/video_capture_device_dummy.cc',
-        'video/capture/video_capture_device_dummy.h',
-        'video/capture/video_capture_proxy.cc',
-        'video/capture/video_capture_proxy.h',
-        'video/capture/video_capture_types.h',
-        'video/capture/win/filter_base_win.cc',
-        'video/capture/win/filter_base_win.h',
-        'video/capture/win/pin_base_win.cc',
-        'video/capture/win/pin_base_win.h',
-        'video/capture/win/sink_filter_observer_win.h',
-        'video/capture/win/sink_filter_win.cc',
-        'video/capture/win/sink_filter_win.h',
-        'video/capture/win/sink_input_pin_win.cc',
-        'video/capture/win/sink_input_pin_win.h',
-        'video/capture/win/video_capture_device_win.cc',
-        'video/capture/win/video_capture_device_win.h',
-        'video/picture.cc',
-        'video/picture.h',
-        'video/video_decode_accelerator.cc',
-        'video/video_decode_accelerator.h',
-        'webm/webm_constants.h',
-        'webm/webm_cluster_parser.cc',
-        'webm/webm_cluster_parser.h',
-        'webm/webm_content_encodings.cc',
-        'webm/webm_content_encodings.h',
-        'webm/webm_content_encodings_client.cc',
-        'webm/webm_content_encodings_client.h',
-        'webm/webm_info_parser.cc',
-        'webm/webm_info_parser.h',
-        'webm/webm_parser.cc',
-        'webm/webm_parser.h',
-        'webm/webm_stream_parser.cc',
-        'webm/webm_stream_parser.h',
-        'webm/webm_tracks_parser.cc',
-        'webm/webm_tracks_parser.h',
-      ],
-      'direct_dependent_settings': {
-        'include_dirs': [
-          '..',
-        ],
-      },
-      'conditions': [
-        # Android doesn't use ffmpeg, so make the dependency conditional
-        # and exclude the sources which depend on ffmpeg.
-        ['OS != \"android\"', {
-          'dependencies': [
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-        }],
-        ['OS == \"android\"', {
-          'sources!': [
-            'base/media_posix.cc',
-            'ffmpeg/ffmpeg_common.cc',
-            'ffmpeg/ffmpeg_common.h',
-            'ffmpeg/file_protocol.cc',
-            'ffmpeg/file_protocol.h',
-            'filters/audio_file_reader.cc',
-            'filters/audio_file_reader.h',
-            'filters/bitstream_converter.cc',
-            'filters/bitstream_converter.h',
-            'filters/chunk_demuxer.cc',
-            'filters/chunk_demuxer.h',
-            'filters/chunk_demuxer_client.h',
-            'filters/ffmpeg_audio_decoder.cc',
-            'filters/ffmpeg_audio_decoder.h',
-            'filters/ffmpeg_demuxer.cc',
-            'filters/ffmpeg_demuxer.h',
-            'filters/ffmpeg_h264_bitstream_converter.cc',
-            'filters/ffmpeg_h264_bitstream_converter.h',
-            'filters/ffmpeg_glue.cc',
-            'filters/ffmpeg_glue.h',
-            'filters/ffmpeg_video_decoder.cc',
-            'filters/ffmpeg_video_decoder.h',
-            'filters/gpu_video_decoder.cc',
-            'filters/gpu_video_decoder.h',
-            'webm/webm_cluster_parser.cc',
-            'webm/webm_cluster_parser.h',
-            'webm/webm_stream_parser.cc',
-            'webm/webm_stream_parser.h',
-          ],
-        }],
-        # The below 'android' condition were added temporarily and should be
-        # removed in downstream, because there is no Java environment setup in
-        # upstream yet.
-        ['OS == \"android\"', {
-          'sources!':[
-            'audio/android/audio_track_output_android.cc',
-          ],
-          'sources':[
-            'audio/android/audio_track_output_stub_android.cc',
-          ],
-          'link_settings': {
-            'libraries': [
-              '-lOpenSLES',
-            ],
-          },
-        }],
-        ['OS==\"linux\" or OS==\"freebsd\" or OS==\"solaris\"', {
-          'link_settings': {
-            'libraries': [
-              '-lasound',
-            ],
-          },
-        }],
-        ['OS==\"openbsd\"', {
-          'sources/': [ ['exclude', '/alsa_' ],
-                        ['exclude', '/audio_manager_linux' ] ],
-          'link_settings': {
-            'libraries': [
-            ],
-          },
-        }],
-        ['OS!=\"openbsd\"', {
-          'sources!': [
-            'audio/openbsd/audio_manager_openbsd.cc',
-            'audio/openbsd/audio_manager_openbsd.h',
-          ],
-        }],
-        ['OS==\"linux\"', {
-          'variables': {
-            'conditions': [
-              ['sysroot!=\"\"', {
-                'pkg-config': '../build/linux/pkg-config-wrapper \"<(sysroot)\" \"<(target_arch)\"',
-              }, {
-                'pkg-config': 'pkg-config'
-              }],
-            ],
-          },
-          'conditions': [
-            ['use_cras == 1', {
-              'cflags': [
-                '<!@(<(pkg-config) --cflags libcras)',
-              ],
-              'link_settings': {
-                'libraries': [
-                  '<!@(<(pkg-config) --libs libcras)',
-                ],
-              },
-              'defines': [
-                'USE_CRAS',
-              ],
-            }, {  # else: use_cras == 0
-              'sources!': [
-                'audio/linux/cras_output.cc',
-                'audio/linux/cras_output.h',
-              ],
-            }],
-          ],
-        }],
-        ['os_posix == 1', {
-          'conditions': [
-            ['use_pulseaudio == 1', {
-              'cflags': [
-                '<!@(pkg-config --cflags libpulse)',
-              ],
-              'link_settings': {
-                'libraries': [
-                  '<!@(pkg-config --libs-only-l libpulse)',
-                ],
-              },
-              'defines': [
-                'USE_PULSEAUDIO',
-              ],
-            }, {  # else: use_pulseaudio == 0
-              'sources!': [
-                'audio/pulse/pulse_output.cc',
-                'audio/pulse/pulse_output.h',
-              ],
-            }],
-          ],
-        }],
-        ['os_posix == 1 and OS != \"android\"', {
-          # Video capture isn't supported in Android yet.
-          'sources!': [
-            'video/capture/video_capture_device_dummy.cc',
-            'video/capture/video_capture_device_dummy.h',
-          ],
-        }],
-        ['OS==\"mac\"', {
-          'link_settings': {
-            'libraries': [
-              '$(SDKROOT)/System/Library/Frameworks/AudioUnit.framework',
-              '$(SDKROOT)/System/Library/Frameworks/AudioToolbox.framework',
-              '$(SDKROOT)/System/Library/Frameworks/CoreAudio.framework',
-              '$(SDKROOT)/System/Library/Frameworks/CoreVideo.framework',
-              '$(SDKROOT)/System/Library/Frameworks/QTKit.framework',
-            ],
-          },
-        }],
-        ['OS==\"win\"', {
-          'sources!': [
-            'audio/pulse/pulse_output.cc',
-            'audio/pulse/pulse_output.h',
-            'video/capture/video_capture_device_dummy.cc',
-            'video/capture/video_capture_device_dummy.h',
-          ],
-        }],
-        ['proprietary_codecs==1 or branding==\"Chrome\"', {
-          'sources': [
-            'mp4/avc.cc',
-            'mp4/avc.h',
-            'mp4/box_definitions.cc',
-            'mp4/box_definitions.h',
-            'mp4/box_reader.cc',
-            'mp4/box_reader.h',
-            'mp4/cenc.cc',
-            'mp4/cenc.h',
-            'mp4/mp4_stream_parser.cc',
-            'mp4/mp4_stream_parser.h',
-            'mp4/offset_byte_queue.cc',
-            'mp4/offset_byte_queue.h',
-            'mp4/track_run_iterator.cc',
-            'mp4/track_run_iterator.h',
-          ],
-        }],
-      ],
-    },
-    {
-      'target_name': 'yuv_convert',
-      'type': 'static_library',
-      'include_dirs': [
-        '..',
-      ],
-      'conditions': [
-        ['order_profiling != 0', {
-          'target_conditions' : [
-            ['_toolset==\"target\"', {
-              'cflags!': [ '-finstrument-functions' ],
-            }],
-          ],
-        }],
-        [ 'target_arch == \"ia32\" or target_arch == \"x64\"', {
-          'dependencies': [
-            'yuv_convert_simd_x86',
-          ],
-        }],
-        [ 'target_arch == \"arm\"', {
-          'dependencies': [
-            'yuv_convert_simd_arm',
-          ],
-        }],
-      ],
-      'sources': [
-        'base/yuv_convert.cc',
-        'base/yuv_convert.h',
-      ],
-    },
-    {
-      'target_name': 'yuv_convert_simd_x86',
-      'type': 'static_library',
-      'include_dirs': [
-        '..',
-      ],
-      'sources': [
-        'base/simd/convert_rgb_to_yuv_c.cc',
-        'base/simd/convert_rgb_to_yuv_sse2.cc',
-        'base/simd/convert_rgb_to_yuv_ssse3.asm',
-        'base/simd/convert_rgb_to_yuv_ssse3.cc',
-        'base/simd/convert_rgb_to_yuv_ssse3.inc',
-        'base/simd/convert_yuv_to_rgb_c.cc',
-        'base/simd/convert_yuv_to_rgb_x86.cc',
-        'base/simd/convert_yuv_to_rgb_mmx.asm',
-        'base/simd/convert_yuv_to_rgb_mmx.inc',
-        'base/simd/convert_yuv_to_rgb_sse.asm',
-        'base/simd/filter_yuv.h',
-        'base/simd/filter_yuv_c.cc',
-        'base/simd/filter_yuv_mmx.cc',
-        'base/simd/filter_yuv_sse2.cc',
-        'base/simd/linear_scale_yuv_to_rgb_mmx.asm',
-        'base/simd/linear_scale_yuv_to_rgb_mmx.inc',
-        'base/simd/linear_scale_yuv_to_rgb_sse.asm',
-        'base/simd/scale_yuv_to_rgb_mmx.asm',
-        'base/simd/scale_yuv_to_rgb_mmx.inc',
-        'base/simd/scale_yuv_to_rgb_sse.asm',
-        'base/simd/yuv_to_rgb_table.cc',
-        'base/simd/yuv_to_rgb_table.h',
-      ],
-      'conditions': [
-        ['order_profiling != 0', {
-          'target_conditions' : [
-            ['_toolset==\"target\"', {
-              'cflags!': [ '-finstrument-functions' ],
-            }],
-          ],
-        }],
-        [ 'target_arch == \"x64\"', {
-          # Source files optimized for X64 systems.
-          'sources': [
-            'base/simd/linear_scale_yuv_to_rgb_mmx_x64.asm',
-            'base/simd/scale_yuv_to_rgb_sse2_x64.asm',
-          ],
-        }],
-        [ 'os_posix == 1 and OS != \"mac\" and OS != \"android\"', {
-          'cflags': [
-            '-msse2',
-          ],
-        }],
-        [ 'OS == \"mac\"', {
-          'configurations': {
-            'Debug': {
-              'xcode_settings': {
-                # gcc on the mac builds horribly unoptimized sse code in debug
-                # mode. Since this is rarely going to be debugged, run with full
-                # optimizations in Debug as well as Release.
-                'GCC_OPTIMIZATION_LEVEL': '3',  # -O3
-               },
-             },
-          },
-        }],
-        [ 'OS==\"win\"', {
-          'variables': {
-            'yasm_flags': [
-              '-DWIN32',
-              '-DMSVC',
-              '-DCHROMIUM',
-              '-Isimd',
-            ],
-          },
-        }],
-        [ 'OS==\"mac\"', {
-          'variables': {
-            'yasm_flags': [
-              '-DPREFIX',
-              '-DMACHO',
-              '-DCHROMIUM',
-              '-Isimd',
-            ],
-          },
-        }],
-        [ 'os_posix==1 and OS!=\"mac\"', {
-          'variables': {
-            'conditions': [
-              [ 'target_arch==\"ia32\"', {
-                'yasm_flags': [
-                  '-DX86_32',
-                  '-DELF',
-                  '-DCHROMIUM',
-                  '-Isimd',
-                ],
-              }, {
-                'yasm_flags': [
-                  '-DARCH_X86_64',
-                  '-DELF',
-                  '-DPIC',
-                  '-DCHROMIUM',
-                  '-Isimd',
-                ],
-              }],
-            ],
-          },
-        }],
-      ],
-      'variables': {
-        'yasm_output_path': '<(SHARED_INTERMEDIATE_DIR)/media',
-      },
-      'msvs_2010_disable_uldi_when_referenced': 1,
-      'includes': [
-        '../third_party/yasm/yasm_compile.gypi',
-      ],
-    },
-    {
-      'target_name': 'yuv_convert_simd_arm',
-      'type': 'static_library',
-      'include_dirs': [
-        '..',
-      ],
-      'sources': [
-        'base/simd/convert_rgb_to_yuv_c.cc',
-        'base/simd/convert_rgb_to_yuv.h',
-        'base/simd/convert_yuv_to_rgb_c.cc',
-        'base/simd/convert_yuv_to_rgb.h',
-        'base/simd/filter_yuv.h',
-        'base/simd/filter_yuv_c.cc',
-        'base/simd/yuv_to_rgb_table.cc',
-        'base/simd/yuv_to_rgb_table.h',
-      ],
-    },
-    {
-      'target_name': 'media_unittests',
-      'type': 'executable',
-      'dependencies': [
-        'media',
-        'media_test_support',
-        'yuv_convert',
-        '../base/base.gyp:base',
-        '../base/base.gyp:base_i18n',
-        '../base/base.gyp:test_support_base',
-        '../testing/gmock.gyp:gmock',
-        '../testing/gtest.gyp:gtest',
-        '../ui/ui.gyp:ui',
-      ],
-      'sources': [
-        'audio/async_socket_io_handler_unittest.cc',
-        'audio/audio_input_controller_unittest.cc',
-        'audio/audio_input_device_unittest.cc',
-        'audio/audio_input_unittest.cc',
-        'audio/audio_input_volume_unittest.cc',
-        'audio/audio_low_latency_input_output_unittest.cc',
-        'audio/audio_output_controller_unittest.cc',
-        'audio/audio_output_proxy_unittest.cc',
-        'audio/audio_parameters_unittest.cc',
-        'audio/audio_util_unittest.cc',
-        'audio/cross_process_notification_unittest.cc',
-        'audio/linux/alsa_output_unittest.cc',
-        'audio/mac/audio_low_latency_input_mac_unittest.cc',
-        'audio/mac/audio_output_mac_unittest.cc',
-        'audio/simple_sources_unittest.cc',
-        'audio/win/audio_low_latency_input_win_unittest.cc',
-        'audio/win/audio_low_latency_output_win_unittest.cc',
-        'audio/win/audio_output_win_unittest.cc',
-        'base/audio_renderer_mixer_unittest.cc',
-        'base/audio_renderer_mixer_input_unittest.cc',
-        'base/buffers_unittest.cc',
-        'base/clock_unittest.cc',
-        'base/composite_filter_unittest.cc',
-        'base/data_buffer_unittest.cc',
-        'base/decoder_buffer_unittest.cc',
-        'base/djb2_unittest.cc',
-        'base/fake_audio_render_callback.cc',
-        'base/fake_audio_render_callback.h',
-        'base/filter_collection_unittest.cc',
-        'base/h264_bitstream_converter_unittest.cc',
-        'base/pipeline_unittest.cc',
-        'base/ranges_unittest.cc',
-        'base/run_all_unittests.cc',
-        'base/seekable_buffer_unittest.cc',
-        'base/state_matrix_unittest.cc',
-        'base/test_data_util.cc',
-        'base/test_data_util.h',
-        'base/video_frame_unittest.cc',
-        'base/video_util_unittest.cc',
-        'base/yuv_convert_unittest.cc',
-        'crypto/aes_decryptor_unittest.cc',
-        'ffmpeg/ffmpeg_common_unittest.cc',
-        'filters/audio_renderer_algorithm_unittest.cc',
-        'filters/audio_renderer_impl_unittest.cc',
-        'filters/bitstream_converter_unittest.cc',
-        'filters/chunk_demuxer_unittest.cc',
-        'filters/ffmpeg_audio_decoder_unittest.cc',
-        'filters/ffmpeg_decoder_unittest.h',
-        'filters/ffmpeg_demuxer_unittest.cc',
-        'filters/ffmpeg_glue_unittest.cc',
-        'filters/ffmpeg_h264_bitstream_converter_unittest.cc',
-        'filters/ffmpeg_video_decoder_unittest.cc',
-        'filters/file_data_source_unittest.cc',
-        'filters/pipeline_integration_test.cc',
-        'filters/pipeline_integration_test_base.cc',
-        'filters/source_buffer_stream_unittest.cc',
-        'filters/video_renderer_base_unittest.cc',
-        'video/capture/video_capture_device_unittest.cc',
-        'webm/cluster_builder.cc',
-        'webm/cluster_builder.h',
-        'webm/webm_cluster_parser_unittest.cc',
-        'webm/webm_content_encodings_client_unittest.cc',
-        'webm/webm_parser_unittest.cc',
-      ],
-      'conditions': [
-        ['os_posix==1 and OS!=\"mac\"', {
-          'conditions': [
-            ['linux_use_tcmalloc==1', {
-              'dependencies': [
-                '../base/allocator/allocator.gyp:allocator',
-              ],
-            }],
-          ],
-        }],
-        ['OS != \"android\"', {
-          'dependencies': [
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-        }],
-        ['OS == \"android\"', {
-          'sources!': [
-            'audio/audio_input_volume_unittest.cc',
-            'base/test_data_util.cc',
-            'base/test_data_util.h',
-            'ffmpeg/ffmpeg_common_unittest.cc',
-            'filters/ffmpeg_audio_decoder_unittest.cc',
-            'filters/bitstream_converter_unittest.cc',
-            'filters/chunk_demuxer_unittest.cc',
-            'filters/ffmpeg_demuxer_unittest.cc',
-            'filters/ffmpeg_glue_unittest.cc',
-            'filters/ffmpeg_h264_bitstream_converter_unittest.cc',
-            'filters/ffmpeg_video_decoder_unittest.cc',
-            'filters/pipeline_integration_test.cc',
-            'filters/pipeline_integration_test_base.cc',
-            'mp4/mp4_stream_parser_unittest.cc',
-            'webm/webm_cluster_parser_unittest.cc',
-          ],
-        }],
-        ['OS == \"linux\"', {
-          'conditions': [
-            ['use_cras == 1', {
-              'sources': [
-                'audio/linux/cras_output_unittest.cc',
-              ],
-              'defines': [
-                'USE_CRAS',
-              ],
-            }],
-          ],
-        }],
-        [ 'target_arch==\"ia32\" or target_arch==\"x64\"', {
-          'sources': [
-            'base/simd/convert_rgb_to_yuv_unittest.cc',
-          ],
-        }],
-        ['proprietary_codecs==1 or branding==\"Chrome\"', {
-          'sources': [
-            'mp4/avc_unittest.cc',
-            'mp4/box_reader_unittest.cc',
-            'mp4/mp4_stream_parser_unittest.cc',
-            'mp4/offset_byte_queue_unittest.cc',
-          ],
-        }],
-      ],
-    },
-    {
-      'target_name': 'media_test_support',
-      'type': 'static_library',
-      'dependencies': [
-        'media',
-        '../base/base.gyp:base',
-        '../testing/gmock.gyp:gmock',
-        '../testing/gtest.gyp:gtest',
-      ],
-      'sources': [
-        'audio/test_audio_input_controller_factory.cc',
-        'audio/test_audio_input_controller_factory.h',
-        'base/mock_callback.cc',
-        'base/mock_callback.h',
-        'base/mock_data_source_host.cc',
-        'base/mock_data_source_host.h',
-        'base/mock_demuxer_host.cc',
-        'base/mock_demuxer_host.h',
-        'base/mock_filter_host.cc',
-        'base/mock_filter_host.h',
-        'base/mock_filters.cc',
-        'base/mock_filters.h',
-      ],
-    },
-    {
-      'target_name': 'scaler_bench',
-      'type': 'executable',
-      'dependencies': [
-        'media',
-        'yuv_convert',
-        '../base/base.gyp:base',
-        '../skia/skia.gyp:skia',
-      ],
-      'sources': [
-        'tools/scaler_bench/scaler_bench.cc',
-      ],
-    },
-    {
-      'target_name': 'qt_faststart',
-      'type': 'executable',
-      'sources': [
-        'tools/qt_faststart/qt_faststart.c'
-      ],
-    },
-    {
-      'target_name': 'seek_tester',
-      'type': 'executable',
-      'dependencies': [
-        'media',
-        '../base/base.gyp:base',
-      ],
-      'sources': [
-        'tools/seek_tester/seek_tester.cc',
-      ],
-    },
-  ],
-  'conditions': [
-    ['OS==\"win\"', {
-      'targets': [
-        {
-          'target_name': 'player_wtl',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'yuv_convert',
-            '../base/base.gyp:base',
-            '../base/third_party/dynamic_annotations/dynamic_annotations.gyp:dynamic_annotations',
-            '../ui/ui.gyp:ui',
-          ],
-          'include_dirs': [
-            '<(DEPTH)/third_party/wtl/include',
-          ],
-          'sources': [
-            'tools/player_wtl/list.h',
-            'tools/player_wtl/mainfrm.h',
-            'tools/player_wtl/movie.cc',
-            'tools/player_wtl/movie.h',
-            'tools/player_wtl/player_wtl.cc',
-            'tools/player_wtl/player_wtl.rc',
-            'tools/player_wtl/props.h',
-            'tools/player_wtl/seek.h',
-            'tools/player_wtl/resource.h',
-            'tools/player_wtl/view.h',
-          ],
-          'msvs_settings': {
-            'VCLinkerTool': {
-              'SubSystem': '2',         # Set /SUBSYSTEM:WINDOWS
-            },
-          },
-          'defines': [
-            '_CRT_SECURE_NO_WARNINGS=1',
-          ],
-        },
-      ],
-    }],
-    ['OS == \"win\" or toolkit_uses_gtk == 1', {
-      'targets': [
-        {
-          'target_name': 'shader_bench',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'yuv_convert',
-            '../base/base.gyp:base',
-            '../ui/gl/gl.gyp:gl',
-          ],
-          'sources': [
-            'tools/shader_bench/shader_bench.cc',
-            'tools/shader_bench/cpu_color_painter.cc',
-            'tools/shader_bench/cpu_color_painter.h',
-            'tools/shader_bench/gpu_color_painter.cc',
-            'tools/shader_bench/gpu_color_painter.h',
-            'tools/shader_bench/gpu_painter.cc',
-            'tools/shader_bench/gpu_painter.h',
-            'tools/shader_bench/painter.cc',
-            'tools/shader_bench/painter.h',
-            'tools/shader_bench/window.cc',
-            'tools/shader_bench/window.h',
-          ],
-          'conditions': [
-            ['toolkit_uses_gtk == 1', {
-              'dependencies': [
-                '../build/linux/system.gyp:gtk',
-              ],
-              'sources': [
-                'tools/shader_bench/window_linux.cc',
-              ],
-            }],
-            ['OS==\"win\"', {
-              'dependencies': [
-                '../third_party/angle/src/build_angle.gyp:libEGL',
-                '../third_party/angle/src/build_angle.gyp:libGLESv2',
-              ],
-              'sources': [
-                'tools/shader_bench/window_win.cc',
-              ],
-            }],
-          ],
-        },
-      ],
-    }],
-    ['OS == \"linux\" and target_arch != \"arm\"', {
-      'targets': [
-        {
-          'target_name': 'tile_render_bench',
-          'type': 'executable',
-          'dependencies': [
-            '../base/base.gyp:base',
-            '../ui/gl/gl.gyp:gl',
-          ],
-          'libraries': [
-            '-lGL',
-            '-ldl',
-          ],
-          'sources': [
-            'tools/tile_render_bench/tile_render_bench.cc',
-          ],
-        },
-      ],
-    }],
-    ['os_posix == 1 and OS != \"mac\" and OS != \"android\"', {
-      'targets': [
-        {
-          'target_name': 'player_x11',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'yuv_convert',
-            '../base/base.gyp:base',
-            '../ui/gl/gl.gyp:gl',
-          ],
-          'link_settings': {
-            'libraries': [
-              '-ldl',
-              '-lX11',
-              '-lXrender',
-              '-lXext',
-            ],
-          },
-          'sources': [
-            'tools/player_x11/data_source_logger.cc',
-            'tools/player_x11/data_source_logger.h',
-            'tools/player_x11/gl_video_renderer.cc',
-            'tools/player_x11/gl_video_renderer.h',
-            'tools/player_x11/player_x11.cc',
-            'tools/player_x11/x11_video_renderer.cc',
-            'tools/player_x11/x11_video_renderer.h',
-          ],
-        },
-      ],
-    }],
-    ['OS == \"android\"', {
-      'targets': [
-        {
-          'target_name': 'player_android',
-          'type': 'static_library',
-          'sources': [
-            'base/android/media_player_bridge.cc',
-            'base/android/media_player_bridge.h',
-          ],
-          'dependencies': [
-            '../base/base.gyp:base',
-          ],
-          'include_dirs': [
-            '<(SHARED_INTERMEDIATE_DIR)/media',
-          ],
-          'actions': [
-            {
-              'action_name': 'generate-jni-headers',
-              'inputs': [
-                '../base/android/jni_generator/jni_generator.py',
-                'base/android/java/src/org/chromium/media/MediaPlayerListener.java',
-              ],
-              'outputs': [
-                '<(SHARED_INTERMEDIATE_DIR)/media/jni/media_player_listener_jni.h',
-              ],
-              'action': [
-                'python',
-                '<(DEPTH)/base/android/jni_generator/jni_generator.py',
-                '-o',
-                '<@(_inputs)',
-                '<@(_outputs)',
-              ],
-            },
-          ],
-        },
-        {
-          'target_name': 'media_java',
-          'type': 'none',
-          'dependencies': [ '../base/base.gyp:base_java' ],
-          'variables': {
-            'package_name': 'media',
-            'java_in_dir': 'base/android/java',
-          },
-          'includes': [ '../build/java.gypi' ],
-        },
-
-      ],
-    }, { # OS != \"android\"'
-      # Android does not use ffmpeg, so disable the targets which require it.
-      'targets': [
-        {
-          'target_name': 'ffmpeg_unittests',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'media_test_support',
-            '../base/base.gyp:base',
-            '../base/base.gyp:base_i18n',
-            '../base/base.gyp:test_support_base',
-            '../base/base.gyp:test_support_perf',
-            '../testing/gtest.gyp:gtest',
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-          'sources': [
-            'ffmpeg/ffmpeg_unittest.cc',
-          ],
-          'conditions': [
-            ['toolkit_uses_gtk == 1', {
-              'dependencies': [
-                # Needed for the following #include chain:
-                #   base/run_all_unittests.cc
-                #   ../base/test_suite.h
-                #   gtk/gtk.h
-                '../build/linux/system.gyp:gtk',
-              ],
-              'conditions': [
-                ['linux_use_tcmalloc==1', {
-                  'dependencies': [
-                    '../base/allocator/allocator.gyp:allocator',
-                  ],
-                }],
-              ],
-            }],
-          ],
-        },
-        {
-          'target_name': 'ffmpeg_regression_tests',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            'media_test_support',
-            '../base/base.gyp:test_support_base',
-            '../testing/gmock.gyp:gmock',
-            '../testing/gtest.gyp:gtest',
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-          'sources': [
-            'base/test_data_util.cc',
-            'base/run_all_unittests.cc',
-            'ffmpeg/ffmpeg_regression_tests.cc',
-            'filters/pipeline_integration_test_base.cc',
-          ],
-          'conditions': [
-            ['os_posix==1 and OS!=\"mac\"', {
-              'conditions': [
-                ['linux_use_tcmalloc==1', {
-                  'dependencies': [
-                    '../base/allocator/allocator.gyp:allocator',
-                  ],
-                }],
-              ],
-            }],
-          ],
-        },
-        {
-          'target_name': 'ffmpeg_tests',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            '../base/base.gyp:base',
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-          'sources': [
-            'test/ffmpeg_tests/ffmpeg_tests.cc',
-          ],
-        },
-        {
-          'target_name': 'media_bench',
-          'type': 'executable',
-          'dependencies': [
-            'media',
-            '../base/base.gyp:base',
-            '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg',
-          ],
-          'sources': [
-            'tools/media_bench/media_bench.cc',
-          ],
-        },
-      ],
-    }]
-  ],
-}
-" 0 64 (face font-lock-comment-face) 64 137 (face font-lock-comment-face) 137 166 (face font-lock-comment-face) 166 171 nil 171 172 (face font-lock-string-face) 172 181 (face font-lock-keyword-face) 181 182 (face font-lock-string-face) 182 190 nil 190 191 (face font-lock-string-face) 191 204 (face font-lock-variable-name-face) 204 205 (face font-lock-string-face) 205 214 nil 214 269 (face font-lock-comment-face) 269 273 nil 273 274 (face font-lock-string-face) 274 289 (face font-lock-variable-name-face) 289 290 (face font-lock-string-face) 290 299 nil 299 365 (face font-lock-comment-face) 365 369 nil 369 370 (face font-lock-string-face) 370 379 (face font-lock-variable-name-face) 379 380 (face font-lock-string-face) 380 392 nil 392 393 (face font-lock-string-face) 393 400 (face font-lock-keyword-face) 400 401 (face font-lock-string-face) 401 417 nil 417 418 (face font-lock-string-face) 418 429 (face font-lock-keyword-face) 429 430 (face font-lock-string-face) 430 432 nil 432 433 (face font-lock-string-face) 433 438 (face font-lock-function-name-face) 438 439 (face font-lock-string-face) 439 447 nil 447 448 (face font-lock-string-face) 448 452 (face font-lock-keyword-face) 452 453 (face font-lock-string-face) 453 455 nil 455 458 (face font-lock-string-face) 458 467 (face font-lock-variable-name-face) 467 469 (face font-lock-string-face) 469 477 nil 477 478 (face font-lock-string-face) 478 490 (face font-lock-keyword-face) 490 491 (face font-lock-string-face) 491 503 nil 503 504 (face font-lock-string-face) 504 515 (face font-lock-function-name-face) 515 516 (face font-lock-string-face) 516 526 nil 526 527 (face font-lock-string-face) 527 548 (face font-lock-function-name-face) 548 549 (face font-lock-string-face) 549 559 nil 559 560 (face font-lock-string-face) 560 643 (face font-lock-function-name-face) 643 644 (face font-lock-string-face) 644 654 nil 654 655 (face font-lock-string-face) 655 696 (face font-lock-function-name-face) 696 697 (face font-lock-string-face) 697 707 nil 707 708 (face font-lock-string-face) 708 735 (face font-lock-function-name-face) 735 736 (face font-lock-string-face) 736 746 nil 746 747 (face font-lock-string-face) 747 784 (face font-lock-function-name-face) 784 785 (face font-lock-string-face) 785 795 nil 795 796 (face font-lock-string-face) 796 811 (face font-lock-function-name-face) 811 812 (face font-lock-string-face) 812 829 nil 829 830 (face font-lock-string-face) 830 837 (face font-lock-keyword-face) 837 838 (face font-lock-string-face) 838 850 nil 850 851 (face font-lock-string-face) 851 871 (face font-lock-preprocessor-face) 871 872 (face font-lock-string-face) 872 889 nil 889 890 (face font-lock-string-face) 890 902 (face font-lock-keyword-face) 902 903 (face font-lock-string-face) 903 915 nil 915 916 (face font-lock-string-face) 916 918 (face font-lock-constant-face) 918 919 (face font-lock-string-face) 919 936 nil 936 937 (face font-lock-string-face) 937 944 (face font-lock-keyword-face) 944 945 (face font-lock-string-face) 945 957 nil 957 958 (face font-lock-string-face) 958 996 (face font-lock-constant-face) 996 997 (face font-lock-string-face) 997 1007 nil 1007 1008 (face font-lock-string-face) 1008 1045 (face font-lock-constant-face) 1045 1046 (face font-lock-string-face) 1046 1056 nil 1056 1057 (face font-lock-string-face) 1057 1100 (face font-lock-constant-face) 1100 1101 (face font-lock-string-face) 1101 1111 nil 1111 1112 (face font-lock-string-face) 1112 1154 (face font-lock-constant-face) 1154 1155 (face font-lock-string-face) 1155 1165 nil 1165 1166 (face font-lock-string-face) 1166 1197 (face font-lock-constant-face) 1197 1198 (face font-lock-string-face) 1198 1208 nil 1208 1209 (face font-lock-string-face) 1209 1239 (face font-lock-constant-face) 1239 1240 (face font-lock-string-face) 1240 1250 nil 1250 1251 (face font-lock-string-face) 1251 1283 (face font-lock-constant-face) 1283 1284 (face font-lock-string-face) 1284 1294 nil 1294 1295 (face font-lock-string-face) 1295 1326 (face font-lock-constant-face) 1326 1327 (face font-lock-string-face) 1327 1337 nil 1337 1338 (face font-lock-string-face) 1338 1369 (face font-lock-constant-face) 1369 1370 (face font-lock-string-face) 1370 1380 nil 1380 1381 (face font-lock-string-face) 1381 1419 (face font-lock-constant-face) 1419 1420 (face font-lock-string-face) 1420 1430 nil 1430 1431 (face font-lock-string-face) 1431 1467 (face font-lock-constant-face) 1467 1468 (face font-lock-string-face) 1468 1478 nil 1478 1479 (face font-lock-string-face) 1479 1507 (face font-lock-constant-face) 1507 1508 (face font-lock-string-face) 1508 1518 nil 1518 1519 (face font-lock-string-face) 1519 1546 (face font-lock-constant-face) 1546 1547 (face font-lock-string-face) 1547 1557 nil 1557 1558 (face font-lock-string-face) 1558 1574 (face font-lock-constant-face) 1574 1575 (face font-lock-string-face) 1575 1585 nil 1585 1586 (face font-lock-string-face) 1586 1617 (face font-lock-constant-face) 1617 1618 (face font-lock-string-face) 1618 1628 nil 1628 1629 (face font-lock-string-face) 1629 1659 (face font-lock-constant-face) 1659 1660 (face font-lock-string-face) 1660 1670 nil 1670 1671 (face font-lock-string-face) 1671 1703 (face font-lock-constant-face) 1703 1704 (face font-lock-string-face) 1704 1714 nil 1714 1715 (face font-lock-string-face) 1715 1746 (face font-lock-constant-face) 1746 1747 (face font-lock-string-face) 1747 1757 nil 1757 1758 (face font-lock-string-face) 1758 1784 (face font-lock-constant-face) 1784 1785 (face font-lock-string-face) 1785 1795 nil 1795 1796 (face font-lock-string-face) 1796 1821 (face font-lock-constant-face) 1821 1822 (face font-lock-string-face) 1822 1832 nil 1832 1833 (face font-lock-string-face) 1833 1855 (face font-lock-constant-face) 1855 1856 (face font-lock-string-face) 1856 1866 nil 1866 1867 (face font-lock-string-face) 1867 1888 (face font-lock-constant-face) 1888 1889 (face font-lock-string-face) 1889 1899 nil 1899 1900 (face font-lock-string-face) 1900 1927 (face font-lock-constant-face) 1927 1928 (face font-lock-string-face) 1928 1938 nil 1938 1939 (face font-lock-string-face) 1939 1965 (face font-lock-constant-face) 1965 1966 (face font-lock-string-face) 1966 1976 nil 1976 1977 (face font-lock-string-face) 1977 2009 (face font-lock-constant-face) 2009 2010 (face font-lock-string-face) 2010 2020 nil 2020 2021 (face font-lock-string-face) 2021 2052 (face font-lock-constant-face) 2052 2053 (face font-lock-string-face) 2053 2063 nil 2063 2064 (face font-lock-string-face) 2064 2096 (face font-lock-constant-face) 2096 2097 (face font-lock-string-face) 2097 2107 nil 2107 2108 (face font-lock-string-face) 2108 2139 (face font-lock-constant-face) 2139 2140 (face font-lock-string-face) 2140 2150 nil 2150 2151 (face font-lock-string-face) 2151 2188 (face font-lock-constant-face) 2188 2189 (face font-lock-string-face) 2189 2199 nil 2199 2200 (face font-lock-string-face) 2200 2236 (face font-lock-constant-face) 2236 2237 (face font-lock-string-face) 2237 2247 nil 2247 2248 (face font-lock-string-face) 2248 2275 (face font-lock-constant-face) 2275 2276 (face font-lock-string-face) 2276 2286 nil 2286 2287 (face font-lock-string-face) 2287 2313 (face font-lock-constant-face) 2313 2314 (face font-lock-string-face) 2314 2324 nil 2324 2325 (face font-lock-string-face) 2325 2352 (face font-lock-constant-face) 2352 2353 (face font-lock-string-face) 2353 2363 nil 2363 2364 (face font-lock-string-face) 2364 2390 (face font-lock-constant-face) 2390 2391 (face font-lock-string-face) 2391 2401 nil 2401 2402 (face font-lock-string-face) 2402 2427 (face font-lock-constant-face) 2427 2428 (face font-lock-string-face) 2428 2438 nil 2438 2439 (face font-lock-string-face) 2439 2463 (face font-lock-constant-face) 2463 2464 (face font-lock-string-face) 2464 2474 nil 2474 2475 (face font-lock-string-face) 2475 2494 (face font-lock-constant-face) 2494 2495 (face font-lock-string-face) 2495 2505 nil 2505 2506 (face font-lock-string-face) 2506 2524 (face font-lock-constant-face) 2524 2525 (face font-lock-string-face) 2525 2535 nil 2535 2536 (face font-lock-string-face) 2536 2571 (face font-lock-constant-face) 2571 2572 (face font-lock-string-face) 2572 2582 nil 2582 2583 (face font-lock-string-face) 2583 2617 (face font-lock-constant-face) 2617 2618 (face font-lock-string-face) 2618 2628 nil 2628 2629 (face font-lock-string-face) 2629 2668 (face font-lock-constant-face) 2668 2669 (face font-lock-string-face) 2669 2679 nil 2679 2680 (face font-lock-string-face) 2680 2721 (face font-lock-constant-face) 2721 2722 (face font-lock-string-face) 2722 2732 nil 2732 2733 (face font-lock-string-face) 2733 2765 (face font-lock-constant-face) 2765 2766 (face font-lock-string-face) 2766 2776 nil 2776 2777 (face font-lock-string-face) 2777 2808 (face font-lock-constant-face) 2808 2809 (face font-lock-string-face) 2809 2819 nil 2819 2820 (face font-lock-string-face) 2820 2853 (face font-lock-constant-face) 2853 2854 (face font-lock-string-face) 2854 2864 nil 2864 2865 (face font-lock-string-face) 2865 2897 (face font-lock-constant-face) 2897 2898 (face font-lock-string-face) 2898 2908 nil 2908 2909 (face font-lock-string-face) 2909 2943 (face font-lock-constant-face) 2943 2944 (face font-lock-string-face) 2944 2954 nil 2954 2955 (face font-lock-string-face) 2955 2988 (face font-lock-constant-face) 2988 2989 (face font-lock-string-face) 2989 2999 nil 2999 3000 (face font-lock-string-face) 3000 3025 (face font-lock-constant-face) 3025 3026 (face font-lock-string-face) 3026 3036 nil 3036 3037 (face font-lock-string-face) 3037 3061 (face font-lock-constant-face) 3061 3062 (face font-lock-string-face) 3062 3072 nil 3072 3073 (face font-lock-string-face) 3073 3099 (face font-lock-constant-face) 3099 3100 (face font-lock-string-face) 3100 3110 nil 3110 3111 (face font-lock-string-face) 3111 3136 (face font-lock-constant-face) 3136 3137 (face font-lock-string-face) 3137 3147 nil 3147 3148 (face font-lock-string-face) 3148 3172 (face font-lock-constant-face) 3172 3173 (face font-lock-string-face) 3173 3183 nil 3183 3184 (face font-lock-string-face) 3184 3207 (face font-lock-constant-face) 3207 3208 (face font-lock-string-face) 3208 3218 nil 3218 3219 (face font-lock-string-face) 3219 3246 (face font-lock-constant-face) 3246 3247 (face font-lock-string-face) 3247 3257 nil 3257 3258 (face font-lock-string-face) 3258 3284 (face font-lock-constant-face) 3284 3285 (face font-lock-string-face) 3285 3295 nil 3295 3296 (face font-lock-string-face) 3296 3322 (face font-lock-constant-face) 3322 3323 (face font-lock-string-face) 3323 3333 nil 3333 3334 (face font-lock-string-face) 3334 3359 (face font-lock-constant-face) 3359 3360 (face font-lock-string-face) 3360 3370 nil 3370 3371 (face font-lock-string-face) 3371 3409 (face font-lock-constant-face) 3409 3410 (face font-lock-string-face) 3410 3420 nil 3420 3421 (face font-lock-string-face) 3421 3458 (face font-lock-constant-face) 3458 3459 (face font-lock-string-face) 3459 3469 nil 3469 3470 (face font-lock-string-face) 3470 3498 (face font-lock-constant-face) 3498 3499 (face font-lock-string-face) 3499 3509 nil 3509 3510 (face font-lock-string-face) 3510 3537 (face font-lock-constant-face) 3537 3538 (face font-lock-string-face) 3538 3548 nil 3548 3549 (face font-lock-string-face) 3549 3589 (face font-lock-constant-face) 3589 3590 (face font-lock-string-face) 3590 3600 nil 3600 3601 (face font-lock-string-face) 3601 3640 (face font-lock-constant-face) 3640 3641 (face font-lock-string-face) 3641 3651 nil 3651 3652 (face font-lock-string-face) 3652 3693 (face font-lock-constant-face) 3693 3694 (face font-lock-string-face) 3694 3704 nil 3704 3705 (face font-lock-string-face) 3705 3745 (face font-lock-constant-face) 3745 3746 (face font-lock-string-face) 3746 3756 nil 3756 3757 (face font-lock-string-face) 3757 3787 (face font-lock-constant-face) 3787 3788 (face font-lock-string-face) 3788 3798 nil 3798 3799 (face font-lock-string-face) 3799 3828 (face font-lock-constant-face) 3828 3829 (face font-lock-string-face) 3829 3839 nil 3839 3840 (face font-lock-string-face) 3840 3869 (face font-lock-constant-face) 3869 3870 (face font-lock-string-face) 3870 3880 nil 3880 3881 (face font-lock-string-face) 3881 3909 (face font-lock-constant-face) 3909 3910 (face font-lock-string-face) 3910 3920 nil 3920 3921 (face font-lock-string-face) 3921 3945 (face font-lock-constant-face) 3945 3946 (face font-lock-string-face) 3946 3956 nil 3956 3957 (face font-lock-string-face) 3957 3980 (face font-lock-constant-face) 3980 3981 (face font-lock-string-face) 3981 3991 nil 3991 3992 (face font-lock-string-face) 3992 4019 (face font-lock-constant-face) 4019 4020 (face font-lock-string-face) 4020 4030 nil 4030 4031 (face font-lock-string-face) 4031 4057 (face font-lock-constant-face) 4057 4058 (face font-lock-string-face) 4058 4068 nil 4068 4069 (face font-lock-string-face) 4069 4090 (face font-lock-constant-face) 4090 4091 (face font-lock-string-face) 4091 4101 nil 4101 4102 (face font-lock-string-face) 4102 4122 (face font-lock-constant-face) 4122 4123 (face font-lock-string-face) 4123 4133 nil 4133 4134 (face font-lock-string-face) 4134 4157 (face font-lock-constant-face) 4157 4158 (face font-lock-string-face) 4158 4168 nil 4168 4169 (face font-lock-string-face) 4169 4191 (face font-lock-constant-face) 4191 4192 (face font-lock-string-face) 4192 4202 nil 4202 4203 (face font-lock-string-face) 4203 4243 (face font-lock-constant-face) 4243 4244 (face font-lock-string-face) 4244 4254 nil 4254 4255 (face font-lock-string-face) 4255 4294 (face font-lock-constant-face) 4294 4295 (face font-lock-string-face) 4295 4305 nil 4305 4306 (face font-lock-string-face) 4306 4347 (face font-lock-constant-face) 4347 4348 (face font-lock-string-face) 4348 4358 nil 4358 4359 (face font-lock-string-face) 4359 4399 (face font-lock-constant-face) 4399 4400 (face font-lock-string-face) 4400 4410 nil 4410 4411 (face font-lock-string-face) 4411 4441 (face font-lock-constant-face) 4441 4442 (face font-lock-string-face) 4442 4452 nil 4452 4453 (face font-lock-string-face) 4453 4482 (face font-lock-constant-face) 4482 4483 (face font-lock-string-face) 4483 4493 nil 4493 4494 (face font-lock-string-face) 4494 4523 (face font-lock-constant-face) 4523 4524 (face font-lock-string-face) 4524 4534 nil 4534 4535 (face font-lock-string-face) 4535 4563 (face font-lock-constant-face) 4563 4564 (face font-lock-string-face) 4564 4574 nil 4574 4575 (face font-lock-string-face) 4575 4610 (face font-lock-constant-face) 4610 4611 (face font-lock-string-face) 4611 4621 nil 4621 4622 (face font-lock-string-face) 4622 4656 (face font-lock-constant-face) 4656 4657 (face font-lock-string-face) 4657 4667 nil 4667 4668 (face font-lock-string-face) 4668 4697 (face font-lock-constant-face) 4697 4698 (face font-lock-string-face) 4698 4708 nil 4708 4709 (face font-lock-string-face) 4709 4737 (face font-lock-constant-face) 4737 4738 (face font-lock-string-face) 4738 4748 nil 4748 4749 (face font-lock-string-face) 4749 4780 (face font-lock-constant-face) 4780 4781 (face font-lock-string-face) 4781 4791 nil 4791 4792 (face font-lock-string-face) 4792 4822 (face font-lock-constant-face) 4822 4823 (face font-lock-string-face) 4823 4833 nil 4833 4834 (face font-lock-string-face) 4834 4869 (face font-lock-constant-face) 4869 4870 (face font-lock-string-face) 4870 4880 nil 4880 4881 (face font-lock-string-face) 4881 4915 (face font-lock-constant-face) 4915 4916 (face font-lock-string-face) 4916 4926 nil 4926 4927 (face font-lock-string-face) 4927 4948 (face font-lock-constant-face) 4948 4949 (face font-lock-string-face) 4949 4959 nil 4959 4960 (face font-lock-string-face) 4960 4980 (face font-lock-constant-face) 4980 4981 (face font-lock-string-face) 4981 4991 nil 4991 4992 (face font-lock-string-face) 4992 5020 (face font-lock-constant-face) 5020 5021 (face font-lock-string-face) 5021 5031 nil 5031 5032 (face font-lock-string-face) 5032 5059 (face font-lock-constant-face) 5059 5060 (face font-lock-string-face) 5060 5070 nil 5070 5071 (face font-lock-string-face) 5071 5092 (face font-lock-constant-face) 5092 5093 (face font-lock-string-face) 5093 5103 nil 5103 5104 (face font-lock-string-face) 5104 5132 (face font-lock-constant-face) 5132 5133 (face font-lock-string-face) 5133 5143 nil 5143 5144 (face font-lock-string-face) 5144 5171 (face font-lock-constant-face) 5171 5172 (face font-lock-string-face) 5172 5182 nil 5182 5183 (face font-lock-string-face) 5183 5217 (face font-lock-constant-face) 5217 5218 (face font-lock-string-face) 5218 5228 nil 5228 5229 (face font-lock-string-face) 5229 5262 (face font-lock-constant-face) 5262 5263 (face font-lock-string-face) 5263 5273 nil 5273 5274 (face font-lock-string-face) 5274 5297 (face font-lock-constant-face) 5297 5298 (face font-lock-string-face) 5298 5308 nil 5308 5309 (face font-lock-string-face) 5309 5324 (face font-lock-constant-face) 5324 5325 (face font-lock-string-face) 5325 5335 nil 5335 5336 (face font-lock-string-face) 5336 5350 (face font-lock-constant-face) 5350 5351 (face font-lock-string-face) 5351 5361 nil 5361 5362 (face font-lock-string-face) 5362 5380 (face font-lock-constant-face) 5380 5381 (face font-lock-string-face) 5381 5391 nil 5391 5392 (face font-lock-string-face) 5392 5409 (face font-lock-constant-face) 5409 5410 (face font-lock-string-face) 5410 5420 nil 5420 5421 (face font-lock-string-face) 5421 5443 (face font-lock-constant-face) 5443 5444 (face font-lock-string-face) 5444 5454 nil 5454 5455 (face font-lock-string-face) 5455 5476 (face font-lock-constant-face) 5476 5477 (face font-lock-string-face) 5477 5487 nil 5487 5488 (face font-lock-string-face) 5488 5501 (face font-lock-constant-face) 5501 5502 (face font-lock-string-face) 5502 5512 nil 5512 5513 (face font-lock-string-face) 5513 5525 (face font-lock-constant-face) 5525 5526 (face font-lock-string-face) 5526 5536 nil 5536 5537 (face font-lock-string-face) 5537 5561 (face font-lock-constant-face) 5561 5562 (face font-lock-string-face) 5562 5572 nil 5572 5573 (face font-lock-string-face) 5573 5596 (face font-lock-constant-face) 5596 5597 (face font-lock-string-face) 5597 5607 nil 5607 5608 (face font-lock-string-face) 5608 5627 (face font-lock-constant-face) 5627 5628 (face font-lock-string-face) 5628 5638 nil 5638 5639 (face font-lock-string-face) 5639 5657 (face font-lock-constant-face) 5657 5658 (face font-lock-string-face) 5658 5668 nil 5668 5669 (face font-lock-string-face) 5669 5688 (face font-lock-constant-face) 5688 5689 (face font-lock-string-face) 5689 5699 nil 5699 5700 (face font-lock-string-face) 5700 5718 (face font-lock-constant-face) 5718 5719 (face font-lock-string-face) 5719 5729 nil 5729 5730 (face font-lock-string-face) 5730 5752 (face font-lock-constant-face) 5752 5753 (face font-lock-string-face) 5753 5763 nil 5763 5764 (face font-lock-string-face) 5764 5785 (face font-lock-constant-face) 5785 5786 (face font-lock-string-face) 5786 5796 nil 5796 5797 (face font-lock-string-face) 5797 5819 (face font-lock-constant-face) 5819 5820 (face font-lock-string-face) 5820 5830 nil 5830 5831 (face font-lock-string-face) 5831 5852 (face font-lock-constant-face) 5852 5853 (face font-lock-string-face) 5853 5863 nil 5863 5864 (face font-lock-string-face) 5864 5880 (face font-lock-constant-face) 5880 5881 (face font-lock-string-face) 5881 5891 nil 5891 5892 (face font-lock-string-face) 5892 5915 (face font-lock-constant-face) 5915 5916 (face font-lock-string-face) 5916 5926 nil 5926 5927 (face font-lock-string-face) 5927 5942 (face font-lock-constant-face) 5942 5943 (face font-lock-string-face) 5943 5953 nil 5953 5954 (face font-lock-string-face) 5954 5968 (face font-lock-constant-face) 5968 5969 (face font-lock-string-face) 5969 5979 nil 5979 5980 (face font-lock-string-face) 5980 6002 (face font-lock-constant-face) 6002 6003 (face font-lock-string-face) 6003 6013 nil 6013 6014 (face font-lock-string-face) 6014 6035 (face font-lock-constant-face) 6035 6036 (face font-lock-string-face) 6036 6046 nil 6046 6047 (face font-lock-string-face) 6047 6059 (face font-lock-constant-face) 6059 6060 (face font-lock-string-face) 6060 6070 nil 6070 6071 (face font-lock-string-face) 6071 6082 (face font-lock-constant-face) 6082 6083 (face font-lock-string-face) 6083 6093 nil 6093 6094 (face font-lock-string-face) 6094 6119 (face font-lock-constant-face) 6119 6120 (face font-lock-string-face) 6120 6130 nil 6130 6131 (face font-lock-string-face) 6131 6155 (face font-lock-constant-face) 6155 6156 (face font-lock-string-face) 6156 6166 nil 6166 6167 (face font-lock-string-face) 6167 6185 (face font-lock-constant-face) 6185 6186 (face font-lock-string-face) 6186 6196 nil 6196 6197 (face font-lock-string-face) 6197 6212 (face font-lock-constant-face) 6212 6213 (face font-lock-string-face) 6213 6223 nil 6223 6224 (face font-lock-string-face) 6224 6238 (face font-lock-constant-face) 6238 6239 (face font-lock-string-face) 6239 6249 nil 6249 6250 (face font-lock-string-face) 6250 6282 (face font-lock-constant-face) 6282 6283 (face font-lock-string-face) 6283 6293 nil 6293 6294 (face font-lock-string-face) 6294 6325 (face font-lock-constant-face) 6325 6326 (face font-lock-string-face) 6326 6336 nil 6336 6337 (face font-lock-string-face) 6337 6349 (face font-lock-constant-face) 6349 6350 (face font-lock-string-face) 6350 6360 nil 6360 6361 (face font-lock-string-face) 6361 6382 (face font-lock-constant-face) 6382 6383 (face font-lock-string-face) 6383 6393 nil 6393 6394 (face font-lock-string-face) 6394 6413 (face font-lock-constant-face) 6413 6414 (face font-lock-string-face) 6414 6424 nil 6424 6425 (face font-lock-string-face) 6425 6442 (face font-lock-constant-face) 6442 6443 (face font-lock-string-face) 6443 6453 nil 6453 6454 (face font-lock-string-face) 6454 6470 (face font-lock-constant-face) 6470 6471 (face font-lock-string-face) 6471 6481 nil 6481 6482 (face font-lock-string-face) 6482 6504 (face font-lock-constant-face) 6504 6505 (face font-lock-string-face) 6505 6515 nil 6515 6516 (face font-lock-string-face) 6516 6535 (face font-lock-constant-face) 6535 6536 (face font-lock-string-face) 6536 6546 nil 6546 6547 (face font-lock-string-face) 6547 6569 (face font-lock-constant-face) 6569 6570 (face font-lock-string-face) 6570 6580 nil 6580 6581 (face font-lock-string-face) 6581 6602 (face font-lock-constant-face) 6602 6603 (face font-lock-string-face) 6603 6613 nil 6613 6614 (face font-lock-string-face) 6614 6631 (face font-lock-constant-face) 6631 6632 (face font-lock-string-face) 6632 6642 nil 6642 6643 (face font-lock-string-face) 6643 6671 (face font-lock-constant-face) 6671 6672 (face font-lock-string-face) 6672 6682 nil 6682 6683 (face font-lock-string-face) 6683 6710 (face font-lock-constant-face) 6710 6711 (face font-lock-string-face) 6711 6721 nil 6721 6722 (face font-lock-string-face) 6722 6738 (face font-lock-constant-face) 6738 6739 (face font-lock-string-face) 6739 6749 nil 6749 6750 (face font-lock-string-face) 6750 6765 (face font-lock-constant-face) 6765 6766 (face font-lock-string-face) 6766 6776 nil 6776 6777 (face font-lock-string-face) 6777 6800 (face font-lock-constant-face) 6800 6801 (face font-lock-string-face) 6801 6811 nil 6811 6812 (face font-lock-string-face) 6812 6834 (face font-lock-constant-face) 6834 6835 (face font-lock-string-face) 6835 6845 nil 6845 6846 (face font-lock-string-face) 6846 6860 (face font-lock-constant-face) 6860 6861 (face font-lock-string-face) 6861 6871 nil 6871 6872 (face font-lock-string-face) 6872 6885 (face font-lock-constant-face) 6885 6886 (face font-lock-string-face) 6886 6896 nil 6896 6897 (face font-lock-string-face) 6897 6920 (face font-lock-constant-face) 6920 6921 (face font-lock-string-face) 6921 6931 nil 6931 6932 (face font-lock-string-face) 6932 6954 (face font-lock-constant-face) 6954 6955 (face font-lock-string-face) 6955 6965 nil 6965 6966 (face font-lock-string-face) 6966 6986 (face font-lock-constant-face) 6986 6987 (face font-lock-string-face) 6987 6997 nil 6997 6998 (face font-lock-string-face) 6998 7017 (face font-lock-constant-face) 7017 7018 (face font-lock-string-face) 7018 7028 nil 7028 7029 (face font-lock-string-face) 7029 7050 (face font-lock-constant-face) 7050 7051 (face font-lock-string-face) 7051 7061 nil 7061 7062 (face font-lock-string-face) 7062 7082 (face font-lock-constant-face) 7082 7083 (face font-lock-string-face) 7083 7093 nil 7093 7094 (face font-lock-string-face) 7094 7122 (face font-lock-constant-face) 7122 7123 (face font-lock-string-face) 7123 7133 nil 7133 7134 (face font-lock-string-face) 7134 7161 (face font-lock-constant-face) 7161 7162 (face font-lock-string-face) 7162 7172 nil 7172 7173 (face font-lock-string-face) 7173 7194 (face font-lock-constant-face) 7194 7195 (face font-lock-string-face) 7195 7205 nil 7205 7206 (face font-lock-string-face) 7206 7226 (face font-lock-constant-face) 7226 7227 (face font-lock-string-face) 7227 7237 nil 7237 7238 (face font-lock-string-face) 7238 7266 (face font-lock-constant-face) 7266 7267 (face font-lock-string-face) 7267 7277 nil 7277 7278 (face font-lock-string-face) 7278 7305 (face font-lock-constant-face) 7305 7306 (face font-lock-string-face) 7306 7316 nil 7316 7317 (face font-lock-string-face) 7317 7336 (face font-lock-constant-face) 7336 7337 (face font-lock-string-face) 7337 7347 nil 7347 7348 (face font-lock-string-face) 7348 7366 (face font-lock-constant-face) 7366 7367 (face font-lock-string-face) 7367 7377 nil 7377 7378 (face font-lock-string-face) 7378 7399 (face font-lock-constant-face) 7399 7400 (face font-lock-string-face) 7400 7410 nil 7410 7411 (face font-lock-string-face) 7411 7429 (face font-lock-constant-face) 7429 7430 (face font-lock-string-face) 7430 7440 nil 7440 7441 (face font-lock-string-face) 7441 7458 (face font-lock-constant-face) 7458 7459 (face font-lock-string-face) 7459 7469 nil 7469 7470 (face font-lock-string-face) 7470 7493 (face font-lock-constant-face) 7493 7494 (face font-lock-string-face) 7494 7504 nil 7504 7505 (face font-lock-string-face) 7505 7527 (face font-lock-constant-face) 7527 7528 (face font-lock-string-face) 7528 7538 nil 7538 7539 (face font-lock-string-face) 7539 7562 (face font-lock-constant-face) 7562 7563 (face font-lock-string-face) 7563 7573 nil 7573 7574 (face font-lock-string-face) 7574 7596 (face font-lock-constant-face) 7596 7597 (face font-lock-string-face) 7597 7607 nil 7607 7608 (face font-lock-string-face) 7608 7631 (face font-lock-constant-face) 7631 7632 (face font-lock-string-face) 7632 7642 nil 7642 7643 (face font-lock-string-face) 7643 7665 (face font-lock-constant-face) 7665 7666 (face font-lock-string-face) 7666 7676 nil 7676 7677 (face font-lock-string-face) 7677 7705 (face font-lock-constant-face) 7705 7706 (face font-lock-string-face) 7706 7716 nil 7716 7717 (face font-lock-string-face) 7717 7744 (face font-lock-constant-face) 7744 7745 (face font-lock-string-face) 7745 7755 nil 7755 7756 (face font-lock-string-face) 7756 7791 (face font-lock-constant-face) 7791 7792 (face font-lock-string-face) 7792 7802 nil 7802 7803 (face font-lock-string-face) 7803 7837 (face font-lock-constant-face) 7837 7838 (face font-lock-string-face) 7838 7848 nil 7848 7849 (face font-lock-string-face) 7849 7879 (face font-lock-constant-face) 7879 7880 (face font-lock-string-face) 7880 7890 nil 7890 7891 (face font-lock-string-face) 7891 7920 (face font-lock-constant-face) 7920 7921 (face font-lock-string-face) 7921 7931 nil 7931 7932 (face font-lock-string-face) 7932 7962 (face font-lock-constant-face) 7962 7963 (face font-lock-string-face) 7963 7973 nil 7973 7974 (face font-lock-string-face) 7974 8003 (face font-lock-constant-face) 8003 8004 (face font-lock-string-face) 8004 8014 nil 8014 8015 (face font-lock-string-face) 8015 8039 (face font-lock-constant-face) 8039 8040 (face font-lock-string-face) 8040 8050 nil 8050 8051 (face font-lock-string-face) 8051 8074 (face font-lock-constant-face) 8074 8075 (face font-lock-string-face) 8075 8085 nil 8085 8086 (face font-lock-string-face) 8086 8116 (face font-lock-constant-face) 8116 8117 (face font-lock-string-face) 8117 8127 nil 8127 8128 (face font-lock-string-face) 8128 8152 (face font-lock-constant-face) 8152 8153 (face font-lock-string-face) 8153 8163 nil 8163 8164 (face font-lock-string-face) 8164 8187 (face font-lock-constant-face) 8187 8188 (face font-lock-string-face) 8188 8198 nil 8198 8199 (face font-lock-string-face) 8199 8230 (face font-lock-constant-face) 8230 8231 (face font-lock-string-face) 8231 8241 nil 8241 8242 (face font-lock-string-face) 8242 8272 (face font-lock-constant-face) 8272 8273 (face font-lock-string-face) 8273 8283 nil 8283 8284 (face font-lock-string-face) 8284 8309 (face font-lock-constant-face) 8309 8310 (face font-lock-string-face) 8310 8320 nil 8320 8321 (face font-lock-string-face) 8321 8345 (face font-lock-constant-face) 8345 8346 (face font-lock-string-face) 8346 8356 nil 8356 8357 (face font-lock-string-face) 8357 8399 (face font-lock-constant-face) 8399 8400 (face font-lock-string-face) 8400 8410 nil 8410 8411 (face font-lock-string-face) 8411 8452 (face font-lock-constant-face) 8452 8453 (face font-lock-string-face) 8453 8463 nil 8463 8464 (face font-lock-string-face) 8464 8486 (face font-lock-constant-face) 8486 8487 (face font-lock-string-face) 8487 8497 nil 8497 8498 (face font-lock-string-face) 8498 8519 (face font-lock-constant-face) 8519 8520 (face font-lock-string-face) 8520 8530 nil 8530 8531 (face font-lock-string-face) 8531 8562 (face font-lock-constant-face) 8562 8563 (face font-lock-string-face) 8563 8573 nil 8573 8574 (face font-lock-string-face) 8574 8604 (face font-lock-constant-face) 8604 8605 (face font-lock-string-face) 8605 8615 nil 8615 8616 (face font-lock-string-face) 8616 8643 (face font-lock-constant-face) 8643 8644 (face font-lock-string-face) 8644 8654 nil 8654 8655 (face font-lock-string-face) 8655 8681 (face font-lock-constant-face) 8681 8682 (face font-lock-string-face) 8682 8692 nil 8692 8693 (face font-lock-string-face) 8693 8721 (face font-lock-constant-face) 8721 8722 (face font-lock-string-face) 8722 8732 nil 8732 8733 (face font-lock-string-face) 8733 8760 (face font-lock-constant-face) 8760 8761 (face font-lock-string-face) 8761 8771 nil 8771 8772 (face font-lock-string-face) 8772 8805 (face font-lock-constant-face) 8805 8806 (face font-lock-string-face) 8806 8816 nil 8816 8817 (face font-lock-string-face) 8817 8849 (face font-lock-constant-face) 8849 8850 (face font-lock-string-face) 8850 8860 nil 8860 8861 (face font-lock-string-face) 8861 8892 (face font-lock-constant-face) 8892 8893 (face font-lock-string-face) 8893 8903 nil 8903 8904 (face font-lock-string-face) 8904 8934 (face font-lock-constant-face) 8934 8935 (face font-lock-string-face) 8935 8945 nil 8945 8946 (face font-lock-string-face) 8946 8978 (face font-lock-constant-face) 8978 8979 (face font-lock-string-face) 8979 8989 nil 8989 8990 (face font-lock-string-face) 8990 9021 (face font-lock-constant-face) 9021 9022 (face font-lock-string-face) 9022 9032 nil 9032 9033 (face font-lock-string-face) 9033 9063 (face font-lock-constant-face) 9063 9064 (face font-lock-string-face) 9064 9074 nil 9074 9075 (face font-lock-string-face) 9075 9104 (face font-lock-constant-face) 9104 9105 (face font-lock-string-face) 9105 9115 nil 9115 9116 (face font-lock-string-face) 9116 9158 (face font-lock-constant-face) 9158 9159 (face font-lock-string-face) 9159 9169 nil 9169 9170 (face font-lock-string-face) 9170 9211 (face font-lock-constant-face) 9211 9212 (face font-lock-string-face) 9212 9222 nil 9222 9223 (face font-lock-string-face) 9223 9272 (face font-lock-constant-face) 9272 9273 (face font-lock-string-face) 9273 9283 nil 9283 9284 (face font-lock-string-face) 9284 9332 (face font-lock-constant-face) 9332 9333 (face font-lock-string-face) 9333 9343 nil 9343 9344 (face font-lock-string-face) 9344 9388 (face font-lock-constant-face) 9388 9389 (face font-lock-string-face) 9389 9399 nil 9399 9400 (face font-lock-string-face) 9400 9445 (face font-lock-constant-face) 9445 9446 (face font-lock-string-face) 9446 9456 nil 9456 9457 (face font-lock-string-face) 9457 9507 (face font-lock-constant-face) 9507 9508 (face font-lock-string-face) 9508 9518 nil 9518 9519 (face font-lock-string-face) 9519 9570 (face font-lock-constant-face) 9570 9571 (face font-lock-string-face) 9571 9581 nil 9581 9582 (face font-lock-string-face) 9582 9611 (face font-lock-constant-face) 9611 9612 (face font-lock-string-face) 9612 9622 nil 9622 9623 (face font-lock-string-face) 9623 9659 (face font-lock-constant-face) 9659 9660 (face font-lock-string-face) 9660 9670 nil 9670 9671 (face font-lock-string-face) 9671 9714 (face font-lock-constant-face) 9714 9715 (face font-lock-string-face) 9715 9725 nil 9725 9726 (face font-lock-string-face) 9726 9768 (face font-lock-constant-face) 9768 9769 (face font-lock-string-face) 9769 9779 nil 9779 9780 (face font-lock-string-face) 9780 9816 (face font-lock-constant-face) 9816 9817 (face font-lock-string-face) 9817 9827 nil 9827 9828 (face font-lock-string-face) 9828 9863 (face font-lock-constant-face) 9863 9864 (face font-lock-string-face) 9864 9874 nil 9874 9875 (face font-lock-string-face) 9875 9910 (face font-lock-constant-face) 9910 9911 (face font-lock-string-face) 9911 9921 nil 9921 9922 (face font-lock-string-face) 9922 9958 (face font-lock-constant-face) 9958 9959 (face font-lock-string-face) 9959 9969 nil 9969 9970 (face font-lock-string-face) 9970 10005 (face font-lock-constant-face) 10005 10006 (face font-lock-string-face) 10006 10016 nil 10016 10017 (face font-lock-string-face) 10017 10050 (face font-lock-constant-face) 10050 10051 (face font-lock-string-face) 10051 10061 nil 10061 10062 (face font-lock-string-face) 10062 10094 (face font-lock-constant-face) 10094 10095 (face font-lock-string-face) 10095 10105 nil 10105 10106 (face font-lock-string-face) 10106 10150 (face font-lock-constant-face) 10150 10151 (face font-lock-string-face) 10151 10161 nil 10161 10162 (face font-lock-string-face) 10162 10198 (face font-lock-constant-face) 10198 10199 (face font-lock-string-face) 10199 10209 nil 10209 10210 (face font-lock-string-face) 10210 10245 (face font-lock-constant-face) 10245 10246 (face font-lock-string-face) 10246 10256 nil 10256 10257 (face font-lock-string-face) 10257 10296 (face font-lock-constant-face) 10296 10297 (face font-lock-string-face) 10297 10307 nil 10307 10308 (face font-lock-string-face) 10308 10346 (face font-lock-constant-face) 10346 10347 (face font-lock-string-face) 10347 10357 nil 10357 10358 (face font-lock-string-face) 10358 10403 (face font-lock-constant-face) 10403 10404 (face font-lock-string-face) 10404 10414 nil 10414 10415 (face font-lock-string-face) 10415 10459 (face font-lock-constant-face) 10459 10460 (face font-lock-string-face) 10460 10470 nil 10470 10471 (face font-lock-string-face) 10471 10487 (face font-lock-constant-face) 10487 10488 (face font-lock-string-face) 10488 10498 nil 10498 10499 (face font-lock-string-face) 10499 10514 (face font-lock-constant-face) 10514 10515 (face font-lock-string-face) 10515 10525 nil 10525 10526 (face font-lock-string-face) 10526 10559 (face font-lock-constant-face) 10559 10560 (face font-lock-string-face) 10560 10570 nil 10570 10571 (face font-lock-string-face) 10571 10603 (face font-lock-constant-face) 10603 10604 (face font-lock-string-face) 10604 10614 nil 10614 10615 (face font-lock-string-face) 10615 10636 (face font-lock-constant-face) 10636 10637 (face font-lock-string-face) 10637 10647 nil 10647 10648 (face font-lock-string-face) 10648 10675 (face font-lock-constant-face) 10675 10676 (face font-lock-string-face) 10676 10686 nil 10686 10687 (face font-lock-string-face) 10687 10713 (face font-lock-constant-face) 10713 10714 (face font-lock-string-face) 10714 10724 nil 10724 10725 (face font-lock-string-face) 10725 10755 (face font-lock-constant-face) 10755 10756 (face font-lock-string-face) 10756 10766 nil 10766 10767 (face font-lock-string-face) 10767 10796 (face font-lock-constant-face) 10796 10797 (face font-lock-string-face) 10797 10807 nil 10807 10808 (face font-lock-string-face) 10808 10845 (face font-lock-constant-face) 10845 10846 (face font-lock-string-face) 10846 10856 nil 10856 10857 (face font-lock-string-face) 10857 10893 (face font-lock-constant-face) 10893 10894 (face font-lock-string-face) 10894 10904 nil 10904 10905 (face font-lock-string-face) 10905 10929 (face font-lock-constant-face) 10929 10930 (face font-lock-string-face) 10930 10940 nil 10940 10941 (face font-lock-string-face) 10941 10964 (face font-lock-constant-face) 10964 10965 (face font-lock-string-face) 10965 10975 nil 10975 10976 (face font-lock-string-face) 10976 10995 (face font-lock-constant-face) 10995 10996 (face font-lock-string-face) 10996 11006 nil 11006 11007 (face font-lock-string-face) 11007 11025 (face font-lock-constant-face) 11025 11026 (face font-lock-string-face) 11026 11036 nil 11036 11037 (face font-lock-string-face) 11037 11063 (face font-lock-constant-face) 11063 11064 (face font-lock-string-face) 11064 11074 nil 11074 11075 (face font-lock-string-face) 11075 11100 (face font-lock-constant-face) 11100 11101 (face font-lock-string-face) 11101 11111 nil 11111 11112 (face font-lock-string-face) 11112 11138 (face font-lock-constant-face) 11138 11139 (face font-lock-string-face) 11139 11149 nil 11149 11150 (face font-lock-string-face) 11150 11175 (face font-lock-constant-face) 11175 11176 (face font-lock-string-face) 11176 11193 nil 11193 11194 (face font-lock-string-face) 11194 11219 (face font-lock-keyword-face) 11219 11220 (face font-lock-string-face) 11220 11232 nil 11232 11233 (face font-lock-string-face) 11233 11245 (face font-lock-keyword-face) 11245 11246 (face font-lock-string-face) 11246 11260 nil 11260 11261 (face font-lock-string-face) 11261 11263 (face font-lock-constant-face) 11263 11264 (face font-lock-string-face) 11264 11292 nil 11292 11293 (face font-lock-string-face) 11293 11303 (face font-lock-keyword-face) 11303 11304 (face font-lock-string-face) 11304 11316 nil 11316 11381 (face font-lock-comment-face) 11381 11389 nil 11389 11439 (face font-lock-comment-face) 11439 11448 nil 11448 11449 (face font-lock-string-face) 11449 11464 (face font-lock-variable-name-face) 11464 11465 (face font-lock-string-face) 11465 11479 nil 11479 11480 (face font-lock-string-face) 11480 11492 (face font-lock-keyword-face) 11492 11493 (face font-lock-string-face) 11493 11509 nil 11509 11510 (face font-lock-string-face) 11510 11549 (face font-lock-function-name-face) 11549 11550 (face font-lock-string-face) 11550 11586 nil 11586 11587 (face font-lock-string-face) 11587 11602 (face font-lock-variable-name-face) 11602 11603 (face font-lock-string-face) 11603 11617 nil 11617 11618 (face font-lock-string-face) 11618 11626 (face font-lock-keyword-face) 11626 11627 (face font-lock-string-face) 11627 11643 nil 11643 11644 (face font-lock-string-face) 11644 11663 (face font-lock-constant-face) 11663 11664 (face font-lock-string-face) 11664 11678 nil 11678 11679 (face font-lock-string-face) 11679 11702 (face font-lock-constant-face) 11702 11703 (face font-lock-string-face) 11703 11717 nil 11717 11718 (face font-lock-string-face) 11718 11740 (face font-lock-constant-face) 11740 11741 (face font-lock-string-face) 11741 11755 nil 11755 11756 (face font-lock-string-face) 11756 11779 (face font-lock-constant-face) 11779 11780 (face font-lock-string-face) 11780 11794 nil 11794 11795 (face font-lock-string-face) 11795 11817 (face font-lock-constant-face) 11817 11818 (face font-lock-string-face) 11818 11832 nil 11832 11833 (face font-lock-string-face) 11833 11861 (face font-lock-constant-face) 11861 11862 (face font-lock-string-face) 11862 11876 nil 11876 11877 (face font-lock-string-face) 11877 11904 (face font-lock-constant-face) 11904 11905 (face font-lock-string-face) 11905 11919 nil 11919 11920 (face font-lock-string-face) 11920 11950 (face font-lock-constant-face) 11950 11951 (face font-lock-string-face) 11951 11965 nil 11965 11966 (face font-lock-string-face) 11966 11995 (face font-lock-constant-face) 11995 11996 (face font-lock-string-face) 11996 12010 nil 12010 12011 (face font-lock-string-face) 12011 12035 (face font-lock-constant-face) 12035 12036 (face font-lock-string-face) 12036 12050 nil 12050 12051 (face font-lock-string-face) 12051 12074 (face font-lock-constant-face) 12074 12075 (face font-lock-string-face) 12075 12089 nil 12089 12090 (face font-lock-string-face) 12090 12120 (face font-lock-constant-face) 12120 12121 (face font-lock-string-face) 12121 12135 nil 12135 12136 (face font-lock-string-face) 12136 12167 (face font-lock-constant-face) 12167 12168 (face font-lock-string-face) 12168 12182 nil 12182 12183 (face font-lock-string-face) 12183 12213 (face font-lock-constant-face) 12213 12214 (face font-lock-string-face) 12214 12228 nil 12228 12229 (face font-lock-string-face) 12229 12254 (face font-lock-constant-face) 12254 12255 (face font-lock-string-face) 12255 12269 nil 12269 12270 (face font-lock-string-face) 12270 12294 (face font-lock-constant-face) 12294 12295 (face font-lock-string-face) 12295 12309 nil 12309 12310 (face font-lock-string-face) 12310 12352 (face font-lock-constant-face) 12352 12353 (face font-lock-string-face) 12353 12367 nil 12367 12368 (face font-lock-string-face) 12368 12409 (face font-lock-constant-face) 12409 12410 (face font-lock-string-face) 12410 12424 nil 12424 12425 (face font-lock-string-face) 12425 12447 (face font-lock-constant-face) 12447 12448 (face font-lock-string-face) 12448 12462 nil 12462 12463 (face font-lock-string-face) 12463 12484 (face font-lock-constant-face) 12484 12485 (face font-lock-string-face) 12485 12499 nil 12499 12500 (face font-lock-string-face) 12500 12531 (face font-lock-constant-face) 12531 12532 (face font-lock-string-face) 12532 12546 nil 12546 12547 (face font-lock-string-face) 12547 12577 (face font-lock-constant-face) 12577 12578 (face font-lock-string-face) 12578 12592 nil 12592 12593 (face font-lock-string-face) 12593 12621 (face font-lock-constant-face) 12621 12622 (face font-lock-string-face) 12622 12636 nil 12636 12637 (face font-lock-string-face) 12637 12664 (face font-lock-constant-face) 12664 12665 (face font-lock-string-face) 12665 12679 nil 12679 12680 (face font-lock-string-face) 12680 12707 (face font-lock-constant-face) 12707 12708 (face font-lock-string-face) 12708 12722 nil 12722 12723 (face font-lock-string-face) 12723 12749 (face font-lock-constant-face) 12749 12750 (face font-lock-string-face) 12750 12764 nil 12764 12765 (face font-lock-string-face) 12765 12791 (face font-lock-constant-face) 12791 12792 (face font-lock-string-face) 12792 12806 nil 12806 12807 (face font-lock-string-face) 12807 12832 (face font-lock-constant-face) 12832 12833 (face font-lock-string-face) 12833 12868 nil 12868 12937 (face font-lock-comment-face) 12937 12945 nil 12945 13016 (face font-lock-comment-face) 13016 13024 nil 13024 13040 (face font-lock-comment-face) 13040 13049 nil 13049 13050 (face font-lock-string-face) 13050 13065 (face font-lock-variable-name-face) 13065 13066 (face font-lock-string-face) 13066 13080 nil 13080 13081 (face font-lock-string-face) 13081 13089 (face font-lock-keyword-face) 13089 13090 (face font-lock-string-face) 13090 13105 nil 13105 13106 (face font-lock-string-face) 13106 13149 (face font-lock-constant-face) 13149 13150 (face font-lock-string-face) 13150 13175 nil 13175 13176 (face font-lock-string-face) 13176 13183 (face font-lock-keyword-face) 13183 13184 (face font-lock-string-face) 13184 13199 nil 13199 13200 (face font-lock-string-face) 13200 13248 (face font-lock-constant-face) 13248 13249 (face font-lock-string-face) 13249 13274 nil 13274 13275 (face font-lock-string-face) 13275 13288 (face font-lock-keyword-face) 13288 13289 (face font-lock-string-face) 13289 13305 nil 13305 13306 (face font-lock-string-face) 13306 13315 (face font-lock-keyword-face) 13315 13316 (face font-lock-string-face) 13316 13334 nil 13334 13335 (face font-lock-string-face) 13335 13345 (face font-lock-constant-face) 13345 13346 (face font-lock-string-face) 13346 13397 nil 13397 13398 (face font-lock-string-face) 13398 13443 (face font-lock-variable-name-face) 13443 13444 (face font-lock-string-face) 13444 13458 nil 13458 13459 (face font-lock-string-face) 13459 13472 (face font-lock-keyword-face) 13472 13473 (face font-lock-string-face) 13473 13489 nil 13489 13490 (face font-lock-string-face) 13490 13499 (face font-lock-keyword-face) 13499 13500 (face font-lock-string-face) 13500 13518 nil 13518 13519 (face font-lock-string-face) 13519 13527 (face font-lock-constant-face) 13527 13528 (face font-lock-string-face) 13528 13579 nil 13579 13580 (face font-lock-string-face) 13580 13593 (face font-lock-variable-name-face) 13593 13594 (face font-lock-string-face) 13594 13608 nil 13608 13609 (face font-lock-string-face) 13609 13617 (face font-lock-keyword-face) 13617 13618 (face font-lock-string-face) 13618 13623 nil 13623 13624 (face font-lock-string-face) 13624 13631 (face font-lock-constant-face) 13631 13632 (face font-lock-string-face) 13632 13634 nil 13634 13635 (face font-lock-string-face) 13635 13641 (face font-lock-constant-face) 13641 13642 (face font-lock-string-face) 13642 13671 nil 13671 13672 (face font-lock-string-face) 13672 13679 (face font-lock-constant-face) 13679 13680 (face font-lock-string-face) 13680 13682 nil 13682 13683 (face font-lock-string-face) 13683 13703 (face font-lock-constant-face) 13703 13704 (face font-lock-string-face) 13704 13720 nil 13720 13721 (face font-lock-string-face) 13721 13734 (face font-lock-keyword-face) 13734 13735 (face font-lock-string-face) 13735 13751 nil 13751 13752 (face font-lock-string-face) 13752 13761 (face font-lock-keyword-face) 13761 13762 (face font-lock-string-face) 13762 13815 nil 13815 13816 (face font-lock-string-face) 13816 13829 (face font-lock-variable-name-face) 13829 13830 (face font-lock-string-face) 13830 13844 nil 13844 13845 (face font-lock-string-face) 13845 13853 (face font-lock-keyword-face) 13853 13854 (face font-lock-string-face) 13854 13870 nil 13870 13871 (face font-lock-string-face) 13871 13909 (face font-lock-constant-face) 13909 13910 (face font-lock-string-face) 13910 13924 nil 13924 13925 (face font-lock-string-face) 13925 13962 (face font-lock-constant-face) 13962 13963 (face font-lock-string-face) 13963 13999 nil 13999 14000 (face font-lock-string-face) 14000 14011 (face font-lock-variable-name-face) 14011 14012 (face font-lock-string-face) 14012 14026 nil 14026 14027 (face font-lock-string-face) 14027 14036 (face font-lock-keyword-face) 14036 14037 (face font-lock-string-face) 14037 14053 nil 14053 14054 (face font-lock-string-face) 14054 14064 (face font-lock-keyword-face) 14064 14065 (face font-lock-string-face) 14065 14084 nil 14084 14085 (face font-lock-string-face) 14085 14096 (face font-lock-variable-name-face) 14096 14097 (face font-lock-string-face) 14097 14117 nil 14117 14129 (face font-lock-string-face) 14129 14131 nil 14131 14169 (face font-lock-string-face) 14169 14176 (face font-lock-variable-name-face) 14176 14182 (face font-lock-string-face) 14182 14193 (face font-lock-variable-name-face) 14193 14196 (face font-lock-string-face) 14196 14233 nil 14233 14245 (face font-lock-string-face) 14245 14247 nil 14247 14259 (face font-lock-string-face) 14259 14316 nil 14316 14317 (face font-lock-string-face) 14317 14327 (face font-lock-keyword-face) 14327 14328 (face font-lock-string-face) 14328 14345 nil 14345 14346 (face font-lock-string-face) 14346 14359 (face font-lock-variable-name-face) 14359 14360 (face font-lock-string-face) 14360 14378 nil 14378 14379 (face font-lock-string-face) 14379 14385 (face font-lock-keyword-face) 14385 14386 (face font-lock-string-face) 14386 14406 nil 14406 14411 (face font-lock-string-face) 14411 14413 (face font-lock-variable-name-face) 14413 14423 (face font-lock-variable-name-face) 14423 14443 (face font-lock-string-face) 14443 14476 nil 14476 14477 (face font-lock-string-face) 14477 14490 (face font-lock-keyword-face) 14490 14491 (face font-lock-string-face) 14491 14511 nil 14511 14512 (face font-lock-string-face) 14512 14521 (face font-lock-keyword-face) 14521 14522 (face font-lock-string-face) 14522 14544 nil 14544 14545 (face font-lock-string-face) 14545 14549 (face font-lock-constant-face) 14549 14551 (face font-lock-variable-name-face) 14551 14561 (face font-lock-variable-name-face) 14561 14578 (face font-lock-constant-face) 14578 14579 (face font-lock-string-face) 14579 14631 nil 14631 14632 (face font-lock-string-face) 14632 14639 (face font-lock-keyword-face) 14639 14640 (face font-lock-string-face) 14640 14660 nil 14660 14661 (face font-lock-string-face) 14661 14669 (face font-lock-preprocessor-face) 14669 14670 (face font-lock-string-face) 14670 14707 nil 14707 14729 (face font-lock-comment-face) 14729 14743 nil 14743 14744 (face font-lock-string-face) 14744 14752 (face font-lock-keyword-face) 14752 14753 (face font-lock-string-face) 14753 14773 nil 14773 14774 (face font-lock-string-face) 14774 14800 (face font-lock-constant-face) 14800 14801 (face font-lock-string-face) 14801 14819 nil 14819 14820 (face font-lock-string-face) 14820 14845 (face font-lock-constant-face) 14845 14846 (face font-lock-string-face) 14846 14915 nil 14915 14916 (face font-lock-string-face) 14916 14929 (face font-lock-variable-name-face) 14929 14930 (face font-lock-string-face) 14930 14944 nil 14944 14945 (face font-lock-string-face) 14945 14955 (face font-lock-keyword-face) 14955 14956 (face font-lock-string-face) 14956 14973 nil 14973 14974 (face font-lock-string-face) 14974 14993 (face font-lock-variable-name-face) 14993 14994 (face font-lock-string-face) 14994 15012 nil 15012 15013 (face font-lock-string-face) 15013 15019 (face font-lock-keyword-face) 15019 15020 (face font-lock-string-face) 15020 15040 nil 15040 15075 (face font-lock-string-face) 15075 15108 nil 15108 15109 (face font-lock-string-face) 15109 15122 (face font-lock-keyword-face) 15122 15123 (face font-lock-string-face) 15123 15143 nil 15143 15144 (face font-lock-string-face) 15144 15153 (face font-lock-keyword-face) 15153 15154 (face font-lock-string-face) 15154 15176 nil 15176 15177 (face font-lock-string-face) 15177 15215 (face font-lock-constant-face) 15215 15216 (face font-lock-string-face) 15216 15268 nil 15268 15269 (face font-lock-string-face) 15269 15276 (face font-lock-keyword-face) 15276 15277 (face font-lock-string-face) 15277 15297 nil 15297 15298 (face font-lock-string-face) 15298 15312 (face font-lock-preprocessor-face) 15312 15313 (face font-lock-string-face) 15313 15350 nil 15350 15378 (face font-lock-comment-face) 15378 15392 nil 15392 15393 (face font-lock-string-face) 15393 15401 (face font-lock-keyword-face) 15401 15402 (face font-lock-string-face) 15402 15422 nil 15422 15423 (face font-lock-string-face) 15423 15450 (face font-lock-constant-face) 15450 15451 (face font-lock-string-face) 15451 15469 nil 15469 15470 (face font-lock-string-face) 15470 15496 (face font-lock-constant-face) 15496 15497 (face font-lock-string-face) 15497 15566 nil 15566 15567 (face font-lock-string-face) 15567 15600 (face font-lock-variable-name-face) 15600 15601 (face font-lock-string-face) 15601 15615 nil 15615 15663 (face font-lock-comment-face) 15663 15673 nil 15673 15674 (face font-lock-string-face) 15674 15682 (face font-lock-keyword-face) 15682 15683 (face font-lock-string-face) 15683 15699 nil 15699 15700 (face font-lock-string-face) 15700 15743 (face font-lock-constant-face) 15743 15744 (face font-lock-string-face) 15744 15758 nil 15758 15759 (face font-lock-string-face) 15759 15801 (face font-lock-constant-face) 15801 15802 (face font-lock-string-face) 15802 15838 nil 15838 15839 (face font-lock-string-face) 15839 15848 (face font-lock-variable-name-face) 15848 15849 (face font-lock-string-face) 15849 15863 nil 15863 15864 (face font-lock-string-face) 15864 15877 (face font-lock-keyword-face) 15877 15878 (face font-lock-string-face) 15878 15894 nil 15894 15895 (face font-lock-string-face) 15895 15904 (face font-lock-keyword-face) 15904 15905 (face font-lock-string-face) 15905 15923 nil 15923 15924 (face font-lock-string-face) 15924 15980 (face font-lock-constant-face) 15980 15981 (face font-lock-string-face) 15981 15997 nil 15997 15998 (face font-lock-string-face) 15998 16057 (face font-lock-constant-face) 16057 16058 (face font-lock-string-face) 16058 16074 nil 16074 16075 (face font-lock-string-face) 16075 16131 (face font-lock-constant-face) 16131 16132 (face font-lock-string-face) 16132 16148 nil 16148 16149 (face font-lock-string-face) 16149 16205 (face font-lock-constant-face) 16205 16206 (face font-lock-string-face) 16206 16222 nil 16222 16223 (face font-lock-string-face) 16223 16275 (face font-lock-constant-face) 16275 16276 (face font-lock-string-face) 16276 16327 nil 16327 16328 (face font-lock-string-face) 16328 16337 (face font-lock-variable-name-face) 16337 16338 (face font-lock-string-face) 16338 16352 nil 16352 16353 (face font-lock-string-face) 16353 16361 (face font-lock-keyword-face) 16361 16362 (face font-lock-string-face) 16362 16378 nil 16378 16379 (face font-lock-string-face) 16379 16406 (face font-lock-constant-face) 16406 16407 (face font-lock-string-face) 16407 16421 nil 16421 16422 (face font-lock-string-face) 16422 16448 (face font-lock-constant-face) 16448 16449 (face font-lock-string-face) 16449 16463 nil 16463 16464 (face font-lock-string-face) 16464 16507 (face font-lock-constant-face) 16507 16508 (face font-lock-string-face) 16508 16522 nil 16522 16523 (face font-lock-string-face) 16523 16565 (face font-lock-constant-face) 16565 16566 (face font-lock-string-face) 16566 16602 nil 16602 16603 (face font-lock-string-face) 16603 16646 (face font-lock-variable-name-face) 16646 16647 (face font-lock-string-face) 16647 16661 nil 16661 16662 (face font-lock-string-face) 16662 16669 (face font-lock-keyword-face) 16669 16670 (face font-lock-string-face) 16670 16686 nil 16686 16687 (face font-lock-string-face) 16687 16697 (face font-lock-constant-face) 16697 16698 (face font-lock-string-face) 16698 16712 nil 16712 16713 (face font-lock-string-face) 16713 16722 (face font-lock-constant-face) 16722 16723 (face font-lock-string-face) 16723 16737 nil 16737 16738 (face font-lock-string-face) 16738 16760 (face font-lock-constant-face) 16760 16761 (face font-lock-string-face) 16761 16775 nil 16775 16776 (face font-lock-string-face) 16776 16797 (face font-lock-constant-face) 16797 16798 (face font-lock-string-face) 16798 16812 nil 16812 16813 (face font-lock-string-face) 16813 16830 (face font-lock-constant-face) 16830 16831 (face font-lock-string-face) 16831 16845 nil 16845 16846 (face font-lock-string-face) 16846 16862 (face font-lock-constant-face) 16862 16863 (face font-lock-string-face) 16863 16877 nil 16877 16878 (face font-lock-string-face) 16878 16889 (face font-lock-constant-face) 16889 16890 (face font-lock-string-face) 16890 16904 nil 16904 16905 (face font-lock-string-face) 16905 16915 (face font-lock-constant-face) 16915 16916 (face font-lock-string-face) 16916 16930 nil 16930 16931 (face font-lock-string-face) 16931 16955 (face font-lock-constant-face) 16955 16956 (face font-lock-string-face) 16956 16970 nil 16970 16971 (face font-lock-string-face) 16971 16994 (face font-lock-constant-face) 16994 16995 (face font-lock-string-face) 16995 17009 nil 17009 17010 (face font-lock-string-face) 17010 17034 (face font-lock-constant-face) 17034 17035 (face font-lock-string-face) 17035 17049 nil 17049 17050 (face font-lock-string-face) 17050 17073 (face font-lock-constant-face) 17073 17074 (face font-lock-string-face) 17074 17088 nil 17088 17089 (face font-lock-string-face) 17089 17114 (face font-lock-constant-face) 17114 17115 (face font-lock-string-face) 17115 17129 nil 17129 17130 (face font-lock-string-face) 17130 17154 (face font-lock-constant-face) 17154 17155 (face font-lock-string-face) 17155 17210 nil 17210 17211 (face font-lock-string-face) 17211 17222 (face font-lock-keyword-face) 17222 17223 (face font-lock-string-face) 17223 17225 nil 17225 17226 (face font-lock-string-face) 17226 17237 (face font-lock-function-name-face) 17237 17238 (face font-lock-string-face) 17238 17246 nil 17246 17247 (face font-lock-string-face) 17247 17251 (face font-lock-keyword-face) 17251 17252 (face font-lock-string-face) 17252 17254 nil 17254 17255 (face font-lock-string-face) 17255 17269 (face font-lock-type-face) 17269 17270 (face font-lock-string-face) 17270 17278 nil 17278 17279 (face font-lock-string-face) 17279 17291 (face font-lock-keyword-face) 17291 17292 (face font-lock-string-face) 17292 17304 nil 17304 17305 (face font-lock-string-face) 17305 17307 (face font-lock-constant-face) 17307 17308 (face font-lock-string-face) 17308 17325 nil 17325 17326 (face font-lock-string-face) 17326 17336 (face font-lock-keyword-face) 17336 17337 (face font-lock-string-face) 17337 17350 nil 17350 17351 (face font-lock-string-face) 17351 17371 (face font-lock-variable-name-face) 17371 17372 (face font-lock-string-face) 17372 17386 nil 17386 17387 (face font-lock-string-face) 17387 17404 (face font-lock-keyword-face) 17404 17405 (face font-lock-string-face) 17405 17423 nil 17423 17424 (face font-lock-string-face) 17424 17442 (face font-lock-variable-name-face) 17442 17443 (face font-lock-string-face) 17443 17461 nil 17461 17462 (face font-lock-string-face) 17462 17469 (face font-lock-keyword-face) 17469 17470 (face font-lock-string-face) 17470 17474 nil 17474 17498 (face font-lock-string-face) 17498 17553 nil 17553 17554 (face font-lock-string-face) 17554 17599 (face font-lock-variable-name-face) 17599 17600 (face font-lock-string-face) 17600 17614 nil 17614 17615 (face font-lock-string-face) 17615 17627 (face font-lock-keyword-face) 17627 17628 (face font-lock-string-face) 17628 17644 nil 17644 17645 (face font-lock-string-face) 17645 17665 (face font-lock-function-name-face) 17665 17666 (face font-lock-string-face) 17666 17703 nil 17703 17704 (face font-lock-string-face) 17704 17724 (face font-lock-variable-name-face) 17724 17725 (face font-lock-string-face) 17725 17739 nil 17739 17740 (face font-lock-string-face) 17740 17752 (face font-lock-keyword-face) 17752 17753 (face font-lock-string-face) 17753 17769 nil 17769 17770 (face font-lock-string-face) 17770 17790 (face font-lock-function-name-face) 17790 17791 (face font-lock-string-face) 17791 17833 nil 17833 17834 (face font-lock-string-face) 17834 17841 (face font-lock-keyword-face) 17841 17842 (face font-lock-string-face) 17842 17854 nil 17854 17855 (face font-lock-string-face) 17855 17874 (face font-lock-constant-face) 17874 17875 (face font-lock-string-face) 17875 17885 nil 17885 17886 (face font-lock-string-face) 17886 17904 (face font-lock-constant-face) 17904 17905 (face font-lock-string-face) 17905 17935 nil 17935 17936 (face font-lock-string-face) 17936 17947 (face font-lock-keyword-face) 17947 17948 (face font-lock-string-face) 17948 17950 nil 17950 17951 (face font-lock-string-face) 17951 17971 (face font-lock-function-name-face) 17971 17972 (face font-lock-string-face) 17972 17980 nil 17980 17981 (face font-lock-string-face) 17981 17985 (face font-lock-keyword-face) 17985 17986 (face font-lock-string-face) 17986 17988 nil 17988 17989 (face font-lock-string-face) 17989 18003 (face font-lock-type-face) 18003 18004 (face font-lock-string-face) 18004 18012 nil 18012 18013 (face font-lock-string-face) 18013 18025 (face font-lock-keyword-face) 18025 18026 (face font-lock-string-face) 18026 18038 nil 18038 18039 (face font-lock-string-face) 18039 18041 (face font-lock-constant-face) 18041 18042 (face font-lock-string-face) 18042 18059 nil 18059 18060 (face font-lock-string-face) 18060 18067 (face font-lock-keyword-face) 18067 18068 (face font-lock-string-face) 18068 18080 nil 18080 18081 (face font-lock-string-face) 18081 18114 (face font-lock-constant-face) 18114 18115 (face font-lock-string-face) 18115 18125 nil 18125 18126 (face font-lock-string-face) 18126 18162 (face font-lock-constant-face) 18162 18163 (face font-lock-string-face) 18163 18173 nil 18173 18174 (face font-lock-string-face) 18174 18212 (face font-lock-constant-face) 18212 18213 (face font-lock-string-face) 18213 18223 nil 18223 18224 (face font-lock-string-face) 18224 18261 (face font-lock-constant-face) 18261 18262 (face font-lock-string-face) 18262 18272 nil 18272 18273 (face font-lock-string-face) 18273 18311 (face font-lock-constant-face) 18311 18312 (face font-lock-string-face) 18312 18322 nil 18322 18323 (face font-lock-string-face) 18323 18356 (face font-lock-constant-face) 18356 18357 (face font-lock-string-face) 18357 18367 nil 18367 18368 (face font-lock-string-face) 18368 18403 (face font-lock-constant-face) 18403 18404 (face font-lock-string-face) 18404 18414 nil 18414 18415 (face font-lock-string-face) 18415 18451 (face font-lock-constant-face) 18451 18452 (face font-lock-string-face) 18452 18462 nil 18462 18463 (face font-lock-string-face) 18463 18499 (face font-lock-constant-face) 18499 18500 (face font-lock-string-face) 18500 18510 nil 18510 18511 (face font-lock-string-face) 18511 18547 (face font-lock-constant-face) 18547 18548 (face font-lock-string-face) 18548 18558 nil 18558 18559 (face font-lock-string-face) 18559 18581 (face font-lock-constant-face) 18581 18582 (face font-lock-string-face) 18582 18592 nil 18592 18593 (face font-lock-string-face) 18593 18618 (face font-lock-constant-face) 18618 18619 (face font-lock-string-face) 18619 18629 nil 18629 18630 (face font-lock-string-face) 18630 18657 (face font-lock-constant-face) 18657 18658 (face font-lock-string-face) 18658 18668 nil 18668 18669 (face font-lock-string-face) 18669 18697 (face font-lock-constant-face) 18697 18698 (face font-lock-string-face) 18698 18708 nil 18708 18709 (face font-lock-string-face) 18709 18750 (face font-lock-constant-face) 18750 18751 (face font-lock-string-face) 18751 18761 nil 18761 18762 (face font-lock-string-face) 18762 18803 (face font-lock-constant-face) 18803 18804 (face font-lock-string-face) 18804 18814 nil 18814 18815 (face font-lock-string-face) 18815 18856 (face font-lock-constant-face) 18856 18857 (face font-lock-string-face) 18857 18867 nil 18867 18868 (face font-lock-string-face) 18868 18902 (face font-lock-constant-face) 18902 18903 (face font-lock-string-face) 18903 18913 nil 18913 18914 (face font-lock-string-face) 18914 18948 (face font-lock-constant-face) 18948 18949 (face font-lock-string-face) 18949 18959 nil 18959 18960 (face font-lock-string-face) 18960 18994 (face font-lock-constant-face) 18994 18995 (face font-lock-string-face) 18995 19005 nil 19005 19006 (face font-lock-string-face) 19006 19035 (face font-lock-constant-face) 19035 19036 (face font-lock-string-face) 19036 19046 nil 19046 19047 (face font-lock-string-face) 19047 19075 (face font-lock-constant-face) 19075 19076 (face font-lock-string-face) 19076 19093 nil 19093 19094 (face font-lock-string-face) 19094 19104 (face font-lock-keyword-face) 19104 19105 (face font-lock-string-face) 19105 19118 nil 19118 19119 (face font-lock-string-face) 19119 19139 (face font-lock-variable-name-face) 19139 19140 (face font-lock-string-face) 19140 19154 nil 19154 19155 (face font-lock-string-face) 19155 19172 (face font-lock-keyword-face) 19172 19173 (face font-lock-string-face) 19173 19191 nil 19191 19192 (face font-lock-string-face) 19192 19210 (face font-lock-variable-name-face) 19210 19211 (face font-lock-string-face) 19211 19229 nil 19229 19230 (face font-lock-string-face) 19230 19237 (face font-lock-keyword-face) 19237 19238 (face font-lock-string-face) 19238 19242 nil 19242 19266 (face font-lock-string-face) 19266 19321 nil 19321 19322 (face font-lock-string-face) 19322 19342 (face font-lock-variable-name-face) 19342 19343 (face font-lock-string-face) 19343 19357 nil 19357 19399 (face font-lock-comment-face) 19399 19409 nil 19409 19410 (face font-lock-string-face) 19410 19417 (face font-lock-keyword-face) 19417 19418 (face font-lock-string-face) 19418 19434 nil 19434 19435 (face font-lock-string-face) 19435 19480 (face font-lock-constant-face) 19480 19481 (face font-lock-string-face) 19481 19495 nil 19495 19496 (face font-lock-string-face) 19496 19535 (face font-lock-constant-face) 19535 19536 (face font-lock-string-face) 19536 19573 nil 19573 19574 (face font-lock-string-face) 19574 19623 (face font-lock-variable-name-face) 19623 19624 (face font-lock-string-face) 19624 19638 nil 19638 19639 (face font-lock-string-face) 19639 19645 (face font-lock-keyword-face) 19645 19646 (face font-lock-string-face) 19646 19662 nil 19662 19670 (face font-lock-string-face) 19670 19707 nil 19707 19708 (face font-lock-string-face) 19708 19719 (face font-lock-variable-name-face) 19719 19720 (face font-lock-string-face) 19720 19734 nil 19734 19735 (face font-lock-string-face) 19735 19749 (face font-lock-keyword-face) 19749 19750 (face font-lock-string-face) 19750 19766 nil 19766 19773 (face font-lock-string-face) 19773 19791 nil 19791 19792 (face font-lock-string-face) 19792 19806 (face font-lock-keyword-face) 19806 19807 (face font-lock-string-face) 19807 19827 nil 19827 19890 (face font-lock-comment-face) 19890 19906 nil 19906 19971 (face font-lock-comment-face) 19971 19987 nil 19987 20032 (face font-lock-comment-face) 20032 20048 nil 20048 20072 (face font-lock-string-face) 20072 20074 nil 20074 20077 (face font-lock-string-face) 20077 20080 nil 20080 20086 (face font-lock-comment-face) 20086 20155 nil 20155 20156 (face font-lock-string-face) 20156 20165 (face font-lock-variable-name-face) 20165 20166 (face font-lock-string-face) 20166 20180 nil 20180 20181 (face font-lock-string-face) 20181 20190 (face font-lock-keyword-face) 20190 20191 (face font-lock-string-face) 20191 20207 nil 20207 20208 (face font-lock-string-face) 20208 20218 (face font-lock-variable-name-face) 20218 20219 (face font-lock-string-face) 20219 20237 nil 20237 20246 (face font-lock-string-face) 20246 20262 nil 20262 20270 (face font-lock-string-face) 20270 20286 nil 20286 20298 (face font-lock-string-face) 20298 20314 nil 20314 20322 (face font-lock-string-face) 20322 20374 nil 20374 20375 (face font-lock-string-face) 20375 20384 (face font-lock-variable-name-face) 20384 20385 (face font-lock-string-face) 20385 20399 nil 20399 20400 (face font-lock-string-face) 20400 20409 (face font-lock-keyword-face) 20409 20410 (face font-lock-string-face) 20410 20426 nil 20426 20427 (face font-lock-string-face) 20427 20437 (face font-lock-variable-name-face) 20437 20438 (face font-lock-string-face) 20438 20456 nil 20456 20466 (face font-lock-string-face) 20466 20482 nil 20482 20491 (face font-lock-string-face) 20491 20507 nil 20507 20519 (face font-lock-string-face) 20519 20535 nil 20535 20543 (face font-lock-string-face) 20543 20595 nil 20595 20596 (face font-lock-string-face) 20596 20621 (face font-lock-variable-name-face) 20621 20622 (face font-lock-string-face) 20622 20636 nil 20636 20637 (face font-lock-string-face) 20637 20646 (face font-lock-keyword-face) 20646 20647 (face font-lock-string-face) 20647 20663 nil 20663 20664 (face font-lock-string-face) 20664 20674 (face font-lock-keyword-face) 20674 20675 (face font-lock-string-face) 20675 20695 nil 20695 20696 (face font-lock-string-face) 20696 20715 (face font-lock-variable-name-face) 20715 20716 (face font-lock-string-face) 20716 20736 nil 20736 20748 (face font-lock-string-face) 20748 20770 nil 20770 20780 (face font-lock-string-face) 20780 20800 nil 20800 20807 (face font-lock-string-face) 20807 20827 nil 20827 20839 (face font-lock-string-face) 20839 20859 nil 20859 20867 (face font-lock-string-face) 20867 20923 nil 20923 20935 (face font-lock-string-face) 20935 20957 nil 20957 20972 (face font-lock-string-face) 20972 20992 nil 20992 20999 (face font-lock-string-face) 20999 21019 nil 21019 21026 (face font-lock-string-face) 21026 21046 nil 21046 21058 (face font-lock-string-face) 21058 21078 nil 21078 21086 (face font-lock-string-face) 21086 21180 nil 21180 21181 (face font-lock-string-face) 21181 21190 (face font-lock-keyword-face) 21190 21191 (face font-lock-string-face) 21191 21203 nil 21203 21204 (face font-lock-string-face) 21204 21220 (face font-lock-variable-name-face) 21220 21221 (face font-lock-string-face) 21221 21223 nil 21223 21224 (face font-lock-string-face) 21224 21256 (face font-lock-variable-name-face) 21256 21257 (face font-lock-string-face) 21257 21274 nil 21274 21314 (face font-lock-string-face) 21314 21325 nil 21325 21326 (face font-lock-string-face) 21326 21334 (face font-lock-keyword-face) 21334 21335 (face font-lock-string-face) 21335 21347 nil 21347 21348 (face font-lock-string-face) 21348 21385 (face font-lock-constant-face) 21385 21386 (face font-lock-string-face) 21386 21416 nil 21416 21417 (face font-lock-string-face) 21417 21428 (face font-lock-keyword-face) 21428 21429 (face font-lock-string-face) 21429 21431 nil 21431 21432 (face font-lock-string-face) 21432 21452 (face font-lock-function-name-face) 21452 21453 (face font-lock-string-face) 21453 21461 nil 21461 21462 (face font-lock-string-face) 21462 21466 (face font-lock-keyword-face) 21466 21467 (face font-lock-string-face) 21467 21469 nil 21469 21470 (face font-lock-string-face) 21470 21484 (face font-lock-type-face) 21484 21485 (face font-lock-string-face) 21485 21493 nil 21493 21494 (face font-lock-string-face) 21494 21506 (face font-lock-keyword-face) 21506 21507 (face font-lock-string-face) 21507 21519 nil 21519 21520 (face font-lock-string-face) 21520 21522 (face font-lock-constant-face) 21522 21523 (face font-lock-string-face) 21523 21540 nil 21540 21541 (face font-lock-string-face) 21541 21548 (face font-lock-keyword-face) 21548 21549 (face font-lock-string-face) 21549 21561 nil 21561 21562 (face font-lock-string-face) 21562 21595 (face font-lock-constant-face) 21595 21596 (face font-lock-string-face) 21596 21606 nil 21606 21607 (face font-lock-string-face) 21607 21637 (face font-lock-constant-face) 21637 21638 (face font-lock-string-face) 21638 21648 nil 21648 21649 (face font-lock-string-face) 21649 21682 (face font-lock-constant-face) 21682 21683 (face font-lock-string-face) 21683 21693 nil 21693 21694 (face font-lock-string-face) 21694 21724 (face font-lock-constant-face) 21724 21725 (face font-lock-string-face) 21725 21735 nil 21735 21736 (face font-lock-string-face) 21736 21758 (face font-lock-constant-face) 21758 21759 (face font-lock-string-face) 21759 21769 nil 21769 21770 (face font-lock-string-face) 21770 21795 (face font-lock-constant-face) 21795 21796 (face font-lock-string-face) 21796 21806 nil 21806 21807 (face font-lock-string-face) 21807 21836 (face font-lock-constant-face) 21836 21837 (face font-lock-string-face) 21837 21847 nil 21847 21848 (face font-lock-string-face) 21848 21876 (face font-lock-constant-face) 21876 21877 (face font-lock-string-face) 21877 21907 nil 21907 21908 (face font-lock-string-face) 21908 21919 (face font-lock-keyword-face) 21919 21920 (face font-lock-string-face) 21920 21922 nil 21922 21923 (face font-lock-string-face) 21923 21938 (face font-lock-function-name-face) 21938 21939 (face font-lock-string-face) 21939 21947 nil 21947 21948 (face font-lock-string-face) 21948 21952 (face font-lock-keyword-face) 21952 21953 (face font-lock-string-face) 21953 21955 nil 21955 21956 (face font-lock-string-face) 21956 21966 (face font-lock-type-face) 21966 21967 (face font-lock-string-face) 21967 21975 nil 21975 21976 (face font-lock-string-face) 21976 21988 (face font-lock-keyword-face) 21988 21989 (face font-lock-string-face) 21989 22001 nil 22001 22002 (face font-lock-string-face) 22002 22007 (face font-lock-function-name-face) 22007 22008 (face font-lock-string-face) 22008 22018 nil 22018 22019 (face font-lock-string-face) 22019 22037 (face font-lock-function-name-face) 22037 22038 (face font-lock-string-face) 22038 22048 nil 22048 22049 (face font-lock-string-face) 22049 22060 (face font-lock-function-name-face) 22060 22061 (face font-lock-string-face) 22061 22071 nil 22071 22072 (face font-lock-string-face) 22072 22093 (face font-lock-function-name-face) 22093 22094 (face font-lock-string-face) 22094 22104 nil 22104 22105 (face font-lock-string-face) 22105 22131 (face font-lock-function-name-face) 22131 22132 (face font-lock-string-face) 22132 22142 nil 22142 22143 (face font-lock-string-face) 22143 22177 (face font-lock-function-name-face) 22177 22178 (face font-lock-string-face) 22178 22188 nil 22188 22189 (face font-lock-string-face) 22189 22215 (face font-lock-function-name-face) 22215 22216 (face font-lock-string-face) 22216 22226 nil 22226 22227 (face font-lock-string-face) 22227 22253 (face font-lock-function-name-face) 22253 22254 (face font-lock-string-face) 22254 22264 nil 22264 22265 (face font-lock-string-face) 22265 22280 (face font-lock-function-name-face) 22280 22281 (face font-lock-string-face) 22281 22298 nil 22298 22299 (face font-lock-string-face) 22299 22306 (face font-lock-keyword-face) 22306 22307 (face font-lock-string-face) 22307 22319 nil 22319 22320 (face font-lock-string-face) 22320 22361 (face font-lock-constant-face) 22361 22362 (face font-lock-string-face) 22362 22372 nil 22372 22373 (face font-lock-string-face) 22373 22413 (face font-lock-constant-face) 22413 22414 (face font-lock-string-face) 22414 22424 nil 22424 22425 (face font-lock-string-face) 22425 22461 (face font-lock-constant-face) 22461 22462 (face font-lock-string-face) 22462 22472 nil 22472 22473 (face font-lock-string-face) 22473 22502 (face font-lock-constant-face) 22502 22503 (face font-lock-string-face) 22503 22513 nil 22513 22514 (face font-lock-string-face) 22514 22550 (face font-lock-constant-face) 22550 22551 (face font-lock-string-face) 22551 22561 nil 22561 22562 (face font-lock-string-face) 22562 22610 (face font-lock-constant-face) 22610 22611 (face font-lock-string-face) 22611 22621 nil 22621 22622 (face font-lock-string-face) 22622 22663 (face font-lock-constant-face) 22663 22664 (face font-lock-string-face) 22664 22674 nil 22674 22675 (face font-lock-string-face) 22675 22711 (face font-lock-constant-face) 22711 22712 (face font-lock-string-face) 22712 22722 nil 22722 22723 (face font-lock-string-face) 22723 22757 (face font-lock-constant-face) 22757 22758 (face font-lock-string-face) 22758 22768 nil 22768 22769 (face font-lock-string-face) 22769 22797 (face font-lock-constant-face) 22797 22798 (face font-lock-string-face) 22798 22808 nil 22808 22809 (face font-lock-string-face) 22809 22853 (face font-lock-constant-face) 22853 22854 (face font-lock-string-face) 22854 22864 nil 22864 22865 (face font-lock-string-face) 22865 22900 (face font-lock-constant-face) 22900 22901 (face font-lock-string-face) 22901 22911 nil 22911 22912 (face font-lock-string-face) 22912 22961 (face font-lock-constant-face) 22961 22962 (face font-lock-string-face) 22962 22972 nil 22972 22973 (face font-lock-string-face) 22973 23011 (face font-lock-constant-face) 23011 23012 (face font-lock-string-face) 23012 23022 nil 23022 23023 (face font-lock-string-face) 23023 23055 (face font-lock-constant-face) 23055 23056 (face font-lock-string-face) 23056 23066 nil 23066 23067 (face font-lock-string-face) 23067 23116 (face font-lock-constant-face) 23116 23117 (face font-lock-string-face) 23117 23127 nil 23127 23128 (face font-lock-string-face) 23128 23178 (face font-lock-constant-face) 23178 23179 (face font-lock-string-face) 23179 23189 nil 23189 23190 (face font-lock-string-face) 23190 23228 (face font-lock-constant-face) 23228 23229 (face font-lock-string-face) 23229 23239 nil 23239 23240 (face font-lock-string-face) 23240 23277 (face font-lock-constant-face) 23277 23278 (face font-lock-string-face) 23278 23288 nil 23288 23289 (face font-lock-string-face) 23289 23332 (face font-lock-constant-face) 23332 23333 (face font-lock-string-face) 23333 23343 nil 23343 23344 (face font-lock-string-face) 23344 23368 (face font-lock-constant-face) 23368 23369 (face font-lock-string-face) 23369 23379 nil 23379 23380 (face font-lock-string-face) 23380 23402 (face font-lock-constant-face) 23402 23403 (face font-lock-string-face) 23403 23413 nil 23413 23414 (face font-lock-string-face) 23414 23447 (face font-lock-constant-face) 23447 23448 (face font-lock-string-face) 23448 23458 nil 23458 23459 (face font-lock-string-face) 23459 23487 (face font-lock-constant-face) 23487 23488 (face font-lock-string-face) 23488 23498 nil 23498 23499 (face font-lock-string-face) 23499 23530 (face font-lock-constant-face) 23530 23531 (face font-lock-string-face) 23531 23541 nil 23541 23542 (face font-lock-string-face) 23542 23563 (face font-lock-constant-face) 23563 23564 (face font-lock-string-face) 23564 23574 nil 23574 23575 (face font-lock-string-face) 23575 23609 (face font-lock-constant-face) 23609 23610 (face font-lock-string-face) 23610 23620 nil 23620 23621 (face font-lock-string-face) 23621 23654 (face font-lock-constant-face) 23654 23655 (face font-lock-string-face) 23655 23665 nil 23665 23666 (face font-lock-string-face) 23666 23700 (face font-lock-constant-face) 23700 23701 (face font-lock-string-face) 23701 23711 nil 23711 23712 (face font-lock-string-face) 23712 23753 (face font-lock-constant-face) 23753 23754 (face font-lock-string-face) 23754 23764 nil 23764 23765 (face font-lock-string-face) 23765 23790 (face font-lock-constant-face) 23790 23791 (face font-lock-string-face) 23791 23801 nil 23801 23802 (face font-lock-string-face) 23802 23825 (face font-lock-constant-face) 23825 23826 (face font-lock-string-face) 23826 23836 nil 23836 23837 (face font-lock-string-face) 23837 23862 (face font-lock-constant-face) 23862 23863 (face font-lock-string-face) 23863 23873 nil 23873 23874 (face font-lock-string-face) 23874 23906 (face font-lock-constant-face) 23906 23907 (face font-lock-string-face) 23907 23917 nil 23917 23918 (face font-lock-string-face) 23918 23947 (face font-lock-constant-face) 23947 23948 (face font-lock-string-face) 23948 23958 nil 23958 23959 (face font-lock-string-face) 23959 23981 (face font-lock-constant-face) 23981 23982 (face font-lock-string-face) 23982 23992 nil 23992 23993 (face font-lock-string-face) 23993 24014 (face font-lock-constant-face) 24014 24015 (face font-lock-string-face) 24015 24025 nil 24025 24026 (face font-lock-string-face) 24026 24054 (face font-lock-constant-face) 24054 24055 (face font-lock-string-face) 24055 24065 nil 24065 24066 (face font-lock-string-face) 24066 24093 (face font-lock-constant-face) 24093 24094 (face font-lock-string-face) 24094 24104 nil 24104 24105 (face font-lock-string-face) 24105 24133 (face font-lock-constant-face) 24133 24134 (face font-lock-string-face) 24134 24144 nil 24144 24145 (face font-lock-string-face) 24145 24177 (face font-lock-constant-face) 24177 24178 (face font-lock-string-face) 24178 24188 nil 24188 24189 (face font-lock-string-face) 24189 24221 (face font-lock-constant-face) 24221 24222 (face font-lock-string-face) 24222 24232 nil 24232 24233 (face font-lock-string-face) 24233 24277 (face font-lock-constant-face) 24277 24278 (face font-lock-string-face) 24278 24288 nil 24288 24289 (face font-lock-string-face) 24289 24328 (face font-lock-constant-face) 24328 24329 (face font-lock-string-face) 24329 24339 nil 24339 24340 (face font-lock-string-face) 24340 24379 (face font-lock-constant-face) 24379 24380 (face font-lock-string-face) 24380 24390 nil 24390 24391 (face font-lock-string-face) 24391 24424 (face font-lock-constant-face) 24424 24425 (face font-lock-string-face) 24425 24435 nil 24435 24436 (face font-lock-string-face) 24436 24476 (face font-lock-constant-face) 24476 24477 (face font-lock-string-face) 24477 24487 nil 24487 24488 (face font-lock-string-face) 24488 24521 (face font-lock-constant-face) 24521 24522 (face font-lock-string-face) 24522 24532 nil 24532 24533 (face font-lock-string-face) 24533 24567 (face font-lock-constant-face) 24567 24568 (face font-lock-string-face) 24568 24578 nil 24578 24579 (face font-lock-string-face) 24579 24610 (face font-lock-constant-face) 24610 24611 (face font-lock-string-face) 24611 24621 nil 24621 24622 (face font-lock-string-face) 24622 24673 (face font-lock-constant-face) 24673 24674 (face font-lock-string-face) 24674 24684 nil 24684 24685 (face font-lock-string-face) 24685 24725 (face font-lock-constant-face) 24725 24726 (face font-lock-string-face) 24726 24736 nil 24736 24737 (face font-lock-string-face) 24737 24773 (face font-lock-constant-face) 24773 24774 (face font-lock-string-face) 24774 24784 nil 24784 24785 (face font-lock-string-face) 24785 24821 (face font-lock-constant-face) 24821 24822 (face font-lock-string-face) 24822 24832 nil 24832 24833 (face font-lock-string-face) 24833 24874 (face font-lock-constant-face) 24874 24875 (face font-lock-string-face) 24875 24885 nil 24885 24886 (face font-lock-string-face) 24886 24926 (face font-lock-constant-face) 24926 24927 (face font-lock-string-face) 24927 24937 nil 24937 24938 (face font-lock-string-face) 24938 24977 (face font-lock-constant-face) 24977 24978 (face font-lock-string-face) 24978 24988 nil 24988 24989 (face font-lock-string-face) 24989 25035 (face font-lock-constant-face) 25035 25036 (face font-lock-string-face) 25036 25046 nil 25046 25047 (face font-lock-string-face) 25047 25070 (face font-lock-constant-face) 25070 25071 (face font-lock-string-face) 25071 25081 nil 25081 25082 (face font-lock-string-face) 25082 25104 (face font-lock-constant-face) 25104 25105 (face font-lock-string-face) 25105 25115 nil 25115 25116 (face font-lock-string-face) 25116 25152 (face font-lock-constant-face) 25152 25153 (face font-lock-string-face) 25153 25163 nil 25163 25164 (face font-lock-string-face) 25164 25210 (face font-lock-constant-face) 25210 25211 (face font-lock-string-face) 25211 25221 nil 25221 25222 (face font-lock-string-face) 25222 25250 (face font-lock-constant-face) 25250 25251 (face font-lock-string-face) 25251 25268 nil 25268 25269 (face font-lock-string-face) 25269 25279 (face font-lock-keyword-face) 25279 25280 (face font-lock-string-face) 25280 25293 nil 25293 25294 (face font-lock-string-face) 25294 25319 (face font-lock-variable-name-face) 25319 25320 (face font-lock-string-face) 25320 25334 nil 25334 25335 (face font-lock-string-face) 25335 25345 (face font-lock-keyword-face) 25345 25346 (face font-lock-string-face) 25346 25363 nil 25363 25364 (face font-lock-string-face) 25364 25385 (face font-lock-variable-name-face) 25385 25386 (face font-lock-string-face) 25386 25404 nil 25404 25405 (face font-lock-string-face) 25405 25417 (face font-lock-keyword-face) 25417 25418 (face font-lock-string-face) 25418 25438 nil 25438 25439 (face font-lock-string-face) 25439 25480 (face font-lock-function-name-face) 25480 25481 (face font-lock-string-face) 25481 25550 nil 25550 25551 (face font-lock-string-face) 25551 25566 (face font-lock-variable-name-face) 25566 25567 (face font-lock-string-face) 25567 25581 nil 25581 25582 (face font-lock-string-face) 25582 25594 (face font-lock-keyword-face) 25594 25595 (face font-lock-string-face) 25595 25611 nil 25611 25612 (face font-lock-string-face) 25612 25651 (face font-lock-function-name-face) 25651 25652 (face font-lock-string-face) 25652 25688 nil 25688 25689 (face font-lock-string-face) 25689 25704 (face font-lock-variable-name-face) 25704 25705 (face font-lock-string-face) 25705 25719 nil 25719 25720 (face font-lock-string-face) 25720 25728 (face font-lock-keyword-face) 25728 25729 (face font-lock-string-face) 25729 25745 nil 25745 25746 (face font-lock-string-face) 25746 25782 (face font-lock-constant-face) 25782 25783 (face font-lock-string-face) 25783 25797 nil 25797 25798 (face font-lock-string-face) 25798 25820 (face font-lock-constant-face) 25820 25821 (face font-lock-string-face) 25821 25835 nil 25835 25836 (face font-lock-string-face) 25836 25857 (face font-lock-constant-face) 25857 25858 (face font-lock-string-face) 25858 25872 nil 25872 25873 (face font-lock-string-face) 25873 25905 (face font-lock-constant-face) 25905 25906 (face font-lock-string-face) 25906 25920 nil 25920 25921 (face font-lock-string-face) 25921 25961 (face font-lock-constant-face) 25961 25962 (face font-lock-string-face) 25962 25976 nil 25976 25977 (face font-lock-string-face) 25977 26016 (face font-lock-constant-face) 26016 26017 (face font-lock-string-face) 26017 26031 nil 26031 26032 (face font-lock-string-face) 26032 26065 (face font-lock-constant-face) 26065 26066 (face font-lock-string-face) 26066 26080 nil 26080 26081 (face font-lock-string-face) 26081 26115 (face font-lock-constant-face) 26115 26116 (face font-lock-string-face) 26116 26130 nil 26130 26131 (face font-lock-string-face) 26131 26162 (face font-lock-constant-face) 26162 26163 (face font-lock-string-face) 26163 26177 nil 26177 26178 (face font-lock-string-face) 26178 26229 (face font-lock-constant-face) 26229 26230 (face font-lock-string-face) 26230 26244 nil 26244 26245 (face font-lock-string-face) 26245 26285 (face font-lock-constant-face) 26285 26286 (face font-lock-string-face) 26286 26300 nil 26300 26301 (face font-lock-string-face) 26301 26337 (face font-lock-constant-face) 26337 26338 (face font-lock-string-face) 26338 26352 nil 26352 26353 (face font-lock-string-face) 26353 26394 (face font-lock-constant-face) 26394 26395 (face font-lock-string-face) 26395 26409 nil 26409 26410 (face font-lock-string-face) 26410 26443 (face font-lock-constant-face) 26443 26444 (face font-lock-string-face) 26444 26458 nil 26458 26459 (face font-lock-string-face) 26459 26495 (face font-lock-constant-face) 26495 26496 (face font-lock-string-face) 26496 26532 nil 26532 26533 (face font-lock-string-face) 26533 26546 (face font-lock-variable-name-face) 26546 26547 (face font-lock-string-face) 26547 26561 nil 26561 26562 (face font-lock-string-face) 26562 26572 (face font-lock-keyword-face) 26572 26573 (face font-lock-string-face) 26573 26590 nil 26590 26591 (face font-lock-string-face) 26591 26604 (face font-lock-variable-name-face) 26604 26605 (face font-lock-string-face) 26605 26623 nil 26623 26624 (face font-lock-string-face) 26624 26631 (face font-lock-keyword-face) 26631 26632 (face font-lock-string-face) 26632 26652 nil 26652 26653 (face font-lock-string-face) 26653 26688 (face font-lock-constant-face) 26688 26689 (face font-lock-string-face) 26689 26722 nil 26722 26723 (face font-lock-string-face) 26723 26730 (face font-lock-keyword-face) 26730 26731 (face font-lock-string-face) 26731 26751 nil 26751 26752 (face font-lock-string-face) 26752 26760 (face font-lock-preprocessor-face) 26760 26761 (face font-lock-string-face) 26761 26831 nil 26831 26832 (face font-lock-string-face) 26832 26873 (face font-lock-variable-name-face) 26873 26874 (face font-lock-string-face) 26874 26888 nil 26888 26889 (face font-lock-string-face) 26889 26896 (face font-lock-keyword-face) 26896 26897 (face font-lock-string-face) 26897 26913 nil 26913 26914 (face font-lock-string-face) 26914 26954 (face font-lock-constant-face) 26954 26955 (face font-lock-string-face) 26955 26991 nil 26991 26992 (face font-lock-string-face) 26992 27035 (face font-lock-variable-name-face) 27035 27036 (face font-lock-string-face) 27036 27050 nil 27050 27051 (face font-lock-string-face) 27051 27058 (face font-lock-keyword-face) 27058 27059 (face font-lock-string-face) 27059 27075 nil 27075 27076 (face font-lock-string-face) 27076 27095 (face font-lock-constant-face) 27095 27096 (face font-lock-string-face) 27096 27110 nil 27110 27111 (face font-lock-string-face) 27111 27137 (face font-lock-constant-face) 27137 27138 (face font-lock-string-face) 27138 27152 nil 27152 27153 (face font-lock-string-face) 27153 27186 (face font-lock-constant-face) 27186 27187 (face font-lock-string-face) 27187 27201 nil 27201 27202 (face font-lock-string-face) 27202 27235 (face font-lock-constant-face) 27235 27236 (face font-lock-string-face) 27236 27291 nil 27291 27292 (face font-lock-string-face) 27292 27303 (face font-lock-keyword-face) 27303 27304 (face font-lock-string-face) 27304 27306 nil 27306 27307 (face font-lock-string-face) 27307 27325 (face font-lock-function-name-face) 27325 27326 (face font-lock-string-face) 27326 27334 nil 27334 27335 (face font-lock-string-face) 27335 27339 (face font-lock-keyword-face) 27339 27340 (face font-lock-string-face) 27340 27342 nil 27342 27343 (face font-lock-string-face) 27343 27357 (face font-lock-type-face) 27357 27358 (face font-lock-string-face) 27358 27366 nil 27366 27367 (face font-lock-string-face) 27367 27379 (face font-lock-keyword-face) 27379 27380 (face font-lock-string-face) 27380 27392 nil 27392 27393 (face font-lock-string-face) 27393 27398 (face font-lock-function-name-face) 27398 27399 (face font-lock-string-face) 27399 27409 nil 27409 27410 (face font-lock-string-face) 27410 27431 (face font-lock-function-name-face) 27431 27432 (face font-lock-string-face) 27432 27442 nil 27442 27443 (face font-lock-string-face) 27443 27469 (face font-lock-function-name-face) 27469 27470 (face font-lock-string-face) 27470 27480 nil 27480 27481 (face font-lock-string-face) 27481 27507 (face font-lock-function-name-face) 27507 27508 (face font-lock-string-face) 27508 27525 nil 27525 27526 (face font-lock-string-face) 27526 27533 (face font-lock-keyword-face) 27533 27534 (face font-lock-string-face) 27534 27546 nil 27546 27547 (face font-lock-string-face) 27547 27591 (face font-lock-constant-face) 27591 27592 (face font-lock-string-face) 27592 27602 nil 27602 27603 (face font-lock-string-face) 27603 27646 (face font-lock-constant-face) 27646 27647 (face font-lock-string-face) 27647 27657 nil 27657 27658 (face font-lock-string-face) 27658 27679 (face font-lock-constant-face) 27679 27680 (face font-lock-string-face) 27680 27690 nil 27690 27691 (face font-lock-string-face) 27691 27711 (face font-lock-constant-face) 27711 27712 (face font-lock-string-face) 27712 27722 nil 27722 27723 (face font-lock-string-face) 27723 27752 (face font-lock-constant-face) 27752 27753 (face font-lock-string-face) 27753 27763 nil 27763 27764 (face font-lock-string-face) 27764 27792 (face font-lock-constant-face) 27792 27793 (face font-lock-string-face) 27793 27803 nil 27803 27804 (face font-lock-string-face) 27804 27829 (face font-lock-constant-face) 27829 27830 (face font-lock-string-face) 27830 27840 nil 27840 27841 (face font-lock-string-face) 27841 27865 (face font-lock-constant-face) 27865 27866 (face font-lock-string-face) 27866 27876 nil 27876 27877 (face font-lock-string-face) 27877 27901 (face font-lock-constant-face) 27901 27902 (face font-lock-string-face) 27902 27912 nil 27912 27913 (face font-lock-string-face) 27913 27936 (face font-lock-constant-face) 27936 27937 (face font-lock-string-face) 27937 27947 nil 27947 27948 (face font-lock-string-face) 27948 27968 (face font-lock-constant-face) 27968 27969 (face font-lock-string-face) 27969 27979 nil 27979 27980 (face font-lock-string-face) 27980 27999 (face font-lock-constant-face) 27999 28000 (face font-lock-string-face) 28000 28030 nil 28030 28031 (face font-lock-string-face) 28031 28042 (face font-lock-keyword-face) 28042 28043 (face font-lock-string-face) 28043 28045 nil 28045 28046 (face font-lock-string-face) 28046 28058 (face font-lock-function-name-face) 28058 28059 (face font-lock-string-face) 28059 28067 nil 28067 28068 (face font-lock-string-face) 28068 28072 (face font-lock-keyword-face) 28072 28073 (face font-lock-string-face) 28073 28075 nil 28075 28076 (face font-lock-string-face) 28076 28086 (face font-lock-type-face) 28086 28087 (face font-lock-string-face) 28087 28095 nil 28095 28096 (face font-lock-string-face) 28096 28108 (face font-lock-keyword-face) 28108 28109 (face font-lock-string-face) 28109 28121 nil 28121 28122 (face font-lock-string-face) 28122 28127 (face font-lock-function-name-face) 28127 28128 (face font-lock-string-face) 28128 28138 nil 28138 28139 (face font-lock-string-face) 28139 28150 (face font-lock-function-name-face) 28150 28151 (face font-lock-string-face) 28151 28161 nil 28161 28162 (face font-lock-string-face) 28162 28183 (face font-lock-function-name-face) 28183 28184 (face font-lock-string-face) 28184 28194 nil 28194 28195 (face font-lock-string-face) 28195 28216 (face font-lock-function-name-face) 28216 28217 (face font-lock-string-face) 28217 28234 nil 28234 28235 (face font-lock-string-face) 28235 28242 (face font-lock-keyword-face) 28242 28243 (face font-lock-string-face) 28243 28255 nil 28255 28256 (face font-lock-string-face) 28256 28290 (face font-lock-constant-face) 28290 28291 (face font-lock-string-face) 28291 28321 nil 28321 28322 (face font-lock-string-face) 28322 28333 (face font-lock-keyword-face) 28333 28334 (face font-lock-string-face) 28334 28336 nil 28336 28337 (face font-lock-string-face) 28337 28349 (face font-lock-function-name-face) 28349 28350 (face font-lock-string-face) 28350 28358 nil 28358 28359 (face font-lock-string-face) 28359 28363 (face font-lock-keyword-face) 28363 28364 (face font-lock-string-face) 28364 28366 nil 28366 28367 (face font-lock-string-face) 28367 28377 (face font-lock-type-face) 28377 28378 (face font-lock-string-face) 28378 28386 nil 28386 28387 (face font-lock-string-face) 28387 28394 (face font-lock-keyword-face) 28394 28395 (face font-lock-string-face) 28395 28407 nil 28407 28408 (face font-lock-string-face) 28408 28441 (face font-lock-constant-face) 28441 28442 (face font-lock-string-face) 28442 28471 nil 28471 28472 (face font-lock-string-face) 28472 28483 (face font-lock-keyword-face) 28483 28484 (face font-lock-string-face) 28484 28486 nil 28486 28487 (face font-lock-string-face) 28487 28498 (face font-lock-function-name-face) 28498 28499 (face font-lock-string-face) 28499 28507 nil 28507 28508 (face font-lock-string-face) 28508 28512 (face font-lock-keyword-face) 28512 28513 (face font-lock-string-face) 28513 28515 nil 28515 28516 (face font-lock-string-face) 28516 28526 (face font-lock-type-face) 28526 28527 (face font-lock-string-face) 28527 28535 nil 28535 28536 (face font-lock-string-face) 28536 28548 (face font-lock-keyword-face) 28548 28549 (face font-lock-string-face) 28549 28561 nil 28561 28562 (face font-lock-string-face) 28562 28567 (face font-lock-function-name-face) 28567 28568 (face font-lock-string-face) 28568 28578 nil 28578 28579 (face font-lock-string-face) 28579 28600 (face font-lock-function-name-face) 28600 28601 (face font-lock-string-face) 28601 28618 nil 28618 28619 (face font-lock-string-face) 28619 28626 (face font-lock-keyword-face) 28626 28627 (face font-lock-string-face) 28627 28639 nil 28639 28640 (face font-lock-string-face) 28640 28672 (face font-lock-constant-face) 28672 28673 (face font-lock-string-face) 28673 28698 nil 28698 28699 (face font-lock-string-face) 28699 28709 (face font-lock-keyword-face) 28709 28710 (face font-lock-string-face) 28710 28719 nil 28719 28720 (face font-lock-string-face) 28720 28729 (face font-lock-variable-name-face) 28729 28730 (face font-lock-string-face) 28730 28740 nil 28740 28741 (face font-lock-string-face) 28741 28748 (face font-lock-keyword-face) 28748 28749 (face font-lock-string-face) 28749 28773 nil 28773 28774 (face font-lock-string-face) 28774 28785 (face font-lock-keyword-face) 28785 28786 (face font-lock-string-face) 28786 28788 nil 28788 28789 (face font-lock-string-face) 28789 28799 (face font-lock-function-name-face) 28799 28800 (face font-lock-string-face) 28800 28812 nil 28812 28813 (face font-lock-string-face) 28813 28817 (face font-lock-keyword-face) 28817 28818 (face font-lock-string-face) 28818 28820 nil 28820 28821 (face font-lock-string-face) 28821 28831 (face font-lock-type-face) 28831 28832 (face font-lock-string-face) 28832 28844 nil 28844 28845 (face font-lock-string-face) 28845 28857 (face font-lock-keyword-face) 28857 28858 (face font-lock-string-face) 28858 28874 nil 28874 28875 (face font-lock-string-face) 28875 28880 (face font-lock-function-name-face) 28880 28881 (face font-lock-string-face) 28881 28895 nil 28895 28896 (face font-lock-string-face) 28896 28907 (face font-lock-function-name-face) 28907 28908 (face font-lock-string-face) 28908 28922 nil 28922 28923 (face font-lock-string-face) 28923 28944 (face font-lock-function-name-face) 28944 28945 (face font-lock-string-face) 28945 28959 nil 28959 28960 (face font-lock-string-face) 28960 29043 (face font-lock-function-name-face) 29043 29044 (face font-lock-string-face) 29044 29058 nil 29058 29059 (face font-lock-string-face) 29059 29074 (face font-lock-function-name-face) 29074 29075 (face font-lock-string-face) 29075 29100 nil 29100 29101 (face font-lock-string-face) 29101 29113 (face font-lock-keyword-face) 29113 29114 (face font-lock-string-face) 29114 29130 nil 29130 29131 (face font-lock-string-face) 29131 29133 (face font-lock-constant-face) 29133 29138 (face font-lock-variable-name-face) 29138 29163 (face font-lock-constant-face) 29163 29164 (face font-lock-string-face) 29164 29189 nil 29189 29190 (face font-lock-string-face) 29190 29197 (face font-lock-keyword-face) 29197 29198 (face font-lock-string-face) 29198 29214 nil 29214 29215 (face font-lock-string-face) 29215 29238 (face font-lock-constant-face) 29238 29239 (face font-lock-string-face) 29239 29253 nil 29253 29254 (face font-lock-string-face) 29254 29280 (face font-lock-constant-face) 29280 29281 (face font-lock-string-face) 29281 29295 nil 29295 29296 (face font-lock-string-face) 29296 29321 (face font-lock-constant-face) 29321 29322 (face font-lock-string-face) 29322 29336 nil 29336 29337 (face font-lock-string-face) 29337 29361 (face font-lock-constant-face) 29361 29362 (face font-lock-string-face) 29362 29376 nil 29376 29377 (face font-lock-string-face) 29377 29407 (face font-lock-constant-face) 29407 29408 (face font-lock-string-face) 29408 29422 nil 29422 29423 (face font-lock-string-face) 29423 29453 (face font-lock-constant-face) 29453 29454 (face font-lock-string-face) 29454 29468 nil 29468 29469 (face font-lock-string-face) 29469 29493 (face font-lock-constant-face) 29493 29494 (face font-lock-string-face) 29494 29508 nil 29508 29509 (face font-lock-string-face) 29509 29532 (face font-lock-constant-face) 29532 29533 (face font-lock-string-face) 29533 29547 nil 29547 29548 (face font-lock-string-face) 29548 29575 (face font-lock-constant-face) 29575 29576 (face font-lock-string-face) 29576 29590 nil 29590 29591 (face font-lock-string-face) 29591 29614 (face font-lock-constant-face) 29614 29615 (face font-lock-string-face) 29615 29640 nil 29640 29655 (face font-lock-string-face) 29655 29671 nil 29671 29685 (face font-lock-string-face) 29685 29703 nil 29703 29714 (face font-lock-string-face) 29714 29716 nil 29716 29719 (face font-lock-string-face) 29719 29729 nil 29729 29754 (face font-lock-comment-face) 29754 29792 nil 29792 29793 (face font-lock-string-face) 29793 29800 (face font-lock-keyword-face) 29800 29801 (face font-lock-string-face) 29801 29817 nil 29817 29818 (face font-lock-string-face) 29818 29843 (face font-lock-preprocessor-face) 29843 29844 (face font-lock-string-face) 29844 29892 nil 29892 29893 (face font-lock-string-face) 29893 29929 (face font-lock-variable-name-face) 29929 29930 (face font-lock-string-face) 29930 29940 nil 29940 29941 (face font-lock-string-face) 29941 29948 (face font-lock-keyword-face) 29948 29949 (face font-lock-string-face) 29949 29973 nil 29973 29974 (face font-lock-string-face) 29974 29985 (face font-lock-keyword-face) 29985 29986 (face font-lock-string-face) 29986 29988 nil 29988 29989 (face font-lock-string-face) 29989 30001 (face font-lock-function-name-face) 30001 30002 (face font-lock-string-face) 30002 30014 nil 30014 30015 (face font-lock-string-face) 30015 30019 (face font-lock-keyword-face) 30019 30020 (face font-lock-string-face) 30020 30022 nil 30022 30023 (face font-lock-string-face) 30023 30033 (face font-lock-type-face) 30033 30034 (face font-lock-string-face) 30034 30046 nil 30046 30047 (face font-lock-string-face) 30047 30059 (face font-lock-keyword-face) 30059 30060 (face font-lock-string-face) 30060 30076 nil 30076 30077 (face font-lock-string-face) 30077 30082 (face font-lock-function-name-face) 30082 30083 (face font-lock-string-face) 30083 30097 nil 30097 30098 (face font-lock-string-face) 30098 30109 (face font-lock-function-name-face) 30109 30110 (face font-lock-string-face) 30110 30124 nil 30124 30125 (face font-lock-string-face) 30125 30146 (face font-lock-function-name-face) 30146 30147 (face font-lock-string-face) 30147 30161 nil 30161 30162 (face font-lock-string-face) 30162 30180 (face font-lock-function-name-face) 30180 30181 (face font-lock-string-face) 30181 30206 nil 30206 30207 (face font-lock-string-face) 30207 30214 (face font-lock-keyword-face) 30214 30215 (face font-lock-string-face) 30215 30231 nil 30231 30232 (face font-lock-string-face) 30232 30266 (face font-lock-constant-face) 30266 30267 (face font-lock-string-face) 30267 30281 nil 30281 30282 (face font-lock-string-face) 30282 30321 (face font-lock-constant-face) 30321 30322 (face font-lock-string-face) 30322 30336 nil 30336 30337 (face font-lock-string-face) 30337 30375 (face font-lock-constant-face) 30375 30376 (face font-lock-string-face) 30376 30390 nil 30390 30391 (face font-lock-string-face) 30391 30430 (face font-lock-constant-face) 30430 30431 (face font-lock-string-face) 30431 30445 nil 30445 30446 (face font-lock-string-face) 30446 30484 (face font-lock-constant-face) 30484 30485 (face font-lock-string-face) 30485 30499 nil 30499 30500 (face font-lock-string-face) 30500 30533 (face font-lock-constant-face) 30533 30534 (face font-lock-string-face) 30534 30548 nil 30548 30549 (face font-lock-string-face) 30549 30581 (face font-lock-constant-face) 30581 30582 (face font-lock-string-face) 30582 30596 nil 30596 30597 (face font-lock-string-face) 30597 30626 (face font-lock-constant-face) 30626 30627 (face font-lock-string-face) 30627 30641 nil 30641 30642 (face font-lock-string-face) 30642 30670 (face font-lock-constant-face) 30670 30671 (face font-lock-string-face) 30671 30685 nil 30685 30686 (face font-lock-string-face) 30686 30714 (face font-lock-constant-face) 30714 30715 (face font-lock-string-face) 30715 30729 nil 30729 30730 (face font-lock-string-face) 30730 30757 (face font-lock-constant-face) 30757 30758 (face font-lock-string-face) 30758 30783 nil 30783 30784 (face font-lock-string-face) 30784 30794 (face font-lock-keyword-face) 30794 30795 (face font-lock-string-face) 30795 30812 nil 30812 30813 (face font-lock-string-face) 30813 30834 (face font-lock-variable-name-face) 30834 30835 (face font-lock-string-face) 30835 30853 nil 30853 30854 (face font-lock-string-face) 30854 30866 (face font-lock-keyword-face) 30866 30867 (face font-lock-string-face) 30867 30887 nil 30887 30888 (face font-lock-string-face) 30888 30917 (face font-lock-function-name-face) 30917 30918 (face font-lock-string-face) 30918 30951 nil 30951 30952 (face font-lock-string-face) 30952 30959 (face font-lock-keyword-face) 30959 30960 (face font-lock-string-face) 30960 30980 nil 30980 30981 (face font-lock-string-face) 30981 31015 (face font-lock-constant-face) 31015 31016 (face font-lock-string-face) 31016 31064 nil 31064 31065 (face font-lock-string-face) 31065 31074 (face font-lock-variable-name-face) 31074 31075 (face font-lock-string-face) 31075 31093 nil 31093 31094 (face font-lock-string-face) 31094 31106 (face font-lock-keyword-face) 31106 31107 (face font-lock-string-face) 31107 31127 nil 31127 31128 (face font-lock-string-face) 31128 31175 (face font-lock-function-name-face) 31175 31176 (face font-lock-string-face) 31176 31194 nil 31194 31195 (face font-lock-string-face) 31195 31245 (face font-lock-function-name-face) 31245 31246 (face font-lock-string-face) 31246 31279 nil 31279 31280 (face font-lock-string-face) 31280 31287 (face font-lock-keyword-face) 31287 31288 (face font-lock-string-face) 31288 31308 nil 31308 31309 (face font-lock-string-face) 31309 31341 (face font-lock-constant-face) 31341 31342 (face font-lock-string-face) 31342 31423 nil 31423 31424 (face font-lock-string-face) 31424 31462 (face font-lock-variable-name-face) 31462 31463 (face font-lock-string-face) 31463 31473 nil 31473 31474 (face font-lock-string-face) 31474 31481 (face font-lock-keyword-face) 31481 31482 (face font-lock-string-face) 31482 31506 nil 31506 31507 (face font-lock-string-face) 31507 31518 (face font-lock-keyword-face) 31518 31519 (face font-lock-string-face) 31519 31521 nil 31521 31522 (face font-lock-string-face) 31522 31539 (face font-lock-function-name-face) 31539 31540 (face font-lock-string-face) 31540 31552 nil 31552 31553 (face font-lock-string-face) 31553 31557 (face font-lock-keyword-face) 31557 31558 (face font-lock-string-face) 31558 31560 nil 31560 31561 (face font-lock-string-face) 31561 31571 (face font-lock-type-face) 31571 31572 (face font-lock-string-face) 31572 31584 nil 31584 31585 (face font-lock-string-face) 31585 31597 (face font-lock-keyword-face) 31597 31598 (face font-lock-string-face) 31598 31614 nil 31614 31615 (face font-lock-string-face) 31615 31636 (face font-lock-function-name-face) 31636 31637 (face font-lock-string-face) 31637 31651 nil 31651 31652 (face font-lock-string-face) 31652 31670 (face font-lock-function-name-face) 31670 31671 (face font-lock-string-face) 31671 31696 nil 31696 31697 (face font-lock-string-face) 31697 31706 (face font-lock-keyword-face) 31706 31707 (face font-lock-string-face) 31707 31723 nil 31723 31724 (face font-lock-string-face) 31724 31728 (face font-lock-constant-face) 31728 31729 (face font-lock-string-face) 31729 31743 nil 31743 31744 (face font-lock-string-face) 31744 31748 (face font-lock-constant-face) 31748 31749 (face font-lock-string-face) 31749 31774 nil 31774 31775 (face font-lock-string-face) 31775 31782 (face font-lock-keyword-face) 31782 31783 (face font-lock-string-face) 31783 31799 nil 31799 31800 (face font-lock-string-face) 31800 31844 (face font-lock-constant-face) 31844 31845 (face font-lock-string-face) 31845 31893 nil 31893 31894 (face font-lock-string-face) 31894 31943 (face font-lock-variable-name-face) 31943 31944 (face font-lock-string-face) 31944 31954 nil 31954 31955 (face font-lock-string-face) 31955 31962 (face font-lock-keyword-face) 31962 31963 (face font-lock-string-face) 31963 31987 nil 31987 31988 (face font-lock-string-face) 31988 31999 (face font-lock-keyword-face) 31999 32000 (face font-lock-string-face) 32000 32002 nil 32002 32003 (face font-lock-string-face) 32003 32013 (face font-lock-function-name-face) 32013 32014 (face font-lock-string-face) 32014 32026 nil 32026 32027 (face font-lock-string-face) 32027 32031 (face font-lock-keyword-face) 32031 32032 (face font-lock-string-face) 32032 32034 nil 32034 32035 (face font-lock-string-face) 32035 32045 (face font-lock-type-face) 32045 32046 (face font-lock-string-face) 32046 32058 nil 32058 32059 (face font-lock-string-face) 32059 32071 (face font-lock-keyword-face) 32071 32072 (face font-lock-string-face) 32072 32088 nil 32088 32089 (face font-lock-string-face) 32089 32094 (face font-lock-function-name-face) 32094 32095 (face font-lock-string-face) 32095 32109 nil 32109 32110 (face font-lock-string-face) 32110 32121 (face font-lock-function-name-face) 32121 32122 (face font-lock-string-face) 32122 32136 nil 32136 32137 (face font-lock-string-face) 32137 32158 (face font-lock-function-name-face) 32158 32159 (face font-lock-string-face) 32159 32173 nil 32173 32174 (face font-lock-string-face) 32174 32192 (face font-lock-function-name-face) 32192 32193 (face font-lock-string-face) 32193 32218 nil 32218 32219 (face font-lock-string-face) 32219 32232 (face font-lock-keyword-face) 32232 32233 (face font-lock-string-face) 32233 32249 nil 32249 32250 (face font-lock-string-face) 32250 32259 (face font-lock-keyword-face) 32259 32260 (face font-lock-string-face) 32260 32278 nil 32278 32279 (face font-lock-string-face) 32279 32283 (face font-lock-constant-face) 32283 32284 (face font-lock-string-face) 32284 32300 nil 32300 32301 (face font-lock-string-face) 32301 32306 (face font-lock-constant-face) 32306 32307 (face font-lock-string-face) 32307 32323 nil 32323 32324 (face font-lock-string-face) 32324 32333 (face font-lock-constant-face) 32333 32334 (face font-lock-string-face) 32334 32350 nil 32350 32351 (face font-lock-string-face) 32351 32357 (face font-lock-constant-face) 32357 32358 (face font-lock-string-face) 32358 32398 nil 32398 32399 (face font-lock-string-face) 32399 32406 (face font-lock-keyword-face) 32406 32407 (face font-lock-string-face) 32407 32423 nil 32423 32424 (face font-lock-string-face) 32424 32462 (face font-lock-constant-face) 32462 32463 (face font-lock-string-face) 32463 32477 nil 32477 32478 (face font-lock-string-face) 32478 32515 (face font-lock-constant-face) 32515 32516 (face font-lock-string-face) 32516 32530 nil 32530 32531 (face font-lock-string-face) 32531 32568 (face font-lock-constant-face) 32568 32569 (face font-lock-string-face) 32569 32583 nil 32583 32584 (face font-lock-string-face) 32584 32620 (face font-lock-constant-face) 32620 32621 (face font-lock-string-face) 32621 32635 nil 32635 32636 (face font-lock-string-face) 32636 32666 (face font-lock-constant-face) 32666 32667 (face font-lock-string-face) 32667 32681 nil 32681 32682 (face font-lock-string-face) 32682 32720 (face font-lock-constant-face) 32720 32721 (face font-lock-string-face) 32721 32735 nil 32735 32736 (face font-lock-string-face) 32736 32773 (face font-lock-constant-face) 32773 32774 (face font-lock-string-face) 32774 32822 nil 32822 32823 (face font-lock-string-face) 32823 32838 (face font-lock-variable-name-face) 32838 32839 (face font-lock-string-face) 32839 32849 nil 32849 32850 (face font-lock-string-face) 32850 32857 (face font-lock-keyword-face) 32857 32858 (face font-lock-string-face) 32858 32882 nil 32882 32883 (face font-lock-string-face) 32883 32894 (face font-lock-keyword-face) 32894 32895 (face font-lock-string-face) 32895 32897 nil 32897 32898 (face font-lock-string-face) 32898 32912 (face font-lock-function-name-face) 32912 32913 (face font-lock-string-face) 32913 32925 nil 32925 32926 (face font-lock-string-face) 32926 32930 (face font-lock-keyword-face) 32930 32931 (face font-lock-string-face) 32931 32933 nil 32933 32934 (face font-lock-string-face) 32934 32948 (face font-lock-type-face) 32948 32949 (face font-lock-string-face) 32949 32961 nil 32961 32962 (face font-lock-string-face) 32962 32969 (face font-lock-keyword-face) 32969 32970 (face font-lock-string-face) 32970 32986 nil 32986 32987 (face font-lock-string-face) 32987 33022 (face font-lock-constant-face) 33022 33023 (face font-lock-string-face) 33023 33037 nil 33037 33038 (face font-lock-string-face) 33038 33072 (face font-lock-constant-face) 33072 33073 (face font-lock-string-face) 33073 33098 nil 33098 33099 (face font-lock-string-face) 33099 33111 (face font-lock-keyword-face) 33111 33112 (face font-lock-string-face) 33112 33128 nil 33128 33129 (face font-lock-string-face) 33129 33150 (face font-lock-function-name-face) 33150 33151 (face font-lock-string-face) 33151 33176 nil 33176 33177 (face font-lock-string-face) 33177 33189 (face font-lock-keyword-face) 33189 33190 (face font-lock-string-face) 33190 33206 nil 33206 33207 (face font-lock-string-face) 33207 33209 (face font-lock-constant-face) 33209 33232 (face font-lock-variable-name-face) 33232 33239 (face font-lock-constant-face) 33239 33240 (face font-lock-string-face) 33240 33265 nil 33265 33266 (face font-lock-string-face) 33266 33273 (face font-lock-keyword-face) 33273 33274 (face font-lock-string-face) 33274 33306 nil 33306 33307 (face font-lock-string-face) 33307 33318 (face font-lock-keyword-face) 33318 33319 (face font-lock-string-face) 33319 33321 nil 33321 33322 (face font-lock-string-face) 33322 33342 (face font-lock-function-name-face) 33342 33343 (face font-lock-string-face) 33343 33359 nil 33359 33360 (face font-lock-string-face) 33360 33366 (face font-lock-keyword-face) 33366 33367 (face font-lock-string-face) 33367 33387 nil 33387 33388 (face font-lock-string-face) 33388 33434 (face font-lock-constant-face) 33434 33435 (face font-lock-string-face) 33435 33453 nil 33453 33454 (face font-lock-string-face) 33454 33519 (face font-lock-constant-face) 33519 33520 (face font-lock-string-face) 33520 33553 nil 33553 33554 (face font-lock-string-face) 33554 33561 (face font-lock-keyword-face) 33561 33562 (face font-lock-string-face) 33562 33582 nil 33582 33583 (face font-lock-string-face) 33583 33585 (face font-lock-constant-face) 33585 33608 (face font-lock-variable-name-face) 33608 33647 (face font-lock-constant-face) 33647 33648 (face font-lock-string-face) 33648 33681 nil 33681 33682 (face font-lock-string-face) 33682 33688 (face font-lock-keyword-face) 33688 33689 (face font-lock-string-face) 33689 33709 nil 33709 33710 (face font-lock-string-face) 33710 33716 (face font-lock-constant-face) 33716 33717 (face font-lock-string-face) 33717 33735 nil 33735 33736 (face font-lock-string-face) 33736 33738 (face font-lock-constant-face) 33738 33743 (face font-lock-variable-name-face) 33743 33788 (face font-lock-constant-face) 33788 33789 (face font-lock-string-face) 33789 33807 nil 33807 33808 (face font-lock-string-face) 33808 33810 (face font-lock-constant-face) 33810 33811 (face font-lock-string-face) 33811 33829 nil 33829 33830 (face font-lock-string-face) 33830 33833 (face font-lock-constant-face) 33833 33840 (face font-lock-variable-name-face) 33840 33841 (face font-lock-constant-face) 33841 33842 (face font-lock-string-face) 33842 33860 nil 33860 33861 (face font-lock-string-face) 33861 33864 (face font-lock-constant-face) 33864 33872 (face font-lock-variable-name-face) 33872 33873 (face font-lock-constant-face) 33873 33874 (face font-lock-string-face) 33874 33952 nil 33952 33953 (face font-lock-string-face) 33953 33964 (face font-lock-keyword-face) 33964 33965 (face font-lock-string-face) 33965 33967 nil 33967 33968 (face font-lock-string-face) 33968 33978 (face font-lock-function-name-face) 33978 33979 (face font-lock-string-face) 33979 33991 nil 33991 33992 (face font-lock-string-face) 33992 33996 (face font-lock-keyword-face) 33996 33997 (face font-lock-string-face) 33997 33999 nil 33999 34000 (face font-lock-string-face) 34000 34004 (face font-lock-type-face) 34004 34005 (face font-lock-string-face) 34005 34017 nil 34017 34018 (face font-lock-string-face) 34018 34030 (face font-lock-keyword-face) 34030 34031 (face font-lock-string-face) 34031 34035 nil 34035 34036 (face font-lock-string-face) 34036 34062 (face font-lock-function-name-face) 34062 34063 (face font-lock-string-face) 34063 34077 nil 34077 34078 (face font-lock-string-face) 34078 34087 (face font-lock-keyword-face) 34087 34088 (face font-lock-string-face) 34088 34104 nil 34104 34105 (face font-lock-string-face) 34105 34117 (face font-lock-variable-name-face) 34117 34118 (face font-lock-string-face) 34118 34120 nil 34120 34121 (face font-lock-string-face) 34121 34126 (face font-lock-variable-name-face) 34126 34127 (face font-lock-string-face) 34127 34141 nil 34141 34142 (face font-lock-string-face) 34142 34153 (face font-lock-variable-name-face) 34153 34154 (face font-lock-string-face) 34154 34156 nil 34156 34157 (face font-lock-string-face) 34157 34174 (face font-lock-variable-name-face) 34174 34175 (face font-lock-string-face) 34175 34200 nil 34200 34201 (face font-lock-string-face) 34201 34209 (face font-lock-keyword-face) 34209 34210 (face font-lock-string-face) 34210 34214 nil 34214 34215 (face font-lock-string-face) 34215 34233 (face font-lock-constant-face) 34233 34234 (face font-lock-string-face) 34234 34268 nil 34268 34287 (face font-lock-comment-face) 34287 34293 nil 34293 34365 (face font-lock-comment-face) 34365 34371 nil 34371 34372 (face font-lock-string-face) 34372 34379 (face font-lock-keyword-face) 34379 34380 (face font-lock-string-face) 34380 34404 nil 34404 34405 (face font-lock-string-face) 34405 34416 (face font-lock-keyword-face) 34416 34417 (face font-lock-string-face) 34417 34419 nil 34419 34420 (face font-lock-string-face) 34420 34436 (face font-lock-function-name-face) 34436 34437 (face font-lock-string-face) 34437 34449 nil 34449 34450 (face font-lock-string-face) 34450 34454 (face font-lock-keyword-face) 34454 34455 (face font-lock-string-face) 34455 34457 nil 34457 34458 (face font-lock-string-face) 34458 34468 (face font-lock-type-face) 34468 34469 (face font-lock-string-face) 34469 34481 nil 34481 34482 (face font-lock-string-face) 34482 34494 (face font-lock-keyword-face) 34494 34495 (face font-lock-string-face) 34495 34511 nil 34511 34512 (face font-lock-string-face) 34512 34517 (face font-lock-function-name-face) 34517 34518 (face font-lock-string-face) 34518 34532 nil 34532 34533 (face font-lock-string-face) 34533 34551 (face font-lock-function-name-face) 34551 34552 (face font-lock-string-face) 34552 34566 nil 34566 34567 (face font-lock-string-face) 34567 34588 (face font-lock-function-name-face) 34588 34589 (face font-lock-string-face) 34589 34603 nil 34603 34604 (face font-lock-string-face) 34604 34630 (face font-lock-function-name-face) 34630 34631 (face font-lock-string-face) 34631 34645 nil 34645 34646 (face font-lock-string-face) 34646 34680 (face font-lock-function-name-face) 34680 34681 (face font-lock-string-face) 34681 34695 nil 34695 34696 (face font-lock-string-face) 34696 34730 (face font-lock-function-name-face) 34730 34731 (face font-lock-string-face) 34731 34745 nil 34745 34746 (face font-lock-string-face) 34746 34772 (face font-lock-function-name-face) 34772 34773 (face font-lock-string-face) 34773 34787 nil 34787 34788 (face font-lock-string-face) 34788 34827 (face font-lock-function-name-face) 34827 34828 (face font-lock-string-face) 34828 34853 nil 34853 34854 (face font-lock-string-face) 34854 34861 (face font-lock-keyword-face) 34861 34862 (face font-lock-string-face) 34862 34878 nil 34878 34879 (face font-lock-string-face) 34879 34904 (face font-lock-constant-face) 34904 34905 (face font-lock-string-face) 34905 34930 nil 34930 34931 (face font-lock-string-face) 34931 34941 (face font-lock-keyword-face) 34941 34942 (face font-lock-string-face) 34942 34959 nil 34959 34960 (face font-lock-string-face) 34960 34981 (face font-lock-variable-name-face) 34981 34982 (face font-lock-string-face) 34982 35000 nil 35000 35001 (face font-lock-string-face) 35001 35013 (face font-lock-keyword-face) 35013 35014 (face font-lock-string-face) 35014 35034 nil 35034 35077 (face font-lock-comment-face) 35077 35093 nil 35093 35123 (face font-lock-comment-face) 35123 35139 nil 35139 35164 (face font-lock-comment-face) 35164 35180 nil 35180 35194 (face font-lock-comment-face) 35194 35210 nil 35210 35211 (face font-lock-string-face) 35211 35240 (face font-lock-function-name-face) 35240 35241 (face font-lock-string-face) 35241 35274 nil 35274 35275 (face font-lock-string-face) 35275 35285 (face font-lock-keyword-face) 35285 35286 (face font-lock-string-face) 35286 35307 nil 35307 35308 (face font-lock-string-face) 35308 35329 (face font-lock-variable-name-face) 35329 35330 (face font-lock-string-face) 35330 35352 nil 35352 35353 (face font-lock-string-face) 35353 35365 (face font-lock-keyword-face) 35365 35366 (face font-lock-string-face) 35366 35390 nil 35390 35391 (face font-lock-string-face) 35391 35432 (face font-lock-function-name-face) 35432 35433 (face font-lock-string-face) 35433 35553 nil 35553 35554 (face font-lock-string-face) 35554 35565 (face font-lock-keyword-face) 35565 35566 (face font-lock-string-face) 35566 35568 nil 35568 35569 (face font-lock-string-face) 35569 35592 (face font-lock-function-name-face) 35592 35593 (face font-lock-string-face) 35593 35605 nil 35605 35606 (face font-lock-string-face) 35606 35610 (face font-lock-keyword-face) 35610 35611 (face font-lock-string-face) 35611 35613 nil 35613 35614 (face font-lock-string-face) 35614 35624 (face font-lock-type-face) 35624 35625 (face font-lock-string-face) 35625 35637 nil 35637 35638 (face font-lock-string-face) 35638 35650 (face font-lock-keyword-face) 35650 35651 (face font-lock-string-face) 35651 35667 nil 35667 35668 (face font-lock-string-face) 35668 35673 (face font-lock-function-name-face) 35673 35674 (face font-lock-string-face) 35674 35688 nil 35688 35689 (face font-lock-string-face) 35689 35707 (face font-lock-function-name-face) 35707 35708 (face font-lock-string-face) 35708 35722 nil 35722 35723 (face font-lock-string-face) 35723 35757 (face font-lock-function-name-face) 35757 35758 (face font-lock-string-face) 35758 35772 nil 35772 35773 (face font-lock-string-face) 35773 35799 (face font-lock-function-name-face) 35799 35800 (face font-lock-string-face) 35800 35814 nil 35814 35815 (face font-lock-string-face) 35815 35841 (face font-lock-function-name-face) 35841 35842 (face font-lock-string-face) 35842 35856 nil 35856 35857 (face font-lock-string-face) 35857 35896 (face font-lock-function-name-face) 35896 35897 (face font-lock-string-face) 35897 35922 nil 35922 35923 (face font-lock-string-face) 35923 35930 (face font-lock-keyword-face) 35930 35931 (face font-lock-string-face) 35931 35947 nil 35947 35948 (face font-lock-string-face) 35948 35970 (face font-lock-constant-face) 35970 35971 (face font-lock-string-face) 35971 35985 nil 35985 35986 (face font-lock-string-face) 35986 36011 (face font-lock-constant-face) 36011 36012 (face font-lock-string-face) 36012 36026 nil 36026 36027 (face font-lock-string-face) 36027 36060 (face font-lock-constant-face) 36060 36061 (face font-lock-string-face) 36061 36075 nil 36075 36076 (face font-lock-string-face) 36076 36117 (face font-lock-constant-face) 36117 36118 (face font-lock-string-face) 36118 36143 nil 36143 36144 (face font-lock-string-face) 36144 36154 (face font-lock-keyword-face) 36154 36155 (face font-lock-string-face) 36155 36172 nil 36172 36173 (face font-lock-string-face) 36173 36198 (face font-lock-variable-name-face) 36198 36199 (face font-lock-string-face) 36199 36217 nil 36217 36218 (face font-lock-string-face) 36218 36228 (face font-lock-keyword-face) 36228 36229 (face font-lock-string-face) 36229 36250 nil 36250 36251 (face font-lock-string-face) 36251 36272 (face font-lock-variable-name-face) 36272 36273 (face font-lock-string-face) 36273 36295 nil 36295 36296 (face font-lock-string-face) 36296 36308 (face font-lock-keyword-face) 36308 36309 (face font-lock-string-face) 36309 36333 nil 36333 36334 (face font-lock-string-face) 36334 36375 (face font-lock-function-name-face) 36375 36376 (face font-lock-string-face) 36376 36496 nil 36496 36497 (face font-lock-string-face) 36497 36508 (face font-lock-keyword-face) 36508 36509 (face font-lock-string-face) 36509 36511 nil 36511 36512 (face font-lock-string-face) 36512 36524 (face font-lock-function-name-face) 36524 36525 (face font-lock-string-face) 36525 36537 nil 36537 36538 (face font-lock-string-face) 36538 36542 (face font-lock-keyword-face) 36542 36543 (face font-lock-string-face) 36543 36545 nil 36545 36546 (face font-lock-string-face) 36546 36556 (face font-lock-type-face) 36556 36557 (face font-lock-string-face) 36557 36569 nil 36569 36570 (face font-lock-string-face) 36570 36582 (face font-lock-keyword-face) 36582 36583 (face font-lock-string-face) 36583 36599 nil 36599 36600 (face font-lock-string-face) 36600 36605 (face font-lock-function-name-face) 36605 36606 (face font-lock-string-face) 36606 36620 nil 36620 36621 (face font-lock-string-face) 36621 36642 (face font-lock-function-name-face) 36642 36643 (face font-lock-string-face) 36643 36657 nil 36657 36658 (face font-lock-string-face) 36658 36697 (face font-lock-function-name-face) 36697 36698 (face font-lock-string-face) 36698 36723 nil 36723 36724 (face font-lock-string-face) 36724 36731 (face font-lock-keyword-face) 36731 36732 (face font-lock-string-face) 36732 36748 nil 36748 36749 (face font-lock-string-face) 36749 36782 (face font-lock-constant-face) 36782 36783 (face font-lock-string-face) 36783 36829 nil 36829 36830 (face font-lock-string-face) 36830 36841 (face font-lock-keyword-face) 36841 36842 (face font-lock-string-face) 36842 36844 nil 36844 36845 (face font-lock-string-face) 36845 36856 (face font-lock-function-name-face) 36856 36857 (face font-lock-string-face) 36857 36869 nil 36869 36870 (face font-lock-string-face) 36870 36874 (face font-lock-keyword-face) 36874 36875 (face font-lock-string-face) 36875 36877 nil 36877 36878 (face font-lock-string-face) 36878 36888 (face font-lock-type-face) 36888 36889 (face font-lock-string-face) 36889 36901 nil 36901 36902 (face font-lock-string-face) 36902 36914 (face font-lock-keyword-face) 36914 36915 (face font-lock-string-face) 36915 36931 nil 36931 36932 (face font-lock-string-face) 36932 36937 (face font-lock-function-name-face) 36937 36938 (face font-lock-string-face) 36938 36952 nil 36952 36953 (face font-lock-string-face) 36953 36974 (face font-lock-function-name-face) 36974 36975 (face font-lock-string-face) 36975 36989 nil 36989 36990 (face font-lock-string-face) 36990 37029 (face font-lock-function-name-face) 37029 37030 (face font-lock-string-face) 37030 37055 nil 37055 37056 (face font-lock-string-face) 37056 37063 (face font-lock-keyword-face) 37063 37064 (face font-lock-string-face) 37064 37080 nil 37080 37081 (face font-lock-string-face) 37081 37113 (face font-lock-constant-face) 37113 37114 (face font-lock-string-face) 37114 37163 nil)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/graphviz.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,100 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2011 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Using the JSON dumped by the dump-dependency-json generator,
-generate input suitable for graphviz to render a dependency graph of
-targets."""
-
-import collections
-import json
-import sys
-
-
-def ParseTarget(target):
-  target, _, suffix = target.partition('#')
-  filename, _, target = target.partition(':')
-  return filename, target, suffix
-
-
-def LoadEdges(filename, targets):
-  """Load the edges map from the dump file, and filter it to only
-  show targets in |targets| and their depedendents."""
-
-  file = open('dump.json')
-  edges = json.load(file)
-  file.close()
-
-  # Copy out only the edges we're interested in from the full edge list.
-  target_edges = {}
-  to_visit = targets[:]
-  while to_visit:
-    src = to_visit.pop()
-    if src in target_edges:
-      continue
-    target_edges[src] = edges[src]
-    to_visit.extend(edges[src])
-
-  return target_edges
-
-
-def WriteGraph(edges):
-  """Print a graphviz graph to stdout.
-  |edges| is a map of target to a list of other targets it depends on."""
-
-  # Bucket targets by file.
-  files = collections.defaultdict(list)
-  for src, dst in edges.items():
-    build_file, target_name, toolset = ParseTarget(src)
-    files[build_file].append(src)
-
-  print 'digraph D {'
-  print '  fontsize=8'  # Used by subgraphs.
-  print '  node [fontsize=8]'
-
-  # Output nodes by file.  We must first write out each node within
-  # its file grouping before writing out any edges that may refer
-  # to those nodes.
-  for filename, targets in files.items():
-    if len(targets) == 1:
-      # If there's only one node for this file, simplify
-      # the display by making it a box without an internal node.
-      target = targets[0]
-      build_file, target_name, toolset = ParseTarget(target)
-      print '  "%s" [shape=box, label="%s\\n%s"]' % (target, filename,
-                                                     target_name)
-    else:
-      # Group multiple nodes together in a subgraph.
-      print '  subgraph "cluster_%s" {' % filename
-      print '    label = "%s"' % filename
-      for target in targets:
-        build_file, target_name, toolset = ParseTarget(target)
-        print '    "%s" [label="%s"]' % (target, target_name)
-      print '  }'
-
-  # Now that we've placed all the nodes within subgraphs, output all
-  # the edges between nodes.
-  for src, dsts in edges.items():
-    for dst in dsts:
-      print '  "%s" -> "%s"' % (src, dst)
-
-  print '}'
-
-
-def main():
-  if len(sys.argv) < 2:
-    print >>sys.stderr, __doc__
-    print >>sys.stderr
-    print >>sys.stderr, 'usage: %s target1 target2...' % (sys.argv[0])
-    return 1
-
-  edges = LoadEdges('dump.json', sys.argv[1:])
-
-  WriteGraph(edges)
-  return 0
-
-
-if __name__ == '__main__':
-  sys.exit(main())
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/pretty_gyp.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,155 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Pretty-prints the contents of a GYP file."""
-
-import sys
-import re
-
-
-# Regex to remove comments when we're counting braces.
-COMMENT_RE = re.compile(r'\s*#.*')
-
-# Regex to remove quoted strings when we're counting braces.
-# It takes into account quoted quotes, and makes sure that the quotes match.
-# NOTE: It does not handle quotes that span more than one line, or
-# cases where an escaped quote is preceeded by an escaped backslash.
-QUOTE_RE_STR = r'(?P<q>[\'"])(.*?)(?<![^\\][\\])(?P=q)'
-QUOTE_RE = re.compile(QUOTE_RE_STR)
-
-
-def comment_replace(matchobj):
-  return matchobj.group(1) + matchobj.group(2) + '#' * len(matchobj.group(3))
-
-
-def mask_comments(input):
-  """Mask the quoted strings so we skip braces inside quoted strings."""
-  search_re = re.compile(r'(.*?)(#)(.*)')
-  return [search_re.sub(comment_replace, line) for line in input]
-
-
-def quote_replace(matchobj):
-  return "%s%s%s%s" % (matchobj.group(1),
-                       matchobj.group(2),
-                       'x'*len(matchobj.group(3)),
-                       matchobj.group(2))
-
-
-def mask_quotes(input):
-  """Mask the quoted strings so we skip braces inside quoted strings."""
-  search_re = re.compile(r'(.*?)' + QUOTE_RE_STR)
-  return [search_re.sub(quote_replace, line) for line in input]
-
-
-def do_split(input, masked_input, search_re):
-  output = []
-  mask_output = []
-  for (line, masked_line) in zip(input, masked_input):
-    m = search_re.match(masked_line)
-    while m:
-      split = len(m.group(1))
-      line = line[:split] + r'\n' + line[split:]
-      masked_line = masked_line[:split] + r'\n' + masked_line[split:]
-      m = search_re.match(masked_line)
-    output.extend(line.split(r'\n'))
-    mask_output.extend(masked_line.split(r'\n'))
-  return (output, mask_output)
-
-
-def split_double_braces(input):
-  """Masks out the quotes and comments, and then splits appropriate
-  lines (lines that matche the double_*_brace re's above) before
-  indenting them below.
-
-  These are used to split lines which have multiple braces on them, so
-  that the indentation looks prettier when all laid out (e.g. closing
-  braces make a nice diagonal line).
-  """
-  double_open_brace_re = re.compile(r'(.*?[\[\{\(,])(\s*)([\[\{\(])')
-  double_close_brace_re = re.compile(r'(.*?[\]\}\)],?)(\s*)([\]\}\)])')
-
-  masked_input = mask_quotes(input)
-  masked_input = mask_comments(masked_input)
-
-  (output, mask_output) = do_split(input, masked_input, double_open_brace_re)
-  (output, mask_output) = do_split(output, mask_output, double_close_brace_re)
-
-  return output
-
-
-def count_braces(line):
-  """keeps track of the number of braces on a given line and returns the result.
-
-  It starts at zero and subtracts for closed braces, and adds for open braces.
-  """
-  open_braces = ['[', '(', '{']
-  close_braces = [']', ')', '}']
-  closing_prefix_re = re.compile(r'(.*?[^\s\]\}\)]+.*?)([\]\}\)],?)\s*$')
-  cnt = 0
-  stripline = COMMENT_RE.sub(r'', line)
-  stripline = QUOTE_RE.sub(r"''", stripline)
-  for char in stripline:
-    for brace in open_braces:
-      if char == brace:
-        cnt += 1
-    for brace in close_braces:
-      if char == brace:
-        cnt -= 1
-
-  after = False
-  if cnt > 0:
-    after = True
-
-  # This catches the special case of a closing brace having something
-  # other than just whitespace ahead of it -- we don't want to
-  # unindent that until after this line is printed so it stays with
-  # the previous indentation level.
-  if cnt < 0 and closing_prefix_re.match(stripline):
-    after = True
-  return (cnt, after)
-
-
-def prettyprint_input(lines):
-  """Does the main work of indenting the input based on the brace counts."""
-  indent = 0
-  basic_offset = 2
-  last_line = ""
-  for line in lines:
-    if COMMENT_RE.match(line):
-      print line
-    else:
-      line = line.strip('\r\n\t ')  # Otherwise doesn't strip \r on Unix.
-      if len(line) > 0:
-        (brace_diff, after) = count_braces(line)
-        if brace_diff != 0:
-          if after:
-            print " " * (basic_offset * indent) + line
-            indent += brace_diff
-          else:
-            indent += brace_diff
-            print " " * (basic_offset * indent) + line
-        else:
-          print " " * (basic_offset * indent) + line
-      else:
-        print ""
-      last_line = line
-
-
-def main():
-  if len(sys.argv) > 1:
-    data = open(sys.argv[1]).read().splitlines()
-  else:
-    data = sys.stdin.read().splitlines()
-  # Split up the double braces.
-  lines = split_double_braces(data)
-
-  # Indent and print the output.
-  prettyprint_input(lines)
-  return 0
-
-
-if __name__ == '__main__':
-  sys.exit(main())
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/pretty_sln.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,168 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Prints the information in a sln file in a diffable way.
-
-   It first outputs each projects in alphabetical order with their
-   dependencies.
-
-   Then it outputs a possible build order.
-"""
-
-__author__ = 'nsylvain (Nicolas Sylvain)'
-
-import os
-import re
-import sys
-import pretty_vcproj
-
-def BuildProject(project, built, projects, deps):
-  # if all dependencies are done, we can build it, otherwise we try to build the
-  # dependency.
-  # This is not infinite-recursion proof.
-  for dep in deps[project]:
-    if dep not in built:
-      BuildProject(dep, built, projects, deps)
-  print project
-  built.append(project)
-
-def ParseSolution(solution_file):
-  # All projects, their clsid and paths.
-  projects = dict()
-
-  # A list of dependencies associated with a project.
-  dependencies = dict()
-
-  # Regular expressions that matches the SLN format.
-  # The first line of a project definition.
-  begin_project = re.compile(('^Project\("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942'
-                              '}"\) = "(.*)", "(.*)", "(.*)"$'))
-  # The last line of a project definition.
-  end_project = re.compile('^EndProject$')
-  # The first line of a dependency list.
-  begin_dep = re.compile('ProjectSection\(ProjectDependencies\) = postProject$')
-  # The last line of a dependency list.
-  end_dep = re.compile('EndProjectSection$')
-  # A line describing a dependency.
-  dep_line = re.compile(' *({.*}) = ({.*})$')
-
-  in_deps = False
-  solution = open(solution_file)
-  for line in solution:
-    results = begin_project.search(line)
-    if results:
-      # Hack to remove icu because the diff is too different.
-      if results.group(1).find('icu') != -1:
-        continue
-      # We remove "_gyp" from the names because it helps to diff them.
-      current_project = results.group(1).replace('_gyp', '')
-      projects[current_project] = [results.group(2).replace('_gyp', ''),
-                                   results.group(3),
-                                   results.group(2)]
-      dependencies[current_project] = []
-      continue
-
-    results = end_project.search(line)
-    if results:
-      current_project = None
-      continue
-
-    results = begin_dep.search(line)
-    if results:
-      in_deps = True
-      continue
-
-    results = end_dep.search(line)
-    if results:
-      in_deps = False
-      continue
-
-    results = dep_line.search(line)
-    if results and in_deps and current_project:
-      dependencies[current_project].append(results.group(1))
-      continue
-
-  # Change all dependencies clsid to name instead.
-  for project in dependencies:
-    # For each dependencies in this project
-    new_dep_array = []
-    for dep in dependencies[project]:
-      # Look for the project name matching this cldis
-      for project_info in projects:
-        if projects[project_info][1] == dep:
-          new_dep_array.append(project_info)
-    dependencies[project] = sorted(new_dep_array)
-
-  return (projects, dependencies)
-
-def PrintDependencies(projects, deps):
-  print "---------------------------------------"
-  print "Dependencies for all projects"
-  print "---------------------------------------"
-  print "--                                   --"
-
-  for (project, dep_list) in sorted(deps.items()):
-    print "Project : %s" % project
-    print "Path : %s" % projects[project][0]
-    if dep_list:
-      for dep in dep_list:
-        print "  - %s" % dep
-    print ""
-
-  print "--                                   --"
-
-def PrintBuildOrder(projects, deps):
-  print "---------------------------------------"
-  print "Build order                            "
-  print "---------------------------------------"
-  print "--                                   --"
-
-  built = []
-  for (project, _) in sorted(deps.items()):
-    if project not in built:
-      BuildProject(project, built, projects, deps)
-
-  print "--                                   --"
-
-def PrintVCProj(projects):
-
-  for project in projects:
-    print "-------------------------------------"
-    print "-------------------------------------"
-    print project
-    print project
-    print project
-    print "-------------------------------------"
-    print "-------------------------------------"
-
-    project_path = os.path.abspath(os.path.join(os.path.dirname(sys.argv[1]),
-                                                projects[project][2]))
-
-    pretty = pretty_vcproj
-    argv = [ '',
-             project_path,
-             '$(SolutionDir)=%s\\' % os.path.dirname(sys.argv[1]),
-           ]
-    argv.extend(sys.argv[3:])
-    pretty.main(argv)
-
-def main():
-  # check if we have exactly 1 parameter.
-  if len(sys.argv) < 2:
-    print 'Usage: %s "c:\\path\\to\\project.sln"' % sys.argv[0]
-    return 1
-
-  (projects, deps) = ParseSolution(sys.argv[1])
-  PrintDependencies(projects, deps)
-  PrintBuildOrder(projects, deps)
-
-  if '--recursive' in sys.argv:
-    PrintVCProj(projects)
-  return 0
-
-
-if __name__ == '__main__':
-  sys.exit(main())
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/pretty_vcproj.py	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,329 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright (c) 2012 Google Inc. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Make the format of a vcproj really pretty.
-
-   This script normalize and sort an xml. It also fetches all the properties
-   inside linked vsprops and include them explicitly in the vcproj.
-
-   It outputs the resulting xml to stdout.
-"""
-
-__author__ = 'nsylvain (Nicolas Sylvain)'
-
-import os
-import sys
-
-from xml.dom.minidom import parse
-from xml.dom.minidom import Node
-
-REPLACEMENTS = dict()
-ARGUMENTS = None
-
-
-class CmpTuple(object):
-  """Compare function between 2 tuple."""
-  def __call__(self, x, y):
-    return cmp(x[0], y[0])
-
-
-class CmpNode(object):
-  """Compare function between 2 xml nodes."""
-
-  def __call__(self, x, y):
-    def get_string(node):
-      node_string = "node"
-      node_string += node.nodeName
-      if node.nodeValue:
-        node_string += node.nodeValue
-
-      if node.attributes:
-        # We first sort by name, if present.
-        node_string += node.getAttribute("Name")
-
-        all_nodes = []
-        for (name, value) in node.attributes.items():
-          all_nodes.append((name, value))
-
-        all_nodes.sort(CmpTuple())
-        for (name, value) in all_nodes:
-          node_string += name
-          node_string += value
-
-      return node_string
-
-    return cmp(get_string(x), get_string(y))
-
-
-def PrettyPrintNode(node, indent=0):
-  if node.nodeType == Node.TEXT_NODE:
-    if node.data.strip():
-      print '%s%s' % (' '*indent, node.data.strip())
-    return
-
-  if node.childNodes:
-    node.normalize()
-  # Get the number of attributes
-  attr_count = 0
-  if node.attributes:
-    attr_count = node.attributes.length
-
-  # Print the main tag
-  if attr_count == 0:
-    print '%s<%s>' % (' '*indent, node.nodeName)
-  else:
-    print '%s<%s' % (' '*indent, node.nodeName)
-
-    all_attributes = []
-    for (name, value) in node.attributes.items():
-      all_attributes.append((name, value))
-      all_attributes.sort(CmpTuple())
-    for (name, value) in all_attributes:
-      print '%s  %s="%s"' % (' '*indent, name, value)
-    print '%s>' % (' '*indent)
-  if node.nodeValue:
-    print '%s  %s' % (' '*indent, node.nodeValue)
-
-  for sub_node in node.childNodes:
-    PrettyPrintNode(sub_node, indent=indent+2)
-  print '%s</%s>' % (' '*indent, node.nodeName)
-
-
-def FlattenFilter(node):
-  """Returns a list of all the node and sub nodes."""
-  node_list = []
-
-  if (node.attributes and
-      node.getAttribute('Name') == '_excluded_files'):
-      # We don't add the "_excluded_files" filter.
-    return []
-
-  for current in node.childNodes:
-    if current.nodeName == 'Filter':
-      node_list.extend(FlattenFilter(current))
-    else:
-      node_list.append(current)
-
-  return node_list
-
-
-def FixFilenames(filenames, current_directory):
-  new_list = []
-  for filename in filenames:
-    if filename:
-      for key in REPLACEMENTS:
-        filename = filename.replace(key, REPLACEMENTS[key])
-      os.chdir(current_directory)
-      filename = filename.strip('"\' ')
-      if filename.startswith('$'):
-        new_list.append(filename)
-      else:
-        new_list.append(os.path.abspath(filename))
-  return new_list
-
-
-def AbsoluteNode(node):
-  """Makes all the properties we know about in this node absolute."""
-  if node.attributes:
-    for (name, value) in node.attributes.items():
-      if name in ['InheritedPropertySheets', 'RelativePath',
-                  'AdditionalIncludeDirectories',
-                  'IntermediateDirectory', 'OutputDirectory',
-                  'AdditionalLibraryDirectories']:
-        # We want to fix up these paths
-        path_list = value.split(';')
-        new_list = FixFilenames(path_list, os.path.dirname(ARGUMENTS[1]))
-        node.setAttribute(name, ';'.join(new_list))
-      if not value:
-        node.removeAttribute(name)
-
-
-def CleanupVcproj(node):
-  """For each sub node, we call recursively this function."""
-  for sub_node in node.childNodes:
-    AbsoluteNode(sub_node)
-    CleanupVcproj(sub_node)
-
-  # Normalize the node, and remove all extranous whitespaces.
-  for sub_node in node.childNodes:
-    if sub_node.nodeType == Node.TEXT_NODE:
-      sub_node.data = sub_node.data.replace("\r", "")
-      sub_node.data = sub_node.data.replace("\n", "")
-      sub_node.data = sub_node.data.rstrip()
-
-  # Fix all the semicolon separated attributes to be sorted, and we also
-  # remove the dups.
-  if node.attributes:
-    for (name, value) in node.attributes.items():
-      sorted_list = sorted(value.split(';'))
-      unique_list = []
-      for i in sorted_list:
-        if not unique_list.count(i):
-          unique_list.append(i)
-      node.setAttribute(name, ';'.join(unique_list))
-      if not value:
-        node.removeAttribute(name)
-
-  if node.childNodes:
-    node.normalize()
-
-  # For each node, take a copy, and remove it from the list.
-  node_array = []
-  while node.childNodes and node.childNodes[0]:
-    # Take a copy of the node and remove it from the list.
-    current = node.childNodes[0]
-    node.removeChild(current)
-
-    # If the child is a filter, we want to append all its children
-    # to this same list.
-    if current.nodeName == 'Filter':
-      node_array.extend(FlattenFilter(current))
-    else:
-      node_array.append(current)
-
-
-  # Sort the list.
-  node_array.sort(CmpNode())
-
-  # Insert the nodes in the correct order.
-  for new_node in node_array:
-    # But don't append empty tool node.
-    if new_node.nodeName == 'Tool':
-      if new_node.attributes and new_node.attributes.length == 1:
-        # This one was empty.
-        continue
-    if new_node.nodeName == 'UserMacro':
-      continue
-    node.appendChild(new_node)
-
-
-def GetConfiguationNodes(vcproj):
-  #TODO(nsylvain): Find a better way to navigate the xml.
-  nodes = []
-  for node in vcproj.childNodes:
-    if node.nodeName == "Configurations":
-      for sub_node in node.childNodes:
-        if sub_node.nodeName == "Configuration":
-          nodes.append(sub_node)
-
-  return nodes
-
-
-def GetChildrenVsprops(filename):
-  dom = parse(filename)
-  if dom.documentElement.attributes:
-    vsprops = dom.documentElement.getAttribute('InheritedPropertySheets')
-    return FixFilenames(vsprops.split(';'), os.path.dirname(filename))
-  return []
-
-def SeekToNode(node1, child2):
-  # A text node does not have properties.
-  if child2.nodeType == Node.TEXT_NODE:
-    return None
-
-  # Get the name of the current node.
-  current_name = child2.getAttribute("Name")
-  if not current_name:
-    # There is no name. We don't know how to merge.
-    return None
-
-  # Look through all the nodes to find a match.
-  for sub_node in node1.childNodes:
-    if sub_node.nodeName == child2.nodeName:
-      name = sub_node.getAttribute("Name")
-      if name == current_name:
-        return sub_node
-
-  # No match. We give up.
-  return None
-
-
-def MergeAttributes(node1, node2):
-  # No attributes to merge?
-  if not node2.attributes:
-    return
-
-  for (name, value2) in node2.attributes.items():
-    # Don't merge the 'Name' attribute.
-    if name == 'Name':
-      continue
-    value1 = node1.getAttribute(name)
-    if value1:
-      # The attribute exist in the main node. If it's equal, we leave it
-      # untouched, otherwise we concatenate it.
-      if value1 != value2:
-        node1.setAttribute(name, ';'.join([value1, value2]))
-    else:
-      # The attribute does nto exist in the main node. We append this one.
-      node1.setAttribute(name, value2)
-
-    # If the attribute was a property sheet attributes, we remove it, since
-    # they are useless.
-    if name == 'InheritedPropertySheets':
-      node1.removeAttribute(name)
-
-
-def MergeProperties(node1, node2):
-  MergeAttributes(node1, node2)
-  for child2 in node2.childNodes:
-    child1 = SeekToNode(node1, child2)
-    if child1:
-      MergeProperties(child1, child2)
-    else:
-      node1.appendChild(child2.cloneNode(True))
-
-
-def main(argv):
-  """Main function of this vcproj prettifier."""
-  global ARGUMENTS
-  ARGUMENTS = argv
-
-  # check if we have exactly 1 parameter.
-  if len(argv) < 2:
-    print ('Usage: %s "c:\\path\\to\\vcproj.vcproj" [key1=value1] '
-           '[key2=value2]' % argv[0])
-    return 1
-
-  # Parse the keys
-  for i in range(2, len(argv)):
-    (key, value) = argv[i].split('=')
-    REPLACEMENTS[key] = value
-
-  # Open the vcproj and parse the xml.
-  dom = parse(argv[1])
-
-  # First thing we need to do is find the Configuration Node and merge them
-  # with the vsprops they include.
-  for configuration_node in GetConfiguationNodes(dom.documentElement):
-    # Get the property sheets associated with this configuration.
-    vsprops = configuration_node.getAttribute('InheritedPropertySheets')
-
-    # Fix the filenames to be absolute.
-    vsprops_list = FixFilenames(vsprops.strip().split(';'),
-                                os.path.dirname(argv[1]))
-
-    # Extend the list of vsprops with all vsprops contained in the current
-    # vsprops.
-    for current_vsprops in vsprops_list:
-      vsprops_list.extend(GetChildrenVsprops(current_vsprops))
-
-    # Now that we have all the vsprops, we need to merge them.
-    for current_vsprops in vsprops_list:
-      MergeProperties(configuration_node,
-                      parse(current_vsprops).documentElement)
-
-  # Now that everything is merged, we need to cleanup the xml.
-  CleanupVcproj(dom.documentElement)
-
-  # Finally, we use the prett xml function to print the vcproj back to the
-  # user.
-  #print dom.toprettyxml(newl="\n")
-  PrettyPrintNode(dom.documentElement)
-  return 0
-
-
-if __name__ == '__main__':
-  sys.exit(main(sys.argv))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/build.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,275 +0,0 @@
-
-module.exports = exports = build
-
-/**
- * Module dependencies.
- */
-
-var fs = require('graceful-fs')
-  , rm = require('rimraf')
-  , path = require('path')
-  , glob = require('glob')
-  , log = require('npmlog')
-  , which = require('which')
-  , mkdirp = require('mkdirp')
-  , exec = require('child_process').exec
-  , win = process.platform == 'win32'
-
-exports.usage = 'Invokes `' + (win ? 'msbuild' : 'make') + '` and builds the module'
-
-function build (gyp, argv, callback) {
-
-  var makeCommand = gyp.opts.make || process.env.MAKE
-      || (process.platform.indexOf('bsd') != -1 && process.platform.indexOf('kfreebsd') == -1 ? 'gmake' : 'make')
-    , command = win ? 'msbuild' : makeCommand
-    , buildDir = path.resolve('build')
-    , configPath = path.resolve(buildDir, 'config.gypi')
-    , jobs = gyp.opts.jobs || process.env.JOBS
-    , buildType
-    , config
-    , arch
-    , nodeDir
-    , copyDevLib
-
-  loadConfigGypi()
-
-  /**
-   * Load the "config.gypi" file that was generated during "configure".
-   */
-
-  function loadConfigGypi () {
-    fs.readFile(configPath, 'utf8', function (err, data) {
-      if (err) {
-        if (err.code == 'ENOENT') {
-          callback(new Error('You must run `node-gyp configure` first!'))
-        } else {
-          callback(err)
-        }
-        return
-      }
-      config = JSON.parse(data.replace(/\#.+\n/, ''))
-
-      // get the 'arch', 'buildType', and 'nodeDir' vars from the config
-      buildType = config.target_defaults.default_configuration
-      arch = config.variables.target_arch
-      nodeDir = config.variables.nodedir
-      copyDevLib = config.variables.copy_dev_lib == 'true'
-
-      if ('debug' in gyp.opts) {
-        buildType = gyp.opts.debug ? 'Debug' : 'Release'
-      }
-      if (!buildType) {
-        buildType = 'Release'
-      }
-
-      log.verbose('build type', buildType)
-      log.verbose('architecture', arch)
-      log.verbose('node dev dir', nodeDir)
-
-      if (win) {
-        findSolutionFile()
-      } else {
-        doWhich()
-      }
-    })
-  }
-
-  /**
-   * On Windows, find the first build/*.sln file.
-   */
-
-  function findSolutionFile () {
-    glob('build/*.sln', function (err, files) {
-      if (err) return callback(err)
-      if (files.length === 0) {
-        return callback(new Error('Could not find *.sln file. Did you run "configure"?'))
-      }
-      guessedSolution = files[0]
-      log.verbose('found first Solution file', guessedSolution)
-      doWhich()
-    })
-  }
-
-  /**
-   * Uses node-which to locate the msbuild / make executable.
-   */
-
-  function doWhich () {
-    // First make sure we have the build command in the PATH
-    which(command, function (err, execPath) {
-      if (err) {
-        if (win && /not found/.test(err.message)) {
-          // On windows and no 'msbuild' found. Let's guess where it is
-          findMsbuild()
-        } else {
-          // Some other error or 'make' not found on Unix, report that to the user
-          callback(err)
-        }
-        return
-      }
-      log.verbose('`which` succeeded for `' + command + '`', execPath)
-      copyNodeLib()
-    })
-  }
-
-  /**
-   * Search for the location of "msbuild.exe" file on Windows.
-   */
-
-  function findMsbuild () {
-    log.verbose('could not find "msbuild.exe" in PATH - finding location in registry')
-    var notfoundErr = new Error('Can\'t find "msbuild.exe". Do you have Microsoft Visual Studio C++ 2008+ installed?')
-    var cmd = 'reg query "HKLM\\Software\\Microsoft\\MSBuild\\ToolsVersions" /s'
-    if (process.arch !== 'ia32')
-      cmd += ' /reg:32'
-    exec(cmd, function (err, stdout, stderr) {
-      var reVers = /ToolsVersions\\([^\\]+)$/i
-        , rePath = /\r\n[ \t]+MSBuildToolsPath[ \t]+REG_SZ[ \t]+([^\r]+)/i
-        , msbuilds = []
-        , r
-        , msbuildPath
-      if (err) {
-        return callback(notfoundErr)
-      }
-      stdout.split('\r\n\r\n').forEach(function(l) {
-        if (!l) return
-        l = l.trim()
-        if (r = reVers.exec(l.substring(0, l.indexOf('\r\n')))) {
-          var ver = parseFloat(r[1], 10)
-          if (ver >= 3.5) {
-            if (r = rePath.exec(l)) {
-              msbuilds.push({
-                version: ver,
-                path: r[1]
-              })
-            }
-          }
-        }
-      })
-      msbuilds.sort(function (x, y) {
-        return (x.version < y.version ? -1 : 1)
-      })
-      ;(function verifyMsbuild () {
-        if (!msbuilds.length) return callback(notfoundErr);
-        msbuildPath = path.resolve(msbuilds.pop().path, 'msbuild.exe')
-        fs.stat(msbuildPath, function (err, stat) {
-          if (err) {
-            if (err.code == 'ENOENT') {
-              if (msbuilds.length) {
-                return verifyMsbuild()
-              } else {
-                callback(notfoundErr)
-              }
-            } else {
-              callback(err)
-            }
-            return
-          }
-          command = msbuildPath
-          copyNodeLib()
-        })
-      })()
-    })
-  }
-
-  /**
-   * Copies the node.lib file for the current target architecture into the
-   * current proper dev dir location.
-   */
-
-  function copyNodeLib () {
-    if (!win || !copyDevLib) return doBuild()
-
-    var buildDir = path.resolve(nodeDir, buildType)
-      , archNodeLibPath = path.resolve(nodeDir, arch, 'node.lib')
-      , buildNodeLibPath = path.resolve(buildDir, 'node.lib')
-
-    mkdirp(buildDir, function (err, isNew) {
-      if (err) return callback(err)
-      log.verbose('"' + buildType + '" dir needed to be created?', isNew)
-      var rs = fs.createReadStream(archNodeLibPath)
-        , ws = fs.createWriteStream(buildNodeLibPath)
-      log.verbose('copying "node.lib" for ' + arch, buildNodeLibPath)
-      rs.pipe(ws)
-      rs.on('error', callback)
-      ws.on('error', callback)
-      rs.on('end', doBuild)
-    })
-  }
-
-  /**
-   * Actually spawn the process and compile the module.
-   */
-
-  function doBuild () {
-
-    // Enable Verbose build
-    var verbose = log.levels[log.level] <= log.levels.verbose
-    if (!win && verbose) {
-      argv.push('V=1')
-    }
-    if (win && !verbose) {
-      argv.push('/clp:Verbosity=minimal')
-    }
-
-    if (win) {
-      // Turn off the Microsoft logo on Windows
-      argv.push('/nologo')
-    }
-
-    // Specify the build type, Release by default
-    if (win) {
-      var p = arch === 'x64' ? 'x64' : 'Win32'
-      argv.push('/p:Configuration=' + buildType + ';Platform=' + p)
-      if (jobs) {
-        if (!isNaN(parseInt(jobs, 10))) {
-          argv.push('/m:' + parseInt(jobs, 10))
-        } else if (jobs.toUpperCase() === 'MAX') {
-          argv.push('/m:' + require('os').cpus().length)
-        }
-      }
-    } else {
-      argv.push('BUILDTYPE=' + buildType)
-      // Invoke the Makefile in the 'build' dir.
-      argv.push('-C')
-      argv.push('build')
-      if (jobs) {
-        if (!isNaN(parseInt(jobs, 10))) {
-          argv.push('--jobs')
-          argv.push(parseInt(jobs, 10))
-        } else if (jobs.toUpperCase() === 'MAX') {
-          argv.push('--jobs')
-          argv.push(require('os').cpus().length)
-        }
-      }
-    }
-
-    if (win) {
-      // did the user specify their own .sln file?
-      var hasSln = argv.some(function (arg) {
-        return path.extname(arg) == '.sln'
-      })
-      if (!hasSln) {
-        argv.unshift(gyp.opts.solution || guessedSolution)
-      }
-    }
-
-    var proc = gyp.spawn(command, argv)
-    proc.on('exit', onExit)
-  }
-
-  /**
-   * Invoked after the make/msbuild command exits.
-   */
-
-  function onExit (code, signal) {
-    if (code !== 0) {
-      return callback(new Error('`' + command + '` failed with exit code: ' + code))
-    }
-    if (signal) {
-      return callback(new Error('`' + command + '` got signal: ' + signal))
-    }
-    callback()
-  }
-
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/clean.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-
-module.exports = exports = clean
-
-exports.usage = 'Removes any generated build files and the "out" dir'
-
-/**
- * Module dependencies.
- */
-
-var rm = require('rimraf')
-var log = require('npmlog')
-
-
-function clean (gyp, argv, callback) {
-
-  // Remove the 'build' dir
-  var buildDir = 'build'
-
-  log.verbose('clean', 'removing "%s" directory', buildDir)
-  rm(buildDir, callback)
-
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/configure.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,474 +0,0 @@
-
-module.exports = exports = configure
-
-/**
- * Module dependencies.
- */
-
-var fs = require('graceful-fs')
-  , path = require('path')
-  , glob = require('glob')
-  , log = require('npmlog')
-  , osenv = require('osenv')
-  , which = require('which')
-  , semver = require('semver')
-  , mkdirp = require('mkdirp')
-  , cp = require('child_process')
-  , spawn = cp.spawn
-  , execFile = cp.execFile
-  , win = process.platform == 'win32'
-
-exports.usage = 'Generates ' + (win ? 'MSVC project files' : 'a Makefile') + ' for the current module'
-
-function configure (gyp, argv, callback) {
-
-  var python = gyp.opts.python || process.env.PYTHON || 'python'
-    , buildDir = path.resolve('build')
-    , hasVCExpress = false
-    , hasVC2012Express = false
-    , hasWin71SDK = false
-    , hasWin8SDK = false
-    , configNames = [ 'config.gypi', 'common.gypi' ]
-    , configs = []
-    , nodeDir
-
-
-  if (win) {
-    checkVCExpress(function () {
-      if (hasVCExpress || hasVC2012Express) {
-        checkWinSDK(function () {
-          checkVSPrompt(function() {
-            checkPython()
-          })
-        })
-      } else {
-        checkPython()
-      }
-    })
-  } else {
-    checkPython()
-  }
-
-  // Check if Python is in the $PATH
-  function checkPython () {
-    log.verbose('check python', 'checking for Python executable "%s" in the PATH', python)
-    which(python, function (err, execPath) {
-      if (err) {
-        log.verbose('`which` failed', python, err)
-        if (win) {
-          guessPython()
-        } else {
-          failNoPython()
-        }
-      } else {
-        log.verbose('`which` succeeded', python, execPath)
-        checkPythonVersion()
-      }
-    })
-  }
-
-  // Called on Windows when "python" isn't available in the current $PATH.
-  // We're gonna check if "%SystemDrive%\python27\python.exe" exists.
-  function guessPython () {
-    log.verbose('could not find "' + python + '". guessing location')
-    var rootDir = process.env.SystemDrive || 'C:\\'
-    if (rootDir[rootDir.length - 1] !== '\\') {
-      rootDir += '\\'
-    }
-    var pythonPath = path.resolve(rootDir, 'Python27', 'python.exe')
-    log.verbose('ensuring that file exists:', pythonPath)
-    fs.stat(pythonPath, function (err, stat) {
-      if (err) {
-        if (err.code == 'ENOENT') {
-          failNoPython()
-        } else {
-          callback(err)
-        }
-        return
-      }
-      python = pythonPath
-      checkPythonVersion()
-    })
-  }
-
-  function checkPythonVersion () {
-    var env = { TERM: 'dumb', PATH: process.env.PATH };
-    execFile(python, ['-c', 'import platform; print(platform.python_version());'], { env: env }, function (err, stdout) {
-      if (err) {
-        return callback(err)
-      }
-      log.verbose('check python version', '`%s -c "import platform; print(platform.python_version());"` returned: %j', python, stdout)
-      var version = stdout.trim()
-      if (~version.indexOf('+')) {
-        log.silly('stripping "+" sign(s) from version')
-        version = version.replace(/\+/g, '')
-      }
-      if (~version.indexOf('rc')) {
-        log.silly('stripping "rc" identifier from version')
-        version = version.replace(/rc(.*)$/ig, '')
-      }
-      var range = semver.Range('>=2.5.0 <3.0.0');
-      if (range.test(version)) {
-        getNodeDir()
-      } else {
-        failPythonVersion(version)
-      }
-    })
-  }
-
-  function failNoPython () {
-    callback(new Error('Can\'t find Python executable "' + python +
-          '", you can set the PYTHON env variable.'))
-  }
-
-  function failPythonVersion (badVersion) {
-    callback(new Error('Python executable "' + python +
-          '" is v' + badVersion + ', which is not supported by gyp.\n' +
-          'You can pass the --python switch to point to Python >= v2.5.0 & < 3.0.0.'))
-  }
-
-  function checkVSPrompt(cb) {
-    // in the event that they have both installed, see if they are using a particular command prompt
-    if (hasVCExpress && hasVC2012Express) {
-      if (process.env["VisualStudioVersion"] === "11.0") {
-        // they are using the VS 2012 command prompt, unset the VS 2010 variables
-        hasVCExpress = false
-        hasWin71SDK = false
-      } else {
-        // otherwise, unset the VS 2012 variables
-        hasVC2012Express = false
-        hasWin8SDK = false
-      }
-    }
-    cb()
-  }
-
-  function checkWinSDK(cb) {
-    checkWin71SDK(function() {
-      checkWin8SDK(cb)
-    })
-  }
-
-  function checkWin71SDK(cb) {
-    spawn('reg', ['query', 'HKLM\\Software\\Microsoft\\Microsoft SDKs\\Windows\\v7.1', '/v', 'InstallationFolder'])
-         .on('exit', function (code) {
-           hasWin71SDK = (code === 0)
-           cb()
-         })
-  }
-
-  function checkWin8SDK(cb) {
-    var cp = spawn('reg', ['query', 'HKLM\\Software\\Microsoft\\Windows Kits\\Installed Products', '/f', 'Windows Software Development Kit x86', '/reg:32'])
-    cp.on('exit', function (code) {
-      hasWin8SDK = (code === 0)
-      cb()
-    })
-  }
-    
-  function checkVC201264(cb) {
-    var cp = spawn('reg', ['query', 'HKLM\\SOFTWARE\\Wow6432Node\\Microsoft\\VisualStudio\\11.0\\Setup\\VC', '/v', 'ProductDir'])
-    cp.on('exit', function (code) {
-      hasVC2012Express = (code === 0)
-      cb()
-    })
-  }
-  
-  function checkVC2012(cb) {
-    var cp = spawn('reg', ['query', 'HKLM\\SOFTWARE\\Microsoft\\VisualStudio\\11.0\\Setup\\VC', '/v', 'ProductDir'])
-    cp.on('exit', function (code) {
-      hasVC2012Express = (code === 0)
-      if (code !== 0) {
-        checkVC201264(cb)
-      } else {
-        cb()
-      }
-    })
-  }
-
-  function checkVC2012Express64(cb) {
-    var cp = spawn('reg', ['query', 'HKLM\\SOFTWARE\\Wow6432Node\\Microsoft\\VCExpress\\11.0\\Setup\\VC', '/v', 'ProductDir'])
-    cp.on('exit', function (code) {
-        hasVC2012Express = (code === 0)
-        if (code !== 0) {
-            checkVC2012(cb)
-        } else {
-            cb()
-        }
-    })
-  }
-
-  function checkVC2012Express(cb) {
-    var cp = spawn('reg', ['query', 'HKLM\\SOFTWARE\\Microsoft\\VCExpress\\11.0\\Setup\\VC', '/v', 'ProductDir'])
-    cp.on('exit', function (code) {
-      hasVC2012Express = (code === 0)
-      if (code !== 0) {
-        checkVC2012Express64(cb)
-      } else {
-        cb()
-      }
-    })
-  }
-
-  function checkVCExpress64(cb) {
-    var cp = spawn('cmd', ['/C', '%WINDIR%\\SysWOW64\\reg', 'query', 'HKLM\\Software\\Microsoft\\VCExpress\\10.0\\Setup\\VC', '/v', 'ProductDir'])
-    cp.on('exit', function (code) {
-      hasVCExpress = (code === 0)
-      checkVC2012Express(cb)
-    })
-  }
-
-  function checkVCExpress(cb) {
-    spawn('reg', ['query', 'HKLM\\Software\\Microsoft\\VCExpress\\10.0\\Setup\\VC', '/v', 'ProductDir'])
-         .on('exit', function (code) {
-           hasVCExpress = (code === 0)
-           if (code !== 0) {
-             checkVCExpress64(cb)
-           } else {
-             checkVC2012Express(cb)
-           }
-         })
-  }
-
-  function getNodeDir () {
-
-    // 'python' should be set by now
-    process.env.PYTHON = python
-
-    if (gyp.opts.nodedir) {
-      // --nodedir was specified. use that for the dev files
-      nodeDir = gyp.opts.nodedir.replace(/^~/, osenv.home())
-
-      log.verbose('get node dir', 'compiling against specified --nodedir dev files: %s', nodeDir)
-      createBuildDir()
-
-    } else {
-      // if no --nodedir specified, ensure node dependencies are installed
-      var version
-      var versionStr
-
-      if (gyp.opts.target) {
-        // if --target was given, then determine a target version to compile for
-        versionStr = gyp.opts.target
-        log.verbose('get node dir', 'compiling against --target node version: %s', versionStr)
-      } else {
-        // if no --target was specified then use the current host node version
-        versionStr = process.version
-        log.verbose('get node dir', 'no --target version specified, falling back to host node version: %s', versionStr)
-      }
-
-      // make sure we have a valid version
-      try {
-        version = semver.parse(versionStr)
-      } catch (e) {
-        return callback(e)
-      }
-      if (!version) {
-        return callback(new Error('Invalid version number: ' + versionStr))
-      }
-
-      // ensure that the target node version's dev files are installed
-      gyp.opts.ensure = true
-      gyp.commands.install([ versionStr ], function (err, version) {
-        if (err) return callback(err)
-        log.verbose('get node dir', 'target node version installed:', version)
-        nodeDir = path.resolve(gyp.devDir, version)
-        createBuildDir()
-      })
-    }
-  }
-
-  function createBuildDir () {
-    log.verbose('build dir', 'attempting to create "build" dir: %s', buildDir)
-    mkdirp(buildDir, function (err, isNew) {
-      if (err) return callback(err)
-      log.verbose('build dir', '"build" dir needed to be created?', isNew)
-      createConfigFile()
-    })
-  }
-
-  function createConfigFile (err) {
-    if (err) return callback(err)
-
-    var configFilename = 'config.gypi'
-    var configPath = path.resolve(buildDir, configFilename)
-
-    log.verbose('build/' + configFilename, 'creating config file')
-
-    var config = process.config || {}
-      , defaults = config.target_defaults
-      , variables = config.variables
-
-    // default "config.variables"
-    if (!variables) variables = config.variables = {}
-
-    // default "config.defaults"
-    if (!defaults) defaults = config.target_defaults = {}
-
-    // don't inherit the "defaults" from node's `process.config` object.
-    // doing so could cause problems in cases where the `node` executable was
-    // compiled on a different machine (with different lib/include paths) than
-    // the machine where the addon is being built to
-    defaults.cflags = []
-    defaults.defines = []
-    defaults.include_dirs = []
-    defaults.libraries = []
-
-    // set the default_configuration prop
-    if ('debug' in gyp.opts) {
-      defaults.default_configuration = gyp.opts.debug ? 'Debug' : 'Release'
-    }
-    if (!defaults.default_configuration) {
-      defaults.default_configuration = 'Release'
-    }
-
-    // set the target_arch variable
-    variables.target_arch = gyp.opts.arch || process.arch || 'ia32'
-
-    // set the toolset for VCExpress users
-    if (win) {
-      if (hasVC2012Express && hasWin8SDK) {
-        defaults.msbuild_toolset = 'v110'
-      } else if (hasVCExpress && hasWin71SDK) {
-        defaults.msbuild_toolset = 'Windows7.1SDK'
-      }
-    }
-
-    // set the node development directory
-    variables.nodedir = nodeDir
-
-    // don't copy dev libraries with nodedir option
-    variables.copy_dev_lib = !gyp.opts.nodedir
-
-    // disable -T "thin" static archives by default
-    variables.standalone_static_library = gyp.opts.thin ? 0 : 1;
-
-    // loop through the rest of the opts and add the unknown ones as variables.
-    // this allows for module-specific configure flags like:
-    //
-    //   $ node-gyp configure --shared-libxml2
-    Object.keys(gyp.opts).forEach(function (opt) {
-      if (opt === 'argv') return
-      if (opt in gyp.configDefs) return
-      variables[opt.replace(/-/g, '_')] = gyp.opts[opt]
-    })
-
-    // ensures that any boolean values from `process.config` get stringified
-    function boolsToString (k, v) {
-      if (typeof v === 'boolean')
-        return String(v)
-      return v
-    }
-
-    log.silly('build/' + configFilename, config)
-
-    // now write out the config.gypi file to the build/ dir
-    var prefix = '# Do not edit. File was generated by node-gyp\'s "configure" step'
-      , json = JSON.stringify(config, boolsToString, 2)
-    log.verbose('build/' + configFilename, 'writing out config file: %s', configPath)
-    configs.push(configPath)
-    fs.writeFile(configPath, [prefix, json, ''].join('\n'), findConfigs)
-  }
-
-  function findConfigs (err) {
-    if (err) return callback(err)
-    var name = configNames.shift()
-    if (!name) return runGyp()
-    var fullPath = path.resolve(name)
-    log.verbose(name, 'checking for gypi file: %s', fullPath)
-    fs.stat(fullPath, function (err, stat) {
-      if (err) {
-        if (err.code == 'ENOENT') {
-          findConfigs() // check next gypi filename
-        } else {
-          callback(err)
-        }
-      } else {
-        log.verbose(name, 'found gypi file')
-        configs.push(fullPath)
-        findConfigs()
-      }
-    })
-  }
-
-  function runGyp (err) {
-    if (err) return callback(err)
-
-    if (!~argv.indexOf('-f') && !~argv.indexOf('--format')) {
-      if (win) {
-        log.verbose('gyp', 'gyp format was not specified; forcing "msvs"')
-        // force the 'make' target for non-Windows
-        argv.push('-f', 'msvs')
-      } else {
-        log.verbose('gyp', 'gyp format was not specified; forcing "make"')
-        // force the 'make' target for non-Windows
-        argv.push('-f', 'make')
-      }
-    }
-
-    function hasMsvsVersion () {
-      return argv.some(function (arg) {
-        return arg.indexOf('msvs_version') === 0
-      })
-    }
-
-    if (win && !hasMsvsVersion()) {
-      if ('msvs_version' in gyp.opts) {
-        argv.push('-G', 'msvs_version=' + gyp.opts.msvs_version)
-      } else {
-        argv.push('-G', 'msvs_version=auto')
-      }
-    }
-
-    // include all the ".gypi" files that were found
-    configs.forEach(function (config) {
-      argv.push('-I', config)
-    })
-
-    // this logic ported from the old `gyp_addon` python file
-    var gyp_script = path.resolve(__dirname, '..', 'gyp', 'gyp')
-    var addon_gypi = path.resolve(__dirname, '..', 'addon.gypi')
-    var common_gypi = path.resolve(nodeDir, 'common.gypi')
-    var output_dir = 'build'
-    if (win) {
-      // Windows expects an absolute path
-      output_dir = buildDir
-    }
-
-    argv.push('-I', addon_gypi)
-    argv.push('-I', common_gypi)
-    argv.push('-Dlibrary=shared_library')
-    argv.push('-Dvisibility=default')
-    argv.push('-Dnode_root_dir=' + nodeDir)
-    argv.push('-Dmodule_root_dir=' + process.cwd())
-    argv.push('--depth=.')
-
-    // tell gyp to write the Makefile/Solution files into output_dir
-    argv.push('--generator-output', output_dir)
-
-    // tell make to write its output into the same dir
-    argv.push('-Goutput_dir=.')
-
-    // enforce use of the "binding.gyp" file
-    argv.unshift('binding.gyp')
-
-    // execute `gyp` from the current target nodedir
-    argv.unshift(gyp_script)
-
-    var cp = gyp.spawn(python, argv)
-    cp.on('exit', onCpExit)
-  }
-
-  /**
-   * Called when the `gyp` child process exits.
-   */
-
-  function onCpExit (code, signal) {
-    if (code !== 0) {
-      callback(new Error('`gyp` failed with exit code: ' + code))
-    } else {
-      // we're done
-      callback()
-    }
-  }
-
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/install.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,362 +0,0 @@
-
-module.exports = exports = install
-
-exports.usage = 'Install node development files for the specified node version.'
-
-/**
- * Module dependencies.
- */
-
-var fs = require('graceful-fs')
-  , osenv = require('osenv')
-  , tar = require('tar')
-  , rm = require('rimraf')
-  , path = require('path')
-  , zlib = require('zlib')
-  , log = require('npmlog')
-  , semver = require('semver')
-  , fstream = require('fstream')
-  , request = require('request')
-  , minimatch = require('minimatch')
-  , mkdir = require('mkdirp')
-  , win = process.platform == 'win32'
-
-function install (gyp, argv, callback) {
-
-  // ensure no double-callbacks happen
-  function cb (err) {
-    if (cb.done) return
-    cb.done = true
-    if (err) {
-      log.warn('install', 'got an error, rolling back install')
-      // roll-back the install if anything went wrong
-      gyp.commands.remove([ version ], function (err2) {
-        callback(err)
-      })
-    } else {
-      callback(null, version)
-    }
-  }
-
-  var distUrl = gyp.opts['dist-url'] || gyp.opts.disturl || 'http://nodejs.org/dist'
-
-
-  // Determine which node dev files version we are installing
-  var versionStr = argv[0] || gyp.opts.target || process.version
-  log.verbose('install', 'input version string %j', versionStr)
-
-  // parse the version to normalize and ensure it's valid
-  var version = semver.parse(versionStr)
-  if (!version) {
-    return callback(new Error('Invalid version number: ' + versionStr))
-  }
-
-  if (semver.lt(versionStr, '0.8.0')) {
-    return callback(new Error('Minimum target version is `0.8.0` or greater. Got: ' + versionStr))
-  }
-
-  // 0.x.y-pre versions are not published yet and cannot be installed. Bail.
-  if (version.prerelease[0] === 'pre') {
-    log.verbose('detected "pre" node version', versionStr)
-    if (gyp.opts.nodedir) {
-      log.verbose('--nodedir flag was passed; skipping install', gyp.opts.nodedir)
-      callback()
-    } else {
-      callback(new Error('"pre" versions of node cannot be installed, use the --nodedir flag instead'))
-    }
-    return
-  }
-
-  // flatten version into String
-  version = version.version
-  log.verbose('install', 'installing version: %s', version)
-
-  // the directory where the dev files will be installed
-  var devDir = path.resolve(gyp.devDir, version)
-
-  // If '--ensure' was passed, then don't *always* install the version;
-  // check if it is already installed, and only install when needed
-  if (gyp.opts.ensure) {
-    log.verbose('install', '--ensure was passed, so won\'t reinstall if already installed')
-    fs.stat(devDir, function (err, stat) {
-      if (err) {
-        if (err.code == 'ENOENT') {
-          log.verbose('install', 'version not already installed, continuing with install', version)
-          go()
-        } else if (err.code == 'EACCES') {
-          eaccesFallback()
-        } else {
-          cb(err)
-        }
-        return
-      }
-      log.verbose('install', 'version is already installed, need to check "installVersion"')
-      var installVersionFile = path.resolve(devDir, 'installVersion')
-      fs.readFile(installVersionFile, 'ascii', function (err, ver) {
-        if (err && err.code != 'ENOENT') {
-          return cb(err)
-        }
-        var installVersion = parseInt(ver, 10) || 0
-        log.verbose('got "installVersion"', installVersion)
-        log.verbose('needs "installVersion"', gyp.package.installVersion)
-        if (installVersion < gyp.package.installVersion) {
-          log.verbose('install', 'version is no good; reinstalling')
-          go()
-        } else {
-          log.verbose('install', 'version is good')
-          cb()
-        }
-      })
-    })
-  } else {
-    go()
-  }
-
-  function download (url) {
-    log.http('GET', url)
-
-    var req = null
-    var requestOpts = {
-        uri: url
-      , headers: {
-          'User-Agent': 'node-gyp v' + gyp.version + ' (node ' + process.version + ')'
-        }
-    }
-
-    // basic support for a proxy server
-    var proxyUrl = gyp.opts.proxy
-                || process.env.http_proxy
-                || process.env.HTTP_PROXY
-                || process.env.npm_config_proxy
-    if (proxyUrl) {
-      if (/^https?:\/\//i.test(proxyUrl)) {
-        log.verbose('download', 'using proxy url: "%s"', proxyUrl)
-        requestOpts.proxy = proxyUrl
-      } else {
-        log.warn('download', 'ignoring invalid "proxy" config setting: "%s"', proxyUrl)
-      }
-    }
-    try {
-      // The "request" constructor can throw sometimes apparently :(
-      // See: https://github.com/TooTallNate/node-gyp/issues/114
-      req = request(requestOpts)
-    } catch (e) {
-      cb(e)
-    }
-    if (req) {
-      req.on('response', function (res) {
-        log.http(res.statusCode, url)
-      })
-    }
-    return req
-  }
-
-  function go () {
-
-    log.verbose('ensuring nodedir is created', devDir)
-
-    // first create the dir for the node dev files
-    mkdir(devDir, function (err, created) {
-      if (err) {
-        if (err.code == 'EACCES') {
-          eaccesFallback()
-        } else {
-          cb(err)
-        }
-        return
-      }
-
-      if (created) {
-        log.verbose('created nodedir', created)
-      }
-
-      // now download the node tarball
-      var tarPath = gyp.opts['tarball'];
-      var tarballUrl = tarPath ? tarPath : distUrl + '/v' + version + '/node-v' + version + '.tar.gz'
-        , badDownload = false
-        , extractCount = 0
-        , gunzip = zlib.createGunzip()
-        , extracter = tar.Extract({ path: devDir, strip: 1, filter: isValid })
-
-      // checks if a file to be extracted from the tarball is valid.
-      // only .h header files and the gyp files get extracted
-      function isValid () {
-        var name = this.path.substring(devDir.length + 1)
-        var isValid = valid(name)
-        if (name === '' && this.type === 'Directory') {
-          // the first directory entry is ok
-          return true
-        }
-        if (isValid) {
-          log.verbose('extracted file from tarball', name)
-          extractCount++
-        } else {
-          // invalid
-          log.silly('ignoring from tarball', name)
-        }
-        return isValid
-      }
-
-      gunzip.on('error', cb)
-      extracter.on('error', cb)
-      extracter.on('end', afterTarball)
-
-      // download the tarball, gunzip and extract!
-
-      if (tarPath) {
-        var input = fs.createReadStream(tarballUrl)
-        input.pipe(gunzip).pipe(extracter)
-        return
-      }
-
-      var req = download(tarballUrl)
-      if (!req) return
-
-      // something went wrong downloading the tarball?
-      req.on('error', function (err) {
-        badDownload = true
-        cb(err)
-      })
-
-      req.on('close', function () {
-        if (extractCount === 0) {
-          cb(new Error('Connection closed while downloading tarball file'))
-        }
-      })
-
-      req.on('response', function (res) {
-        if (res.statusCode !== 200) {
-          badDownload = true
-          cb(new Error(res.statusCode + ' status code downloading tarball'))
-          return
-        }
-        // start unzipping and untaring
-        req.pipe(gunzip).pipe(extracter)
-      })
-
-      // invoked after the tarball has finished being extracted
-      function afterTarball () {
-        if (badDownload) return
-        if (extractCount === 0) {
-          return cb(new Error('There was a fatal problem while downloading/extracting the tarball'))
-        }
-        log.verbose('tarball', 'done parsing tarball')
-        var async = 0
-
-        if (win) {
-          // need to download node.lib
-          async++
-          downloadNodeLib(deref)
-        }
-
-        // write the "installVersion" file
-        async++
-        var installVersionPath = path.resolve(devDir, 'installVersion')
-        fs.writeFile(installVersionPath, gyp.package.installVersion + '\n', deref)
-
-        if (async === 0) {
-          // no async tasks required
-          cb()
-        }
-
-        function deref (err) {
-          if (err) return cb(err)
-          --async || cb()
-        }
-      }
-
-      function downloadNodeLib (done) {
-        log.verbose('on Windows; need to download `node.lib`...')
-        var dir32 = path.resolve(devDir, 'ia32')
-          , dir64 = path.resolve(devDir, 'x64')
-          , nodeLibPath32 = path.resolve(dir32, 'node.lib')
-          , nodeLibPath64 = path.resolve(dir64, 'node.lib')
-          , nodeLibUrl32 = distUrl + '/v' + version + '/node.lib'
-          , nodeLibUrl64 = distUrl + '/v' + version + '/x64/node.lib'
-
-        log.verbose('32-bit node.lib dir', dir32)
-        log.verbose('64-bit node.lib dir', dir64)
-        log.verbose('`node.lib` 32-bit url', nodeLibUrl32)
-        log.verbose('`node.lib` 64-bit url', nodeLibUrl64)
-
-        var async = 2
-        mkdir(dir32, function (err) {
-          if (err) return done(err)
-          log.verbose('streaming 32-bit node.lib to:', nodeLibPath32)
-
-          var req = download(nodeLibUrl32)
-          if (!req) return
-          req.on('error', done)
-          req.on('response', function (res) {
-            if (res.statusCode !== 200) {
-              done(new Error(res.statusCode + ' status code downloading 32-bit node.lib'))
-              return
-            }
-
-            var ws = fs.createWriteStream(nodeLibPath32)
-            ws.on('error', cb)
-            req.pipe(ws)
-          })
-          req.on('end', function () {
-            --async || done()
-          })
-        })
-        mkdir(dir64, function (err) {
-          if (err) return done(err)
-          log.verbose('streaming 64-bit node.lib to:', nodeLibPath64)
-
-          var req = download(nodeLibUrl64)
-          if (!req) return
-          req.on('error', done)
-          req.on('response', function (res) {
-            if (res.statusCode !== 200) {
-              done(new Error(res.statusCode + ' status code downloading 64-bit node.lib'))
-              return
-            }
-
-            var ws = fs.createWriteStream(nodeLibPath64)
-            ws.on('error', cb)
-            req.pipe(ws)
-          })
-          req.on('end', function () {
-            --async || done()
-          })
-        })
-      } // downloadNodeLib()
-
-    }) // mkdir()
-
-  } // go()
-
-  /**
-   * Checks if a given filename is "valid" for this installation.
-   */
-
-  function valid (file) {
-    // header files
-    return minimatch(file, '*.h', { matchBase: true }) ||
-           minimatch(file, '*.gypi', { matchBase: true })
-  }
-
-  /**
-   * The EACCES fallback is a workaround for npm's `sudo` behavior, where
-   * it drops the permissions before invoking any child processes (like
-   * node-gyp). So what happens is the "nobody" user doesn't have
-   * permission to create the dev dir. As a fallback, make the tmpdir() be
-   * the dev dir for this installation. This is not ideal, but at least
-   * the compilation will succeed...
-   */
-
-  function eaccesFallback () {
-    var tmpdir = osenv.tmpdir()
-    gyp.devDir = path.resolve(tmpdir, '.node-gyp')
-    log.warn('EACCES', 'user "%s" does not have permission to access the dev dir "%s"', osenv.user(), devDir)
-    log.warn('EACCES', 'attempting to reinstall using temporary dev dir "%s"', gyp.devDir)
-    if (process.cwd() == tmpdir) {
-      log.verbose('tmpdir == cwd', 'automatically will remove dev files after to save disk space')
-      gyp.todo.push({ name: 'remove', args: argv })
-    }
-    gyp.commands.install(argv, cb)
-  }
-
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/list.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-
-module.exports = exports = list
-
-exports.usage = 'Prints a listing of the currently installed node development files'
-
-/**
- * Module dependencies.
- */
-
-var fs = require('graceful-fs')
-  , path = require('path')
-  , log = require('npmlog')
-
-function list (gyp, args, callback) {
-
-  var devDir = gyp.devDir
-  log.verbose('list', 'using node-gyp dir:', devDir)
-
-  // readdir() the node-gyp dir
-  fs.readdir(devDir, onreaddir)
-
-  function onreaddir (err, versions) {
-    if (err && err.code != 'ENOENT') {
-      return callback(err)
-    }
-    if (Array.isArray(versions)) {
-      versions = versions.filter(function (v) { return v != 'current' })
-    } else {
-      versions = []
-    }
-    callback(null, versions)
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/node-gyp.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,225 +0,0 @@
-
-/**
- * Module exports.
- */
-
-module.exports = exports = gyp
-
-/**
- * Module dependencies.
- */
-
-var fs = require('graceful-fs')
-  , path = require('path')
-  , nopt = require('nopt')
-  , log = require('npmlog')
-  , child_process = require('child_process')
-  , EE = require('events').EventEmitter
-  , inherits = require('util').inherits
-  , commands = [
-      // Module build commands
-        'build'
-      , 'clean'
-      , 'configure'
-      , 'rebuild'
-      // Development Header File management commands
-      , 'install'
-      , 'list'
-      , 'remove'
-    ]
-  , aliases = {
-        'ls': 'list'
-      , 'rm': 'remove'
-    }
-
-// differentiate node-gyp's logs from npm's
-log.heading = 'gyp'
-
-/**
- * The `gyp` function.
- */
-
-function gyp () {
-  return new Gyp()
-}
-
-function Gyp () {
-  var self = this
-
-  // set the dir where node-gyp dev files get installed
-  // TODO: make this *more* configurable?
-  //       see: https://github.com/TooTallNate/node-gyp/issues/21
-  var homeDir = process.env.HOME || process.env.USERPROFILE
-  if (!homeDir) {
-    throw new Error(
-      "node-gyp requires that the user's home directory is specified " +
-      "in either of the environmental variables HOME or USERPROFILE"
-    );
-  }
-  this.devDir = path.resolve(homeDir, '.node-gyp')
-
-  this.commands = {}
-
-  commands.forEach(function (command) {
-    self.commands[command] = function (argv, callback) {
-      log.verbose('command', command, argv)
-      return require('./' + command)(self, argv, callback)
-    }
-  })
-}
-inherits(Gyp, EE)
-exports.Gyp = Gyp
-var proto = Gyp.prototype
-
-/**
- * Export the contents of the package.json.
- */
-
-proto.package = require('../package')
-
-/**
- * nopt configuration definitions
- */
-
-proto.configDefs = {
-    help: Boolean     // everywhere
-  , arch: String      // 'configure'
-  , debug: Boolean    // 'build'
-  , directory: String // bin
-  , make: String      // 'build'
-  , msvs_version: String // 'configure'
-  , ensure: Boolean   // 'install'
-  , solution: String  // 'build' (windows only)
-  , proxy: String     // 'install'
-  , nodedir: String   // 'configure'
-  , loglevel: String  // everywhere
-  , python: String    // 'configure'
-  , 'dist-url': String // 'install'
-  , 'tarball': String // 'install'
-  , jobs: String      // 'build'
-  , thin: String      // 'configure'
-}
-
-/**
- * nopt shorthands
- */
-
-proto.shorthands = {
-    release: '--no-debug'
-  , C: '--directory'
-  , debug: '--debug'
-  , j: '--jobs'
-  , silly: '--loglevel=silly'
-  , verbose: '--loglevel=verbose'
-}
-
-/**
- * expose the command aliases for the bin file to use.
- */
-
-proto.aliases = aliases
-
-/**
- * Parses the given argv array and sets the 'opts',
- * 'argv' and 'command' properties.
- */
-
-proto.parseArgv = function parseOpts (argv) {
-  this.opts = nopt(this.configDefs, this.shorthands, argv)
-  this.argv = this.opts.argv.remain.slice()
-
-  var commands = this.todo = []
-
-  // create a copy of the argv array with aliases mapped
-  argv = this.argv.map(function (arg) {
-    // is this an alias?
-    if (arg in this.aliases) {
-      arg = this.aliases[arg]
-    }
-    return arg
-  }, this)
-
-  // process the mapped args into "command" objects ("name" and "args" props)
-  argv.slice().forEach(function (arg) {
-    if (arg in this.commands) {
-      var args = argv.splice(0, argv.indexOf(arg))
-      argv.shift()
-      if (commands.length > 0) {
-        commands[commands.length - 1].args = args
-      }
-      commands.push({ name: arg, args: [] })
-    }
-  }, this)
-  if (commands.length > 0) {
-    commands[commands.length - 1].args = argv.splice(0)
-  }
-
-  // support for inheriting config env variables from npm
-  var npm_config_prefix = 'npm_config_'
-  Object.keys(process.env).forEach(function (name) {
-    if (name.indexOf(npm_config_prefix) !== 0) return
-    var val = process.env[name]
-    if (name === npm_config_prefix + 'loglevel') {
-      log.level = val
-    } else {
-      // add the user-defined options to the config
-      name = name.substring(npm_config_prefix.length)
-      this.opts[name] = val
-    }
-  }, this)
-
-  if (this.opts.loglevel) {
-    log.level = this.opts.loglevel
-  }
-  log.resume()
-}
-
-/**
- * Spawns a child process and emits a 'spawn' event.
- */
-
-proto.spawn = function spawn (command, args, opts) {
-  if (!opts) opts = {}
-  if (!opts.silent && !opts.customFds) {
-    opts.customFds = [ 0, 1, 2 ]
-  }
-  var cp = child_process.spawn(command, args, opts)
-  log.info('spawn', command)
-  log.info('spawn args', args)
-  return cp
-}
-
-/**
- * Returns the usage instructions for node-gyp.
- */
-
-proto.usage = function usage () {
-  var str = [
-      ''
-    , '  Usage: node-gyp <command> [options]'
-    , ''
-    , '  where <command> is one of:'
-    , commands.map(function (c) {
-        return '    - ' + c + ' - ' + require('./' + c).usage
-      }).join('\n')
-    , ''
-    , '  for specific command usage and options try:'
-    , '    $ node-gyp <command> --help'
-    , ''
-    , 'node-gyp@' + this.version + '  ' + path.resolve(__dirname, '..')
-    , 'node@' + process.versions.node
-  ].join('\n')
-  return str
-}
-
-/**
- * Version number getter.
- */
-
-Object.defineProperty(proto, 'version', {
-    get: function () {
-      return this.package.version
-    }
-  , enumerable: true
-})
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/rebuild.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-
-module.exports = exports = rebuild
-
-exports.usage = 'Runs "clean", "configure" and "build" all at once'
-
-var log = require('npmlog')
-
-function rebuild (gyp, argv, callback) {
-
-  gyp.todo.push(
-      { name: 'clean', args: [] }
-    , { name: 'configure', args: [] }
-    , { name: 'build', args: [] }
-  )
-  process.nextTick(callback)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/lib/remove.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-
-module.exports = exports = remove
-
-exports.usage = 'Removes the node development files for the specified version'
-
-/**
- * Module dependencies.
- */
-
-var fs = require('fs')
-  , rm = require('rimraf')
-  , path = require('path')
-  , log = require('npmlog')
-  , semver = require('semver')
-
-function remove (gyp, argv, callback) {
-
-  var devDir = gyp.devDir
-  log.verbose('remove', 'using node-gyp dir:', devDir)
-
-  // get the user-specified version to remove
-  var v = argv[0] || gyp.opts.target
-  log.verbose('remove', 'removing target version:', v)
-
-  if (!v) {
-    return callback(new Error('You must specify a version number to remove. Ex: "' + process.version + '"'))
-  }
-
-  // parse the version to normalize and make sure it's valid
-  var version = semver.parse(v)
-
-  if (!version) {
-    return callback(new Error('Invalid version number: ' + v))
-  }
-
-  // flatten the version Array into a String
-  version = version.version
-
-  var versionPath = path.resolve(gyp.devDir, version)
-  log.verbose('remove', 'removing development files for version:', version)
-
-  // first check if its even installed
-  fs.stat(versionPath, function (err, stat) {
-    if (err) {
-      if (err.code == 'ENOENT') {
-        callback(null, 'version was already uninstalled: ' + version)
-      } else {
-        callback(err)
-      }
-      return
-    }
-    // Go ahead and delete the dir
-    rm(versionPath, callback)
-  })
-
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/node-gyp/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-{
-  "name": "node-gyp",
-  "description": "Node.js native addon build tool",
-  "keywords": [
-    "native",
-    "addon",
-    "module",
-    "c",
-    "c++",
-    "bindings",
-    "gyp"
-  ],
-  "version": "0.11.0",
-  "installVersion": 9,
-  "author": {
-    "name": "Nathan Rajlich",
-    "email": "nathan@tootallnate.net",
-    "url": "http://tootallnate.net"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/TooTallNate/node-gyp.git"
-  },
-  "preferGlobal": true,
-  "bin": {
-    "node-gyp": "./bin/node-gyp.js"
-  },
-  "main": "./lib/node-gyp.js",
-  "dependencies": {
-    "glob": "3",
-    "graceful-fs": "2",
-    "fstream": "0",
-    "minimatch": "0",
-    "mkdirp": "0",
-    "nopt": "2",
-    "npmlog": "0",
-    "osenv": "0",
-    "request": "2",
-    "rimraf": "2",
-    "semver": "~2.2.1",
-    "tar": "0",
-    "which": "1"
-  },
-  "engines": {
-    "node": ">= 0.8.0"
-  },
-  "readme": "node-gyp\n=========\n### Node.js native addon build tool\n\n`node-gyp` is a cross-platform command-line tool written in Node.js for compiling\nnative addon modules for Node.js, which takes away the pain of dealing with the\nvarious differences in build platforms. It is the replacement to the `node-waf`\nprogram which is removed for node `v0.8`. If you have a native addon for node that\nstill has a `wscript` file, then you should definitely add a `binding.gyp` file\nto support the latest versions of node.\n\nMultiple target versions of node are supported (i.e. `0.8`, `0.9`, `0.10`, ..., `1.0`,\netc.), regardless of what version of node is actually installed on your system\n(`node-gyp` downloads the necessary development files for the target version).\n\n#### Features:\n\n * Easy to use, consistent interface\n * Same commands to build your module on every platform\n * Supports multiple target versions of Node\n\n\nInstallation\n------------\n\nYou can install with `npm`:\n\n``` bash\n$ npm install -g node-gyp\n```\n\nYou will also need to install:\n\n  * On Unix:\n    * `python` (`v2.7` recommended, `v3.x.x` is __*not*__ supported)\n    * `make`\n    * A proper C/C++ compiler toolchain, like GCC\n  * On Windows:\n    * [Python][windows-python] ([`v2.7.3`][windows-python-v2.7.3] recommended, `v3.x.x` is __*not*__ supported)\n    * Windows XP/Vista/7:\n      * Microsoft Visual Studio C++ 2010 ([Express][msvc2010] version works well)\n      * For 64-bit builds of node and native modules you will _**also**_ need the [Windows 7 64-bit SDK][win7sdk]\n        * If the install fails, try uninstalling any C++ 2010 x64&x86 Redistributable that you have installed first.\n      * If you get errors that the 64-bit compilers are not installed you may also need the [compiler update for the Windows SDK 7.1]\n    * Windows 7/8:\n      * Microsoft Visual Studio C++ 2012 for Windows Desktop ([Express][msvc2012] version works well)\n\nNote that OS X is just a flavour of Unix and so needs `python`, `make`, and C/C++.\nAn easy way to obtain these is to install XCode from Apple,\nand then use it to install the command line tools (under Preferences -> Downloads).\n\nHow to Use\n----------\n\nTo compile your native addon, first go to its root directory:\n\n``` bash\n$ cd my_node_addon\n```\n\nThe next step is to generate the appropriate project build files for the current\nplatform. Use `configure` for that:\n\n``` bash\n$ node-gyp configure\n```\n\n__Note__: The `configure` step looks for the `binding.gyp` file in the current\ndirectory to processs. See below for instructions on creating the `binding.gyp` file.\n\nNow you will have either a `Makefile` (on Unix platforms) or a `vcxproj` file\n(on Windows) in the `build/` directory. Next invoke the `build` command:\n\n``` bash\n$ node-gyp build\n```\n\nNow you have your compiled `.node` bindings file! The compiled bindings end up\nin `build/Debug/` or `build/Release/`, depending on the build mode. At this point\nyou can require the `.node` file with Node and run your tests!\n\n__Note:__ To create a _Debug_ build of the bindings file, pass the `--debug` (or\n`-d`) switch when running the either `configure` or `build` command.\n\n\nThe \"binding.gyp\" file\n----------------------\n\nPreviously when node had `node-waf` you had to write a `wscript` file. The\nreplacement for that is the `binding.gyp` file, which describes the configuration\nto build your module in a JSON-like format. This file gets placed in the root of\nyour package, alongside the `package.json` file.\n\nA barebones `gyp` file appropriate for building a node addon looks like:\n\n``` python\n{\n  \"targets\": [\n    {\n      \"target_name\": \"binding\",\n      \"sources\": [ \"src/binding.cc\" ]\n    }\n  ]\n}\n```\n\nSome additional resources for writing `gyp` files:\n\n * [\"Hello World\" node addon example](https://github.com/joyent/node/tree/master/test/addons/hello-world)\n * [gyp user documentation](http://code.google.com/p/gyp/wiki/GypUserDocumentation)\n * [gyp input format reference](http://code.google.com/p/gyp/wiki/InputFormatReference)\n * [*\"binding.gyp\" files out in the wild* wiki page](https://github.com/TooTallNate/node-gyp/wiki/%22binding.gyp%22-files-out-in-the-wild)\n\n\nCommands\n--------\n\n`node-gyp` responds to the following commands:\n\n| **Command**   | **Description**\n|:--------------|:---------------------------------------------------------------\n| `build`       | Invokes `make`/`msbuild.exe` and builds the native addon\n| `clean`       | Removes any the `build` dir if it exists\n| `configure`   | Generates project build files for the current platform\n| `rebuild`     | Runs \"clean\", \"configure\" and \"build\" all in a row\n| `install`     | Installs node development header files for the given version\n| `list`        | Lists the currently installed node development file versions\n| `remove`      | Removes the node development header files for the given version\n\n\nLicense\n-------\n\n(The MIT License)\n\nCopyright (c) 2012 Nathan Rajlich &lt;nathan@tootallnate.net&gt;\n\nPermission is hereby granted, free of charge, to any person obtaining\na copy of this software and associated documentation files (the\n'Software'), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to\npermit persons to whom the Software is furnished to do so, subject to\nthe following conditions:\n\nThe above copyright notice and this permission notice shall be\nincluded in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,\nEXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.\nIN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY\nCLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,\nTORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\nSOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\n\n[windows-python]: http://www.python.org/getit/windows\n[windows-python-v2.7.3]: http://www.python.org/download/releases/2.7.3#download\n[msvc2010]: http://go.microsoft.com/?linkid=9709949\n[msvc2012]: http://go.microsoft.com/?linkid=9816758\n[win7sdk]: http://www.microsoft.com/en-us/download/details.aspx?id=8279\n[compiler update for the Windows SDK 7.1]: http://www.microsoft.com/en-us/download/details.aspx?id=4422\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/TooTallNate/node-gyp/issues"
-  },
-  "homepage": "https://github.com/TooTallNate/node-gyp",
-  "_id": "node-gyp@0.11.0",
-  "dist": {
-    "shasum": "e6745c3c68c40883c9ad42a5582295405cd3d81d"
-  },
-  "_from": "node-gyp@latest",
-  "_resolved": "https://registry.npmjs.org/node-gyp/-/node-gyp-0.11.0.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-node_modules
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-Copyright 2009, 2010, 2011 Isaac Z. Schlueter.
-All rights reserved.
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,210 +0,0 @@
-If you want to write an option parser, and have it be good, there are
-two ways to do it.  The Right Way, and the Wrong Way.
-
-The Wrong Way is to sit down and write an option parser.  We've all done
-that.
-
-The Right Way is to write some complex configurable program with so many
-options that you go half-insane just trying to manage them all, and put
-it off with duct-tape solutions until you see exactly to the core of the
-problem, and finally snap and write an awesome option parser.
-
-If you want to write an option parser, don't write an option parser.
-Write a package manager, or a source control system, or a service
-restarter, or an operating system.  You probably won't end up with a
-good one of those, but if you don't give up, and you are relentless and
-diligent enough in your procrastination, you may just end up with a very
-nice option parser.
-
-## USAGE
-
-    // my-program.js
-    var nopt = require("nopt")
-      , Stream = require("stream").Stream
-      , path = require("path")
-      , knownOpts = { "foo" : [String, null]
-                    , "bar" : [Stream, Number]
-                    , "baz" : path
-                    , "bloo" : [ "big", "medium", "small" ]
-                    , "flag" : Boolean
-                    , "pick" : Boolean
-                    , "many" : [String, Array]
-                    }
-      , shortHands = { "foofoo" : ["--foo", "Mr. Foo"]
-                     , "b7" : ["--bar", "7"]
-                     , "m" : ["--bloo", "medium"]
-                     , "p" : ["--pick"]
-                     , "f" : ["--flag"]
-                     }
-                 // everything is optional.
-                 // knownOpts and shorthands default to {}
-                 // arg list defaults to process.argv
-                 // slice defaults to 2
-      , parsed = nopt(knownOpts, shortHands, process.argv, 2)
-    console.log(parsed)
-
-This would give you support for any of the following:
-
-```bash
-$ node my-program.js --foo "blerp" --no-flag
-{ "foo" : "blerp", "flag" : false }
-
-$ node my-program.js ---bar 7 --foo "Mr. Hand" --flag
-{ bar: 7, foo: "Mr. Hand", flag: true }
-
-$ node my-program.js --foo "blerp" -f -----p
-{ foo: "blerp", flag: true, pick: true }
-
-$ node my-program.js -fp --foofoo
-{ foo: "Mr. Foo", flag: true, pick: true }
-
-$ node my-program.js --foofoo -- -fp  # -- stops the flag parsing.
-{ foo: "Mr. Foo", argv: { remain: ["-fp"] } }
-
-$ node my-program.js --blatzk 1000 -fp # unknown opts are ok.
-{ blatzk: 1000, flag: true, pick: true }
-
-$ node my-program.js --blatzk true -fp # but they need a value
-{ blatzk: true, flag: true, pick: true }
-
-$ node my-program.js --no-blatzk -fp # unless they start with "no-"
-{ blatzk: false, flag: true, pick: true }
-
-$ node my-program.js --baz b/a/z # known paths are resolved.
-{ baz: "/Users/isaacs/b/a/z" }
-
-# if Array is one of the types, then it can take many
-# values, and will always be an array.  The other types provided
-# specify what types are allowed in the list.
-
-$ node my-program.js --many 1 --many null --many foo
-{ many: ["1", "null", "foo"] }
-
-$ node my-program.js --many foo
-{ many: ["foo"] }
-```
-
-Read the tests at the bottom of `lib/nopt.js` for more examples of
-what this puppy can do.
-
-## Types
-
-The following types are supported, and defined on `nopt.typeDefs`
-
-* String: A normal string.  No parsing is done.
-* path: A file system path.  Gets resolved against cwd if not absolute.
-* url: A url.  If it doesn't parse, it isn't accepted.
-* Number: Must be numeric.
-* Date: Must parse as a date. If it does, and `Date` is one of the options,
-  then it will return a Date object, not a string.
-* Boolean: Must be either `true` or `false`.  If an option is a boolean,
-  then it does not need a value, and its presence will imply `true` as
-  the value.  To negate boolean flags, do `--no-whatever` or `--whatever
-  false`
-* NaN: Means that the option is strictly not allowed.  Any value will
-  fail.
-* Stream: An object matching the "Stream" class in node.  Valuable
-  for use when validating programmatically.  (npm uses this to let you
-  supply any WriteStream on the `outfd` and `logfd` config options.)
-* Array: If `Array` is specified as one of the types, then the value
-  will be parsed as a list of options.  This means that multiple values
-  can be specified, and that the value will always be an array.
-
-If a type is an array of values not on this list, then those are
-considered valid values.  For instance, in the example above, the
-`--bloo` option can only be one of `"big"`, `"medium"`, or `"small"`,
-and any other value will be rejected.
-
-When parsing unknown fields, `"true"`, `"false"`, and `"null"` will be
-interpreted as their JavaScript equivalents, and numeric values will be
-interpreted as a number.
-
-You can also mix types and values, or multiple types, in a list.  For
-instance `{ blah: [Number, null] }` would allow a value to be set to
-either a Number or null.  When types are ordered, this implies a
-preference, and the first type that can be used to properly interpret
-the value will be used.
-
-To define a new type, add it to `nopt.typeDefs`.  Each item in that
-hash is an object with a `type` member and a `validate` method.  The
-`type` member is an object that matches what goes in the type list.  The
-`validate` method is a function that gets called with `validate(data,
-key, val)`.  Validate methods should assign `data[key]` to the valid
-value of `val` if it can be handled properly, or return boolean
-`false` if it cannot.
-
-You can also call `nopt.clean(data, types, typeDefs)` to clean up a
-config object and remove its invalid properties.
-
-## Error Handling
-
-By default, nopt outputs a warning to standard error when invalid
-options are found.  You can change this behavior by assigning a method
-to `nopt.invalidHandler`.  This method will be called with
-the offending `nopt.invalidHandler(key, val, types)`.
-
-If no `nopt.invalidHandler` is assigned, then it will console.error
-its whining.  If it is assigned to boolean `false` then the warning is
-suppressed.
-
-## Abbreviations
-
-Yes, they are supported.  If you define options like this:
-
-```javascript
-{ "foolhardyelephants" : Boolean
-, "pileofmonkeys" : Boolean }
-```
-
-Then this will work:
-
-```bash
-node program.js --foolhar --pil
-node program.js --no-f --pileofmon
-# etc.
-```
-
-## Shorthands
-
-Shorthands are a hash of shorter option names to a snippet of args that
-they expand to.
-
-If multiple one-character shorthands are all combined, and the
-combination does not unambiguously match any other option or shorthand,
-then they will be broken up into their constituent parts.  For example:
-
-```json
-{ "s" : ["--loglevel", "silent"]
-, "g" : "--global"
-, "f" : "--force"
-, "p" : "--parseable"
-, "l" : "--long"
-}
-```
-
-```bash
-npm ls -sgflp
-# just like doing this:
-npm ls --loglevel silent --global --force --long --parseable
-```
-
-## The Rest of the args
-
-The config object returned by nopt is given a special member called
-`argv`, which is an object with the following fields:
-
-* `remain`: The remaining args after all the parsing has occurred.
-* `original`: The args as they originally appeared.
-* `cooked`: The args after flags and shorthands are expanded.
-
-## Slicing
-
-Node programs are called with more or less the exact argv as it appears
-in C land, after the v8 and node-specific options have been plucked off.
-As such, `argv[0]` is always `node` and `argv[1]` is always the
-JavaScript program being run.
-
-That's usually not very useful to you.  So they're sliced off by
-default.  If you want them, then you can pass in `0` as the last
-argument, or any other number that you'd like to slice off the start of
-the list.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/bin/nopt.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,51 +0,0 @@
-#!/usr/bin/env node
-var nopt = require("../lib/nopt")
-  , types = { num: Number
-            , bool: Boolean
-            , help: Boolean
-            , list: Array
-            , "num-list": [Number, Array]
-            , "str-list": [String, Array]
-            , "bool-list": [Boolean, Array]
-            , str: String
-            , clear: Boolean
-            , config: Boolean
-            , length: Number
-            }
-  , shorthands = { s: [ "--str", "astring" ]
-                 , b: [ "--bool" ]
-                 , nb: [ "--no-bool" ]
-                 , tft: [ "--bool-list", "--no-bool-list", "--bool-list", "true" ]
-                 , "?": ["--help"]
-                 , h: ["--help"]
-                 , H: ["--help"]
-                 , n: [ "--num", "125" ]
-                 , c: ["--config"]
-                 , l: ["--length"]
-                 }
-  , parsed = nopt( types
-                 , shorthands
-                 , process.argv
-                 , 2 )
-
-console.log("parsed", parsed)
-
-if (parsed.help) {
-  console.log("")
-  console.log("nopt cli tester")
-  console.log("")
-  console.log("types")
-  console.log(Object.keys(types).map(function M (t) {
-    var type = types[t]
-    if (Array.isArray(type)) {
-      return [t, type.map(function (type) { return type.name })]
-    }
-    return [t, type && type.name]
-  }).reduce(function (s, i) {
-    s[i[0]] = i[1]
-    return s
-  }, {}))
-  console.log("")
-  console.log("shorthands")
-  console.log(shorthands)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/examples/my-program.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-#!/usr/bin/env node
-
-//process.env.DEBUG_NOPT = 1
-
-// my-program.js
-var nopt = require("../lib/nopt")
-  , Stream = require("stream").Stream
-  , path = require("path")
-  , knownOpts = { "foo" : [String, null]
-                , "bar" : [Stream, Number]
-                , "baz" : path
-                , "bloo" : [ "big", "medium", "small" ]
-                , "flag" : Boolean
-                , "pick" : Boolean
-                }
-  , shortHands = { "foofoo" : ["--foo", "Mr. Foo"]
-                 , "b7" : ["--bar", "7"]
-                 , "m" : ["--bloo", "medium"]
-                 , "p" : ["--pick"]
-                 , "f" : ["--flag", "true"]
-                 , "g" : ["--flag"]
-                 , "s" : "--flag"
-                 }
-             // everything is optional.
-             // knownOpts and shorthands default to {}
-             // arg list defaults to process.argv
-             // slice defaults to 2
-  , parsed = nopt(knownOpts, shortHands, process.argv, 2)
-
-console.log("parsed =\n"+ require("util").inspect(parsed))
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/lib/nopt.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,612 +0,0 @@
-// info about each config option.
-
-var debug = process.env.DEBUG_NOPT || process.env.NOPT_DEBUG
-  ? function () { console.error.apply(console, arguments) }
-  : function () {}
-
-var url = require("url")
-  , path = require("path")
-  , Stream = require("stream").Stream
-  , abbrev = require("abbrev")
-
-module.exports = exports = nopt
-exports.clean = clean
-
-exports.typeDefs =
-  { String  : { type: String,  validate: validateString  }
-  , Boolean : { type: Boolean, validate: validateBoolean }
-  , url     : { type: url,     validate: validateUrl     }
-  , Number  : { type: Number,  validate: validateNumber  }
-  , path    : { type: path,    validate: validatePath    }
-  , Stream  : { type: Stream,  validate: validateStream  }
-  , Date    : { type: Date,    validate: validateDate    }
-  }
-
-function nopt (types, shorthands, args, slice) {
-  args = args || process.argv
-  types = types || {}
-  shorthands = shorthands || {}
-  if (typeof slice !== "number") slice = 2
-
-  debug(types, shorthands, args, slice)
-
-  args = args.slice(slice)
-  var data = {}
-    , key
-    , remain = []
-    , cooked = args
-    , original = args.slice(0)
-
-  parse(args, data, remain, types, shorthands)
-  // now data is full
-  clean(data, types, exports.typeDefs)
-  data.argv = {remain:remain,cooked:cooked,original:original}
-  Object.defineProperty(data.argv, 'toString', { value: function () {
-    return this.original.map(JSON.stringify).join(" ")
-  }, enumerable: false })
-  return data
-}
-
-function clean (data, types, typeDefs) {
-  typeDefs = typeDefs || exports.typeDefs
-  var remove = {}
-    , typeDefault = [false, true, null, String, Number, Array]
-
-  Object.keys(data).forEach(function (k) {
-    if (k === "argv") return
-    var val = data[k]
-      , isArray = Array.isArray(val)
-      , type = types[k]
-    if (!isArray) val = [val]
-    if (!type) type = typeDefault
-    if (type === Array) type = typeDefault.concat(Array)
-    if (!Array.isArray(type)) type = [type]
-
-    debug("val=%j", val)
-    debug("types=", type)
-    val = val.map(function (val) {
-      // if it's an unknown value, then parse false/true/null/numbers/dates
-      if (typeof val === "string") {
-        debug("string %j", val)
-        val = val.trim()
-        if ((val === "null" && ~type.indexOf(null))
-            || (val === "true" &&
-               (~type.indexOf(true) || ~type.indexOf(Boolean)))
-            || (val === "false" &&
-               (~type.indexOf(false) || ~type.indexOf(Boolean)))) {
-          val = JSON.parse(val)
-          debug("jsonable %j", val)
-        } else if (~type.indexOf(Number) && !isNaN(val)) {
-          debug("convert to number", val)
-          val = +val
-        } else if (~type.indexOf(Date) && !isNaN(Date.parse(val))) {
-          debug("convert to date", val)
-          val = new Date(val)
-        }
-      }
-
-      if (!types.hasOwnProperty(k)) {
-        return val
-      }
-
-      // allow `--no-blah` to set 'blah' to null if null is allowed
-      if (val === false && ~type.indexOf(null) &&
-          !(~type.indexOf(false) || ~type.indexOf(Boolean))) {
-        val = null
-      }
-
-      var d = {}
-      d[k] = val
-      debug("prevalidated val", d, val, types[k])
-      if (!validate(d, k, val, types[k], typeDefs)) {
-        if (exports.invalidHandler) {
-          exports.invalidHandler(k, val, types[k], data)
-        } else if (exports.invalidHandler !== false) {
-          debug("invalid: "+k+"="+val, types[k])
-        }
-        return remove
-      }
-      debug("validated val", d, val, types[k])
-      return d[k]
-    }).filter(function (val) { return val !== remove })
-
-    if (!val.length) delete data[k]
-    else if (isArray) {
-      debug(isArray, data[k], val)
-      data[k] = val
-    } else data[k] = val[0]
-
-    debug("k=%s val=%j", k, val, data[k])
-  })
-}
-
-function validateString (data, k, val) {
-  data[k] = String(val)
-}
-
-function validatePath (data, k, val) {
-  data[k] = path.resolve(String(val))
-  return true
-}
-
-function validateNumber (data, k, val) {
-  debug("validate Number %j %j %j", k, val, isNaN(val))
-  if (isNaN(val)) return false
-  data[k] = +val
-}
-
-function validateDate (data, k, val) {
-  debug("validate Date %j %j %j", k, val, Date.parse(val))
-  var s = Date.parse(val)
-  if (isNaN(s)) return false
-  data[k] = new Date(val)
-}
-
-function validateBoolean (data, k, val) {
-  if (val instanceof Boolean) val = val.valueOf()
-  else if (typeof val === "string") {
-    if (!isNaN(val)) val = !!(+val)
-    else if (val === "null" || val === "false") val = false
-    else val = true
-  } else val = !!val
-  data[k] = val
-}
-
-function validateUrl (data, k, val) {
-  val = url.parse(String(val))
-  if (!val.host) return false
-  data[k] = val.href
-}
-
-function validateStream (data, k, val) {
-  if (!(val instanceof Stream)) return false
-  data[k] = val
-}
-
-function validate (data, k, val, type, typeDefs) {
-  // arrays are lists of types.
-  if (Array.isArray(type)) {
-    for (var i = 0, l = type.length; i < l; i ++) {
-      if (type[i] === Array) continue
-      if (validate(data, k, val, type[i], typeDefs)) return true
-    }
-    delete data[k]
-    return false
-  }
-
-  // an array of anything?
-  if (type === Array) return true
-
-  // NaN is poisonous.  Means that something is not allowed.
-  if (type !== type) {
-    debug("Poison NaN", k, val, type)
-    delete data[k]
-    return false
-  }
-
-  // explicit list of values
-  if (val === type) {
-    debug("Explicitly allowed %j", val)
-    // if (isArray) (data[k] = data[k] || []).push(val)
-    // else data[k] = val
-    data[k] = val
-    return true
-  }
-
-  // now go through the list of typeDefs, validate against each one.
-  var ok = false
-    , types = Object.keys(typeDefs)
-  for (var i = 0, l = types.length; i < l; i ++) {
-    debug("test type %j %j %j", k, val, types[i])
-    var t = typeDefs[types[i]]
-    if (t && type === t.type) {
-      var d = {}
-      ok = false !== t.validate(d, k, val)
-      val = d[k]
-      if (ok) {
-        // if (isArray) (data[k] = data[k] || []).push(val)
-        // else data[k] = val
-        data[k] = val
-        break
-      }
-    }
-  }
-  debug("OK? %j (%j %j %j)", ok, k, val, types[i])
-
-  if (!ok) delete data[k]
-  return ok
-}
-
-function parse (args, data, remain, types, shorthands) {
-  debug("parse", args, data, remain)
-
-  var key = null
-    , abbrevs = abbrev(Object.keys(types))
-    , shortAbbr = abbrev(Object.keys(shorthands))
-
-  for (var i = 0; i < args.length; i ++) {
-    var arg = args[i]
-    debug("arg", arg)
-
-    if (arg.match(/^-{2,}$/)) {
-      // done with keys.
-      // the rest are args.
-      remain.push.apply(remain, args.slice(i + 1))
-      args[i] = "--"
-      break
-    }
-    var hadEq = false
-    if (arg.charAt(0) === "-" && arg.length > 1) {
-      if (arg.indexOf("=") !== -1) {
-        hadEq = true
-        var v = arg.split("=")
-        arg = v.shift()
-        v = v.join("=")
-        args.splice.apply(args, [i, 1].concat([arg, v]))
-      }
-
-      // see if it's a shorthand
-      // if so, splice and back up to re-parse it.
-      var shRes = resolveShort(arg, shorthands, shortAbbr, abbrevs)
-      debug("arg=%j shRes=%j", arg, shRes)
-      if (shRes) {
-        debug(arg, shRes)
-        args.splice.apply(args, [i, 1].concat(shRes))
-        if (arg !== shRes[0]) {
-          i --
-          continue
-        }
-      }
-      arg = arg.replace(/^-+/, "")
-      var no = null
-      while (arg.toLowerCase().indexOf("no-") === 0) {
-        no = !no
-        arg = arg.substr(3)
-      }
-
-      if (abbrevs[arg]) arg = abbrevs[arg]
-
-      var isArray = types[arg] === Array ||
-        Array.isArray(types[arg]) && types[arg].indexOf(Array) !== -1
-
-      // allow unknown things to be arrays if specified multiple times.
-      if (!types.hasOwnProperty(arg) && data.hasOwnProperty(arg)) {
-        if (!Array.isArray(data[arg]))
-          data[arg] = [data[arg]]
-        isArray = true
-      }
-
-      var val
-        , la = args[i + 1]
-
-      var isBool = typeof no === 'boolean' ||
-        types[arg] === Boolean ||
-        Array.isArray(types[arg]) && types[arg].indexOf(Boolean) !== -1 ||
-        (typeof types[arg] === 'undefined' && !hadEq) ||
-        (la === "false" &&
-         (types[arg] === null ||
-          Array.isArray(types[arg]) && ~types[arg].indexOf(null)))
-
-      if (isBool) {
-        // just set and move along
-        val = !no
-        // however, also support --bool true or --bool false
-        if (la === "true" || la === "false") {
-          val = JSON.parse(la)
-          la = null
-          if (no) val = !val
-          i ++
-        }
-
-        // also support "foo":[Boolean, "bar"] and "--foo bar"
-        if (Array.isArray(types[arg]) && la) {
-          if (~types[arg].indexOf(la)) {
-            // an explicit type
-            val = la
-            i ++
-          } else if ( la === "null" && ~types[arg].indexOf(null) ) {
-            // null allowed
-            val = null
-            i ++
-          } else if ( !la.match(/^-{2,}[^-]/) &&
-                      !isNaN(la) &&
-                      ~types[arg].indexOf(Number) ) {
-            // number
-            val = +la
-            i ++
-          } else if ( !la.match(/^-[^-]/) && ~types[arg].indexOf(String) ) {
-            // string
-            val = la
-            i ++
-          }
-        }
-
-        if (isArray) (data[arg] = data[arg] || []).push(val)
-        else data[arg] = val
-
-        continue
-      }
-
-      if (la && la.match(/^-{2,}$/)) {
-        la = undefined
-        i --
-      }
-
-      val = la === undefined ? true : la
-      if (isArray) (data[arg] = data[arg] || []).push(val)
-      else data[arg] = val
-
-      i ++
-      continue
-    }
-    remain.push(arg)
-  }
-}
-
-function resolveShort (arg, shorthands, shortAbbr, abbrevs) {
-  // handle single-char shorthands glommed together, like
-  // npm ls -glp, but only if there is one dash, and only if
-  // all of the chars are single-char shorthands, and it's
-  // not a match to some other abbrev.
-  arg = arg.replace(/^-+/, '')
-
-  // if it's an exact known option, then don't go any further
-  if (abbrevs[arg] === arg)
-    return null
-
-  // if it's an exact known shortopt, same deal
-  if (shorthands[arg]) {
-    // make it an array, if it's a list of words
-    if (shorthands[arg] && !Array.isArray(shorthands[arg]))
-      shorthands[arg] = shorthands[arg].split(/\s+/)
-
-    return shorthands[arg]
-  }
-
-  // first check to see if this arg is a set of single-char shorthands
-  var singles = shorthands.___singles
-  if (!singles) {
-    singles = Object.keys(shorthands).filter(function (s) {
-      return s.length === 1
-    }).reduce(function (l,r) {
-      l[r] = true
-      return l
-    }, {})
-    shorthands.___singles = singles
-    debug('shorthand singles', singles)
-  }
-
-  var chrs = arg.split("").filter(function (c) {
-    return singles[c]
-  })
-
-  if (chrs.join("") === arg) return chrs.map(function (c) {
-    return shorthands[c]
-  }).reduce(function (l, r) {
-    return l.concat(r)
-  }, [])
-
-
-  // if it's an arg abbrev, and not a literal shorthand, then prefer the arg
-  if (abbrevs[arg] && !shorthands[arg])
-    return null
-
-  // if it's an abbr for a shorthand, then use that
-  if (shortAbbr[arg])
-    arg = shortAbbr[arg]
-
-  // make it an array, if it's a list of words
-  if (shorthands[arg] && !Array.isArray(shorthands[arg]))
-    shorthands[arg] = shorthands[arg].split(/\s+/)
-
-  return shorthands[arg]
-}
-
-if (module === require.main) {
-var assert = require("assert")
-  , util = require("util")
-
-  , shorthands =
-    { s : ["--loglevel", "silent"]
-    , d : ["--loglevel", "info"]
-    , dd : ["--loglevel", "verbose"]
-    , ddd : ["--loglevel", "silly"]
-    , noreg : ["--no-registry"]
-    , reg : ["--registry"]
-    , "no-reg" : ["--no-registry"]
-    , silent : ["--loglevel", "silent"]
-    , verbose : ["--loglevel", "verbose"]
-    , h : ["--usage"]
-    , H : ["--usage"]
-    , "?" : ["--usage"]
-    , help : ["--usage"]
-    , v : ["--version"]
-    , f : ["--force"]
-    , desc : ["--description"]
-    , "no-desc" : ["--no-description"]
-    , "local" : ["--no-global"]
-    , l : ["--long"]
-    , p : ["--parseable"]
-    , porcelain : ["--parseable"]
-    , g : ["--global"]
-    }
-
-  , types =
-    { aoa: Array
-    , nullstream: [null, Stream]
-    , date: Date
-    , str: String
-    , browser : String
-    , cache : path
-    , color : ["always", Boolean]
-    , depth : Number
-    , description : Boolean
-    , dev : Boolean
-    , editor : path
-    , force : Boolean
-    , global : Boolean
-    , globalconfig : path
-    , group : [String, Number]
-    , gzipbin : String
-    , logfd : [Number, Stream]
-    , loglevel : ["silent","win","error","warn","info","verbose","silly"]
-    , long : Boolean
-    , "node-version" : [false, String]
-    , npaturl : url
-    , npat : Boolean
-    , "onload-script" : [false, String]
-    , outfd : [Number, Stream]
-    , parseable : Boolean
-    , pre: Boolean
-    , prefix: path
-    , proxy : url
-    , "rebuild-bundle" : Boolean
-    , registry : url
-    , searchopts : String
-    , searchexclude: [null, String]
-    , shell : path
-    , t: [Array, String]
-    , tag : String
-    , tar : String
-    , tmp : path
-    , "unsafe-perm" : Boolean
-    , usage : Boolean
-    , user : String
-    , username : String
-    , userconfig : path
-    , version : Boolean
-    , viewer: path
-    , _exit : Boolean
-    }
-
-; [["-v", {version:true}, []]
-  ,["---v", {version:true}, []]
-  ,["ls -s --no-reg connect -d",
-    {loglevel:"info",registry:null},["ls","connect"]]
-  ,["ls ---s foo",{loglevel:"silent"},["ls","foo"]]
-  ,["ls --registry blargle", {}, ["ls"]]
-  ,["--no-registry", {registry:null}, []]
-  ,["--no-color true", {color:false}, []]
-  ,["--no-color false", {color:true}, []]
-  ,["--no-color", {color:false}, []]
-  ,["--color false", {color:false}, []]
-  ,["--color --logfd 7", {logfd:7,color:true}, []]
-  ,["--color=true", {color:true}, []]
-  ,["--logfd=10", {logfd:10}, []]
-  ,["--tmp=/tmp -tar=gtar",{tmp:"/tmp",tar:"gtar"},[]]
-  ,["--tmp=tmp -tar=gtar",
-    {tmp:path.resolve(process.cwd(), "tmp"),tar:"gtar"},[]]
-  ,["--logfd x", {}, []]
-  ,["a -true -- -no-false", {true:true},["a","-no-false"]]
-  ,["a -no-false", {false:false},["a"]]
-  ,["a -no-no-true", {true:true}, ["a"]]
-  ,["a -no-no-no-false", {false:false}, ["a"]]
-  ,["---NO-no-No-no-no-no-nO-no-no"+
-    "-No-no-no-no-no-no-no-no-no"+
-    "-no-no-no-no-NO-NO-no-no-no-no-no-no"+
-    "-no-body-can-do-the-boogaloo-like-I-do"
-   ,{"body-can-do-the-boogaloo-like-I-do":false}, []]
-  ,["we are -no-strangers-to-love "+
-    "--you-know=the-rules --and=so-do-i "+
-    "---im-thinking-of=a-full-commitment "+
-    "--no-you-would-get-this-from-any-other-guy "+
-    "--no-gonna-give-you-up "+
-    "-no-gonna-let-you-down=true "+
-    "--no-no-gonna-run-around false "+
-    "--desert-you=false "+
-    "--make-you-cry false "+
-    "--no-tell-a-lie "+
-    "--no-no-and-hurt-you false"
-   ,{"strangers-to-love":false
-    ,"you-know":"the-rules"
-    ,"and":"so-do-i"
-    ,"you-would-get-this-from-any-other-guy":false
-    ,"gonna-give-you-up":false
-    ,"gonna-let-you-down":false
-    ,"gonna-run-around":false
-    ,"desert-you":false
-    ,"make-you-cry":false
-    ,"tell-a-lie":false
-    ,"and-hurt-you":false
-    },["we", "are"]]
-  ,["-t one -t two -t three"
-   ,{t: ["one", "two", "three"]}
-   ,[]]
-  ,["-t one -t null -t three four five null"
-   ,{t: ["one", "null", "three"]}
-   ,["four", "five", "null"]]
-  ,["-t foo"
-   ,{t:["foo"]}
-   ,[]]
-  ,["--no-t"
-   ,{t:["false"]}
-   ,[]]
-  ,["-no-no-t"
-   ,{t:["true"]}
-   ,[]]
-  ,["-aoa one -aoa null -aoa 100"
-   ,{aoa:["one", null, 100]}
-   ,[]]
-  ,["-str 100"
-   ,{str:"100"}
-   ,[]]
-  ,["--color always"
-   ,{color:"always"}
-   ,[]]
-  ,["--no-nullstream"
-   ,{nullstream:null}
-   ,[]]
-  ,["--nullstream false"
-   ,{nullstream:null}
-   ,[]]
-  ,["--notadate=2011-01-25"
-   ,{notadate: "2011-01-25"}
-   ,[]]
-  ,["--date 2011-01-25"
-   ,{date: new Date("2011-01-25")}
-   ,[]]
-  ,["-cl 1"
-   ,{config: true, length: 1}
-   ,[]
-   ,{config: Boolean, length: Number, clear: Boolean}
-   ,{c: "--config", l: "--length"}]
-  ,["--acount bla"
-   ,{"acount":true}
-   ,["bla"]
-   ,{account: Boolean, credentials: Boolean, options: String}
-   ,{a:"--account", c:"--credentials",o:"--options"}]
-  ,["--clear"
-   ,{clear:true}
-   ,[]
-   ,{clear:Boolean,con:Boolean,len:Boolean,exp:Boolean,add:Boolean,rep:Boolean}
-   ,{c:"--con",l:"--len",e:"--exp",a:"--add",r:"--rep"}]
-  ,["--file -"
-   ,{"file":"-"}
-   ,[]
-   ,{file:String}
-   ,{}]
-  ,["--file -"
-   ,{"file":true}
-   ,["-"]
-   ,{file:Boolean}
-   ,{}]
-  ].forEach(function (test) {
-    var argv = test[0].split(/\s+/)
-      , opts = test[1]
-      , rem = test[2]
-      , actual = nopt(test[3] || types, test[4] || shorthands, argv, 0)
-      , parsed = actual.argv
-    delete actual.argv
-    console.log(util.inspect(actual, false, 2, true), parsed.remain)
-    for (var i in opts) {
-      var e = JSON.stringify(opts[i])
-        , a = JSON.stringify(actual[i] === undefined ? null : actual[i])
-      if (e && typeof e === "object") {
-        assert.deepEqual(e, a)
-      } else {
-        assert.equal(e, a)
-      }
-    }
-    assert.deepEqual(rem, parsed.remain)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/nopt/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "name": "nopt",
-  "version": "2.1.2",
-  "description": "Option parsing for Node, supporting types, shorthands, etc. Used by npm.",
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "main": "lib/nopt.js",
-  "scripts": {
-    "test": "node lib/nopt.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "http://github.com/isaacs/nopt"
-  },
-  "bin": {
-    "nopt": "./bin/nopt.js"
-  },
-  "license": {
-    "type": "MIT",
-    "url": "https://github.com/isaacs/nopt/raw/master/LICENSE"
-  },
-  "dependencies": {
-    "abbrev": "1"
-  },
-  "readme": "If you want to write an option parser, and have it be good, there are\ntwo ways to do it.  The Right Way, and the Wrong Way.\n\nThe Wrong Way is to sit down and write an option parser.  We've all done\nthat.\n\nThe Right Way is to write some complex configurable program with so many\noptions that you go half-insane just trying to manage them all, and put\nit off with duct-tape solutions until you see exactly to the core of the\nproblem, and finally snap and write an awesome option parser.\n\nIf you want to write an option parser, don't write an option parser.\nWrite a package manager, or a source control system, or a service\nrestarter, or an operating system.  You probably won't end up with a\ngood one of those, but if you don't give up, and you are relentless and\ndiligent enough in your procrastination, you may just end up with a very\nnice option parser.\n\n## USAGE\n\n    // my-program.js\n    var nopt = require(\"nopt\")\n      , Stream = require(\"stream\").Stream\n      , path = require(\"path\")\n      , knownOpts = { \"foo\" : [String, null]\n                    , \"bar\" : [Stream, Number]\n                    , \"baz\" : path\n                    , \"bloo\" : [ \"big\", \"medium\", \"small\" ]\n                    , \"flag\" : Boolean\n                    , \"pick\" : Boolean\n                    , \"many\" : [String, Array]\n                    }\n      , shortHands = { \"foofoo\" : [\"--foo\", \"Mr. Foo\"]\n                     , \"b7\" : [\"--bar\", \"7\"]\n                     , \"m\" : [\"--bloo\", \"medium\"]\n                     , \"p\" : [\"--pick\"]\n                     , \"f\" : [\"--flag\"]\n                     }\n                 // everything is optional.\n                 // knownOpts and shorthands default to {}\n                 // arg list defaults to process.argv\n                 // slice defaults to 2\n      , parsed = nopt(knownOpts, shortHands, process.argv, 2)\n    console.log(parsed)\n\nThis would give you support for any of the following:\n\n```bash\n$ node my-program.js --foo \"blerp\" --no-flag\n{ \"foo\" : \"blerp\", \"flag\" : false }\n\n$ node my-program.js ---bar 7 --foo \"Mr. Hand\" --flag\n{ bar: 7, foo: \"Mr. Hand\", flag: true }\n\n$ node my-program.js --foo \"blerp\" -f -----p\n{ foo: \"blerp\", flag: true, pick: true }\n\n$ node my-program.js -fp --foofoo\n{ foo: \"Mr. Foo\", flag: true, pick: true }\n\n$ node my-program.js --foofoo -- -fp  # -- stops the flag parsing.\n{ foo: \"Mr. Foo\", argv: { remain: [\"-fp\"] } }\n\n$ node my-program.js --blatzk 1000 -fp # unknown opts are ok.\n{ blatzk: 1000, flag: true, pick: true }\n\n$ node my-program.js --blatzk true -fp # but they need a value\n{ blatzk: true, flag: true, pick: true }\n\n$ node my-program.js --no-blatzk -fp # unless they start with \"no-\"\n{ blatzk: false, flag: true, pick: true }\n\n$ node my-program.js --baz b/a/z # known paths are resolved.\n{ baz: \"/Users/isaacs/b/a/z\" }\n\n# if Array is one of the types, then it can take many\n# values, and will always be an array.  The other types provided\n# specify what types are allowed in the list.\n\n$ node my-program.js --many 1 --many null --many foo\n{ many: [\"1\", \"null\", \"foo\"] }\n\n$ node my-program.js --many foo\n{ many: [\"foo\"] }\n```\n\nRead the tests at the bottom of `lib/nopt.js` for more examples of\nwhat this puppy can do.\n\n## Types\n\nThe following types are supported, and defined on `nopt.typeDefs`\n\n* String: A normal string.  No parsing is done.\n* path: A file system path.  Gets resolved against cwd if not absolute.\n* url: A url.  If it doesn't parse, it isn't accepted.\n* Number: Must be numeric.\n* Date: Must parse as a date. If it does, and `Date` is one of the options,\n  then it will return a Date object, not a string.\n* Boolean: Must be either `true` or `false`.  If an option is a boolean,\n  then it does not need a value, and its presence will imply `true` as\n  the value.  To negate boolean flags, do `--no-whatever` or `--whatever\n  false`\n* NaN: Means that the option is strictly not allowed.  Any value will\n  fail.\n* Stream: An object matching the \"Stream\" class in node.  Valuable\n  for use when validating programmatically.  (npm uses this to let you\n  supply any WriteStream on the `outfd` and `logfd` config options.)\n* Array: If `Array` is specified as one of the types, then the value\n  will be parsed as a list of options.  This means that multiple values\n  can be specified, and that the value will always be an array.\n\nIf a type is an array of values not on this list, then those are\nconsidered valid values.  For instance, in the example above, the\n`--bloo` option can only be one of `\"big\"`, `\"medium\"`, or `\"small\"`,\nand any other value will be rejected.\n\nWhen parsing unknown fields, `\"true\"`, `\"false\"`, and `\"null\"` will be\ninterpreted as their JavaScript equivalents, and numeric values will be\ninterpreted as a number.\n\nYou can also mix types and values, or multiple types, in a list.  For\ninstance `{ blah: [Number, null] }` would allow a value to be set to\neither a Number or null.  When types are ordered, this implies a\npreference, and the first type that can be used to properly interpret\nthe value will be used.\n\nTo define a new type, add it to `nopt.typeDefs`.  Each item in that\nhash is an object with a `type` member and a `validate` method.  The\n`type` member is an object that matches what goes in the type list.  The\n`validate` method is a function that gets called with `validate(data,\nkey, val)`.  Validate methods should assign `data[key]` to the valid\nvalue of `val` if it can be handled properly, or return boolean\n`false` if it cannot.\n\nYou can also call `nopt.clean(data, types, typeDefs)` to clean up a\nconfig object and remove its invalid properties.\n\n## Error Handling\n\nBy default, nopt outputs a warning to standard error when invalid\noptions are found.  You can change this behavior by assigning a method\nto `nopt.invalidHandler`.  This method will be called with\nthe offending `nopt.invalidHandler(key, val, types)`.\n\nIf no `nopt.invalidHandler` is assigned, then it will console.error\nits whining.  If it is assigned to boolean `false` then the warning is\nsuppressed.\n\n## Abbreviations\n\nYes, they are supported.  If you define options like this:\n\n```javascript\n{ \"foolhardyelephants\" : Boolean\n, \"pileofmonkeys\" : Boolean }\n```\n\nThen this will work:\n\n```bash\nnode program.js --foolhar --pil\nnode program.js --no-f --pileofmon\n# etc.\n```\n\n## Shorthands\n\nShorthands are a hash of shorter option names to a snippet of args that\nthey expand to.\n\nIf multiple one-character shorthands are all combined, and the\ncombination does not unambiguously match any other option or shorthand,\nthen they will be broken up into their constituent parts.  For example:\n\n```json\n{ \"s\" : [\"--loglevel\", \"silent\"]\n, \"g\" : \"--global\"\n, \"f\" : \"--force\"\n, \"p\" : \"--parseable\"\n, \"l\" : \"--long\"\n}\n```\n\n```bash\nnpm ls -sgflp\n# just like doing this:\nnpm ls --loglevel silent --global --force --long --parseable\n```\n\n## The Rest of the args\n\nThe config object returned by nopt is given a special member called\n`argv`, which is an object with the following fields:\n\n* `remain`: The remaining args after all the parsing has occurred.\n* `original`: The args as they originally appeared.\n* `cooked`: The args after flags and shorthands are expanded.\n\n## Slicing\n\nNode programs are called with more or less the exact argv as it appears\nin C land, after the v8 and node-specific options have been plucked off.\nAs such, `argv[0]` is always `node` and `argv[1]` is always the\nJavaScript program being run.\n\nThat's usually not very useful to you.  So they're sliced off by\ndefault.  If you want them, then you can pass in `0` as the last\nargument, or any other number that you'd like to slice off the start of\nthe list.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/nopt/issues"
-  },
-  "_id": "nopt@2.1.2",
-  "_from": "nopt@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-test/fixtures/cache
-node_modules
-npm-debug.log
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,157 +0,0 @@
-# npm-registry-client
-
-The code that npm uses to talk to the registry.
-
-It handles all the caching and HTTP calls.
-
-## Usage
-
-```javascript
-var RegClient = require('npm-registry-client')
-var client = new RegClient(config)
-
-client.get("npm", "latest", 1000, function (er, data, raw, res) {
-  // error is an error if there was a problem.
-  // data is the parsed data object
-  // raw is the json string
-  // res is the response from couch
-})
-```
-
-# Configuration
-
-This program is designed to work with
-[npmconf](https://npmjs.org/package/npmconf), but you can also pass in
-a plain-jane object with the appropriate configs, and it'll shim it
-for you.  Any configuration thingie that has get/set/del methods will
-also be accepted.
-
-* `registry` **Required** {String} URL to the registry
-* `cache` **Required** {String} Path to the cache folder
-* `always-auth` {Boolean} Auth even for GET requests.
-* `auth` {String} A base64-encoded `username:password`
-* `email` {String} User's email address
-* `tag` {String} The default tag to use when publishing new packages.
-  Default = `"latest"`
-* `ca` {String} Cerficate signing authority certificates to trust.
-* `strict-ssl` {Boolean} Whether or not to be strict with SSL
-  certificates.  Default = `true`
-* `user-agent` {String} User agent header to send.  Default =
-  `"node/{process.version} {process.platform} {process.arch}"`
-* `log` {Object} The logger to use.  Defaults to `require("npmlog")` if
-  that works, otherwise logs are disabled.
-* `fetch-retries` {Number} Number of times to retry on GET failures.
-  Default=2
-* `fetch-retry-factor` {Number} `factor` setting for `node-retry`. Default=10
-* `fetch-retry-mintimeout` {Number} `minTimeout` setting for `node-retry`.
-  Default=10000 (10 seconds)
-* `fetch-retry-maxtimeout` {Number} `maxTimeout` setting for `node-retry`.
-  Default=60000 (60 seconds)
-* `proxy` {URL} The url to proxy requests through.
-* `https-proxy` {URL} The url to proxy https requests through.
-  Defaults to be the same as `proxy` if unset.
-* `_auth` {String} The base64-encoded authorization header.
-* `username` `_password` {String} Username/password to use to generate
-  `_auth` if not supplied.
-* `_token` {Object} A token for use with
-  [couch-login](https://npmjs.org/package/couch-login)
-
-# client.request(method, where, [what], [etag], [nofollow], cb)
-
-* `method` {String} HTTP method
-* `where` {String} Path to request on the server
-* `what` {Stream | Buffer | String | Object} The request body.  Objects
-  that are not Buffers or Streams are encoded as JSON.
-* `etag` {String} The cached ETag
-* `nofollow` {Boolean} Prevent following 302/301 responses
-* `cb` {Function}
-  * `error` {Error | null}
-  * `data` {Object} the parsed data object
-  * `raw` {String} the json
-  * `res` {Response Object} response from couch
-
-Make a request to the registry.  All the other methods are wrappers
-around this. one.
-
-# client.adduser(username, password, email, cb)
-
-* `username` {String}
-* `password` {String}
-* `email` {String}
-* `cb` {Function}
-
-Add a user account to the registry, or verify the credentials.
-
-# client.get(url, [timeout], [nofollow], [staleOk], cb)
-
-* `url` {String} The url path to fetch
-* `timeout` {Number} Number of seconds old that a cached copy must be
-  before a new request will be made.
-* `nofollow` {Boolean} Do not follow 301/302 responses
-* `staleOk` {Boolean} If there's cached data available, then return that
-  to the callback quickly, and update the cache the background.
-
-Fetches data from the registry via a GET request, saving it in
-the cache folder with the ETag.
-
-# client.publish(data, tarball, [readme], cb)
-
-* `data` {Object} Package data
-* `tarball` {String | Stream} Filename or stream of the package tarball
-* `readme` {String} Contents of the README markdown file
-* `cb` {Function}
-
-Publish a package to the registry.
-
-Note that this does not create the tarball from a folder.  However, it
-can accept a gzipped tar stream or a filename to a tarball.
-
-# client.star(package, starred, cb)
-
-* `package` {String} Name of the package to star
-* `starred` {Boolean} True to star the package, false to unstar it.
-* `cb` {Function}
-
-Star or unstar a package.
-
-Note that the user does not have to be the package owner to star or
-unstar a package, though other writes do require that the user be the
-package owner.
-
-# client.stars(username, cb)
-
-* `username` {String} Name of user to fetch starred packages for.
-* `cb` {Function}
-
-View your own or another user's starred packages.
-
-# client.tag(project, version, tag, cb)
-
-* `project` {String} Project name
-* `version` {String} Version to tag
-* `tag` {String} Tag name to apply
-* `cb` {Function}
-
-Mark a version in the `dist-tags` hash, so that `pkg@tag`
-will fetch the specified version.
-
-# client.unpublish(name, [ver], cb)
-
-* `name` {String} package name
-* `ver` {String} version to unpublish. Leave blank to unpublish all
-  versions.
-* `cb` {Function}
-
-Remove a version of a package (or all versions) from the registry.  When
-the last version us unpublished, the entire document is removed from the
-database.
-
-# client.upload(where, file, [etag], [nofollow], cb)
-
-* `where` {String} URL path to upload to
-* `file` {String | Stream} Either the filename or a readable stream
-* `etag` {String} Cache ETag
-* `nofollow` {Boolean} Do not follow 301/302 responses
-* `cb` {Function}
-
-Upload an attachment.  Mostly used by `client.publish()`.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,71 +0,0 @@
-
-// utilities for working with the js-registry site.
-
-module.exports = RegClient
-
-var fs = require('fs')
-, url = require('url')
-, path = require('path')
-, CouchLogin = require('couch-login')
-, npmlog
-
-try {
-  npmlog = require("npmlog")
-} catch (er) {
-  npmlog = { error: noop, warn: noop, info: noop,
-             verbose: noop, silly: noop, http: noop,
-             pause: noop, resume: noop }
-}
-
-function noop () {}
-
-function RegClient (conf) {
-  // accept either a plain-jane object, or a npmconf object
-  // with a "get" method.
-  if (typeof conf.get !== 'function') {
-    var data = conf
-    conf = { get: function (k) { return data[k] }
-           , set: function (k, v) { data[k] = v }
-           , del: function (k) { delete data[k] } }
-  }
-
-  this.conf = conf
-
-  // if provided, then the registry needs to be a url.
-  // if it's not provided, then we're just using the cache only.
-  var registry = conf.get('registry')
-  if (registry) {
-    registry = url.parse(registry)
-    if (!registry.protocol) throw new Error(
-      'Invalid registry: ' + registry.url)
-    registry = registry.href
-    if (registry.slice(-1) !== '/') {
-      registry += '/'
-    }
-    this.conf.set('registry', registry)
-  } else {
-    registry = null
-  }
-
-  if (!conf.get('cache')) throw new Error("Cache dir is required")
-
-  var auth = this.conf.get('_auth')
-  var alwaysAuth = this.conf.get('always-auth')
-  if (auth && !alwaysAuth && registry) {
-    // if we're always authing, then we just send the
-    // user/pass on every thing.  otherwise, create a
-    // session, and use that.
-    var token = this.conf.get('_token')
-    this.couchLogin = new CouchLogin(registry, token)
-    this.couchLogin.proxy = this.conf.get('proxy')
-    this.couchLogin.strictSSL = this.conf.get('strict-ssl')
-    this.couchLogin.ca = this.conf.get('ca')
-  }
-
-  this.log = conf.log || conf.get('log') || npmlog
-}
-
-require('fs').readdirSync(__dirname + "/lib").forEach(function (f) {
-  if (!f.match(/\.js$/)) return
-  RegClient.prototype[f.replace(/\.js$/, '')] = require('./lib/' + f)
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/adduser.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,137 +0,0 @@
-module.exports = adduser
-
-var crypto = require('crypto')
-
-function sha (s) {
-  return crypto.createHash("sha1").update(s).digest("hex")
-}
-
-function adduser (username, password, email, cb) {
-
-  password = ("" + (password || "")).trim()
-  if (!password) return cb(new Error("No password supplied."))
-
-  email = ("" + (email || "")).trim()
-  if (!email) return cb(new Error("No email address supplied."))
-  if (!email.match(/^[^@]+@[^\.]+\.[^\.]+/)) {
-    return cb(new Error("Please use a real email address."))
-  }
-
-  if (password.indexOf(":") !== -1) return cb(new Error(
-    "Sorry, ':' chars are not allowed in passwords.\n"+
-    "See <https://issues.apache.org/jira/browse/COUCHDB-969> for why."))
-
-  var salt = crypto.randomBytes(30).toString('hex')
-    , userobj =
-      { name : username
-      , salt : salt
-      , password_sha : sha(password + salt)
-      , email : email
-      , _id : 'org.couchdb.user:'+username
-      , type : "user"
-      , roles : []
-      , date: new Date().toISOString()
-      }
-
-  // pluck off any other username/password/token.  it needs to be the
-  // same as the user we're becoming now.  replace them on error.
-  var pre = { username: this.conf.get('username')
-            , password: this.conf.get('_password')
-            , auth: this.conf.get('_auth')
-            , token: this.conf.get('_token') }
-
-  this.conf.del('_token')
-  this.conf.del('username')
-  this.conf.del('_auth')
-  this.conf.del('_password')
-  if (this.couchLogin) {
-    this.couchLogin.token = null
-  }
-
-  cb = done.call(this, cb, pre)
-
-  var logObj = Object.keys(userobj).map(function (k) {
-    if (k === 'salt' || k === 'password_sha') return [k, 'XXXXX']
-    return [k, userobj[k]]
-  }).reduce(function (s, kv) {
-    s[kv[0]] = kv[1]
-    return s
-  }, {})
-
-  this.log.verbose("adduser", "before first PUT", logObj)
-
-  this.request('PUT'
-    , '/-/user/org.couchdb.user:'+encodeURIComponent(username)
-    , userobj
-    , function (error, data, json, response) {
-        // if it worked, then we just created a new user, and all is well.
-        // but if we're updating a current record, then it'll 409 first
-        if (error && !this.conf.get('_auth')) {
-          // must be trying to re-auth on a new machine.
-          // use this info as auth
-          var b = new Buffer(username + ":" + password)
-          this.conf.set('_auth', b.toString("base64"))
-          this.conf.set('username', username)
-          this.conf.set('_password', password)
-        }
-
-        if (!error || !response || response.statusCode !== 409) {
-          return cb(error, data, json, response)
-        }
-
-        this.log.verbose("adduser", "update existing user")
-        return this.request('GET'
-          , '/-/user/org.couchdb.user:'+encodeURIComponent(username)
-          , function (er, data, json, response) {
-              if (er || data.error) {
-                return cb(er, data, json, response)
-              }
-              Object.keys(data).forEach(function (k) {
-                if (!userobj[k]) {
-                  userobj[k] = data[k]
-                }
-              })
-              this.log.verbose("adduser", "userobj", logObj)
-              this.request('PUT'
-                , '/-/user/org.couchdb.user:'+encodeURIComponent(username)
-                  + "/-rev/" + userobj._rev
-                , userobj
-                , cb )
-            }.bind(this))
-      }.bind(this))
-}
-
-function done (cb, pre) {
-  return function (error, data, json, response) {
-    if (!error && (!response || response.statusCode === 201)) {
-      return cb(error, data, json, response)
-    }
-
-    // there was some kind of error, re-instate previous auth/token/etc.
-    this.conf.set('_token', pre.token)
-    if (this.couchLogin) {
-      this.couchLogin.token = pre.token
-      if (this.couchLogin.tokenSet) {
-        this.couchLogin.tokenSet(pre.token)
-      }
-    }
-    this.conf.set('username', pre.username)
-    this.conf.set('_password', pre.password)
-    this.conf.set('_auth', pre.auth)
-
-    this.log.verbose("adduser", "back", [error, data, json])
-    if (!error) {
-      error = new Error( (response && response.statusCode || "") + " "+
-      "Could not create user\n"+JSON.stringify(data))
-    }
-    if (response
-        && (response.statusCode === 401 || response.statusCode === 403)) {
-      this.log.warn("adduser", "Incorrect username or password\n"
-              +"You can reset your account by visiting:\n"
-              +"\n"
-              +"    http://admin.npmjs.org/reset\n")
-    }
-
-    return cb(error)
-  }.bind(this)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/get.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,178 +0,0 @@
-
-module.exports = get
-
-var fs = require("graceful-fs")
-  , path = require("path")
-  , mkdir = require("mkdirp")
-  , chownr = require("chownr")
-
-function get (uri, timeout, nofollow, staleOk, cb) {
-  if (typeof cb !== "function") cb = staleOk, staleOk = false
-  if (typeof cb !== "function") cb = nofollow, nofollow = false
-  if (typeof cb !== "function") cb = timeout, timeout = -1
-  if (typeof cb !== "function") cb = version, version = null
-
-  timeout = Math.min(timeout, this.conf.get('cache-max') || 0)
-  timeout = Math.max(timeout, this.conf.get('cache-min') || -Infinity)
-
-  if (!this.conf.get('registry')) timeout = Infinity
-
-  if ( process.env.COMP_CWORD !== undefined
-    && process.env.COMP_LINE !== undefined
-    && process.env.COMP_POINT !== undefined
-    ) timeout = Math.max(timeout, 60000)
-
-  // /-/all is special.
-  // It uses timestamp-based caching and partial updates,
-  // because it is a monster.
-  if (uri === "/-/all") {
-    return requestAll.call(this, cb)
-  }
-
-  var cacheUri = uri
-  // on windows ":" is not an allowed character in a foldername
-  cacheUri = cacheUri.replace(/:/g, '_')
-  var cache = path.join(this.conf.get('cache'), cacheUri, ".cache.json")
-
-  fs.stat(cache, function (er, stat) {
-    if (!er) fs.readFile(cache, function (er, data) {
-      try { data = JSON.parse(data) }
-      catch (ex) { data = null }
-      get_.call(this, uri, timeout, cache, stat, data, nofollow, staleOk, cb)
-    }.bind(this))
-    else get_.call(this, uri, timeout, cache, null, null, nofollow, staleOk, cb)
-  }.bind(this))
-}
-
-function requestAll (cb) {
-  var cache = path.join(this.conf.get('cache'), "/-/all", ".cache.json")
-
-  mkdir(path.join(this.conf.get('cache'), "-", "all"), function (er) {
-    fs.readFile(cache, function (er, data) {
-      if (er) return requestAll_.call(this, 0, {}, cb)
-      try {
-        data = JSON.parse(data)
-      } catch (ex) {
-        fs.writeFile(cache, "{}", function (er) {
-          if (er) return cb(new Error("Broken cache."))
-          return requestAll_.call(this, 0, {}, cb)
-        }.bind(this))
-      }
-      var t = +data._updated || 0
-      requestAll_.call(this, t, data, cb)
-    }.bind(this))
-  }.bind(this))
-}
-
-function requestAll_ (c, data, cb) {
-  // use the cache and update in the background if it's not too old
-  if (Date.now() - c < 60000) {
-    cb(null, data)
-    cb = function () {}
-  }
-
-  var uri = "/-/all/since?stale=update_after&startkey=" + c
-
-  if (c === 0) {
-    this.log.warn("", "Building the local index for the first time, please be patient")
-    uri = "/-/all"
-  }
-
-  var cache = path.join(this.conf.get('cache'), "-/all", ".cache.json")
-  this.request('GET', uri, function (er, updates, _, res) {
-    if (er) return cb(er, data)
-    var headers = res.headers
-      , updated = updates._updated || Date.parse(headers.date)
-    Object.keys(updates).forEach(function (p) {
-      data[p] = updates[p]
-    })
-    data._updated = updated
-    fs.writeFile( cache, JSON.stringify(data)
-                , function (er) {
-      delete data._updated
-      return cb(er, data)
-    })
-  })
-}
-
-function get_ (uri, timeout, cache, stat, data, nofollow, staleOk, cb) {
-  var etag
-  if (data && data._etag) etag = data._etag
-  if (timeout && timeout > 0 && stat && data) {
-    if ((Date.now() - stat.mtime.getTime())/1000 < timeout) {
-      this.log.verbose("registry.get", uri, "not expired, no request")
-      delete data._etag
-      return cb(null, data, JSON.stringify(data), {statusCode:304})
-    }
-    if (staleOk) {
-      this.log.verbose("registry.get", uri, "staleOk, background update")
-      delete data._etag
-      process.nextTick(cb.bind( null, null, data, JSON.stringify(data)
-                              , {statusCode: 304} ))
-      cb = function () {}
-    }
-  }
-
-  this.request('GET', uri, null, etag, nofollow, function (er, remoteData, raw, response) {
-    // if we get an error talking to the registry, but we have it
-    // from the cache, then just pretend we got it.
-    if (er && cache && data && !data.error) {
-      er = null
-      response = {statusCode: 304}
-    }
-
-    if (response) {
-      this.log.silly("registry.get", "cb", [response.statusCode, response.headers])
-      if (response.statusCode === 304 && etag) {
-        remoteData = data
-        this.log.verbose("etag", uri+" from cache")
-      }
-    }
-
-    data = remoteData
-    if (!data) {
-      er = er || new Error("failed to fetch from registry: " + uri)
-    }
-
-    if (er) return cb(er, data, raw, response)
-
-    // just give the write the old college try.  if it fails, whatever.
-    function saved () {
-      delete data._etag
-      cb(er, data, raw, response)
-    }
-
-    saveToCache.call(this, cache, data, saved)
-  }.bind(this))
-}
-
-function saveToCache (cache, data, saved) {
-  if (this._cacheStat) {
-    var cs = this._cacheStat
-    return saveToCache_.call(this, cache, data, cs.uid, cs.gid, saved)
-  }
-  fs.stat(this.conf.get('cache'), function (er, st) {
-    if (er) {
-      return fs.stat(process.env.HOME || "", function (er, st) {
-        // if this fails, oh well.
-        if (er) return saved()
-        this._cacheStat = st
-        return saveToCache.call(this, cache, data, saved)
-      }.bind(this))
-    }
-    this._cacheStat = st || { uid: null, gid: null }
-    return saveToCache.call(this, cache, data, saved)
-  }.bind(this))
-}
-
-function saveToCache_ (cache, data, uid, gid, saved) {
-  mkdir(path.dirname(cache), function (er, made) {
-    if (er) return saved()
-    fs.writeFile(cache, JSON.stringify(data), function (er) {
-      if (er || uid === null || gid === null) {
-        return saved()
-      }
-      chownr(made || cache, uid, gid, saved)
-    })
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/publish.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,112 +0,0 @@
-
-module.exports = publish
-
-var path = require("path")
-  , url = require("url")
-
-function publish (data, tarball, cb) {
-
-  var email = this.conf.get('email')
-  var auth = this.conf.get('_auth')
-  var username = this.conf.get('username')
-
-  if (!email || !auth || !username) {
-    var er = new Error("auth and email required for publishing")
-    er.code = 'ENEEDAUTH'
-    return cb(er)
-  }
-
-  // add the dist-url to the data, pointing at the tarball.
-  // if the {name} isn't there, then create it.
-  // if the {version} is already there, then fail.
-  // then:
-  // PUT the data to {config.registry}/{data.name}/{data.version}
-  var registry = this.conf.get('registry')
-
-  var fullData =
-    { _id : data.name
-    , name : data.name
-    , description : data.description
-    , "dist-tags" : {}
-    , versions : {}
-    , readme: data.readme || ""
-    , maintainers :
-      [ { name : username
-        , email : email
-        }
-      ]
-    }
-
-  var tbName = data.name + "-" + data.version + ".tgz"
-    , tbURI = data.name + "/-/" + tbName
-
-  data._id = data.name+"@"+data.version
-  data.dist = data.dist || {}
-  data.dist.tarball = url.resolve(registry, tbURI)
-                         .replace(/^https:\/\//, "http://")
-
-
-  // first try to just PUT the whole fullData, and this will fail if it's
-  // already there, because it'll be lacking a _rev, so couch'll bounce it.
-  this.request("PUT", encodeURIComponent(data.name), fullData,
-      function (er, parsed, json, response) {
-    // get the rev and then upload the attachment
-    // a 409 is expected here, if this is a new version of an existing package.
-    if (er
-        && !(response && response.statusCode === 409)
-        && !( parsed
-            && parsed.reason ===
-              "must supply latest _rev to update existing package" )) {
-      this.log.error("publish", "Failed PUT response "
-                    +(response && response.statusCode))
-      return cb(er)
-    }
-    var dataURI = encodeURIComponent(data.name)
-                + "/" + encodeURIComponent(data.version)
-
-    var tag = data.tag || this.conf.get('tag') || "latest"
-    dataURI += "/-tag/" + tag
-
-    // let's see what versions are already published.
-    // could be that we just need to update the bin dist values.
-    this.request("GET", data.name, function (er, fullData) {
-      if (er) return cb(er)
-
-      function handle(er) {
-        if (er.message.indexOf("conflict Document update conflict.") === 0) {
-          return cb(conflictError.call(this, data._id));
-        }
-        this.log.error("publish", "Error uploading package");
-        return cb(er)
-      }
-
-      var exists = fullData.versions && fullData.versions[data.version]
-      if (exists) return cb(conflictError.call(this, data._id))
-
-      var rev = fullData._rev;
-      attach.call(this, data.name, tarball, tbName, rev, function (er) {
-        if (er) return handle.call(this, er)
-        this.log.verbose("publish", "attached", [data.name, tarball, tbName])
-        this.request("PUT", dataURI, data, function (er) {
-          if (er) return handle.call(this, er)
-          return cb(er)
-        }.bind(this))
-      }.bind(this))
-    }.bind(this))
-  }.bind(this)) // pining for fat arrows.
-}
-
-function conflictError (pkgid) {
-  var e = new Error("publish fail")
-  e.code = "EPUBLISHCONFLICT"
-  e.pkgid = pkgid
-  return e
-}
-
-function attach (doc, file, filename, rev, cb) {
-  doc = encodeURIComponent(doc)
-  var revu = "-rev/"+rev
-    , attURI = doc + "/-/" + encodeURIComponent(filename) + "/" + revu
-  this.log.verbose("uploading", [attURI, file])
-  this.upload(attURI, file, cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/request.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,305 +0,0 @@
-module.exports = regRequest
-
-var url = require("url")
-  , fs = require("graceful-fs")
-  , rm = require("rimraf")
-  , asyncMap = require("slide").asyncMap
-  , Stream = require("stream").Stream
-  , request = require("request")
-  , retry = require("retry")
-
-function regRequest (method, where, what, etag, nofollow, reauthed, cb_) {
-  if (typeof cb_ !== "function") cb_ = reauthed, reauthed = false
-  if (typeof cb_ !== "function") cb_ = nofollow, nofollow = false
-  if (typeof cb_ !== "function") cb_ = etag, etag = null
-  if (typeof cb_ !== "function") cb_ = what, what = null
-
-  var registry = this.conf.get('registry')
-  if (!registry) return cb(new Error(
-    "No registry url provided: " + method + " " + where))
-
-  // Since there are multiple places where an error could occur,
-  // don't let the cb be called more than once.
-  var errState = null
-  function cb (er) {
-    if (errState) return
-    if (er) errState = er
-    cb_.apply(null, arguments)
-  }
-
-  if (where.match(/^\/?favicon.ico/)) {
-    return cb(new Error("favicon.ico isn't a package, it's a picture."))
-  }
-
-  var adduserChange = /^\/?-\/user\/org\.couchdb\.user:([^\/]+)\/-rev/
-  , adduserNew = /^\/?-\/user\/org\.couchdb\.user:([^\/]+)/
-  , nu = where.match(adduserNew)
-  , uc = where.match(adduserChange)
-  , isUpload = what || this.conf.get('always-auth')
-  , isDel = method === "DELETE"
-  , authRequired = isUpload && !nu || uc || isDel
-
-  // resolve to a full url on the registry
-  if (!where.match(/^https?:\/\//)) {
-    this.log.verbose("url raw", where)
-
-    var q = where.split("?")
-    where = q.shift()
-    q = q.join("?")
-
-    if (where.charAt(0) !== "/") where = "/" + where
-    where = "." + where.split("/").map(function (p) {
-      p = p.trim()
-      if (p.match(/^org.couchdb.user/)) {
-        return p.replace(/\//g, encodeURIComponent("/"))
-      }
-      return encodeURIComponent(p)
-    }).join("/")
-    if (q) where += "?" + q
-    this.log.verbose("url resolving", [registry, where])
-    where = url.resolve(registry, where)
-    this.log.verbose("url resolved", where)
-  }
-
-  var remote = url.parse(where)
-  , auth = this.conf.get('_auth')
-
-  if (authRequired && !this.conf.get('always-auth')) {
-    var couch = this.couchLogin
-    , token = couch && (this.conf.get('_token') || couch.token)
-    , validToken = token && couch.valid(token)
-
-    if (!validToken) token = null
-    else this.conf.set('_token', token)
-
-    if (couch && !token) {
-      // login to get a valid token
-      var a = { name: this.conf.get('username'),
-                password: this.conf.get('_password') }
-      var args = arguments
-      return this.couchLogin.login(a, function (er, cr, data) {
-        if (er || !couch.valid(couch.token)) {
-          er = er || new Error('login error')
-          return cb(er, cr, data)
-        }
-        this.conf.set('_token', this.couchLogin.token)
-        return regRequest.call(this,
-                               method, where, what,
-                               etag, nofollow, reauthed, cb_)
-      }.bind(this))
-    }
-  }
-
-  // now we either have a valid token, or an auth.
-
-  if (authRequired && !auth && !token) {
-    return cb(new Error(
-      "Cannot insert data into the registry without auth"))
-  }
-
-  if (auth && !token) {
-    remote.auth = new Buffer(auth, "base64").toString("utf8")
-  }
-
-  // Tuned to spread 3 attempts over about a minute.
-  // See formula at <https://github.com/tim-kos/node-retry>.
-  var operation = retry.operation({
-    retries: this.conf.get('fetch-retries') || 2,
-    factor: this.conf.get('fetch-retry-factor'),
-    minTimeout: this.conf.get('fetch-retry-mintimeout') || 10000,
-    maxTimeout: this.conf.get('fetch-retry-maxtimeout') || 60000
-  })
-
-  var self = this
-  operation.attempt(function (currentAttempt) {
-    self.log.info("trying", "registry request attempt " + currentAttempt
-        + " at " + (new Date()).toLocaleTimeString())
-    makeRequest.call(self, method, remote, where, what, etag, nofollow, token
-                     , function (er, parsed, raw, response) {
-      if (!er || er.message.match(/^SSL Error/)) {
-        if (er)
-          er.code = 'ESSL'
-        return cb(er, parsed, raw, response)
-      }
-
-      // Only retry on 408, 5xx or no `response`.
-      var statusCode = response && response.statusCode
-
-      var reauth = !reauthed &&
-                   ( statusCode === 401 ||
-                     statusCode === 400 ||
-                     statusCode === 403 )
-      if (reauth)
-        reauthed = true
-
-      var timeout = statusCode === 408
-      var serverError = statusCode >= 500
-      var statusRetry = !statusCode || timeout || serverError
-      if (reauth && this.conf.get('_auth') && this.conf.get('_token')) {
-        this.conf.del('_token')
-        this.couchLogin.token = null
-        return regRequest.call(this, method, where, what,
-                               etag, nofollow, reauthed, cb_)
-      }
-      if (er && statusRetry && operation.retry(er)) {
-        self.log.info("retry", "will retry, error on last attempt: " + er)
-        return
-      }
-      cb.apply(null, arguments)
-    }.bind(this))
-  }.bind(this))
-}
-
-function makeRequest (method, remote, where, what, etag, nofollow, tok, cb_) {
-  var cbCalled = false
-  function cb () {
-    if (cbCalled) return
-    cbCalled = true
-    cb_.apply(null, arguments)
-  }
-
-  var strict = this.conf.get('strict-ssl')
-  if (strict === undefined) strict = true
-  var opts = { url: remote
-             , method: method
-             , ca: this.conf.get('ca')
-             , strictSSL: strict }
-    , headers = opts.headers = {}
-  if (etag) {
-    this.log.verbose("etag", etag)
-    headers[method === "GET" ? "if-none-match" : "if-match"] = etag
-  }
-
-  if (tok) {
-    headers.cookie = 'AuthSession=' + tok.AuthSession
-  }
-
-  headers.accept = "application/json"
-
-  headers["user-agent"] = this.conf.get('user-agent') ||
-                          'node/' + process.version
-
-  var p = this.conf.get('proxy')
-  var sp = this.conf.get('https-proxy') || p
-  opts.proxy = remote.protocol === "https:" ? sp : p
-
-  // figure out wth 'what' is
-  if (what) {
-    if (Buffer.isBuffer(what) || typeof what === "string") {
-      opts.body = what
-      headers["content-type"] = "application/json"
-      headers["content-length"] = Buffer.byteLength(what)
-    } else if (what instanceof Stream) {
-      headers["content-type"] = "application/octet-stream"
-      if (what.size) headers["content-length"] = what.size
-    } else {
-      delete what._etag
-      opts.json = what
-    }
-  }
-
-  if (nofollow) {
-    opts.followRedirect = false
-  }
-
-  this.log.http(method, remote.href || "/")
-
-  var done = requestDone.call(this, method, where, cb)
-  var req = request(opts, done)
-
-  req.on("error", cb)
-  req.on("socket", function (s) {
-    s.on("error", cb)
-  })
-
-  if (what && (what instanceof Stream)) {
-    what.pipe(req)
-  }
-}
-
-// cb(er, parsed, raw, response)
-function requestDone (method, where, cb) {
-  return function (er, response, data) {
-    if (er) return cb(er)
-
-    var urlObj = url.parse(where)
-    if (urlObj.auth)
-      urlObj.auth = '***'
-    this.log.http(response.statusCode, url.format(urlObj))
-
-    var parsed
-
-    if (Buffer.isBuffer(data)) {
-      data = data.toString()
-    }
-
-    if (data && typeof data === "string" && response.statusCode !== 304) {
-      try {
-        parsed = JSON.parse(data)
-      } catch (ex) {
-        ex.message += "\n" + data
-        this.log.verbose("bad json", data)
-        this.log.error("registry", "error parsing json")
-        return cb(ex, null, data, response)
-      }
-    } else if (data) {
-      parsed = data
-      data = JSON.stringify(parsed)
-    }
-
-    // expect data with any error codes
-    if (!data && response.statusCode >= 400) {
-      return cb( response.statusCode + " "
-               + require("http").STATUS_CODES[response.statusCode]
-               , null, data, response )
-    }
-
-    var er = null
-    if (parsed && response.headers.etag) {
-      parsed._etag = response.headers.etag
-    }
-
-    if (parsed && parsed.error && response.statusCode >= 400) {
-      var w = url.parse(where).pathname.substr(1)
-      var name
-      if (!w.match(/^-/) && parsed.error === "not_found") {
-        w = w.split("/")
-        name = w[w.indexOf("_rewrite") + 1]
-        er = new Error("404 Not Found: "+name)
-        er.code = "E404"
-        er.pkgid = name
-      } else {
-        er = new Error(
-          parsed.error + " " + (parsed.reason || "") + ": " + w)
-      }
-    } else if (method !== "HEAD" && method !== "GET") {
-      // invalidate cache
-      // This is irrelevant for commands that do etag caching, but
-      // ls and view also have a timed cache, so this keeps the user
-      // from thinking that it didn't work when it did.
-      // Note that failure is an acceptable option here, since the
-      // only result will be a stale cache for some helper commands.
-      var path = require("path")
-        , p = url.parse(where).pathname.split("/")
-        , _ = "/"
-        , caches = p.map(function (part) {
-            part = part.replace(/:/g, "_")
-            return _ = path.join(_, part)
-          }).map(function (cache) {
-            return path.join(this.conf.get('cache'), cache, ".cache.json")
-          }, this)
-
-      // if the method is DELETE, then also remove the thing itself.
-      // Note that the search index is probably invalid.  Whatever.
-      // That's what you get for deleting stuff.  Don't do that.
-      if (method === "DELETE") {
-        p = p.slice(0, p.indexOf("-rev"))
-        p = p.join("/").replace(/:/g, "_")
-        caches.push(path.join(this.conf.get('cache'), p))
-      }
-
-      asyncMap(caches, rm, function () {})
-    }
-    return cb(er, parsed, data, response)
-  }.bind(this)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/star.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,29 +0,0 @@
-
-module.exports = star
-
-function star (package, starred, cb) {
-  if (!this.conf.get('username')) return cb(new Error(
-    "Must be logged in to star/unstar packages"))
-
-  var users = {}
-
-  this.request("GET", package, function (er, fullData) {
-    if (er) return cb(er)
-
-    fullData = { _id: fullData._id
-               , _rev: fullData._rev
-               , users: fullData.users || {} }
-
-    if (starred) {
-      this.log.info("starring", fullData._id)
-      fullData.users[this.conf.get('username')] = true
-      this.log.verbose("starring", fullData)
-    } else {
-      delete fullData.users[this.conf.get('username')]
-      this.log.info("unstarring", fullData._id)
-      this.log.verbose("unstarring", fullData)
-    }
-
-    return this.request("PUT", package, fullData, cb)
-  }.bind(this))
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/stars.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,9 +0,0 @@
-var qs = require('querystring')
-
-module.exports = stars
-
-function stars (name, cb) {
-  name = encodeURIComponent(name)
-  var path = "/-/_view/starredByUser?key=\""+name+"\""
-  this.request("GET", path, cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/tag.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-
-module.exports = tag
-
-function tag (project, version, tag, cb) {
-  this.request("PUT", project+"/"+tag, JSON.stringify(version), cb)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/unpublish.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,103 +0,0 @@
-
-// fetch the data
-// modify to remove the version in question
-// If no versions remaining, then DELETE
-// else, PUT the modified data
-// delete the tarball
-
-module.exports = unpublish
-
-var semver = require("semver")
-  , url = require("url")
-  , chain = require("slide").chain
-
-function unpublish (name, ver, cb) {
-  if (typeof cb !== "function") cb = ver, ver = null
-
-  this.get(name, null, -1, true, function (er, data) {
-    if (er) {
-      this.log.info("unpublish", name+" not published")
-      return cb()
-    }
-    // remove all if no version specified
-    if (!ver) {
-      this.log.info("unpublish", "No version specified, removing all")
-      return this.request("DELETE", name+'/-rev/'+data._rev, cb)
-    }
-
-    var versions = data.versions || {}
-      , versionPublic = versions.hasOwnProperty(ver)
-
-    if (!versionPublic) {
-      this.log.info("unpublish", name+"@"+ver+" not published")
-    } else {
-      var dist = versions[ver].dist
-      this.log.verbose("unpublish", "removing attachments", dist)
-    }
-
-    delete versions[ver]
-    // if it was the only version, then delete the whole package.
-    if (!Object.keys(versions).length) {
-      this.log.info("unpublish", "No versions remain, removing entire package")
-      return this.request("DELETE", name+"/-rev/"+data._rev, cb)
-    }
-
-    if (!versionPublic) return cb()
-
-    var latestVer = data["dist-tags"].latest
-    for (var tag in data["dist-tags"]) {
-      if (data["dist-tags"][tag] === ver) delete data["dist-tags"][tag]
-    }
-
-    if (latestVer === ver) {
-      data["dist-tags"].latest =
-        Object.getOwnPropertyNames(versions).sort(semver.compareLoose).pop()
-    }
-
-    var rev = data._rev
-    delete data._revisions
-    delete data._attachments
-    var cb_ = detacher.call(this, data, dist, cb)
-    this.request("PUT", name+"/-rev/"+rev, data, function (er) {
-      if (er) {
-        this.log.error("unpublish", "Failed to update data")
-      }
-      cb_(er)
-    }.bind(this))
-  }.bind(this))
-}
-
-function detacher (data, dist, cb) {
-  return function (er) {
-    if (er) return cb(er)
-    this.get(data.name, function (er, data) {
-      if (er) return cb(er)
-
-      var tb = url.parse(dist.tarball)
-
-      detach.call(this, data, tb.pathname, data._rev, function (er) {
-        if (er || !dist.bin) return cb(er)
-        chain(Object.keys(dist.bin).map(function (bt) {
-          return function (cb) {
-            var d = dist.bin[bt]
-            detach.call(this, data, url.parse(d.tarball).pathname, null, cb)
-          }.bind(this)
-        }, this), cb)
-      }.bind(this))
-    }.bind(this))
-  }.bind(this)
-}
-
-function detach (data, path, rev, cb) {
-  if (rev) {
-    path += "/-rev/" + rev
-    this.log.info("detach", path)
-    return this.request("DELETE", path, cb)
-  }
-  this.get(data.name, function (er, data) {
-    rev = data._rev
-    if (!rev) return cb(new Error(
-      "No _rev found in "+data._id))
-    detach.call(this, data, path, rev, cb)
-  }.bind(this))
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/lib/upload.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-module.exports = upload
-
-var fs = require('fs')
-, Stream = require("stream").Stream
-
-function upload (where, file, etag, nofollow, cb) {
-  if (typeof nofollow === "function") cb = nofollow, nofollow = false
-  if (typeof etag === "function") cb = etag, etag = null
-
-  if (file instanceof Stream) {
-    return this.request("PUT", where, file, etag, nofollow, cb)
-  }
-
-  fs.stat(file, function (er, stat) {
-    if (er) return cb(er)
-    var s = fs.createReadStream(file)
-    s.size = stat.size
-    s.on("error", cb)
-
-    this.request("PUT", where, s, etag, nofollow, cb)
-  }.bind(this))
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-test/fixtures/couch.log
-test/fixtures/.delete
-test/fixtures/pid
-test/fixtures/_users.couch
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,260 +0,0 @@
-# couch-login
-
-This module lets you log into couchdb to get a session token, then make
-requests using that session.  It is basically just a thin wrapper around
-[@mikeal's request module](https://github.com/mikeal/request).
-
-This is handy if you want a user to take actions in a couchdb database
-on behalf of a user, without having to store their couchdb username and
-password anywhere.  (You do need to store the AuthSession token
-somewhere, though.)
-
-## Usage
-
-```javascript
-var CouchLogin = require('couch-login')
-
-// Nothing about this module is http-server specific of course.
-// You could also use it to do authenticated requests against
-// a couchdb using sessions and storing the token somewhere else.
-
-http.createServer(function (req, res) {
-  var couch = new CouchLogin('http://my-couch.iriscouch.com:5984/')
-
-  // .. look up the token in the user's session or whatever ..
-  // Look at couch.decorate(req, res) for more on doing that
-  // automatically, below.
-
-  if (sessionToken) {
-    // this user already logged in.
-    couch.token = sessionToken
-
-    // now we can do things on their behalf, like:
-    // 1. View their session info.
-    // like doing request.get({ uri: couch + '/_session', ... })
-    // but with the cookie and whatnot
-
-    couch.get('/_session', function (er, resp, data) {
-      // er = some kind of communication error.
-      // resp = response object from the couchdb request.
-      // data = parsed JSON response body.
-      if (er || resp.statusCode !== 200) {
-        res.statusCode = resp.statusCode || 403
-        return res.end('Invalid login or something')
-      }
-
-      // now we have the session info, we know who this user is.
-      // hitting couchdb for this on every request is kinda costly,
-      // so maybe you should store the username wherever you're storing
-      // the sessionToken.  RedSess is a good util for this, if you're
-      // into redis.  And if you're not into redis, you're crazy,
-      // because it is awesome.
-
-      // now let's get the user record.
-      // note that this will 404 for anyone other than the user,
-      // unless they're a server admin.
-      couch.get('/_users/org.couchdb.user:' + data.userCtx.name, etc)
-
-      // PUTs and DELETEs will also use their session, of course, so
-      // your validate_doc_update's will see their info in userCtx
-    })
-
-  } else {
-    // don't have a sessionToken.
-    // get a username and password from the post body or something.
-    // maybe redirect to a /login page or something to ask for that.
-    var login = { name: name, password: password }
-    couch.login(login, function (er, resp, data) {
-      // again, er is an error, resp is the response obj, data is the json
-      if (er || resp.statusCode !== 200) {
-        res.statusCode = resp.statusCode || 403
-        return res.end('Invalid login or something')
-      }
-
-      // the data is something like
-      // {"ok":true,"name":"testuser","roles":[]}
-      // and couch.token is the token you'll need to save somewhere.
-
-      // at this point, you can start making authenticated requests to
-      // couchdb, or save data in their session, or do whatever it is
-      // that you need to do.
-
-      res.statusCode = 200
-      res.write("Who's got two thumbs and just logged you into couch?\n")
-      setTimeout(function () {
-        res.end("THIS GUY!")
-      }, 500)
-    })
-  }
-})
-```
-
-## Class: CouchLogin
-### new CouchLogin(couchdbUrl, token)
-
-Create a new CouchLogin object bound to the couchdb url.
-
-In addition to these, the `get`, `post`, `put`, and `del` methods all
-proxy to the associated method on [request](https://github.com/mikeal/request).
-
-However, as you'll note in the example above, only the pathname portion
-of the url is required.  Urls will be appended to the couchdb url passed
-into the constructor.
-
-If you have to talk to more than one couchdb, then you'll need more than
-one CouchLogin object, for somewhat obvious reasons.
-
-All callbacks get called with the following arguments, which are exactly
-identical to the arguments passed to a `request` callback.
-
-* `er` {Error | null} Set if a communication error happens.
-* `resp` {HTTP Response} The response from the request to couchdb
-* `data` {Object} The parsed JSON data from couch
-
-If the token is the string "anonymous", then it will not attempt to log
-in before making requests.  If the token is not "anonymous", then it
-must be an object with the appropriate fields.
-
-### couch.token
-
-* {Object}
-
-An object representing the couchdb session token.  (Basically just a
-cookie and a timeout.)
-
-If the token has already timed out, then setting it will have no effect.
-
-### couch.tokenSet
-
-If set, this method is called whenever the token is saved.
-
-For example, you could assign a function to this method to save the
-token into a redis session, a cookie, or in some other database.
-
-Takes a callback which should be called when the token is saved.
-
-### couch.tokenGet
-
-If set, this method is called to look up the token on demand.
-
-The inverse of couch.tokenSet.  Takes a callback which is called with
-the `cb(er || null, token)`.
-
-### couch.tokenDel
-
-If set, this method is called to delete the token when it should be
-discarded.
-
-Related to tokenGet and tokenSet.  Takes a callback which should be
-called when the token is deleted.
-
-### couch.ca
-
-* {String | Array | null}
-
-A certificate authority string, or an array of CA strings.  Only
-relevant for HTTPS couches, of course.
-
-Leave as `null` to use the default ca settings built into node.
-
-### couch.strictSSL
-
-* {Boolean | null}
-
-Whether or not to be strict about SSL connections.  If left as null,
-then use the default setting in node, which is true in node versions
-0.9.x and above, and false prior to 0.8.x.
-
-Only relevant for HTTPS couches, of course.
-
-### couch.anonymous()
-
-Return a new CouchLogin object that points at the same couchdb server,
-but doesn't try to log in before making requests.
-
-This is handy for situations where the user is not logged in at the
-moment, but a request needs to be made anyway, and does not require
-authorization.
-
-### couch.login(auth, callback)
-
-* `auth` {Object} The login details
-  * `name` {String}
-  * `password` {String}
-* `callback` {Function}
-
-When the callback is called, the `couch.token` will already have been
-set (assuming it worked!), so subsequent requests will be done as that
-user.
-
-### couch.get(path, callback)
-
-GET the supplied path from the couchdb using the credentials on the
-token.
-
-Fails if the token is invalid or expired.
-
-### couch.del(path, callback)
-
-DELETE the supplied path from the couchdb using the credentials on the
-token.
-
-Fails if the token is invalid or expired.
-
-### couch.post(path, data, callback)
-
-POST the data to the supplied path in the couchdb, using the credentials
-on the token.
-
-Fails if the token is invalid or expired.
-
-### couch.put(path, data, callback)
-
-PUT the data to the supplied path in the couchdb, using the credentials
-on the token.
-
-Fails if the token is invalid or expired.
-
-### couch.changePass(newAuth, callback)
-
-Must already be logged in.  Updates the `_users` document with new salt
-and hash, and re-logs in with the new credentials.  Callback is called
-with the same arguments as login, or the first step of the process that
-failed.
-
-### couch.signup(userData, callback)
-
-Create a new user account.  The userData must contain at least a `name`
-and `password` field.  Any additional data will be copied to the user
-record.  The `_id`, `name`, `roles`, `type`, `password_sha`, `salt`, and
-`date` fields are generated.
-
-Also signs in as the newly created user, on successful account creation.
-
-### couch.deleteAccount(name, callback)
-
-Deletes a user account.  If not logged in as the user, or a server
-admin, then the request will fail.
-
-Note that this immediately invalidates any session tokens for the
-deleted user account.  If you are deleting the user's record, then you
-ought to follow this with `couch.logout(callback)` so that it won't try
-to re-use the invalid session.
-
-### couch.logout(callback)
-
-Delete the session out of couchdb.  This makes the token permanently
-invalid, and deletes it.
-
-### couch.decorate(req, res)
-
-Set up `req.couch` and `res.couch` as references to this couch login
-instance.
-
-Additionall, if `req.session` or `res.session` is set, then it'll call
-`session.get('couch_token', cb)` as the tokenGet method,
-`session.set('couch_token', token, cb)` as the tokenSet method, and
-`session.del('couch_token', cb)` as the tokenDel method.
-
-This works really nice with
-[RedSess](https://github.com/isaacs/redsess).
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/couch-login.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,385 +0,0 @@
-var request = require('request')
-, url = require('url')
-, crypto = require('crypto')
-, YEAR = (1000 * 60 * 60 * 24 * 365)
-, BASIC = {}
-, assert = require('assert')
-
-module.exports = CouchLogin
-
-function CouchLogin (couch, tok) {
-  if (!(this instanceof CouchLogin)) {
-    return new CouchLogin(couch)
-  }
-
-  if (!couch) throw new Error(
-    "Need to pass a couch url to CouchLogin constructor")
-
-  if (couch instanceof CouchLogin)
-    couch = couch.couch
-
-  couch = url.parse(couch)
-  if (couch.auth) {
-    var a = couch.auth.split(':')
-    var name = a.shift()
-    var password = a.join(':')
-    this.name = name
-    if (password)
-      this.auth = new Buffer(name + ':' + password).toString('base64')
-  } else {
-    this.auth = null
-    this.name = null
-  }
-  delete couch.auth
-
-  if (tok === 'anonymous')
-    tok = NaN
-  else if (tok === 'basic')
-    tok = BASIC
-
-  this.token = tok
-  this.couch = url.format(couch)
-  this.proxy = null
-
-  this.maxAge = YEAR
-
-  // replace with a CA cert string, or an array, or leave as null
-  // to use the defaults included in node.  Only relevant for HTTPS
-  // couches, of course.
-  this.ca = null
-
-  // set to boolean true or false to specify the strictSSL behavior.
-  // if left as null, then it'll use whatever node defaults to, which
-  // is false <=0.8.x, and true >=0.9.x
-  //
-  // Again, only relevant for https couches, of course.
-  this.strictSSL = null
-}
-
-CouchLogin.prototype =
-{ get: makeReq('GET')
-, del: makeReq('DELETE')
-, put: makeReq('PUT', true)
-, post: makeReq('POST', true)
-, login: login
-, logout: logout
-, decorate: decorate
-, changePass: changePass
-, signup: signup
-, deleteAccount: deleteAccount
-, anon: anon
-, anonymous: anon
-, valid: valid
-}
-
-Object.defineProperty(CouchLogin.prototype, 'constructor',
-  { value: CouchLogin, enumerable: false })
-
-function decorate (req, res) {
-  assert(this instanceof CouchLogin)
-  req.couch = res.couch = this
-
-  // backed by some sort of set(k,v,cb), get(k,cb) session storage.
-  var session = req.session || res.session || null
-  if (session) {
-    this.tokenGet = function (cb) {
-      session.get('couch_token', cb)
-    }
-
-    // don't worry about it failing.  it'll just mean a login next time.
-    this.tokenSet = function (tok, cb) {
-      session.set('couch_token', tok, cb || function () {})
-    }
-
-    this.tokenDel = function (cb) {
-      session.del('couch_token', cb || function () {})
-    }
-  }
-
-  return this
-}
-
-function anon () {
-  assert(this instanceof CouchLogin)
-  return new CouchLogin(this.couch, NaN)
-}
-
-function makeReq (meth, body, f) { return function madeReq (p, d, cb) {
-  assert(this instanceof CouchLogin)
-  f = f || (this.token !== this.token)
-  if (!f && !valid(this.token)) {
-    // lazily get the token.
-    if (this.tokenGet) return this.tokenGet(function (er, tok) {
-      if (er || !valid(tok)) {
-        if (!body) cb = d, d = null
-        return cb(new Error('auth token expired or invalid'))
-      }
-      this.token = tok
-      return madeReq.call(this, p, d, cb)
-    }.bind(this))
-
-    // no getter, no token, no business.
-    return process.nextTick(function () {
-      if (!body) cb = d, d = null
-      cb(new Error('auth token expired or invalid'))
-    })
-  }
-
-  if (!body) cb = d, d = null
-
-  var h = {}
-  , u = url.resolve(this.couch, p)
-  , req = { uri: u, headers: h, json: true, body: d, method: meth }
-
-  if (this.token === BASIC) {
-    if (!this.auth)
-      return process.nextTick(cb.bind(this, new Error(
-        'Using basic auth and no auth provided')))
-    else
-      h.authorization = 'Basic ' + this.auth
-  } else if (this.token) {
-    h.cookie = 'AuthSession=' + this.token.AuthSession
-  }
-
-  if (this.proxy) {
-    req.proxy = this.proxy
-  }
-
-  // we're handling cookies, don't do it for us.
-  req.jar = false
-
-  if (this.ca)
-    req.ca = this.ca
-
-  if (typeof this.strictSSL === 'boolean')
-    req.strictSSL = req.rejectUnauthorized = this.strictSSL
-
-  request(req, function (er, res, data) {
-    // update cookie.
-    if (er || res.statusCode !== 200) return cb(er, res, data)
-    addToken.call(this, res)
-    return cb.call(this, er, res, data)
-  }.bind(this))
-}}
-
-function login (auth, cb) {
-  assert(this instanceof CouchLogin)
-  if (this.token === BASIC) {
-    this.auth = new Buffer(auth.name + ':' + auth.password).toString('base64')
-    this.name = auth.name
-    cb = cb.bind(this, null, { statusCode: 200 }, { ok: true })
-    return process.nextTick(cb)
-  }
-  var a = { name: auth.name, password: auth.password }
-  var req = makeReq('post', true, true)
-  req.call(this, '/_session', a, function (er, cr, data) {
-    if (er || (cr && cr.statusCode >= 400))
-      return cb(er, cr, data)
-    this.name = auth.name
-    cb(er, cr, data)
-  }.bind(this))
-}
-
-function changePass (auth, cb) {
-  assert(this instanceof CouchLogin)
-  if (!auth.name || !auth.password) return cb(new Error('invalid auth'))
-
-  var u = '/_users/org.couchdb.user:' + auth.name
-  this.get(u, function (er, res, data) {
-    if (er || res.statusCode !== 200) return cb(er, res, data)
-
-    // copy any other keys we're setting here.
-    // note that name, password_sha, salt, and date
-    // are all set explicitly below.
-    Object.keys(auth).filter(function (k) {
-      return k.charAt(0) !== '_'
-          && k !== 'password'
-          && k !== 'password_sha'
-          && k !== 'salt'
-    }).forEach(function (k) {
-      data[k] = auth[k]
-    })
-
-    var newSalt = crypto.randomBytes(30).toString('hex')
-    , newPass = auth.password
-    , newSha = sha(newPass + newSalt)
-
-    data.password_sha = newSha
-    data.salt = newSalt
-    data.date = new Date().toISOString()
-
-    this.put(u + '?rev=' + data._rev, data, function (er, res, data) {
-      if (er || res.statusCode >= 400)
-        return cb(er, res, data)
-      if (this.name && this.name !== auth.name)
-        return cb(er, res, data)
-      return this.login(auth, cb)
-    }.bind(this))
-  }.bind(this))
-}
-
-// They said that there should probably be a warning before
-// deleting the user's whole account, so here it is:
-//
-// WATCH OUT!
-function deleteAccount (name, cb) {
-  assert(this instanceof CouchLogin)
-  var u = '/_users/org.couchdb.user:' + name
-  this.get(u, thenPut.bind(this))
-
-  function thenPut (er, res, data) {
-    if (er || res.statusCode !== 200) {
-      return cb(er, res, data)
-    }
-
-    // user accts can't be just DELETE'd by non-admins
-    // so we take the existing doc and just slap a _deleted
-    // flag on it to fake it.  Works the same either way
-    // in couch.
-    data._deleted = true
-    this.put(u + '?rev=' + data._rev, data, cb)
-  }
-}
-
-
-
-function signup (auth, cb) {
-  assert(this instanceof CouchLogin)
-  if (this.token && this.token !== BASIC) {
-
-    return this.logout(function (er, res, data) {
-      if (er || res && res.statusCode !== 200) {
-        return cb(er, res, data)
-      }
-
-      if (this.token) {
-        return cb(new Error('failed to delete token'), res, data)
-      }
-
-      this.signup(auth, cb)
-    }.bind(this))
-  }
-
-  // make a new user record.
-  var newSalt = crypto.randomBytes(30).toString('hex')
-  , newSha = sha(auth.password + newSalt)
-  , user = { _id: 'org.couchdb.user:' + auth.name
-           , name: auth.name
-           , roles: []
-           , type: 'user'
-           , password_sha: newSha
-           , salt: newSalt
-           , date: new Date().toISOString() }
-
-  Object.keys(auth).forEach(function (k) {
-    if (k === 'name' || k === 'password') return
-    user[k] = auth[k]
-  })
-
-  var u = '/_users/' + user._id
-  makeReq('put', true, true).call(this, u, user, function (er, res, data) {
-    if (er || res.statusCode >= 400) {
-      return cb(er, res, data)
-    }
-
-    // it worked! log in as that user and get their record
-    this.login(auth, function (er, res, data) {
-      if (er || (res && res.statusCode >= 400) || data && data.error) {
-        return cb(er, res, data)
-      }
-      this.get(u, cb)
-    }.bind(this))
-  }.bind(this))
-}
-
-function addToken (res) {
-  assert(this instanceof CouchLogin)
-  // not doing the whole login session cookie thing.
-  if (this.token === BASIC)
-    return
-
-  // attach the token, if a new one was provided.
-  var sc = res.headers['set-cookie']
-  if (!sc) return
-  if (!Array.isArray(sc)) sc = [sc]
-
-  sc = sc.filter(function (c) {
-    return c.match(/^AuthSession=/)
-  })[0]
-
-  if (!sc.length) return
-
-  sc = sc.split(/\s*;\s*/).map(function (p) {
-    return p.split('=')
-  }).reduce(function (set, p) {
-    var k = p[0] === 'AuthSession' ? p[0] : p[0].toLowerCase()
-    , v = k === 'expires' ? Date.parse(p[1])
-        : p[1] === '' || p[1] === undefined ? true // HttpOnly
-        : p[1]
-    set[k] = v
-    return set
-  }, {})
-
-  if (sc.hasOwnProperty('max-age')) {
-    var ma = sc['max-age']
-    sc.expires = (ma <= 0) ? 0 : Date.now() + (ma * 1000)
-    delete sc['max-age']
-  }
-
-  // expire the session after 1 year, even if couch won't.
-  if (!sc.hasOwnProperty('expires')) {
-    sc.expires = Date.now() + YEAR
-  }
-
-  if (!isNaN(this.maxAge)) {
-    sc.expires = Math.min(sc.expires, Date.now() + this.maxAge)
-  }
-
-  this.token = sc
-  if (this.tokenSet) this.tokenSet(this.token)
-}
-
-
-function logout (cb) {
-  assert(this instanceof CouchLogin)
-  if (!this.token && this.tokenGet) {
-    return this.tokenGet(function (er, tok) {
-      if (er || !tok)
-        return cb(null, { statusCode: 200 }, {})
-      this.token = tok
-      this.logout(cb)
-    }.bind(this))
-  }
-
-  if (!valid(this.token)) {
-    this.token = null
-    if (this.tokenDel) this.tokenDel()
-    return process.nextTick(cb.bind(this, null, { statusCode: 200 }, {}))
-  }
-
-  var h = { cookie: 'AuthSession=' + this.token.AuthSession }
-  , u = url.resolve(this.couch, '/_session')
-  , req = { uri: u, headers: h, json: true }
-
-  request.del(req, function (er, res, data) {
-    if (er || (res.statusCode !== 200 && res.statusCode !== 404)) {
-      return cb(er, res, data)
-    }
-
-    this.token = null
-    if (this.tokenDel)
-      this.tokenDel()
-    cb(er, res, data)
-  }.bind(this))
-}
-
-function valid (token) {
-  if (token === BASIC) return true
-  if (!token) return false
-  if (!token.hasOwnProperty('expires')) return true
-  return token.expires > Date.now()
-}
-
-function sha (s) {
-  return crypto.createHash("sha1").update(s).digest("hex")
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/couch-login/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "couch-login",
-  "description": "A module for doing logged-in requests to a couchdb server",
-  "version": "0.1.18",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/couch-login.git"
-  },
-  "main": "couch-login.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "dependencies": {
-    "request": "2 >=2.25.0"
-  },
-  "devDependencies": {
-    "tap": "~0.2.4"
-  },
-  "readme": "# couch-login\n\nThis module lets you log into couchdb to get a session token, then make\nrequests using that session.  It is basically just a thin wrapper around\n[@mikeal's request module](https://github.com/mikeal/request).\n\nThis is handy if you want a user to take actions in a couchdb database\non behalf of a user, without having to store their couchdb username and\npassword anywhere.  (You do need to store the AuthSession token\nsomewhere, though.)\n\n## Usage\n\n```javascript\nvar CouchLogin = require('couch-login')\n\n// Nothing about this module is http-server specific of course.\n// You could also use it to do authenticated requests against\n// a couchdb using sessions and storing the token somewhere else.\n\nhttp.createServer(function (req, res) {\n  var couch = new CouchLogin('http://my-couch.iriscouch.com:5984/')\n\n  // .. look up the token in the user's session or whatever ..\n  // Look at couch.decorate(req, res) for more on doing that\n  // automatically, below.\n\n  if (sessionToken) {\n    // this user already logged in.\n    couch.token = sessionToken\n\n    // now we can do things on their behalf, like:\n    // 1. View their session info.\n    // like doing request.get({ uri: couch + '/_session', ... })\n    // but with the cookie and whatnot\n\n    couch.get('/_session', function (er, resp, data) {\n      // er = some kind of communication error.\n      // resp = response object from the couchdb request.\n      // data = parsed JSON response body.\n      if (er || resp.statusCode !== 200) {\n        res.statusCode = resp.statusCode || 403\n        return res.end('Invalid login or something')\n      }\n\n      // now we have the session info, we know who this user is.\n      // hitting couchdb for this on every request is kinda costly,\n      // so maybe you should store the username wherever you're storing\n      // the sessionToken.  RedSess is a good util for this, if you're\n      // into redis.  And if you're not into redis, you're crazy,\n      // because it is awesome.\n\n      // now let's get the user record.\n      // note that this will 404 for anyone other than the user,\n      // unless they're a server admin.\n      couch.get('/_users/org.couchdb.user:' + data.userCtx.name, etc)\n\n      // PUTs and DELETEs will also use their session, of course, so\n      // your validate_doc_update's will see their info in userCtx\n    })\n\n  } else {\n    // don't have a sessionToken.\n    // get a username and password from the post body or something.\n    // maybe redirect to a /login page or something to ask for that.\n    var login = { name: name, password: password }\n    couch.login(login, function (er, resp, data) {\n      // again, er is an error, resp is the response obj, data is the json\n      if (er || resp.statusCode !== 200) {\n        res.statusCode = resp.statusCode || 403\n        return res.end('Invalid login or something')\n      }\n\n      // the data is something like\n      // {\"ok\":true,\"name\":\"testuser\",\"roles\":[]}\n      // and couch.token is the token you'll need to save somewhere.\n\n      // at this point, you can start making authenticated requests to\n      // couchdb, or save data in their session, or do whatever it is\n      // that you need to do.\n\n      res.statusCode = 200\n      res.write(\"Who's got two thumbs and just logged you into couch?\\n\")\n      setTimeout(function () {\n        res.end(\"THIS GUY!\")\n      }, 500)\n    })\n  }\n})\n```\n\n## Class: CouchLogin\n### new CouchLogin(couchdbUrl, token)\n\nCreate a new CouchLogin object bound to the couchdb url.\n\nIn addition to these, the `get`, `post`, `put`, and `del` methods all\nproxy to the associated method on [request](https://github.com/mikeal/request).\n\nHowever, as you'll note in the example above, only the pathname portion\nof the url is required.  Urls will be appended to the couchdb url passed\ninto the constructor.\n\nIf you have to talk to more than one couchdb, then you'll need more than\none CouchLogin object, for somewhat obvious reasons.\n\nAll callbacks get called with the following arguments, which are exactly\nidentical to the arguments passed to a `request` callback.\n\n* `er` {Error | null} Set if a communication error happens.\n* `resp` {HTTP Response} The response from the request to couchdb\n* `data` {Object} The parsed JSON data from couch\n\nIf the token is the string \"anonymous\", then it will not attempt to log\nin before making requests.  If the token is not \"anonymous\", then it\nmust be an object with the appropriate fields.\n\n### couch.token\n\n* {Object}\n\nAn object representing the couchdb session token.  (Basically just a\ncookie and a timeout.)\n\nIf the token has already timed out, then setting it will have no effect.\n\n### couch.tokenSet\n\nIf set, this method is called whenever the token is saved.\n\nFor example, you could assign a function to this method to save the\ntoken into a redis session, a cookie, or in some other database.\n\nTakes a callback which should be called when the token is saved.\n\n### couch.tokenGet\n\nIf set, this method is called to look up the token on demand.\n\nThe inverse of couch.tokenSet.  Takes a callback which is called with\nthe `cb(er || null, token)`.\n\n### couch.tokenDel\n\nIf set, this method is called to delete the token when it should be\ndiscarded.\n\nRelated to tokenGet and tokenSet.  Takes a callback which should be\ncalled when the token is deleted.\n\n### couch.ca\n\n* {String | Array | null}\n\nA certificate authority string, or an array of CA strings.  Only\nrelevant for HTTPS couches, of course.\n\nLeave as `null` to use the default ca settings built into node.\n\n### couch.strictSSL\n\n* {Boolean | null}\n\nWhether or not to be strict about SSL connections.  If left as null,\nthen use the default setting in node, which is true in node versions\n0.9.x and above, and false prior to 0.8.x.\n\nOnly relevant for HTTPS couches, of course.\n\n### couch.anonymous()\n\nReturn a new CouchLogin object that points at the same couchdb server,\nbut doesn't try to log in before making requests.\n\nThis is handy for situations where the user is not logged in at the\nmoment, but a request needs to be made anyway, and does not require\nauthorization.\n\n### couch.login(auth, callback)\n\n* `auth` {Object} The login details\n  * `name` {String}\n  * `password` {String}\n* `callback` {Function}\n\nWhen the callback is called, the `couch.token` will already have been\nset (assuming it worked!), so subsequent requests will be done as that\nuser.\n\n### couch.get(path, callback)\n\nGET the supplied path from the couchdb using the credentials on the\ntoken.\n\nFails if the token is invalid or expired.\n\n### couch.del(path, callback)\n\nDELETE the supplied path from the couchdb using the credentials on the\ntoken.\n\nFails if the token is invalid or expired.\n\n### couch.post(path, data, callback)\n\nPOST the data to the supplied path in the couchdb, using the credentials\non the token.\n\nFails if the token is invalid or expired.\n\n### couch.put(path, data, callback)\n\nPUT the data to the supplied path in the couchdb, using the credentials\non the token.\n\nFails if the token is invalid or expired.\n\n### couch.changePass(newAuth, callback)\n\nMust already be logged in.  Updates the `_users` document with new salt\nand hash, and re-logs in with the new credentials.  Callback is called\nwith the same arguments as login, or the first step of the process that\nfailed.\n\n### couch.signup(userData, callback)\n\nCreate a new user account.  The userData must contain at least a `name`\nand `password` field.  Any additional data will be copied to the user\nrecord.  The `_id`, `name`, `roles`, `type`, `password_sha`, `salt`, and\n`date` fields are generated.\n\nAlso signs in as the newly created user, on successful account creation.\n\n### couch.deleteAccount(name, callback)\n\nDeletes a user account.  If not logged in as the user, or a server\nadmin, then the request will fail.\n\nNote that this immediately invalidates any session tokens for the\ndeleted user account.  If you are deleting the user's record, then you\nought to follow this with `couch.logout(callback)` so that it won't try\nto re-use the invalid session.\n\n### couch.logout(callback)\n\nDelete the session out of couchdb.  This makes the token permanently\ninvalid, and deletes it.\n\n### couch.decorate(req, res)\n\nSet up `req.couch` and `res.couch` as references to this couch login\ninstance.\n\nAdditionall, if `req.session` or `res.session` is set, then it'll call\n`session.get('couch_token', cb)` as the tokenGet method,\n`session.set('couch_token', token, cb)` as the tokenSet method, and\n`session.del('couch_token', cb)` as the tokenDel method.\n\nThis works really nice with\n[RedSess](https://github.com/isaacs/redsess).\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/couch-login/issues"
-  },
-  "homepage": "https://github.com/isaacs/couch-login",
-  "_id": "couch-login@0.1.18",
-  "_from": "couch-login@~0.1.18"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-registry-client/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,44 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "npm-registry-client",
-  "description": "Client for the npm registry",
-  "version": "0.2.29",
-  "repository": {
-    "url": "git://github.com/isaacs/npm-registry-client"
-  },
-  "main": "index.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "dependencies": {
-    "request": "2 >=2.25.0",
-    "graceful-fs": "~2.0.0",
-    "semver": "^2.2.1",
-    "slide": "~1.1.3",
-    "chownr": "0",
-    "mkdirp": "~0.3.3",
-    "rimraf": "~2",
-    "retry": "0.6.0",
-    "couch-login": "~0.1.18",
-    "npmlog": ""
-  },
-  "devDependencies": {
-    "tap": ""
-  },
-  "optionalDependencies": {
-    "npmlog": ""
-  },
-  "license": "BSD",
-  "readme": "# npm-registry-client\n\nThe code that npm uses to talk to the registry.\n\nIt handles all the caching and HTTP calls.\n\n## Usage\n\n```javascript\nvar RegClient = require('npm-registry-client')\nvar client = new RegClient(config)\n\nclient.get(\"npm\", \"latest\", 1000, function (er, data, raw, res) {\n  // error is an error if there was a problem.\n  // data is the parsed data object\n  // raw is the json string\n  // res is the response from couch\n})\n```\n\n# Configuration\n\nThis program is designed to work with\n[npmconf](https://npmjs.org/package/npmconf), but you can also pass in\na plain-jane object with the appropriate configs, and it'll shim it\nfor you.  Any configuration thingie that has get/set/del methods will\nalso be accepted.\n\n* `registry` **Required** {String} URL to the registry\n* `cache` **Required** {String} Path to the cache folder\n* `always-auth` {Boolean} Auth even for GET requests.\n* `auth` {String} A base64-encoded `username:password`\n* `email` {String} User's email address\n* `tag` {String} The default tag to use when publishing new packages.\n  Default = `\"latest\"`\n* `ca` {String} Cerficate signing authority certificates to trust.\n* `strict-ssl` {Boolean} Whether or not to be strict with SSL\n  certificates.  Default = `true`\n* `user-agent` {String} User agent header to send.  Default =\n  `\"node/{process.version} {process.platform} {process.arch}\"`\n* `log` {Object} The logger to use.  Defaults to `require(\"npmlog\")` if\n  that works, otherwise logs are disabled.\n* `fetch-retries` {Number} Number of times to retry on GET failures.\n  Default=2\n* `fetch-retry-factor` {Number} `factor` setting for `node-retry`. Default=10\n* `fetch-retry-mintimeout` {Number} `minTimeout` setting for `node-retry`.\n  Default=10000 (10 seconds)\n* `fetch-retry-maxtimeout` {Number} `maxTimeout` setting for `node-retry`.\n  Default=60000 (60 seconds)\n* `proxy` {URL} The url to proxy requests through.\n* `https-proxy` {URL} The url to proxy https requests through.\n  Defaults to be the same as `proxy` if unset.\n* `_auth` {String} The base64-encoded authorization header.\n* `username` `_password` {String} Username/password to use to generate\n  `_auth` if not supplied.\n* `_token` {Object} A token for use with\n  [couch-login](https://npmjs.org/package/couch-login)\n\n# client.request(method, where, [what], [etag], [nofollow], cb)\n\n* `method` {String} HTTP method\n* `where` {String} Path to request on the server\n* `what` {Stream | Buffer | String | Object} The request body.  Objects\n  that are not Buffers or Streams are encoded as JSON.\n* `etag` {String} The cached ETag\n* `nofollow` {Boolean} Prevent following 302/301 responses\n* `cb` {Function}\n  * `error` {Error | null}\n  * `data` {Object} the parsed data object\n  * `raw` {String} the json\n  * `res` {Response Object} response from couch\n\nMake a request to the registry.  All the other methods are wrappers\naround this. one.\n\n# client.adduser(username, password, email, cb)\n\n* `username` {String}\n* `password` {String}\n* `email` {String}\n* `cb` {Function}\n\nAdd a user account to the registry, or verify the credentials.\n\n# client.get(url, [timeout], [nofollow], [staleOk], cb)\n\n* `url` {String} The url path to fetch\n* `timeout` {Number} Number of seconds old that a cached copy must be\n  before a new request will be made.\n* `nofollow` {Boolean} Do not follow 301/302 responses\n* `staleOk` {Boolean} If there's cached data available, then return that\n  to the callback quickly, and update the cache the background.\n\nFetches data from the registry via a GET request, saving it in\nthe cache folder with the ETag.\n\n# client.publish(data, tarball, [readme], cb)\n\n* `data` {Object} Package data\n* `tarball` {String | Stream} Filename or stream of the package tarball\n* `readme` {String} Contents of the README markdown file\n* `cb` {Function}\n\nPublish a package to the registry.\n\nNote that this does not create the tarball from a folder.  However, it\ncan accept a gzipped tar stream or a filename to a tarball.\n\n# client.star(package, starred, cb)\n\n* `package` {String} Name of the package to star\n* `starred` {Boolean} True to star the package, false to unstar it.\n* `cb` {Function}\n\nStar or unstar a package.\n\nNote that the user does not have to be the package owner to star or\nunstar a package, though other writes do require that the user be the\npackage owner.\n\n# client.stars(username, cb)\n\n* `username` {String} Name of user to fetch starred packages for.\n* `cb` {Function}\n\nView your own or another user's starred packages.\n\n# client.tag(project, version, tag, cb)\n\n* `project` {String} Project name\n* `version` {String} Version to tag\n* `tag` {String} Tag name to apply\n* `cb` {Function}\n\nMark a version in the `dist-tags` hash, so that `pkg@tag`\nwill fetch the specified version.\n\n# client.unpublish(name, [ver], cb)\n\n* `name` {String} package name\n* `ver` {String} version to unpublish. Leave blank to unpublish all\n  versions.\n* `cb` {Function}\n\nRemove a version of a package (or all versions) from the registry.  When\nthe last version us unpublished, the entire document is removed from the\ndatabase.\n\n# client.upload(where, file, [etag], [nofollow], cb)\n\n* `where` {String} URL path to upload to\n* `file` {String | Stream} Either the filename or a readable stream\n* `etag` {String} Cache ETag\n* `nofollow` {Boolean} Do not follow 301/302 responses\n* `cb` {Function}\n\nUpload an attachment.  Mostly used by `client.publish()`.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/npm-registry-client/issues"
-  },
-  "homepage": "https://github.com/isaacs/npm-registry-client",
-  "_id": "npm-registry-client@0.2.29",
-  "_from": "npm-registry-client@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-*.swp
-.*.swp
-
-.DS_Store
-*~
-.project
-.settings
-npm-debug.log
-coverage.html
-.idea
-lib-cov
-
-node_modules
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-language: node_js
-node_js:
-  - "0.8"
-  - "0.10"
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Robert Kowalski
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-[![Build Status](https://travis-ci.org/robertkowalski/npm-user-validate.png?branch=master)](https://travis-ci.org/robertkowalski/npm-user-validate)
-[![devDependency Status](https://david-dm.org/robertkowalski/npm-user-validate/dev-status.png)](https://david-dm.org/robertkowalski/npm-user-validate#info=devDependencies)
-
-# npm-user-validate
-
-Validation for the npm client and npm-www (and probably other npm projects)
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/npm-user-validate.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,49 +0,0 @@
-exports.email = email
-exports.pw = pw
-exports.username = username
-
-var requirements = exports.requirements = {
-  username: {
-    lowerCase: 'Username must be lowercase',
-    urlSafe: 'Username may not contain non-url-safe chars',
-    dot: 'Username may not start with "."'
-  },
-  password: {
-    badchars: 'Password passwords cannot contain these characters: \'!:@"'
-  },
-  email: {
-    valid: 'Email must be an email address'
-  }
-};
-
-function username (un) {
-  if (un !== un.toLowerCase()) {
-    return new Error(requirements.username.lowerCase)
-  }
-
-  if (un !== encodeURIComponent(un)) {
-    return new Error(requirements.username.urlSafe)
-  }
-
-  if (un.charAt(0) === '.') {
-    return new Error(requirements.username.dot)
-  }
-
-  return null
-}
-
-function email (em) {
-  if (!em.match(/^.+@.+\..+$/)) {
-    return new Error(requirements.email.valid)
-  }
-
-  return null
-}
-
-function pw (pw) {
-  if (pw.match(/['!:@"]/)) {
-    return new Error(requirements.password.badchars)
-  }
-
-  return null
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npm-user-validate/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-{
-  "name": "npm-user-validate",
-  "version": "0.0.3",
-  "description": "User validations for npm",
-  "main": "npm-user-validate.js",
-  "devDependencies": {
-    "tap": "0.4.3"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "https://github.com/robertkowalski/npm-user-validate"
-  },
-  "keywords": [
-    "npm",
-    "validation",
-    "registry"
-  ],
-  "author": {
-    "name": "Robert Kowalski",
-    "email": "rok@kowalski.gd"
-  },
-  "license": "BSD",
-  "readme": "[![Build Status](https://travis-ci.org/robertkowalski/npm-user-validate.png?branch=master)](https://travis-ci.org/robertkowalski/npm-user-validate)\n[![devDependency Status](https://david-dm.org/robertkowalski/npm-user-validate/dev-status.png)](https://david-dm.org/robertkowalski/npm-user-validate#info=devDependencies)\n\n# npm-user-validate\n\nValidation for the npm client and npm-www (and probably other npm projects)",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/robertkowalski/npm-user-validate/issues"
-  },
-  "_id": "npm-user-validate@0.0.3",
-  "dist": {
-    "shasum": "7b147d11038083fb0ba2d60ff851dc20322aa9f6"
-  },
-  "_from": "npm-user-validate@0.0.3",
-  "_resolved": "https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-0.0.3.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-/test/fixtures/userconfig-with-gc
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-# npmconf
-
-The config thing npm uses
-
-If you are interested in interacting with the config settings that npm
-uses, then use this module.
-
-However, if you are writing a new Node.js program, and want
-configuration functionality similar to what npm has, but for your
-own thing, then I'd recommend using [rc](https://github.com/dominictarr/rc),
-which is probably what you want.
-
-If I were to do it all over again, that's what I'd do for npm.  But,
-alas, there are many systems depending on many of the particulars of
-npm's configuration setup, so it's not worth the cost of changing.
-
-## USAGE
-
-```javascript
-var npmconf = require('npmconf')
-
-// pass in the cli options that you read from the cli
-// or whatever top-level configs you want npm to use for now.
-npmconf.load({some:'configs'}, function (er, conf) {
-  // do stuff with conf
-  conf.get('some', 'cli') // 'configs'
-  conf.get('username') // 'joebobwhatevers'
-  conf.set('foo', 'bar', 'user')
-  conf.save('user', function (er) {
-    // foo = bar is now saved to ~/.npmrc or wherever
-  })
-})
-```
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/config-defs.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,427 +0,0 @@
-// defaults, types, and shorthands.
-
-
-var path = require("path")
-  , url = require("url")
-  , Stream = require("stream").Stream
-  , semver = require("semver")
-  , stableFamily = semver.parse(process.version)
-  , nopt = require("nopt")
-  , osenv = require("osenv")
-
-try {
-  var log = require("npmlog")
-} catch (er) {
-  var util = require('util')
-  var log = { warn: function (m) {
-    console.warn(m + util.format.apply(util, [].slice.call(arguments, 1)))
-  } }
-}
-
-exports.Octal = Octal
-function Octal () {}
-function validateOctal (data, k, val) {
-  // must be either an integer or an octal string.
-  if (typeof val === "number") {
-    data[k] = val
-    return true
-  }
-
-  if (typeof val === "string") {
-    if (val.charAt(0) !== "0" || isNaN(val)) return false
-    data[k] = parseInt(val, 8).toString(8)
-  }
-}
-
-function validateSemver (data, k, val) {
-  if (!semver.valid(val)) return false
-  data[k] = semver.valid(val)
-}
-
-function validateStream (data, k, val) {
-  if (!(val instanceof Stream)) return false
-  data[k] = val
-}
-
-nopt.typeDefs.semver = { type: semver, validate: validateSemver }
-nopt.typeDefs.Octal = { type: Octal, validate: validateOctal }
-nopt.typeDefs.Stream = { type: Stream, validate: validateStream }
-
-nopt.invalidHandler = function (k, val, type, data) {
-  log.warn("invalid config", k + "=" + JSON.stringify(val))
-
-  if (Array.isArray(type)) {
-    if (type.indexOf(url) !== -1) type = url
-    else if (type.indexOf(path) !== -1) type = path
-  }
-
-  switch (type) {
-    case Octal:
-      log.warn("invalid config", "Must be octal number, starting with 0")
-      break
-    case url:
-      log.warn("invalid config", "Must be a full url with 'http://'")
-      break
-    case path:
-      log.warn("invalid config", "Must be a valid filesystem path")
-      break
-    case Number:
-      log.warn("invalid config", "Must be a numeric value")
-      break
-    case Stream:
-      log.warn("invalid config", "Must be an instance of the Stream class")
-      break
-  }
-}
-
-if (!stableFamily || (+stableFamily.minor % 2)) stableFamily = null
-else stableFamily = stableFamily.major + "." + stableFamily.minor
-
-var defaults
-
-var temp = osenv.tmpdir()
-var home = osenv.home()
-
-var uidOrPid = process.getuid ? process.getuid() : process.pid
-
-if (home) process.env.HOME = home
-else home = path.resolve(temp, "npm-" + uidOrPid)
-
-var cacheExtra = process.platform === "win32" ? "npm-cache" : ".npm"
-var cacheRoot = process.platform === "win32" && process.env.APPDATA || home
-var cache = path.resolve(cacheRoot, cacheExtra)
-
-
-var globalPrefix
-Object.defineProperty(exports, "defaults", {get: function () {
-  if (defaults) return defaults
-
-  if (process.env.PREFIX) {
-    globalPrefix = process.env.PREFIX
-  } else if (process.platform === "win32") {
-    // c:\node\node.exe --> prefix=c:\node\
-    globalPrefix = path.dirname(process.execPath)
-  } else {
-    // /usr/local/bin/node --> prefix=/usr/local
-    globalPrefix = path.dirname(path.dirname(process.execPath))
-
-    // destdir only is respected on Unix
-    if (process.env.DESTDIR) {
-      globalPrefix = path.join(process.env.DESTDIR, globalPrefix)
-    }
-  }
-
-  return defaults =
-    { "always-auth" : false
-    , "bin-links" : true
-    , browser : null
-
-    , ca: // the npm CA certificate.
-      [ "-----BEGIN CERTIFICATE-----\n"+
-        "MIIChzCCAfACCQDauvz/KHp8ejANBgkqhkiG9w0BAQUFADCBhzELMAkGA1UEBhMC\n"+
-        "VVMxCzAJBgNVBAgTAkNBMRAwDgYDVQQHEwdPYWtsYW5kMQwwCgYDVQQKEwNucG0x\n"+
-        "IjAgBgNVBAsTGW5wbSBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkxDjAMBgNVBAMTBW5w\n"+
-        "bUNBMRcwFQYJKoZIhvcNAQkBFghpQGl6cy5tZTAeFw0xMTA5MDUwMTQ3MTdaFw0y\n"+
-        "MTA5MDIwMTQ3MTdaMIGHMQswCQYDVQQGEwJVUzELMAkGA1UECBMCQ0ExEDAOBgNV\n"+
-        "BAcTB09ha2xhbmQxDDAKBgNVBAoTA25wbTEiMCAGA1UECxMZbnBtIENlcnRpZmlj\n"+
-        "YXRlIEF1dGhvcml0eTEOMAwGA1UEAxMFbnBtQ0ExFzAVBgkqhkiG9w0BCQEWCGlA\n"+
-        "aXpzLm1lMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDLI4tIqPpRW+ACw9GE\n"+
-        "OgBlJZwK5f8nnKCLK629Pv5yJpQKs3DENExAyOgDcyaF0HD0zk8zTp+ZsLaNdKOz\n"+
-        "Gn2U181KGprGKAXP6DU6ByOJDWmTlY6+Ad1laYT0m64fERSpHw/hjD3D+iX4aMOl\n"+
-        "y0HdbT5m1ZGh6SJz3ZqxavhHLQIDAQABMA0GCSqGSIb3DQEBBQUAA4GBAC4ySDbC\n"+
-        "l7W1WpLmtLGEQ/yuMLUf6Jy/vr+CRp4h+UzL+IQpCv8FfxsYE7dhf/bmWTEupBkv\n"+
-        "yNL18lipt2jSvR3v6oAHAReotvdjqhxddpe5Holns6EQd1/xEZ7sB1YhQKJtvUrl\n"+
-        "ZNufy1Jf1r0ldEGeA+0ISck7s+xSh9rQD2Op\n"+
-        "-----END CERTIFICATE-----\n",
-
-        // "GlobalSign Root CA"
-        "-----BEGIN CERTIFICATE-----\n"+
-        "MIIDdTCCAl2gAwIBAgILBAAAAAABFUtaw5QwDQYJKoZIhvcNAQEFBQAwVzELMAkGA1UEBhMCQkUx\n"+
-        "GTAXBgNVBAoTEEdsb2JhbFNpZ24gbnYtc2ExEDAOBgNVBAsTB1Jvb3QgQ0ExGzAZBgNVBAMTEkds\n"+
-        "b2JhbFNpZ24gUm9vdCBDQTAeFw05ODA5MDExMjAwMDBaFw0yODAxMjgxMjAwMDBaMFcxCzAJBgNV\n"+
-        "BAYTAkJFMRkwFwYDVQQKExBHbG9iYWxTaWduIG52LXNhMRAwDgYDVQQLEwdSb290IENBMRswGQYD\n"+
-        "VQQDExJHbG9iYWxTaWduIFJvb3QgQ0EwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDa\n"+
-        "DuaZjc6j40+Kfvvxi4Mla+pIH/EqsLmVEQS98GPR4mdmzxzdzxtIK+6NiY6arymAZavpxy0Sy6sc\n"+
-        "THAHoT0KMM0VjU/43dSMUBUc71DuxC73/OlS8pF94G3VNTCOXkNz8kHp1Wrjsok6Vjk4bwY8iGlb\n"+
-        "Kk3Fp1S4bInMm/k8yuX9ifUSPJJ4ltbcdG6TRGHRjcdGsnUOhugZitVtbNV4FpWi6cgKOOvyJBNP\n"+
-        "c1STE4U6G7weNLWLBYy5d4ux2x8gkasJU26Qzns3dLlwR5EiUWMWea6xrkEmCMgZK9FGqkjWZCrX\n"+
-        "gzT/LCrBbBlDSgeF59N89iFo7+ryUp9/k5DPAgMBAAGjQjBAMA4GA1UdDwEB/wQEAwIBBjAPBgNV\n"+
-        "HRMBAf8EBTADAQH/MB0GA1UdDgQWBBRge2YaRQ2XyolQL30EzTSo//z9SzANBgkqhkiG9w0BAQUF\n"+
-        "AAOCAQEA1nPnfE920I2/7LqivjTFKDK1fPxsnCwrvQmeU79rXqoRSLblCKOzyj1hTdNGCbM+w6Dj\n"+
-        "Y1Ub8rrvrTnhQ7k4o+YviiY776BQVvnGCv04zcQLcFGUl5gE38NflNUVyRRBnMRddWQVDf9VMOyG\n"+
-        "j/8N7yy5Y0b2qvzfvGn9LhJIZJrglfCm7ymPAbEVtQwdpf5pLGkkeB6zpxxxYu7KyJesF12KwvhH\n"+
-        "hm4qxFYxldBniYUr+WymXUadDKqC5JlR3XC321Y9YeRq4VzW9v493kHMB65jUr9TU/Qr6cf9tveC\n"+
-        "X4XSQRjbgbMEHMUfpIBvFSDJ3gyICh3WZlXi/EjJKSZp4A==\n"+
-        "-----END CERTIFICATE-----\n",
-
-        // "GlobalSign Root CA - R2"
-        "-----BEGIN CERTIFICATE-----\n"+
-        "MIIDujCCAqKgAwIBAgILBAAAAAABD4Ym5g0wDQYJKoZIhvcNAQEFBQAwTDEgMB4GA1UECxMXR2xv\n"+
-        "YmFsU2lnbiBSb290IENBIC0gUjIxEzARBgNVBAoTCkdsb2JhbFNpZ24xEzARBgNVBAMTCkdsb2Jh\n"+
-        "bFNpZ24wHhcNMDYxMjE1MDgwMDAwWhcNMjExMjE1MDgwMDAwWjBMMSAwHgYDVQQLExdHbG9iYWxT\n"+
-        "aWduIFJvb3QgQ0EgLSBSMjETMBEGA1UEChMKR2xvYmFsU2lnbjETMBEGA1UEAxMKR2xvYmFsU2ln\n"+
-        "bjCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKbPJA6+Lm8omUVCxKs+IVSbC9N/hHD6\n"+
-        "ErPLv4dfxn+G07IwXNb9rfF73OX4YJYJkhD10FPe+3t+c4isUoh7SqbKSaZeqKeMWhG8eoLrvozp\n"+
-        "s6yWJQeXSpkqBy+0Hne/ig+1AnwblrjFuTosvNYSuetZfeLQBoZfXklqtTleiDTsvHgMCJiEbKjN\n"+
-        "S7SgfQx5TfC4LcshytVsW33hoCmEofnTlEnLJGKRILzdC9XZzPnqJworc5HGnRusyMvo4KD0L5CL\n"+
-        "TfuwNhv2GXqF4G3yYROIXJ/gkwpRl4pazq+r1feqCapgvdzZX99yqWATXgAByUr6P6TqBwMhAo6C\n"+
-        "ygPCm48CAwEAAaOBnDCBmTAOBgNVHQ8BAf8EBAMCAQYwDwYDVR0TAQH/BAUwAwEB/zAdBgNVHQ4E\n"+
-        "FgQUm+IHV2ccHsBqBt5ZtJot39wZhi4wNgYDVR0fBC8wLTAroCmgJ4YlaHR0cDovL2NybC5nbG9i\n"+
-        "YWxzaWduLm5ldC9yb290LXIyLmNybDAfBgNVHSMEGDAWgBSb4gdXZxwewGoG3lm0mi3f3BmGLjAN\n"+
-        "BgkqhkiG9w0BAQUFAAOCAQEAmYFThxxol4aR7OBKuEQLq4GsJ0/WwbgcQ3izDJr86iw8bmEbTUsp\n"+
-        "9Z8FHSbBuOmDAGJFtqkIk7mpM0sYmsL4h4hO291xNBrBVNpGP+DTKqttVCL1OmLNIG+6KYnX3ZHu\n"+
-        "01yiPqFbQfXf5WRDLenVOavSot+3i9DAgBkcRcAtjOj4LaR0VknFBbVPFd5uRHg5h6h+u/N5GJG7\n"+
-        "9G+dwfCMNYxdAfvDbbnvRG15RjF+Cv6pgsH/76tuIMRQyV+dTZsXjAzlAcmgQWpzU/qlULRuJQ/7\n"+
-        "TBj0/VLZjmmx6BEP3ojY+x1J96relc8geMJgEtslQIxq/H5COEBkEveegeGTLg==\n"+
-        "-----END CERTIFICATE-----\n",
-
-        // GlobalSign Organization Validation CA - G2
-        "-----BEGIN CERTIFICATE-----\n"+
-        "MIIEYDCCA0igAwIBAgILBAAAAAABL07hRQwwDQYJKoZIhvcNAQEFBQAwVzELMAkG\n"+
-        "A1UEBhMCQkUxGTAXBgNVBAoTEEdsb2JhbFNpZ24gbnYtc2ExEDAOBgNVBAsTB1Jv\n"+
-        "b3QgQ0ExGzAZBgNVBAMTEkdsb2JhbFNpZ24gUm9vdCBDQTAeFw0xMTA0MTMxMDAw\n"+
-        "MDBaFw0yMjA0MTMxMDAwMDBaMF0xCzAJBgNVBAYTAkJFMRkwFwYDVQQKExBHbG9i\n"+
-        "YWxTaWduIG52LXNhMTMwMQYDVQQDEypHbG9iYWxTaWduIE9yZ2FuaXphdGlvbiBW\n"+
-        "YWxpZGF0aW9uIENBIC0gRzIwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIB\n"+
-        "AQDdNR3yIFQmGtDvpW+Bdllw3Of01AMkHyQOnSKf1Ccyeit87ovjYWI4F6+0S3qf\n"+
-        "ZyEcLZVUunm6tsTyDSF0F2d04rFkCJlgePtnwkv3J41vNnbPMYzl8QbX3FcOW6zu\n"+
-        "zi2rqqlwLwKGyLHQCAeV6irs0Z7kNlw7pja1Q4ur944+ABv/hVlrYgGNguhKujiz\n"+
-        "4MP0bRmn6gXdhGfCZsckAnNate6kGdn8AM62pI3ffr1fsjqdhDFPyGMM5NgNUqN+\n"+
-        "ARvUZ6UYKOsBp4I82Y4d5UcNuotZFKMfH0vq4idGhs6dOcRmQafiFSNrVkfB7cVT\n"+
-        "5NSAH2v6gEaYsgmmD5W+ZoiTAgMBAAGjggElMIIBITAOBgNVHQ8BAf8EBAMCAQYw\n"+
-        "EgYDVR0TAQH/BAgwBgEB/wIBADAdBgNVHQ4EFgQUXUayjcRLdBy77fVztjq3OI91\n"+
-        "nn4wRwYDVR0gBEAwPjA8BgRVHSAAMDQwMgYIKwYBBQUHAgEWJmh0dHBzOi8vd3d3\n"+
-        "Lmdsb2JhbHNpZ24uY29tL3JlcG9zaXRvcnkvMDMGA1UdHwQsMCowKKAmoCSGImh0\n"+
-        "dHA6Ly9jcmwuZ2xvYmFsc2lnbi5uZXQvcm9vdC5jcmwwPQYIKwYBBQUHAQEEMTAv\n"+
-        "MC0GCCsGAQUFBzABhiFodHRwOi8vb2NzcC5nbG9iYWxzaWduLmNvbS9yb290cjEw\n"+
-        "HwYDVR0jBBgwFoAUYHtmGkUNl8qJUC99BM00qP/8/UswDQYJKoZIhvcNAQEFBQAD\n"+
-        "ggEBABvgiADHBREc/6stSEJSzSBo53xBjcEnxSxZZ6CaNduzUKcbYumlO/q2IQen\n"+
-        "fPMOK25+Lk2TnLryhj5jiBDYW2FQEtuHrhm70t8ylgCoXtwtI7yw07VKoI5lkS/Z\n"+
-        "9oL2dLLffCbvGSuXL+Ch7rkXIkg/pfcNYNUNUUflWP63n41edTzGQfDPgVRJEcYX\n"+
-        "pOBWYdw9P91nbHZF2krqrhqkYE/Ho9aqp9nNgSvBZnWygI/1h01fwlr1kMbawb30\n"+
-        "hag8IyrhFHvBN91i0ZJsumB9iOQct+R2UTjEqUdOqCsukNK1OFHrwZyKarXMsh3o\n"+
-        "wFZUTKiL8IkyhtyTMr5NGvo1dbU=\n"+
-        "-----END CERTIFICATE-----\n"
-      ]
-
-
-    , cache : cache
-
-    , "cache-lock-stale": 60000
-    , "cache-lock-retries": 10
-    , "cache-lock-wait": 10000
-
-    , "cache-max": Infinity
-    , "cache-min": 10
-
-    , color : true
-    , coverage: false
-    , depth: Infinity
-    , description : true
-    , dev : false
-    , editor : osenv.editor()
-    , "engine-strict": false
-    , force : false
-
-    , "fetch-retries": 2
-    , "fetch-retry-factor": 10
-    , "fetch-retry-mintimeout": 10000
-    , "fetch-retry-maxtimeout": 60000
-
-    , git: "git"
-
-    , global : false
-    , globalconfig : path.resolve(globalPrefix, "etc", "npmrc")
-    , globalignorefile : path.resolve( globalPrefix, "etc", "npmignore")
-    , group : process.platform === "win32" ? 0
-            : process.env.SUDO_GID || (process.getgid && process.getgid())
-    , ignore: ""
-    , "init-module": path.resolve(home, '.npm-init.js')
-    , "init.version" : "0.0.0"
-    , "init.author.name" : ""
-    , "init.author.email" : ""
-    , "init.author.url" : ""
-    , json: false
-    , link: false
-    , loglevel : "http"
-    , logstream : process.stderr
-    , long : false
-    , message : "%s"
-    , "node-version" : process.version
-    , npaturl : "http://npat.npmjs.org/"
-    , npat : false
-    , "onload-script" : false
-    , optional: true
-    , parseable : false
-    , pre: false
-    , prefix : globalPrefix
-    , production: process.env.NODE_ENV === "production"
-    , "proprietary-attribs": true
-    , proxy : process.env.HTTP_PROXY || process.env.http_proxy || null
-    , "https-proxy" : process.env.HTTPS_PROXY || process.env.https_proxy ||
-                      process.env.HTTP_PROXY || process.env.http_proxy || null
-    , "user-agent" : "node/" + process.version
-                     + ' ' + process.platform
-                     + ' ' + process.arch
-    , "rebuild-bundle" : true
-    , registry : "https://registry.npmjs.org/"
-    , rollback : true
-    , save : false
-    , "save-bundle": false
-    , "save-dev" : false
-    , "save-optional" : false
-    , searchopts: ""
-    , searchexclude: null
-    , searchsort: "name"
-    , shell : osenv.shell()
-    , shrinkwrap: true
-    , "sign-git-tag": false
-    , "strict-ssl": true
-    , tag : "latest"
-    , tmp : temp
-    , unicode : true
-    , "unsafe-perm" : process.platform === "win32"
-                    || process.platform === "cygwin"
-                    || !( process.getuid && process.setuid
-                       && process.getgid && process.setgid )
-                    || process.getuid() !== 0
-    , usage : false
-    , user : process.platform === "win32" ? 0 : "nobody"
-    , username : ""
-    , userconfig : path.resolve(home, ".npmrc")
-    , userignorefile : path.resolve(home, ".npmignore")
-    , umask: 022
-    , version : false
-    , versions : false
-    , viewer: process.platform === "win32" ? "browser" : "man"
-    , yes: null
-
-    , _exit : true
-    }
-}})
-
-exports.types =
-  { "always-auth" : Boolean
-  , "bin-links": Boolean
-  , browser : [null, String]
-  , ca: [null, String, Array]
-  , cache : path
-  , "cache-lock-stale": Number
-  , "cache-lock-retries": Number
-  , "cache-lock-wait": Number
-  , "cache-max": Number
-  , "cache-min": Number
-  , color : ["always", Boolean]
-  , coverage: Boolean
-  , depth : Number
-  , description : Boolean
-  , dev : Boolean
-  , editor : String
-  , "engine-strict": Boolean
-  , force : Boolean
-  , "fetch-retries": Number
-  , "fetch-retry-factor": Number
-  , "fetch-retry-mintimeout": Number
-  , "fetch-retry-maxtimeout": Number
-  , git: String
-  , global : Boolean
-  , globalconfig : path
-  , globalignorefile: path
-  , group : [Number, String]
-  , "https-proxy" : [null, url]
-  , "user-agent" : String
-  , ignore : String
-  , "init-module": path
-  , "init.version" : [null, semver]
-  , "init.author.name" : String
-  , "init.author.email" : String
-  , "init.author.url" : ["", url]
-  , json: Boolean
-  , link: Boolean
-  , loglevel : ["silent","win","error","warn","http","info","verbose","silly"]
-  , logstream : Stream
-  , long : Boolean
-  , message: String
-  , "node-version" : [null, semver]
-  , npaturl : url
-  , npat : Boolean
-  , "onload-script" : [null, String]
-  , optional: Boolean
-  , parseable : Boolean
-  , pre: Boolean
-  , prefix: path
-  , production: Boolean
-  , "proprietary-attribs": Boolean
-  , proxy : [null, url]
-  , "rebuild-bundle" : Boolean
-  , registry : [null, url]
-  , rollback : Boolean
-  , save : Boolean
-  , "save-bundle": Boolean
-  , "save-dev" : Boolean
-  , "save-optional" : Boolean
-  , searchopts : String
-  , searchexclude: [null, String]
-  , searchsort: [ "name", "-name"
-                , "description", "-description"
-                , "author", "-author"
-                , "date", "-date"
-                , "keywords", "-keywords" ]
-  , shell : String
-  , shrinkwrap: Boolean
-  , "sign-git-tag": Boolean
-  , "strict-ssl": Boolean
-  , tag : String
-  , tmp : path
-  , unicode : Boolean
-  , "unsafe-perm" : Boolean
-  , usage : Boolean
-  , user : [Number, String]
-  , username : String
-  , userconfig : path
-  , userignorefile : path
-  , umask: Octal
-  , version : Boolean
-  , versions : Boolean
-  , viewer: String
-  , yes: [false, null, Boolean]
-  , _exit : Boolean
-  , _password: String
-  }
-
-exports.shorthands =
-  { s : ["--loglevel", "silent"]
-  , d : ["--loglevel", "info"]
-  , dd : ["--loglevel", "verbose"]
-  , ddd : ["--loglevel", "silly"]
-  , noreg : ["--no-registry"]
-  , N : ["--no-registry"]
-  , reg : ["--registry"]
-  , "no-reg" : ["--no-registry"]
-  , silent : ["--loglevel", "silent"]
-  , verbose : ["--loglevel", "verbose"]
-  , quiet: ["--loglevel", "warn"]
-  , q: ["--loglevel", "warn"]
-  , h : ["--usage"]
-  , H : ["--usage"]
-  , "?" : ["--usage"]
-  , help : ["--usage"]
-  , v : ["--version"]
-  , f : ["--force"]
-  , gangster : ["--force"]
-  , gangsta : ["--force"]
-  , desc : ["--description"]
-  , "no-desc" : ["--no-description"]
-  , "local" : ["--no-global"]
-  , l : ["--long"]
-  , m : ["--message"]
-  , p : ["--parseable"]
-  , porcelain : ["--parseable"]
-  , g : ["--global"]
-  , S : ["--save"]
-  , D : ["--save-dev"]
-  , O : ["--save-optional"]
-  , y : ["--yes"]
-  , n : ["--no-yes"]
-  , B : ["--save-bundle"]
-  }
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-node_modules
-node_modules/*
-npm_debug.log
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/LICENCE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-Copyright (c) 2011 Dominic Tarr
-
-Permission is hereby granted, free of charge, 
-to any person obtaining a copy of this software and 
-associated documentation files (the "Software"), to 
-deal in the Software without restriction, including 
-without limitation the rights to use, copy, modify, 
-merge, publish, distribute, sublicense, and/or sell 
-copies of the Software, and to permit persons to whom 
-the Software is furnished to do so, 
-subject to the following conditions:
-
-The above copyright notice and this permission notice 
-shall be included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, 
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES 
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. 
-IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR 
-ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, 
-TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE 
-SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,282 +0,0 @@
-var ProtoList = require('proto-list')
-  , path = require('path')
-  , fs = require('fs')
-  , ini = require('ini')
-  , EE = require('events').EventEmitter
-  , url = require('url')
-  , http = require('http')
-
-var exports = module.exports = function () {
-  var args = [].slice.call(arguments)
-    , conf = new ConfigChain()
-
-  while(args.length) {
-    var a = args.shift()
-    if(a) conf.push
-          ( 'string' === typeof a
-            ? json(a)
-            : a )
-  }
-
-  return conf
-}
-
-//recursively find a file...
-
-var find = exports.find = function () {
-  var rel = path.join.apply(null, [].slice.call(arguments))
-
-  function find(start, rel) {
-    var file = path.join(start, rel)
-    try {
-      fs.statSync(file)
-      return file
-    } catch (err) {
-      if(path.dirname(start) !== start) // root
-        return find(path.dirname(start), rel)
-    }
-  }
-  return find(__dirname, rel)
-}
-
-var parse = exports.parse = function (content, file, type) {
-  content = '' + content
-  // if we don't know what it is, try json and fall back to ini
-  // if we know what it is, then it must be that.
-  if (!type) {
-    try { return JSON.parse(content) }
-    catch (er) { return ini.parse(content) }
-  } else if (type === 'json') {
-    if (this.emit) {
-      try { return JSON.parse(content) }
-      catch (er) { this.emit('error', er) }
-    } else {
-      return JSON.parse(content)
-    }
-  } else {
-    return ini.parse(content)
-  }
-}
-
-var json = exports.json = function () {
-  var args = [].slice.call(arguments).filter(function (arg) { return arg != null })
-  var file = path.join.apply(null, args)
-  var content
-  try {
-    content = fs.readFileSync(file,'utf-8')
-  } catch (err) {
-    return
-  }
-  return parse(content, file, 'json')
-}
-
-var env = exports.env = function (prefix, env) {
-  env = env || process.env
-  var obj = {}
-  var l = prefix.length
-  for(var k in env) {
-    if(k.indexOf(prefix) === 0)
-      obj[k.substring(l)] = env[k]
-  }
-
-  return obj
-}
-
-exports.ConfigChain = ConfigChain
-function ConfigChain () {
-  EE.apply(this)
-  ProtoList.apply(this, arguments)
-  this._awaiting = 0
-  this._saving = 0
-  this.sources = {}
-}
-
-// multi-inheritance-ish
-var extras = {
-  constructor: { value: ConfigChain }
-}
-Object.keys(EE.prototype).forEach(function (k) {
-  extras[k] = Object.getOwnPropertyDescriptor(EE.prototype, k)
-})
-ConfigChain.prototype = Object.create(ProtoList.prototype, extras)
-
-ConfigChain.prototype.del = function (key, where) {
-  // if not specified where, then delete from the whole chain, scorched
-  // earth style
-  if (where) {
-    var target = this.sources[where]
-    target = target && target.data
-    if (!target) {
-      return this.emit('error', new Error('not found '+where))
-    }
-    delete target[key]
-  } else {
-    for (var i = 0, l = this.list.length; i < l; i ++) {
-      delete this.list[i][key]
-    }
-  }
-  return this
-}
-
-ConfigChain.prototype.set = function (key, value, where) {
-  var target
-
-  if (where) {
-    target = this.sources[where]
-    target = target && target.data
-    if (!target) {
-      return this.emit('error', new Error('not found '+where))
-    }
-  } else {
-    target = this.list[0]
-    if (!target) {
-      return this.emit('error', new Error('cannot set, no confs!'))
-    }
-  }
-  target[key] = value
-  return this
-}
-
-ConfigChain.prototype.get = function (key, where) {
-  if (where) {
-    where = this.sources[where]
-    if (where) where = where.data
-    if (where && Object.hasOwnProperty.call(where, key)) return where[key]
-    return undefined
-  }
-  return this.list[0][key]
-}
-
-ConfigChain.prototype.save = function (where, type, cb) {
-  if (typeof type === 'function') cb = type, type = null
-  var target = this.sources[where]
-  if (!target || !(target.path || target.source) || !target.data) {
-    // TODO: maybe save() to a url target could be a PUT or something?
-    // would be easy to swap out with a reddis type thing, too
-    return this.emit('error', new Error('bad save target: '+where))
-  }
-
-  if (target.source) {
-    var pref = target.prefix || ''
-    Object.keys(target.data).forEach(function (k) {
-      target.source[pref + k] = target.data[k]
-    })
-    return this
-  }
-
-  var type = type || target.type
-  var data = target.data
-  if (target.type === 'json') {
-    data = JSON.stringify(data)
-  } else {
-    data = ini.stringify(data)
-  }
-
-  this._saving ++
-  fs.writeFile(target.path, data, 'utf8', function (er) {
-    this._saving --
-    if (er) {
-      if (cb) return cb(er)
-      else return this.emit('error', er)
-    }
-    if (this._saving === 0) {
-      if (cb) cb()
-      this.emit('save')
-    }
-  }.bind(this))
-  return this
-}
-
-ConfigChain.prototype.addFile = function (file, type, name) {
-  name = name || file
-  var marker = {__source__:name}
-  this.sources[name] = { path: file, type: type }
-  this.push(marker)
-  this._await()
-  fs.readFile(file, 'utf8', function (er, data) {
-    if (er) this.emit('error', er)
-    this.addString(data, file, type, marker)
-  }.bind(this))
-  return this
-}
-
-ConfigChain.prototype.addEnv = function (prefix, env, name) {
-  name = name || 'env'
-  var data = exports.env(prefix, env)
-  this.sources[name] = { data: data, source: env, prefix: prefix }
-  return this.add(data, name)
-}
-
-ConfigChain.prototype.addUrl = function (req, type, name) {
-  this._await()
-  var href = url.format(req)
-  name = name || href
-  var marker = {__source__:name}
-  this.sources[name] = { href: href, type: type }
-  this.push(marker)
-  http.request(req, function (res) {
-    var c = []
-    var ct = res.headers['content-type']
-    if (!type) {
-      type = ct.indexOf('json') !== -1 ? 'json'
-           : ct.indexOf('ini') !== -1 ? 'ini'
-           : href.match(/\.json$/) ? 'json'
-           : href.match(/\.ini$/) ? 'ini'
-           : null
-      marker.type = type
-    }
-
-    res.on('data', c.push.bind(c))
-    .on('end', function () {
-      this.addString(Buffer.concat(c), href, type, marker)
-    }.bind(this))
-    .on('error', this.emit.bind(this, 'error'))
-
-  }.bind(this))
-  .on('error', this.emit.bind(this, 'error'))
-  .end()
-
-  return this
-}
-
-ConfigChain.prototype.addString = function (data, file, type, marker) {
-  data = this.parse(data, file, type)
-  this.add(data, marker)
-  return this
-}
-
-ConfigChain.prototype.add = function (data, marker) {
-  if (marker && typeof marker === 'object') {
-    var i = this.list.indexOf(marker)
-    if (i === -1) {
-      return this.emit('error', new Error('bad marker'))
-    }
-    this.splice(i, 1, data)
-    marker = marker.__source__
-    this.sources[marker] = this.sources[marker] || {}
-    this.sources[marker].data = data
-    // we were waiting for this.  maybe emit 'load'
-    this._resolve()
-  } else {
-    if (typeof marker === 'string') {
-      this.sources[marker] = this.sources[marker] || {}
-      this.sources[marker].data = data
-    }
-    // trigger the load event if nothing was already going to do so.
-    this._await()
-    this.push(data)
-    process.nextTick(this._resolve.bind(this))
-  }
-  return this
-}
-
-ConfigChain.prototype.parse = exports.parse
-
-ConfigChain.prototype._await = function () {
-  this._awaiting++
-}
-
-ConfigChain.prototype._resolve = function () {
-  this._awaiting--
-  if (this._awaiting === 0) this.emit('load', this)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/node_modules/proto-list/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-Copyright 2009, 2010, 2011 Isaac Z. Schlueter.
-All rights reserved.
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/node_modules/proto-list/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-A list of objects, bound by their prototype chain.
-
-Used in npm's config stuff.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/node_modules/proto-list/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-{
-  "name": "proto-list",
-  "version": "1.2.2",
-  "description": "A utility for managing a prototype chain",
-  "main": "./proto-list.js",
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "https://github.com/isaacs/proto-list"
-  },
-  "license": {
-    "type": "MIT",
-    "url": "https://github.com/isaacs/proto-list/blob/master/LICENSE"
-  },
-  "devDependencies": {
-    "tap": "0"
-  },
-  "readme": "A list of objects, bound by their prototype chain.\n\nUsed in npm's config stuff.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/proto-list/issues"
-  },
-  "homepage": "https://github.com/isaacs/proto-list",
-  "_id": "proto-list@1.2.2",
-  "_from": "proto-list@~1.2.1"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/node_modules/proto-list/proto-list.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,81 +0,0 @@
-
-module.exports = ProtoList
-
-function ProtoList () {
-  this.list = []
-  var root = null
-  Object.defineProperty(this, 'root', {
-    get: function () { return root },
-    set: function (r) {
-      root = r
-      if (this.list.length) {
-        this.list[this.list.length - 1].__proto__ = r
-      }
-    },
-    enumerable: true,
-    configurable: true
-  })
-}
-
-ProtoList.prototype =
-  { get length () { return this.list.length }
-  , get keys () {
-      var k = []
-      for (var i in this.list[0]) k.push(i)
-      return k
-    }
-  , get snapshot () {
-      var o = {}
-      this.keys.forEach(function (k) { o[k] = this.get(k) }, this)
-      return o
-    }
-  , get store () {
-      return this.list[0]
-    }
-  , push : function (obj) {
-      if (typeof obj !== "object") obj = {valueOf:obj}
-      if (this.list.length >= 1) {
-        this.list[this.list.length - 1].__proto__ = obj
-      }
-      obj.__proto__ = this.root
-      return this.list.push(obj)
-    }
-  , pop : function () {
-      if (this.list.length >= 2) {
-        this.list[this.list.length - 2].__proto__ = this.root
-      }
-      return this.list.pop()
-    }
-  , unshift : function (obj) {
-      obj.__proto__ = this.list[0] || this.root
-      return this.list.unshift(obj)
-    }
-  , shift : function () {
-      if (this.list.length === 1) {
-        this.list[0].__proto__ = this.root
-      }
-      return this.list.shift()
-    }
-  , get : function (key) {
-      return this.list[0][key]
-    }
-  , set : function (key, val, save) {
-      if (!this.length) this.push({})
-      if (save && this.list[0].hasOwnProperty(key)) this.push({})
-      return this.list[0][key] = val
-    }
-  , forEach : function (fn, thisp) {
-      for (var key in this.list[0]) fn.call(thisp, key, this.list[0][key])
-    }
-  , slice : function () {
-      return this.list.slice.apply(this.list, arguments)
-    }
-  , splice : function () {
-      // handle injections
-      var ret = this.list.splice.apply(this.list, arguments)
-      for (var i = 0, l = this.list.length; i < l; i++) {
-        this.list[i].__proto__ = this.list[i + 1] || this.root
-      }
-      return ret
-    }
-  }
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-{
-  "name": "config-chain",
-  "version": "1.1.8",
-  "description": "HANDLE CONFIGURATION ONCE AND FOR ALL",
-  "homepage": "http://github.com/dominictarr/config-chain",
-  "repository": {
-    "type": "git",
-    "url": "https://github.com/dominictarr/config-chain.git"
-  },
-  "dependencies": {
-    "proto-list": "~1.2.1",
-    "ini": "1"
-  },
-  "devDependencies": {
-    "tap": "0.3.0"
-  },
-  "author": {
-    "name": "Dominic Tarr",
-    "email": "dominic.tarr@gmail.com",
-    "url": "http://dominictarr.com"
-  },
-  "scripts": {
-    "test": "tap test/"
-  },
-  "readme": "#config-chain\n\nUSE THIS MODULE TO LOAD ALL YOUR CONFIGURATIONS\n\n``` js\n\n  //npm install config-chain\n\n  var cc = require('config-chain')\n    , opts = require('optimist').argv //ALWAYS USE OPTIMIST FOR COMMAND LINE OPTIONS.\n    , env = opts.env || process.env.YOUR_APP_ENV || 'dev' //SET YOUR ENV LIKE THIS.\n\n  // EACH ARG TO CONFIGURATOR IS LOADED INTO CONFIGURATION CHAIN\n  // EARLIER ITEMS OVERIDE LATER ITEMS\n  // PUTS COMMAND LINE OPTS FIRST, AND DEFAULTS LAST!\n\n  //strings are interpereted as filenames.\n  //will be loaded synchronously\n\n  var conf =\n  cc(\n    //OVERRIDE SETTINGS WITH COMMAND LINE OPTS\n    opts,\n\n    //ENV VARS IF PREFIXED WITH 'myApp_'\n\n    cc.env('myApp_'), //myApp_foo = 'like this'\n\n    //FILE NAMED BY ENV\n    path.join(__dirname,  'config.' + env + '.json'),\n\n    //IF `env` is PRODUCTION\n    env === 'prod'\n      ? path.join(__dirname, 'special.json') //load a special file\n      : null //NULL IS IGNORED!\n\n    //SUBDIR FOR ENV CONFIG\n    path.join(__dirname,  'config', env, 'config.json'),\n\n    //SEARCH PARENT DIRECTORIES FROM CURRENT DIR FOR FILE\n    cc.find('config.json'),\n\n    //PUT DEFAULTS LAST\n    {\n      host: 'localhost'\n      port: 8000\n    })\n\n  var host = conf.get('host')\n\n  // or\n\n  var host = conf.store.host\n\n```\n\nFINALLY, EASY FLEXIBLE CONFIGURATIONS!\n\n##see also: [proto-list](https://github.com/isaacs/proto-list/)\n\nWHATS THAT YOU SAY?\n\nYOU WANT A \"CLASS\" SO THAT YOU CAN DO CRAYCRAY JQUERY CRAPS?\n\nEXTEND WITH YOUR OWN FUNCTIONALTY!?\n\n## CONFIGCHAIN LIVES TO SERVE ONLY YOU!\n\n```javascript\nvar cc = require('config-chain')\n\n// all the stuff you did before\nvar config = cc({\n      some: 'object'\n    },\n    cc.find('config.json'),\n    cc.env('myApp_')\n  )\n  // CONFIGS AS A SERVICE, aka \"CaaS\", aka EVERY DEVOPS DREAM OMG!\n  .addUrl('http://configurator:1234/my-configs')\n  // ASYNC FTW!\n  .addFile('/path/to/file.json')\n\n  // OBJECTS ARE OK TOO, they're SYNC but they still ORDER RIGHT\n  // BECAUSE PROMISES ARE USED BUT NO, NOT *THOSE* PROMISES, JUST\n  // ACTUAL PROMISES LIKE YOU MAKE TO YOUR MOM, KEPT OUT OF LOVE\n  .add({ another: 'object' })\n\n  // DIE A THOUSAND DEATHS IF THIS EVER HAPPENS!!\n  .on('error', function (er) {\n    // IF ONLY THERE WAS SOMETHIGN HARDER THAN THROW\n    // MY SORROW COULD BE ADEQUATELY EXPRESSED.  /o\\\n    throw er\n  })\n\n  // THROW A PARTY IN YOUR FACE WHEN ITS ALL LOADED!!\n  .on('load', function (config) {\n    console.awesome('HOLY SHIT!')\n  })\n```\n\n# BORING API DOCS\n\n## cc(...args)\n\nMAKE A CHAIN AND ADD ALL THE ARGS.\n\nIf the arg is a STRING, then it shall be a JSON FILENAME.\n\nSYNC I/O!\n\nRETURN THE CHAIN!\n\n## cc.json(...args)\n\nJoin the args INTO A JSON FILENAME!\n\nSYNC I/O!\n\n## cc.find(relativePath)\n\nSEEK the RELATIVE PATH by climbing the TREE OF DIRECTORIES.\n\nRETURN THE FOUND PATH!\n\nSYNC I/O!\n\n## cc.parse(content, file, type)\n\nParse the content string, and guess the type from either the\nspecified type or the filename.\n\nRETURN THE RESULTING OBJECT!\n\nNO I/O!\n\n## cc.env(prefix, env=process.env)\n\nGet all the keys on the provided env object (or process.env) which are\nprefixed by the specified prefix, and put the values on a new object.\n\nRETURN THE RESULTING OBJECT!\n\nNO I/O!\n\n## cc.ConfigChain()\n\nThe ConfigChain class for CRAY CRAY JQUERY STYLE METHOD CHAINING!\n\nOne of these is returned by the main exported function, as well.\n\nIt inherits (prototypically) from\n[ProtoList](https://github.com/isaacs/proto-list/), and also inherits\n(parasitically) from\n[EventEmitter](http://nodejs.org/api/events.html#events_class_events_eventemitter)\n\nIt has all the methods from both, and except where noted, they are\nunchanged.\n\n### LET IT BE KNOWN THAT chain IS AN INSTANCE OF ConfigChain.\n\n## chain.sources\n\nA list of all the places where it got stuff.  The keys are the names\npassed to addFile or addUrl etc, and the value is an object with some\ninfo about the data source.\n\n## chain.addFile(filename, type, [name=filename])\n\nFilename is the name of the file.  Name is an arbitrary string to be\nused later if you desire.  Type is either 'ini' or 'json', and will\ntry to guess intelligently if omitted.\n\nLoaded files can be saved later.\n\n## chain.addUrl(url, type, [name=url])\n\nSame as the filename thing, but with a url.\n\nCan't be saved later.\n\n## chain.addEnv(prefix, env, [name='env'])\n\nAdd all the keys from the env object that start with the prefix.\n\n## chain.addString(data, file, type, [name])\n\nParse the string and add it to the set.  (Mainly used internally.)\n\n## chain.add(object, [name])\n\nAdd the object to the set.\n\n## chain.root {Object}\n\nThe root from which all the other config objects in the set descend\nprototypically.\n\nPut your defaults here.\n\n## chain.set(key, value, name)\n\nSet the key to the value on the named config object.  If name is\nunset, then set it on the first config object in the set.  (That is,\nthe one with the highest priority, which was added first.)\n\n## chain.get(key, [name])\n\nGet the key from the named config object explicitly, or from the\nresolved configs if not specified.\n\n## chain.save(name, type)\n\nWrite the named config object back to its origin.\n\nCurrently only supported for env and file config types.\n\nFor files, encode the data according to the type.\n\n## chain.on('save', function () {})\n\nWhen one or more files are saved, emits `save` event when they're all\nsaved.\n\n## chain.on('load', function (chain) {})\n\nWhen the config chain has loaded all the specified files and urls and\nsuch, the 'load' event fires.\n",
-  "readmeFilename": "readme.markdown",
-  "bugs": {
-    "url": "https://github.com/dominictarr/config-chain/issues"
-  },
-  "_id": "config-chain@1.1.8",
-  "_from": "config-chain@~1.1.8"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/node_modules/config-chain/readme.markdown	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,228 +0,0 @@
-#config-chain
-
-USE THIS MODULE TO LOAD ALL YOUR CONFIGURATIONS
-
-``` js
-
-  //npm install config-chain
-
-  var cc = require('config-chain')
-    , opts = require('optimist').argv //ALWAYS USE OPTIMIST FOR COMMAND LINE OPTIONS.
-    , env = opts.env || process.env.YOUR_APP_ENV || 'dev' //SET YOUR ENV LIKE THIS.
-
-  // EACH ARG TO CONFIGURATOR IS LOADED INTO CONFIGURATION CHAIN
-  // EARLIER ITEMS OVERIDE LATER ITEMS
-  // PUTS COMMAND LINE OPTS FIRST, AND DEFAULTS LAST!
-
-  //strings are interpereted as filenames.
-  //will be loaded synchronously
-
-  var conf =
-  cc(
-    //OVERRIDE SETTINGS WITH COMMAND LINE OPTS
-    opts,
-
-    //ENV VARS IF PREFIXED WITH 'myApp_'
-
-    cc.env('myApp_'), //myApp_foo = 'like this'
-
-    //FILE NAMED BY ENV
-    path.join(__dirname,  'config.' + env + '.json'),
-
-    //IF `env` is PRODUCTION
-    env === 'prod'
-      ? path.join(__dirname, 'special.json') //load a special file
-      : null //NULL IS IGNORED!
-
-    //SUBDIR FOR ENV CONFIG
-    path.join(__dirname,  'config', env, 'config.json'),
-
-    //SEARCH PARENT DIRECTORIES FROM CURRENT DIR FOR FILE
-    cc.find('config.json'),
-
-    //PUT DEFAULTS LAST
-    {
-      host: 'localhost'
-      port: 8000
-    })
-
-  var host = conf.get('host')
-
-  // or
-
-  var host = conf.store.host
-
-```
-
-FINALLY, EASY FLEXIBLE CONFIGURATIONS!
-
-##see also: [proto-list](https://github.com/isaacs/proto-list/)
-
-WHATS THAT YOU SAY?
-
-YOU WANT A "CLASS" SO THAT YOU CAN DO CRAYCRAY JQUERY CRAPS?
-
-EXTEND WITH YOUR OWN FUNCTIONALTY!?
-
-## CONFIGCHAIN LIVES TO SERVE ONLY YOU!
-
-```javascript
-var cc = require('config-chain')
-
-// all the stuff you did before
-var config = cc({
-      some: 'object'
-    },
-    cc.find('config.json'),
-    cc.env('myApp_')
-  )
-  // CONFIGS AS A SERVICE, aka "CaaS", aka EVERY DEVOPS DREAM OMG!
-  .addUrl('http://configurator:1234/my-configs')
-  // ASYNC FTW!
-  .addFile('/path/to/file.json')
-
-  // OBJECTS ARE OK TOO, they're SYNC but they still ORDER RIGHT
-  // BECAUSE PROMISES ARE USED BUT NO, NOT *THOSE* PROMISES, JUST
-  // ACTUAL PROMISES LIKE YOU MAKE TO YOUR MOM, KEPT OUT OF LOVE
-  .add({ another: 'object' })
-
-  // DIE A THOUSAND DEATHS IF THIS EVER HAPPENS!!
-  .on('error', function (er) {
-    // IF ONLY THERE WAS SOMETHIGN HARDER THAN THROW
-    // MY SORROW COULD BE ADEQUATELY EXPRESSED.  /o\
-    throw er
-  })
-
-  // THROW A PARTY IN YOUR FACE WHEN ITS ALL LOADED!!
-  .on('load', function (config) {
-    console.awesome('HOLY SHIT!')
-  })
-```
-
-# BORING API DOCS
-
-## cc(...args)
-
-MAKE A CHAIN AND ADD ALL THE ARGS.
-
-If the arg is a STRING, then it shall be a JSON FILENAME.
-
-SYNC I/O!
-
-RETURN THE CHAIN!
-
-## cc.json(...args)
-
-Join the args INTO A JSON FILENAME!
-
-SYNC I/O!
-
-## cc.find(relativePath)
-
-SEEK the RELATIVE PATH by climbing the TREE OF DIRECTORIES.
-
-RETURN THE FOUND PATH!
-
-SYNC I/O!
-
-## cc.parse(content, file, type)
-
-Parse the content string, and guess the type from either the
-specified type or the filename.
-
-RETURN THE RESULTING OBJECT!
-
-NO I/O!
-
-## cc.env(prefix, env=process.env)
-
-Get all the keys on the provided env object (or process.env) which are
-prefixed by the specified prefix, and put the values on a new object.
-
-RETURN THE RESULTING OBJECT!
-
-NO I/O!
-
-## cc.ConfigChain()
-
-The ConfigChain class for CRAY CRAY JQUERY STYLE METHOD CHAINING!
-
-One of these is returned by the main exported function, as well.
-
-It inherits (prototypically) from
-[ProtoList](https://github.com/isaacs/proto-list/), and also inherits
-(parasitically) from
-[EventEmitter](http://nodejs.org/api/events.html#events_class_events_eventemitter)
-
-It has all the methods from both, and except where noted, they are
-unchanged.
-
-### LET IT BE KNOWN THAT chain IS AN INSTANCE OF ConfigChain.
-
-## chain.sources
-
-A list of all the places where it got stuff.  The keys are the names
-passed to addFile or addUrl etc, and the value is an object with some
-info about the data source.
-
-## chain.addFile(filename, type, [name=filename])
-
-Filename is the name of the file.  Name is an arbitrary string to be
-used later if you desire.  Type is either 'ini' or 'json', and will
-try to guess intelligently if omitted.
-
-Loaded files can be saved later.
-
-## chain.addUrl(url, type, [name=url])
-
-Same as the filename thing, but with a url.
-
-Can't be saved later.
-
-## chain.addEnv(prefix, env, [name='env'])
-
-Add all the keys from the env object that start with the prefix.
-
-## chain.addString(data, file, type, [name])
-
-Parse the string and add it to the set.  (Mainly used internally.)
-
-## chain.add(object, [name])
-
-Add the object to the set.
-
-## chain.root {Object}
-
-The root from which all the other config objects in the set descend
-prototypically.
-
-Put your defaults here.
-
-## chain.set(key, value, name)
-
-Set the key to the value on the named config object.  If name is
-unset, then set it on the first config object in the set.  (That is,
-the one with the highest priority, which was added first.)
-
-## chain.get(key, [name])
-
-Get the key from the named config object explicitly, or from the
-resolved configs if not specified.
-
-## chain.save(name, type)
-
-Write the named config object back to its origin.
-
-Currently only supported for env and file config types.
-
-For files, encode the data according to the type.
-
-## chain.on('save', function () {})
-
-When one or more files are saved, emits `save` event when they're all
-saved.
-
-## chain.on('load', function (chain) {})
-
-When the config chain has loaded all the specified files and urls and
-such, the 'load' event fires.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/npmconf.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,338 +0,0 @@
-
-var CC = require('config-chain').ConfigChain
-var inherits = require('inherits')
-var configDefs = require('./config-defs.js')
-var types = configDefs.types
-var once = require('once')
-var fs = require('fs')
-var path = require('path')
-var nopt = require('nopt')
-var ini = require('ini')
-var Octal = configDefs.Octal
-var mkdirp = require('mkdirp')
-
-exports.load = load
-exports.Conf = Conf
-exports.loaded = false
-exports.rootConf = null
-exports.usingBuiltin = false
-exports.defs = configDefs
-Object.defineProperty(exports, 'defaults', { get: function () {
-  return configDefs.defaults
-}, enumerable: true })
-Object.defineProperty(exports, 'types', { get: function () {
-  return configDefs.types
-}, enumerable: true })
-
-exports.validate = validate
-
-var myUid = process.env.SUDO_UID !== undefined
-          ? process.env.SUDO_UID : (process.getuid && process.getuid())
-var myGid = process.env.SUDO_GID !== undefined
-          ? process.env.SUDO_GID : (process.getgid && process.getgid())
-
-
-var loading = false
-var loadCbs = []
-function load (cli_, builtin_, cb_) {
-  var cli, builtin, cb
-  for (var i = 0; i < arguments.length; i++)
-    switch (typeof arguments[i]) {
-      case 'string': builtin = arguments[i]; break
-      case 'object': cli = arguments[i]; break
-      case 'function': cb = arguments[i]; break
-    }
-
-  if (!cb)
-    cb = function () {}
-
-  if (exports.loaded) {
-    var ret = exports.loaded
-    if (cli) {
-      ret = new Conf(ret)
-      ret.unshift(cli)
-    }
-    return process.nextTick(cb.bind(null, null, ret))
-  }
-
-  // either a fresh object, or a clone of the passed in obj
-  if (!cli)
-    cli = {}
-  else
-    cli = Object.keys(cli).reduce(function (c, k) {
-      c[k] = cli[k]
-      return c
-    }, {})
-
-  loadCbs.push(cb)
-  if (loading)
-    return
-  loading = true
-
-  cb = once(function (er, conf) {
-    if (!er)
-      exports.loaded = conf
-    loadCbs.forEach(function (fn) {
-      fn(er, conf)
-    })
-    loadCbs.length = 0
-  })
-
-  // check for a builtin if provided.
-  exports.usingBuiltin = !!builtin
-  var rc = exports.rootConf = new Conf()
-  var defaults = configDefs.defaults
-  if (builtin)
-    rc.addFile(builtin, 'builtin')
-  else
-    rc.add({}, 'builtin')
-
-  rc.on('load', function () {
-    var conf = new Conf(rc)
-    conf.usingBuiltin = !!builtin
-    conf.add(cli, 'cli')
-    conf.addEnv()
-    conf.addFile(conf.get('userconfig'), 'user')
-    conf.once('error', cb)
-    conf.once('load', function () {
-      // globalconfig and globalignorefile defaults
-      // need to respond to the "prefix" setting up to this point.
-      // Eg, `npm config get globalconfig --prefix ~/local` should
-      // return `~/local/etc/npmrc`
-      // annoying humans and their expectations!
-      if (conf.get('prefix')) {
-        var etc = path.resolve(conf.get("prefix"), "etc")
-        defaults.globalconfig = path.resolve(etc, "npmrc")
-        defaults.globalignorefile = path.resolve(etc, "npmignore")
-      }
-      conf.addFile(conf.get('globalconfig'), 'global')
-
-      // move the builtin into the conf stack now.
-      conf.root = defaults
-      conf.add(rc.shift(), 'builtin')
-      conf.once('load', function () {
-        // warn about invalid bits.
-        validate(conf)
-        exports.loaded = conf
-        cb(null, conf)
-      })
-    })
-  })
-}
-
-
-// Basically the same as CC, but:
-// 1. Always ini
-// 2. Parses environment variable names in field values
-// 3. Field values that start with ~/ are replaced with process.env.HOME
-// 4. Can inherit from another Conf object, using it as the base.
-inherits(Conf, CC)
-function Conf (base) {
-  if (!(this instanceof Conf))
-    return new Conf(base)
-
-  CC.apply(this)
-
-  if (base)
-    if (base instanceof Conf)
-      this.root = base.list[0] || base.root
-    else
-      this.root = base
-  else
-    this.root = configDefs.defaults
-}
-
-Conf.prototype.save = function (where, cb) {
-  var target = this.sources[where]
-  if (!target || !(target.path || target.source) || !target.data) {
-    if (where !== 'builtin')
-      var er = new Error('bad save target: '+where)
-    if (cb) {
-      process.nextTick(cb.bind(null, er))
-      return this
-    }
-    return this.emit('error', er)
-  }
-
-  if (target.source) {
-    var pref = target.prefix || ''
-    Object.keys(target.data).forEach(function (k) {
-      target.source[pref + k] = target.data[k]
-    })
-    if (cb) process.nextTick(cb)
-    return this
-  }
-
-  var data = target.data
-
-  if (typeof data._password === 'string' &&
-      typeof data.username === 'string') {
-    var auth = data.username + ':' + data._password
-    data = Object.keys(data).reduce(function (c, k) {
-      if (k === 'username' || k === '_password')
-        return c
-      c[k] = data[k]
-      return c
-    }, { _auth: new Buffer(auth, 'utf8').toString('base64') })
-    delete data.username
-    delete data._password
-  }
-
-  data = ini.stringify(data)
-
-  then = then.bind(this)
-  done = done.bind(this)
-  this._saving ++
-
-  var mode = where === 'user' ? 0600 : 0666
-  if (!data.trim())
-    fs.unlink(target.path, done)
-  else {
-    mkdirp(path.dirname(target.path), function (er) {
-      if (er)
-        return then(er)
-      fs.writeFile(target.path, data, 'utf8', function (er) {
-        if (er)
-          return then(er)
-        if (where === 'user' && myUid && myGid)
-          fs.chown(target.path, +myUid, +myGid, then)
-        else
-          then()
-      })
-    })
-  }
-
-  function then (er) {
-    if (er)
-      return done(er)
-    fs.chmod(target.path, mode, done)
-  }
-
-  function done (er) {
-    if (er) {
-      if (cb) return cb(er)
-      else return this.emit('error', er)
-    }
-    this._saving --
-    if (this._saving === 0) {
-      if (cb) cb()
-      this.emit('save')
-    }
-  }
-
-  return this
-}
-
-Conf.prototype.addFile = function (file, name) {
-  name = name || file
-  var marker = {__source__:name}
-  this.sources[name] = { path: file, type: 'ini' }
-  this.push(marker)
-  this._await()
-  fs.readFile(file, 'utf8', function (er, data) {
-    if (er) // just ignore missing files.
-      return this.add({}, marker)
-    this.addString(data, file, 'ini', marker)
-  }.bind(this))
-  return this
-}
-
-// always ini files.
-Conf.prototype.parse = function (content, file) {
-  return CC.prototype.parse.call(this, content, file, 'ini')
-}
-
-Conf.prototype.add = function (data, marker) {
-  Object.keys(data).forEach(function (k) {
-    data[k] = parseField(data[k], k)
-  })
-  if (Object.prototype.hasOwnProperty.call(data, '_auth')) {
-    var auth = new Buffer(data._auth, 'base64').toString('utf8').split(':')
-    var username = auth.shift()
-    var password = auth.join(':')
-    data.username = username
-    data._password = password
-  }
-  return CC.prototype.add.call(this, data, marker)
-}
-
-Conf.prototype.addEnv = function (env) {
-  env = env || process.env
-  var conf = {}
-  Object.keys(env)
-    .filter(function (k) { return k.match(/^npm_config_[^_]/i) })
-    .forEach(function (k) {
-      if (!env[k])
-        return
-
-      conf[k.replace(/^npm_config_/i, '')
-            .toLowerCase()
-            .replace(/_/g, '-')] = env[k]
-    })
-  return CC.prototype.addEnv.call(this, '', conf, 'env')
-}
-
-function parseField (f, k, emptyIsFalse) {
-  if (typeof f !== 'string' && !(f instanceof String))
-    return f
-
-  // type can be an array or single thing.
-  var typeList = [].concat(types[k])
-  var isPath = -1 !== typeList.indexOf(path)
-  var isBool = -1 !== typeList.indexOf(Boolean)
-  var isString = -1 !== typeList.indexOf(String)
-  var isOctal = -1 !== typeList.indexOf(Octal)
-  var isNumber = isOctal || (-1 !== typeList.indexOf(Number))
-
-  f = (''+f).trim()
-
-  if (f.match(/^".*"$/))
-    f = JSON.parse(f)
-
-  if (isBool && !isString && f === '')
-    return true
-
-  switch (f) {
-    case 'true': return true
-    case 'false': return false
-    case 'null': return null
-    case 'undefined': return undefined
-  }
-
-  f = envReplace(f)
-
-  if (isPath) {
-    var homePattern = process.platform === 'win32' ? /^~(\/|\\)/ : /^~\//
-    if (f.match(homePattern) && process.env.HOME) {
-      f = path.resolve(process.env.HOME, f.substr(2))
-    }
-    f = path.resolve(f)
-  }
-
-  if (isNumber && !isNaN(f))
-    f = isOctal ? parseInt(f, 8) : +f
-
-  return f
-}
-
-function envReplace (f) {
-  if (typeof f !== "string" || !f) return f
-
-  // replace any ${ENV} values with the appropriate environ.
-  var envExpr = /(\\*)\$\{([^}]+)\}/g
-  return f.replace(envExpr, function (orig, esc, name, i, s) {
-    esc = esc.length && esc.length % 2
-    if (esc)
-      return orig
-    if (undefined === process.env[name])
-      throw new Error("Failed to replace env in config: "+orig)
-    return process.env[name]
-  })
-}
-
-function validate (cl) {
-  // warn about invalid configs at every level.
-  cl.list.forEach(function (conf, level) {
-    nopt.clean(conf, configDefs.types)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmconf/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,50 +0,0 @@
-{
-  "name": "npmconf",
-  "version": "0.1.5",
-  "description": "The config thing npm uses",
-  "main": "npmconf.js",
-  "directories": {
-    "test": "test"
-  },
-  "dependencies": {
-    "config-chain": "~1.1.8",
-    "inherits": "~2.0.0",
-    "once": "~1.3.0",
-    "mkdirp": "~0.3.3",
-    "osenv": "0.0.3",
-    "nopt": "2",
-    "semver": "2",
-    "ini": "~1.1.0"
-  },
-  "devDependencies": {
-    "tap": "~0.4.0"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/npmconf"
-  },
-  "keywords": [
-    "npm",
-    "config",
-    "config-chain",
-    "conf",
-    "ini"
-  ],
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me"
-  },
-  "license": "BSD",
-  "readme": "# npmconf\n\nThe config thing npm uses\n\nIf you are interested in interacting with the config settings that npm\nuses, then use this module.\n\nHowever, if you are writing a new Node.js program, and want\nconfiguration functionality similar to what npm has, but for your\nown thing, then I'd recommend using [rc](https://github.com/dominictarr/rc),\nwhich is probably what you want.\n\nIf I were to do it all over again, that's what I'd do for npm.  But,\nalas, there are many systems depending on many of the particulars of\nnpm's configuration setup, so it's not worth the cost of changing.\n\n## USAGE\n\n```javascript\nvar npmconf = require('npmconf')\n\n// pass in the cli options that you read from the cli\n// or whatever top-level configs you want npm to use for now.\nnpmconf.load({some:'configs'}, function (er, conf) {\n  // do stuff with conf\n  conf.get('some', 'cli') // 'configs'\n  conf.get('username') // 'joebobwhatevers'\n  conf.set('foo', 'bar', 'user')\n  conf.save('user', function (er) {\n    // foo = bar is now saved to ~/.npmrc or wherever\n  })\n})\n```\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/npmconf/issues"
-  },
-  "homepage": "https://github.com/isaacs/npmconf",
-  "_id": "npmconf@0.1.5",
-  "_from": "npmconf@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,153 +0,0 @@
-# npmlog
-
-The logger util that npm uses.
-
-This logger is very basic.  It does the logging for npm.  It supports
-custom levels and colored output.
-
-By default, logs are written to stderr.  If you want to send log messages
-to outputs other than streams, then you can change the `log.stream`
-member, or you can just listen to the events that it emits, and do
-whatever you want with them.
-
-# Basic Usage
-
-```
-var log = require('npmlog')
-
-// additional stuff ---------------------------+
-// message ----------+                         |
-// prefix ----+      |                         |
-// level -+   |      |                         |
-//        v   v      v                         v
-    log.info('fyi', 'I have a kitty cat: %j', myKittyCat)
-```
-
-## log.level
-
-* {String}
-
-The level to display logs at.  Any logs at or above this level will be
-displayed.  The special level `silent` will prevent anything from being
-displayed ever.
-
-## log.record
-
-* {Array}
-
-An array of all the log messages that have been entered.
-
-## log.maxRecordSize
-
-* {Number}
-
-The maximum number of records to keep.  If log.record gets bigger than
-10% over this value, then it is sliced down to 90% of this value.
-
-The reason for the 10% window is so that it doesn't have to resize a
-large array on every log entry.
-
-## log.prefixStyle
-
-* {Object}
-
-A style object that specifies how prefixes are styled.  (See below)
-
-## log.headingStyle
-
-* {Object}
-
-A style object that specifies how the heading is styled.  (See below)
-
-## log.heading
-
-* {String} Default: ""
-
-If set, a heading that is printed at the start of every line.
-
-## log.stream
-
-* {Stream} Default: `process.stderr`
-
-The stream where output is written.
-
-## log.enableColor()
-
-Force colors to be used on all messages, regardless of the output
-stream.
-
-## log.disableColor()
-
-Disable colors on all messages.
-
-## log.pause()
-
-Stop emitting messages to the stream, but do not drop them.
-
-## log.resume()
-
-Emit all buffered messages that were written while paused.
-
-## log.log(level, prefix, message, ...)
-
-* `level` {String} The level to emit the message at
-* `prefix` {String} A string prefix.  Set to "" to skip.
-* `message...` Arguments to `util.format`
-
-Emit a log message at the specified level.
-
-## log\[level](prefix, message, ...)
-
-For example,
-
-* log.silly(prefix, message, ...)
-* log.verbose(prefix, message, ...)
-* log.info(prefix, message, ...)
-* log.http(prefix, message, ...)
-* log.warn(prefix, message, ...)
-* log.error(prefix, message, ...)
-
-Like `log.log(level, prefix, message, ...)`.  In this way, each level is
-given a shorthand, so you can do `log.info(prefix, message)`.
-
-## log.addLevel(level, n, style, disp)
-
-* `level` {String} Level indicator
-* `n` {Number} The numeric level
-* `style` {Object} Object with fg, bg, inverse, etc.
-* `disp` {String} Optional replacement for `level` in the output.
-
-Sets up a new level with a shorthand function and so forth.
-
-Note that if the number is `Infinity`, then setting the level to that
-will cause all log messages to be suppressed.  If the number is
-`-Infinity`, then the only way to show it is to enable all log messages.
-
-# Events
-
-Events are all emitted with the message object.
-
-* `log` Emitted for all messages
-* `log.<level>` Emitted for all messages with the `<level>` level.
-* `<prefix>` Messages with prefixes also emit their prefix as an event.
-
-# Style Objects
-
-Style objects can have the following fields:
-
-* `fg` {String} Color for the foreground text
-* `bg` {String} Color for the background
-* `bold`, `inverse`, `underline` {Boolean} Set the associated property
-* `bell` {Boolean} Make a noise (This is pretty annoying, probably.)
-
-# Message Objects
-
-Every log event is emitted with a message object, and the `log.record`
-list contains all of them that have been created.  They have the
-following fields:
-
-* `id` {Number}
-* `level` {String}
-* `prefix` {String}
-* `message` {String} Result of `util.format()`
-* `messageRaw` {Array} Arguments to `util.format()`
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/example.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-var log = require('./log.js')
-
-log.heading = 'npm'
-
-console.error('log.level=silly')
-log.level = 'silly'
-log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}})
-log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}})
-log.info('info prefix', 'x = %j', {foo:{bar:'baz'}})
-log.http('http prefix', 'x = %j', {foo:{bar:'baz'}})
-log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}})
-log.error('error prefix', 'x = %j', {foo:{bar:'baz'}})
-log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}})
-
-console.error('log.level=silent')
-log.level = 'silent'
-log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}})
-log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}})
-log.info('info prefix', 'x = %j', {foo:{bar:'baz'}})
-log.http('http prefix', 'x = %j', {foo:{bar:'baz'}})
-log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}})
-log.error('error prefix', 'x = %j', {foo:{bar:'baz'}})
-log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}})
-
-console.error('log.level=info')
-log.level = 'info'
-log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}})
-log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}})
-log.info('info prefix', 'x = %j', {foo:{bar:'baz'}})
-log.http('http prefix', 'x = %j', {foo:{bar:'baz'}})
-log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}})
-log.error('error prefix', 'x = %j', {foo:{bar:'baz'}})
-log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}})
-log.error('404', 'This is a longer\n'+
-                 'message, with some details\n'+
-                 'and maybe a stack.\n'+
-                 new Error('a 404 error').stack)
-log.addLevel('noise', 10000, {beep: true})
-log.noise(false, 'LOUD NOISES')
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/log.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,154 +0,0 @@
-var EE = require('events').EventEmitter
-var log = exports = module.exports = new EE
-var util = require('util')
-
-var ansi = require('ansi')
-log.cursor = ansi(process.stderr)
-log.stream = process.stderr
-
-// by default, let ansi decide based on tty-ness.
-var colorEnabled = undefined
-log.enableColor = function () {
-  colorEnabled = true
-  this.cursor.enabled = true
-}
-log.disableColor = function () {
-  colorEnabled = false
-  this.cursor.enabled = false
-}
-
-// default level
-log.level = 'info'
-
-// temporarily stop emitting, but don't drop
-log.pause = function () {
-  this._paused = true
-}
-
-log.resume = function () {
-  if (!this._paused) return
-  this._paused = false
-
-  var b = this._buffer
-  this._buffer = []
-  b.forEach(function (m) {
-    this.emitLog(m)
-  }, this)
-}
-
-log._buffer = []
-
-var id = 0
-log.record = []
-log.maxRecordSize = 10000
-log.log = function (lvl, prefix, message) {
-  var l = this.levels[lvl]
-  if (l === undefined) {
-    return this.emit('error', new Error(util.format(
-      'Undefined log level: %j', lvl)))
-  }
-
-  var a = new Array(arguments.length - 2)
-  var stack = null
-  for (var i = 2; i < arguments.length; i ++) {
-    var arg = a[i-2] = arguments[i]
-
-    // resolve stack traces to a plain string.
-    if (typeof arg === 'object' && arg &&
-        (arg instanceof Error) && arg.stack) {
-      arg.stack = stack = arg.stack + ''
-    }
-  }
-  if (stack) a.unshift(stack + '\n')
-  message = util.format.apply(util, a)
-
-  var m = { id: id++,
-            level: lvl,
-            prefix: String(prefix || ''),
-            message: message,
-            messageRaw: a }
-
-  this.emit('log', m)
-  this.emit('log.' + lvl, m)
-  if (m.prefix) this.emit(m.prefix, m)
-
-  this.record.push(m)
-  var mrs = this.maxRecordSize
-  var n = this.record.length - mrs
-  if (n > mrs / 10) {
-    var newSize = Math.floor(mrs * 0.9)
-    this.record = this.record.slice(-1 * newSize)
-  }
-
-  this.emitLog(m)
-}.bind(log)
-
-log.emitLog = function (m) {
-  if (this._paused) {
-    this._buffer.push(m)
-    return
-  }
-  var l = this.levels[m.level]
-  if (l === undefined) return
-  if (l < this.levels[this.level]) return
-  if (l > 0 && !isFinite(l)) return
-
-  var style = log.style[m.level]
-  var disp = log.disp[m.level] || m.level
-  m.message.split(/\r?\n/).forEach(function (line) {
-    if (this.heading) {
-      this.write(this.heading, this.headingStyle)
-      this.write(' ')
-    }
-    this.write(disp, log.style[m.level])
-    var p = m.prefix || ''
-    if (p) this.write(' ')
-    this.write(p, this.prefixStyle)
-    this.write(' ' + line + '\n')
-  }, this)
-}
-
-log.write = function (msg, style) {
-  if (!this.cursor) return
-  if (this.stream !== this.cursor.stream) {
-    this.cursor = ansi(this.stream, { enabled: colorEnabled })
-  }
-
-  style = style || {}
-  if (style.fg) this.cursor.fg[style.fg]()
-  if (style.bg) this.cursor.bg[style.bg]()
-  if (style.bold) this.cursor.bold()
-  if (style.underline) this.cursor.underline()
-  if (style.inverse) this.cursor.inverse()
-  if (style.beep) this.cursor.beep()
-  this.cursor.write(msg).reset()
-}
-
-log.addLevel = function (lvl, n, style, disp) {
-  if (!disp) disp = lvl
-  this.levels[lvl] = n
-  this.style[lvl] = style
-  if (!this[lvl]) this[lvl] = function () {
-    var a = new Array(arguments.length + 1)
-    a[0] = lvl
-    for (var i = 0; i < arguments.length; i ++) {
-      a[i + 1] = arguments[i]
-    }
-    return this.log.apply(this, a)
-  }.bind(this)
-  this.disp[lvl] = disp
-}
-
-log.prefixStyle = { fg: 'magenta' }
-log.headingStyle = { fg: 'white', bg: 'black' }
-
-log.style = {}
-log.levels = {}
-log.disp = {}
-log.addLevel('silly', -Infinity, { inverse: true }, 'sill')
-log.addLevel('verbose', 1000, { fg: 'blue', bg: 'black' }, 'verb')
-log.addLevel('info', 2000, { fg: 'green' })
-log.addLevel('http', 3000, { fg: 'green', bg: 'black' })
-log.addLevel('warn', 4000, { fg: 'black', bg: 'yellow' }, 'WARN')
-log.addLevel('error', 5000, { fg: 'red', bg: 'black' }, 'ERR!')
-log.addLevel('silent', Infinity)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/npmlog/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "npmlog",
-  "description": "logger for npm",
-  "version": "0.0.6",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/npmlog.git"
-  },
-  "main": "log.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "dependencies": {
-    "ansi": "~0.2.1"
-  },
-  "devDependencies": {
-    "tap": ""
-  },
-  "license": "BSD",
-  "readme": "# npmlog\n\nThe logger util that npm uses.\n\nThis logger is very basic.  It does the logging for npm.  It supports\ncustom levels and colored output.\n\nBy default, logs are written to stderr.  If you want to send log messages\nto outputs other than streams, then you can change the `log.stream`\nmember, or you can just listen to the events that it emits, and do\nwhatever you want with them.\n\n# Basic Usage\n\n```\nvar log = require('npmlog')\n\n// additional stuff ---------------------------+\n// message ----------+                         |\n// prefix ----+      |                         |\n// level -+   |      |                         |\n//        v   v      v                         v\n    log.info('fyi', 'I have a kitty cat: %j', myKittyCat)\n```\n\n## log.level\n\n* {String}\n\nThe level to display logs at.  Any logs at or above this level will be\ndisplayed.  The special level `silent` will prevent anything from being\ndisplayed ever.\n\n## log.record\n\n* {Array}\n\nAn array of all the log messages that have been entered.\n\n## log.maxRecordSize\n\n* {Number}\n\nThe maximum number of records to keep.  If log.record gets bigger than\n10% over this value, then it is sliced down to 90% of this value.\n\nThe reason for the 10% window is so that it doesn't have to resize a\nlarge array on every log entry.\n\n## log.prefixStyle\n\n* {Object}\n\nA style object that specifies how prefixes are styled.  (See below)\n\n## log.headingStyle\n\n* {Object}\n\nA style object that specifies how the heading is styled.  (See below)\n\n## log.heading\n\n* {String} Default: \"\"\n\nIf set, a heading that is printed at the start of every line.\n\n## log.stream\n\n* {Stream} Default: `process.stderr`\n\nThe stream where output is written.\n\n## log.enableColor()\n\nForce colors to be used on all messages, regardless of the output\nstream.\n\n## log.disableColor()\n\nDisable colors on all messages.\n\n## log.pause()\n\nStop emitting messages to the stream, but do not drop them.\n\n## log.resume()\n\nEmit all buffered messages that were written while paused.\n\n## log.log(level, prefix, message, ...)\n\n* `level` {String} The level to emit the message at\n* `prefix` {String} A string prefix.  Set to \"\" to skip.\n* `message...` Arguments to `util.format`\n\nEmit a log message at the specified level.\n\n## log\\[level](prefix, message, ...)\n\nFor example,\n\n* log.silly(prefix, message, ...)\n* log.verbose(prefix, message, ...)\n* log.info(prefix, message, ...)\n* log.http(prefix, message, ...)\n* log.warn(prefix, message, ...)\n* log.error(prefix, message, ...)\n\nLike `log.log(level, prefix, message, ...)`.  In this way, each level is\ngiven a shorthand, so you can do `log.info(prefix, message)`.\n\n## log.addLevel(level, n, style, disp)\n\n* `level` {String} Level indicator\n* `n` {Number} The numeric level\n* `style` {Object} Object with fg, bg, inverse, etc.\n* `disp` {String} Optional replacement for `level` in the output.\n\nSets up a new level with a shorthand function and so forth.\n\nNote that if the number is `Infinity`, then setting the level to that\nwill cause all log messages to be suppressed.  If the number is\n`-Infinity`, then the only way to show it is to enable all log messages.\n\n# Events\n\nEvents are all emitted with the message object.\n\n* `log` Emitted for all messages\n* `log.<level>` Emitted for all messages with the `<level>` level.\n* `<prefix>` Messages with prefixes also emit their prefix as an event.\n\n# Style Objects\n\nStyle objects can have the following fields:\n\n* `fg` {String} Color for the foreground text\n* `bg` {String} Color for the background\n* `bold`, `inverse`, `underline` {Boolean} Set the associated property\n* `bell` {Boolean} Make a noise (This is pretty annoying, probably.)\n\n# Message Objects\n\nEvery log event is emitted with a message object, and the `log.record`\nlist contains all of them that have been created.  They have the\nfollowing fields:\n\n* `id` {Number}\n* `level` {String}\n* `prefix` {String}\n* `message` {String} Result of `util.format()`\n* `messageRaw` {Array} Arguments to `util.format()`\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/npmlog/issues"
-  },
-  "_id": "npmlog@0.0.6",
-  "_from": "npmlog@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/once/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/once/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,51 +0,0 @@
-# once
-
-Only call a function once.
-
-## usage
-
-```javascript
-var once = require('once')
-
-function load (file, cb) {
-  cb = once(cb)
-  loader.load('file')
-  loader.once('load', cb)
-  loader.once('error', cb)
-}
-```
-
-Or add to the Function.prototype in a responsible way:
-
-```javascript
-// only has to be done once
-require('once').proto()
-
-function load (file, cb) {
-  cb = cb.once()
-  loader.load('file')
-  loader.once('load', cb)
-  loader.once('error', cb)
-}
-```
-
-Ironically, the prototype feature makes this module twice as
-complicated as necessary.
-
-To check whether you function has been called, use `fn.called`. Once the
-function is called for the first time the return value of the original
-function is saved in `fn.value` and subsequent calls will continue to
-return this value.
-
-```javascript
-var once = require('once')
-
-function load (cb) {
-  cb = once(cb)
-  var stream = createStream()
-  stream.once('data', cb)
-  stream.once('end', function () {
-    if (!cb.called) cb(new Error('not found'))
-  })
-}
-```
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/once/once.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,20 +0,0 @@
-module.exports = once
-
-once.proto = once(function () {
-  Object.defineProperty(Function.prototype, 'once', {
-    value: function () {
-      return once(this)
-    },
-    configurable: true
-  })
-})
-
-function once (fn) {
-  var f = function () {
-    if (f.called) return f.value
-    f.called = true
-    return f.value = fn.apply(this, arguments)
-  }
-  f.called = false
-  return f
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/once/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-{
-  "name": "once",
-  "version": "1.3.0",
-  "description": "Run a function exactly one time",
-  "main": "once.js",
-  "directories": {
-    "test": "test"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "tap": "~0.3.0"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/once"
-  },
-  "keywords": [
-    "once",
-    "function",
-    "one",
-    "single"
-  ],
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "license": "BSD",
-  "readme": "# once\n\nOnly call a function once.\n\n## usage\n\n```javascript\nvar once = require('once')\n\nfunction load (file, cb) {\n  cb = once(cb)\n  loader.load('file')\n  loader.once('load', cb)\n  loader.once('error', cb)\n}\n```\n\nOr add to the Function.prototype in a responsible way:\n\n```javascript\n// only has to be done once\nrequire('once').proto()\n\nfunction load (file, cb) {\n  cb = cb.once()\n  loader.load('file')\n  loader.once('load', cb)\n  loader.once('error', cb)\n}\n```\n\nIronically, the prototype feature makes this module twice as\ncomplicated as necessary.\n\nTo check whether you function has been called, use `fn.called`. Once the\nfunction is called for the first time the return value of the original\nfunction is saved in `fn.value` and subsequent calls will continue to\nreturn this value.\n\n```javascript\nvar once = require('once')\n\nfunction load (cb) {\n  cb = once(cb)\n  var stream = createStream()\n  stream.once('data', cb)\n  stream.once('end', function () {\n    if (!cb.called) cb(new Error('not found'))\n  })\n}\n```\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/once/issues"
-  },
-  "_id": "once@1.3.0",
-  "_from": "once@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/opener/LICENSE.txt	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-            DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE
-                    Version 2, December 2004
-
- Copyright (C) 2012 Domenic Denicola <domenic@domenicdenicola.com>
-
- Everyone is permitted to copy and distribute verbatim or modified
- copies of this license document, and changing it is allowed as long
- as the name is changed.
-
-            DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE
-   TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
-
-  0. You just DO WHAT THE FUCK YOU WANT TO.
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/opener/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,44 +0,0 @@
-# It Opens Stuff
-
-That is, in your desktop environment. This will make *actual windows pop up*, with stuff in them:
-
-```bash
-npm install opener -g
-
-opener http://google.com
-opener ./my-file.txt
-opener firefox
-opener npm run lint
-```
-
-Also if you want to use it programmatically you can do that too:
-
-```js
-var opener = require("opener");
-
-opener("http://google.com");
-opener("./my-file.txt");
-opener("firefox");
-opener("npm run lint");
-```
-
-## Use It for Good
-
-Like opening the user's browser with a test harness in your package's test script:
-
-```json
-{
-    "scripts": {
-        "test": "opener ./test/runner.html"
-    },
-    "devDependencies": {
-        "opener": "*"
-    }
-}
-```
-
-## Why
-
-Because Windows has `start`, Macs have `open`, and *nix has `xdg-open`. At least
-[according to some guy on StackOverflow](http://stackoverflow.com/q/1480971/3191). And I like things that work on all
-three. Like Node.js. And Opener.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/opener/opener.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-#!/usr/bin/env node
-
-"use strict";
-
-var childProcess = require("child_process");
-
-function opener(args, options, callback) {
-    // http://stackoverflow.com/q/1480971/3191, but see below for Windows.
-    var command = process.platform === "win32" ? "cmd" :
-                  process.platform === "darwin" ? "open" :
-                  "xdg-open";
-
-    if (typeof args === "string") {
-        args = [args];
-    }
-
-    if (typeof options === "function") {
-        callback = options;
-        options = {};
-    }
-
-    if (options && typeof options === "object" && options.command) {
-        if (process.platform === "win32") {
-            // *always* use cmd on windows
-            args = [options.command].concat(args);
-        } else {
-            command = options.command;
-        }
-    }
-
-    if (process.platform === "win32") {
-        // On Windows, we really want to use the "start" command. But, the rules regarding arguments with spaces, and
-        // escaping them with quotes, can get really arcane. So the easiest way to deal with this is to pass off the
-        // responsibility to "cmd /c", which has that logic built in.
-        //
-        // Furthermore, if "cmd /c" double-quoted the first parameter, then "start" will interpret it as a window title,
-        // so we need to add a dummy empty-string window title: http://stackoverflow.com/a/154090/3191
-        args = ["/c", "start", '""'].concat(args);
-    }
-
-    childProcess.execFile(command, args, options, callback);
-}
-
-// Export `opener` for programmatic access.
-// You might use this to e.g. open a website: `opener("http://google.com")`
-module.exports = opener;
-
-// If we're being called from the command line, just execute, using the command-line arguments.
-if (require.main && require.main.id === module.id) {
-    opener(process.argv.slice(2), function (error) {
-        if (error) {
-            throw error;
-        }
-    });
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/opener/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,34 +0,0 @@
-{
-  "name": "opener",
-  "description": "Opens stuff, like webpages and files and executables, cross-platform",
-  "version": "1.3.0",
-  "author": {
-    "name": "Domenic Denicola",
-    "email": "domenic@domenicdenicola.com",
-    "url": "http://domenicdenicola.com"
-  },
-  "license": "WTFPL",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/domenic/opener.git"
-  },
-  "bugs": {
-    "url": "http://github.com/domenic/opener/issues"
-  },
-  "main": "opener.js",
-  "bin": {
-    "opener": "opener.js"
-  },
-  "scripts": {
-    "lint": "jshint opener.js"
-  },
-  "devDependencies": {
-    "jshint": ">= 0.9.0"
-  },
-  "readme": "# It Opens Stuff\r\n\r\nThat is, in your desktop environment. This will make *actual windows pop up*, with stuff in them:\r\n\r\n```bash\r\nnpm install opener -g\r\n\r\nopener http://google.com\r\nopener ./my-file.txt\r\nopener firefox\r\nopener npm run lint\r\n```\r\n\r\nAlso if you want to use it programmatically you can do that too:\r\n\r\n```js\r\nvar opener = require(\"opener\");\r\n\r\nopener(\"http://google.com\");\r\nopener(\"./my-file.txt\");\r\nopener(\"firefox\");\r\nopener(\"npm run lint\");\r\n```\r\n\r\n## Use It for Good\r\n\r\nLike opening the user's browser with a test harness in your package's test script:\r\n\r\n```json\r\n{\r\n    \"scripts\": {\r\n        \"test\": \"opener ./test/runner.html\"\r\n    },\r\n    \"devDependencies\": {\r\n        \"opener\": \"*\"\r\n    }\r\n}\r\n```\r\n\r\n## Why\r\n\r\nBecause Windows has `start`, Macs have `open`, and *nix has `xdg-open`. At least\r\n[according to some guy on StackOverflow](http://stackoverflow.com/q/1480971/3191). And I like things that work on all\r\nthree. Like Node.js. And Opener.\r\n",
-  "_id": "opener@1.3.0",
-  "dist": {
-    "shasum": "d72b4b2e61b0a4ca7822a7554070620002fb90d9"
-  },
-  "_from": "opener@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/osenv/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-Copyright (c) Isaac Z. Schlueter
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS
-``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
-TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
-INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
-CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
-ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
-POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/osenv/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,63 +0,0 @@
-# osenv
-
-Look up environment settings specific to different operating systems.
-
-## Usage
-
-```javascript
-var osenv = require('osenv')
-var path = osenv.path()
-var user = osenv.user()
-// etc.
-
-// Some things are not reliably in the env, and have a fallback command:
-var h = osenv.hostname(function (er, hostname) {
-  h = hostname
-})
-// This will still cause it to be memoized, so calling osenv.hostname()
-// is now an immediate operation.
-
-// You can always send a cb, which will get called in the nextTick
-// if it's been memoized, or wait for the fallback data if it wasn't
-// found in the environment.
-osenv.hostname(function (er, hostname) {
-  if (er) console.error('error looking up hostname')
-  else console.log('this machine calls itself %s', hostname)
-})
-```
-
-## osenv.hostname()
-
-The machine name.  Calls `hostname` if not found.
-
-## osenv.user()
-
-The currently logged-in user.  Calls `whoami` if not found.
-
-## osenv.prompt()
-
-Either PS1 on unix, or PROMPT on Windows.
-
-## osenv.tmpdir()
-
-The place where temporary files should be created.
-
-## osenv.home()
-
-No place like it.
-
-## osenv.path()
-
-An array of the places that the operating system will search for
-executables.
-
-## osenv.editor() 
-
-Return the executable name of the editor program.  This uses the EDITOR
-and VISUAL environment variables, and falls back to `vi` on Unix, or
-`notepad.exe` on Windows.
-
-## osenv.shell()
-
-The SHELL on Unix, which Windows calls the ComSpec.  Defaults to 'bash'
-or 'cmd'.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/osenv/osenv.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,80 +0,0 @@
-var isWindows = process.platform === 'win32'
-var windir = isWindows ? process.env.windir || 'C:\\Windows' : null
-var path = require('path')
-var exec = require('child_process').exec
-
-// looking up envs is a bit costly.
-// Also, sometimes we want to have a fallback
-// Pass in a callback to wait for the fallback on failures
-// After the first lookup, always returns the same thing.
-function memo (key, lookup, fallback) {
-  var fell = false
-  var falling = false
-  exports[key] = function (cb) {
-    var val = lookup()
-    if (!val && !fell && !falling && fallback) {
-      fell = true
-      falling = true
-      exec(fallback, function (er, output, stderr) {
-        falling = false
-        if (er) return // oh well, we tried
-        val = output.trim()
-      })
-    }
-    exports[key] = function (cb) {
-      if (cb) process.nextTick(cb.bind(null, null, val))
-      return val
-    }
-    if (cb && !falling) process.nextTick(cb.bind(null, null, val))
-    return val
-  }
-}
-
-memo('user', function () {
-  return ( isWindows
-         ? process.env.USERDOMAIN + '\\' + process.env.USERNAME
-         : process.env.USER
-         )
-}, 'whoami')
-
-memo('prompt', function () {
-  return isWindows ? process.env.PROMPT : process.env.PS1
-})
-
-memo('hostname', function () {
-  return isWindows ? process.env.COMPUTERNAME : process.env.HOSTNAME
-}, 'hostname')
-
-memo('tmpdir', function () {
-  var t = isWindows ? 'temp' : 'tmp'
-  return process.env.TMPDIR ||
-         process.env.TMP ||
-         process.env.TEMP ||
-         ( exports.home() ? path.resolve(exports.home(), t)
-         : isWindows ? path.resolve(windir, t)
-         : '/tmp'
-         )
-})
-
-memo('home', function () {
-  return ( isWindows ? process.env.USERPROFILE
-         : process.env.HOME
-         )
-})
-
-memo('path', function () {
-  return (process.env.PATH ||
-          process.env.Path ||
-          process.env.path).split(isWindows ? ';' : ':')
-})
-
-memo('editor', function () {
-  return process.env.EDITOR ||
-         process.env.VISUAL ||
-         (isWindows ? 'notepad.exe' : 'vi')
-})
-
-memo('shell', function () {
-  return isWindows ? process.env.ComSpec || 'cmd'
-         : process.env.SHELL || 'bash'
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/osenv/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,38 +0,0 @@
-{
-  "name": "osenv",
-  "version": "0.0.3",
-  "main": "osenv.js",
-  "directories": {
-    "test": "test"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "tap": "~0.2.5"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/osenv"
-  },
-  "keywords": [
-    "environment",
-    "variable",
-    "home",
-    "tmpdir",
-    "path",
-    "prompt",
-    "ps1"
-  ],
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "license": "BSD",
-  "description": "Look up environment settings specific to different operating systems",
-  "readme": "# osenv\n\nLook up environment settings specific to different operating systems.\n\n## Usage\n\n```javascript\nvar osenv = require('osenv')\nvar path = osenv.path()\nvar user = osenv.user()\n// etc.\n\n// Some things are not reliably in the env, and have a fallback command:\nvar h = osenv.hostname(function (er, hostname) {\n  h = hostname\n})\n// This will still cause it to be memoized, so calling osenv.hostname()\n// is now an immediate operation.\n\n// You can always send a cb, which will get called in the nextTick\n// if it's been memoized, or wait for the fallback data if it wasn't\n// found in the environment.\nosenv.hostname(function (er, hostname) {\n  if (er) console.error('error looking up hostname')\n  else console.log('this machine calls itself %s', hostname)\n})\n```\n\n## osenv.hostname()\n\nThe machine name.  Calls `hostname` if not found.\n\n## osenv.user()\n\nThe currently logged-in user.  Calls `whoami` if not found.\n\n## osenv.prompt()\n\nEither PS1 on unix, or PROMPT on Windows.\n\n## osenv.tmpdir()\n\nThe place where temporary files should be created.\n\n## osenv.home()\n\nNo place like it.\n\n## osenv.path()\n\nAn array of the places that the operating system will search for\nexecutables.\n\n## osenv.editor() \n\nReturn the executable name of the editor program.  This uses the EDITOR\nand VISUAL environment variables, and falls back to `vi` on Unix, or\n`notepad.exe` on Windows.\n\n## osenv.shell()\n\nThe SHELL on Unix, which Windows calls the ComSpec.  Defaults to 'bash'\nor 'cmd'.\n",
-  "_id": "osenv@0.0.3",
-  "_from": "osenv@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-installed/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-The ISC License
-
-Copyright (c) Isaac Z. Schlueter
-
-Permission to use, copy, modify, and/or distribute this software for any
-purpose with or without fee is hereby granted, provided that the above
-copyright notice and this permission notice appear in all copies.
-
-THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
-REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
-FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
-INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
-LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
-OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
-PERFORMANCE OF THIS SOFTWARE.
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-installed/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-# read-installed
-
-Read all the installed packages in a folder, and return a tree
-structure with all the data.
-
-npm uses this.
-
-## Usage
-
-```javascript
-var readInstalled = require("read-installed")
-// depth is optional, defaults to Infinity
-readInstalled(folder, depth, logFunction, function (er, data) {
-  ...
-})
-```
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-installed/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "name": "read-installed",
-  "description": "Read all the installed packages in a folder, and return a tree structure with all the data.",
-  "version": "0.2.4",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/read-installed"
-  },
-  "main": "read-installed.js",
-  "scripts": {
-    "test": "node test/basic.js"
-  },
-  "dependencies": {
-    "semver": "2",
-    "slide": "~1.1.3",
-    "read-package-json": "1",
-    "graceful-fs": "~2"
-  },
-  "optionalDependencies": {
-    "graceful-fs": "~2"
-  },
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "license": "ISC",
-  "readme": "# read-installed\n\nRead all the installed packages in a folder, and return a tree\nstructure with all the data.\n\nnpm uses this.\n\n## Usage\n\n```javascript\nvar readInstalled = require(\"read-installed\")\n// depth is optional, defaults to Infinity\nreadInstalled(folder, depth, logFunction, function (er, data) {\n  ...\n})\n```\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/read-installed/issues"
-  },
-  "_id": "read-installed@0.2.4",
-  "_from": "read-installed@~0.2.2"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-installed/read-installed.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,333 +0,0 @@
-
-// Walk through the file-system "database" of installed
-// packages, and create a data object related to the
-// installed versions of each package.
-
-/*
-This will traverse through all node_modules folders,
-resolving the dependencies object to the object corresponding to
-the package that meets that dep, or just the version/range if
-unmet.
-
-Assuming that you had this folder structure:
-
-/path/to
-+-- package.json { name = "root" }
-`-- node_modules
-    +-- foo {bar, baz, asdf}
-    | +-- node_modules
-    |   +-- bar { baz }
-    |   `-- baz
-    `-- asdf
-
-where "foo" depends on bar, baz, and asdf, bar depends on baz,
-and bar and baz are bundled with foo, whereas "asdf" is at
-the higher level (sibling to foo), you'd get this object structure:
-
-{ <package.json data>
-, path: "/path/to"
-, parent: null
-, dependencies:
-  { foo :
-    { version: "1.2.3"
-    , path: "/path/to/node_modules/foo"
-    , parent: <Circular: root>
-    , dependencies:
-      { bar:
-        { parent: <Circular: foo>
-        , path: "/path/to/node_modules/foo/node_modules/bar"
-        , version: "2.3.4"
-        , dependencies: { baz: <Circular: foo.dependencies.baz> }
-        }
-      , baz: { ... }
-      , asdf: <Circular: asdf>
-      }
-    }
-  , asdf: { ... }
-  }
-}
-
-Unmet deps are left as strings.
-Extraneous deps are marked with extraneous:true
-deps that don't meet a requirement are marked with invalid:true
-deps that don't meet a peer requirement are marked with peerInvalid:true
-
-to READ(packagefolder, parentobj, name, reqver)
-obj = read package.json
-installed = ./node_modules/*
-if parentobj is null, and no package.json
-  obj = {dependencies:{<installed>:"*"}}
-deps = Object.keys(obj.dependencies)
-obj.path = packagefolder
-obj.parent = parentobj
-if name, && obj.name !== name, obj.invalid = true
-if reqver, && obj.version !satisfies reqver, obj.invalid = true
-if !reqver && parentobj, obj.extraneous = true
-for each folder in installed
-  obj.dependencies[folder] = READ(packagefolder+node_modules+folder,
-                                  obj, folder, obj.dependencies[folder])
-# walk tree to find unmet deps
-for each dep in obj.dependencies not in installed
-  r = obj.parent
-  while r
-    if r.dependencies[dep]
-      if r.dependencies[dep].verion !satisfies obj.dependencies[dep]
-        WARN
-        r.dependencies[dep].invalid = true
-      obj.dependencies[dep] = r.dependencies[dep]
-      r = null
-    else r = r.parent
-return obj
-
-
-TODO:
-1. Find unmet deps in parent directories, searching as node does up
-as far as the left-most node_modules folder.
-2. Ignore anything in node_modules that isn't a package folder.
-
-*/
-
-try {
-  var fs = require("graceful-fs")
-} catch (er) {
-  var fs = require("fs")
-}
-
-var path = require("path")
-var asyncMap = require("slide").asyncMap
-var semver = require("semver")
-var readJson = require("read-package-json")
-var url = require("url")
-
-module.exports = readInstalled
-
-function readInstalled (folder, depth_, log_, cb_) {
-  var depth = Infinity, log = function () {}, cb
-  for (var i = 1; i < arguments.length - 1; i++) {
-    if (typeof arguments[i] === 'number')
-      depth = arguments[i]
-    else if (typeof arguments[i] === 'function')
-      log = arguments[i]
-  }
-  cb = arguments[i]
-
-  readInstalled_(folder, null, null, null, 0, depth, function (er, obj) {
-    if (er) return cb(er)
-    // now obj has all the installed things, where they're installed
-    // figure out the inheritance links, now that the object is built.
-    resolveInheritance(obj, log)
-    cb(null, obj)
-  })
-}
-
-var rpSeen = {}
-function readInstalled_ (folder, parent, name, reqver, depth, maxDepth, cb) {
-  var installed
-    , obj
-    , real
-    , link
-
-  fs.readdir(path.resolve(folder, "node_modules"), function (er, i) {
-    // error indicates that nothing is installed here
-    if (er) i = []
-    installed = i.filter(function (f) { return f.charAt(0) !== "." })
-    next()
-  })
-
-  readJson(path.resolve(folder, "package.json"), function (er, data) {
-    obj = copy(data)
-
-    if (!parent) {
-      obj = obj || true
-      er = null
-    }
-    return next(er)
-  })
-
-  fs.lstat(folder, function (er, st) {
-    if (er) {
-      if (!parent) real = true
-      return next(er)
-    }
-    fs.realpath(folder, function (er, rp) {
-      //console.error("realpath(%j) = %j", folder, rp)
-      real = rp
-      if (st.isSymbolicLink()) link = rp
-      next(er)
-    })
-  })
-
-  var errState = null
-    , called = false
-  function next (er) {
-    if (errState) return
-    if (er) {
-      errState = er
-      return cb(null, [])
-    }
-    //console.error('next', installed, obj && typeof obj, name, real)
-    if (!installed || !obj || !real || called) return
-    called = true
-    if (rpSeen[real]) return cb(null, rpSeen[real])
-    if (obj === true) {
-      obj = {dependencies:{}, path:folder}
-      installed.forEach(function (i) { obj.dependencies[i] = "*" })
-    }
-    if (name && obj.name !== name) obj.invalid = true
-    obj.realName = name || obj.name
-    obj.dependencies = obj.dependencies || {}
-
-    // "foo":"http://blah" is always presumed valid
-    if (reqver
-        && semver.validRange(reqver, true)
-        && !semver.satisfies(obj.version, reqver, true)) {
-      obj.invalid = true
-    }
-
-    if (parent
-        && !(name in parent.dependencies)
-        && !(name in (parent.devDependencies || {}))) {
-      obj.extraneous = true
-    }
-    obj.path = obj.path || folder
-    obj.realPath = real
-    obj.link = link
-    if (parent && !obj.link) obj.parent = parent
-    rpSeen[real] = obj
-    obj.depth = depth
-    //if (depth >= maxDepth) return cb(null, obj)
-    asyncMap(installed, function (pkg, cb) {
-      var rv = obj.dependencies[pkg]
-      if (!rv && obj.devDependencies) rv = obj.devDependencies[pkg]
-      if (depth >= maxDepth) {
-        // just try to get the version number
-        var pkgfolder = path.resolve(folder, "node_modules", pkg)
-          , jsonFile = path.resolve(pkgfolder, "package.json")
-        return readJson(jsonFile, function (er, depData) {
-          // already out of our depth, ignore errors
-          if (er || !depData || !depData.version) return cb(null, obj)
-          obj.dependencies[pkg] = depData.version
-          cb(null, obj)
-        })
-      }
-
-      readInstalled_( path.resolve(folder, "node_modules/"+pkg)
-                    , obj, pkg, obj.dependencies[pkg], depth + 1, maxDepth
-                    , cb )
-
-    }, function (er, installedData) {
-      if (er) return cb(er)
-      installedData.forEach(function (dep) {
-        obj.dependencies[dep.realName] = dep
-      })
-
-      // any strings here are unmet things.  however, if it's
-      // optional, then that's fine, so just delete it.
-      if (obj.optionalDependencies) {
-        Object.keys(obj.optionalDependencies).forEach(function (dep) {
-          if (typeof obj.dependencies[dep] === "string") {
-            delete obj.dependencies[dep]
-          }
-        })
-      }
-      return cb(null, obj)
-    })
-  }
-}
-
-// starting from a root object, call findUnmet on each layer of children
-var riSeen = []
-function resolveInheritance (obj, log) {
-  if (typeof obj !== "object") return
-  if (riSeen.indexOf(obj) !== -1) return
-  riSeen.push(obj)
-  if (typeof obj.dependencies !== "object") {
-    obj.dependencies = {}
-  }
-  Object.keys(obj.dependencies).forEach(function (dep) {
-    findUnmet(obj.dependencies[dep], log)
-  })
-  Object.keys(obj.dependencies).forEach(function (dep) {
-    resolveInheritance(obj.dependencies[dep], log)
-  })
-  findUnmet(obj, log)
-}
-
-// find unmet deps by walking up the tree object.
-// No I/O
-var fuSeen = []
-function findUnmet (obj, log) {
-  if (fuSeen.indexOf(obj) !== -1) return
-  fuSeen.push(obj)
-  //console.error("find unmet", obj.name, obj.parent && obj.parent.name)
-  var deps = obj.dependencies = obj.dependencies || {}
-
-  //console.error(deps)
-  Object.keys(deps)
-    .filter(function (d) { return typeof deps[d] === "string" })
-    .forEach(function (d) {
-      //console.error("find unmet", obj.name, d, deps[d])
-      var r = obj.parent
-        , found = null
-      while (r && !found && typeof deps[d] === "string") {
-        // if r is a valid choice, then use that.
-        found = r.dependencies[d]
-        if (!found && r.realName === d) found = r
-
-        if (!found) {
-          r = r.link ? null : r.parent
-          continue
-        }
-        if ( typeof deps[d] === "string"
-            // url deps presumed innocent.
-            && !url.parse(deps[d]).protocol
-            && !semver.satisfies(found.version, deps[d], true)) {
-          // the bad thing will happen
-          log("unmet dependency", obj.path + " requires "+d+"@'"+deps[d]
-             +"' but will load\n"
-             +found.path+",\nwhich is version "+found.version
-             )
-          found.invalid = true
-        } else {
-          found.extraneous = false
-        }
-        deps[d] = found
-      }
-
-    })
-
-  var peerDeps = obj.peerDependencies = obj.peerDependencies || {}
-  Object.keys(peerDeps).forEach(function (d) {
-    var dependency
-
-    if (!obj.parent) {
-      dependency = obj.dependencies[d]
-
-      // read it as a missing dep
-      if (!dependency) {
-        obj.dependencies[d] = peerDeps[d]
-      }
-    } else {
-      dependency = obj.parent.dependencies && obj.parent.dependencies[d]
-    }
-
-    if (!dependency) return
-
-    dependency.extraneous = false
-
-    if (!semver.satisfies(dependency.version, peerDeps[d], true)) {
-      dependency.peerInvalid = true
-    }
-  })
-
-  return obj
-}
-
-function copy (obj) {
-  if (!obj || typeof obj !== 'object') return obj
-  if (Array.isArray(obj)) return obj.map(copy)
-
-  var o = {}
-  for (var i in obj) o[i] = copy(obj[i])
-  return o
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-The ISC License
-
-Copyright (c) Isaac Z. Schlueter
-
-Permission to use, copy, modify, and/or distribute this software for any
-purpose with or without fee is hereby granted, provided that the above
-copyright notice and this permission notice appear in all copies.
-
-THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
-REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
-FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
-INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
-LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
-OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
-PERFORMANCE OF THIS SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,166 +0,0 @@
-# read-package-json
-
-This is the thing that npm uses to read package.json files.  It
-validates some stuff, and loads some default things.
-
-It keeps a cache of the files you've read, so that you don't end
-up reading the same package.json file multiple times.
-
-Note that if you just want to see what's literally in the package.json
-file, you can usually do `var data = require('some-module/package.json')`.
-
-This module is basically only needed by npm, but it's handy to see what
-npm will see when it looks at your package.
-
-## Usage
-
-```javascript
-var readJson = require('read-package-json')
-
-// readJson(filename, [logFunction=noop], [strict=false], cb)
-readJson('/path/to/package.json', console.error, false, function (er, data) {
-  if (er) {
-    console.error("There was an error reading the file")
-    return
-  }
-
-  console.error('the package data is', data)
-});
-```
-
-## readJson(file, [logFn = noop], [strict = false], cb)
-
-* `file` {String} The path to the package.json file
-* `logFn` {Function} Function to handle logging.  Defaults to a noop.
-* `strict` {Boolean} True to enforce SemVer 2.0 version strings, and
-  other strict requirements.
-* `cb` {Function} Gets called with `(er, data)`, as is The Node Way.
-
-Reads the JSON file and does the things.
-
-## `package.json` Fields
-
-See `man 5 package.json` or `npm help json`.
-
-## readJson.log
-
-By default this is a reference to the `npmlog` module.  But if that
-module can't be found, then it'll be set to just a dummy thing that does
-nothing.
-
-Replace with your own `{log,warn,error}` object for fun loggy time.
-
-## readJson.extras(file, data, cb)
-
-Run all the extra stuff relative to the file, with the parsed data.
-
-Modifies the data as it does stuff.  Calls the cb when it's done.
-
-## readJson.extraSet = [fn, fn, ...]
-
-Array of functions that are called by `extras`.  Each one receives the
-arguments `fn(file, data, cb)` and is expected to call `cb(er, data)`
-when done or when an error occurs.
-
-Order is indeterminate, so each function should be completely
-independent.
-
-Mix and match!
-
-## readJson.cache
-
-The `lru-cache` object that readJson uses to not read the same file over
-and over again.  See
-[lru-cache](https://github.com/isaacs/node-lru-cache) for details.
-
-## Other Relevant Files Besides `package.json`
-
-Some other files have an effect on the resulting data object, in the
-following ways:
-
-### `README?(.*)`
-
-If there is a `README` or `README.*` file present, then npm will attach
-a `readme` field to the data with the contents of this file.
-
-Owing to the fact that roughly 100% of existing node modules have
-Markdown README files, it will generally be assumed to be Markdown,
-regardless of the extension.  Please plan accordingly.
-
-### `server.js`
-
-If there is a `server.js` file, and there is not already a
-`scripts.start` field, then `scripts.start` will be set to `node
-server.js`.
-
-### `AUTHORS`
-
-If there is not already a `contributors` field, then the `contributors`
-field will be set to the contents of the `AUTHORS` file, split by lines,
-and parsed.
-
-### `bindings.gyp`
-
-If a bindings.gyp file exists, and there is not already a
-`scripts.install` field, then the `scripts.install` field will be set to
-`node-gyp rebuild`.
-
-### `wscript`
-
-If a wscript file exists, and there is not already a `scripts.install`
-field, then the `scripts.install` field will be set to `node-waf clean ;
-node-waf configure build`.
-
-Note that the `bindings.gyp` file supercedes this, since node-waf has
-been deprecated in favor of node-gyp.
-
-### `index.js`
-
-If the json file does not exist, but there is a `index.js` file
-present instead, and that file has a package comment, then it will try
-to parse the package comment, and use that as the data instead.
-
-A package comment looks like this:
-
-```javascript
-/**package
- * { "name": "my-bare-module"
- * , "version": "1.2.3"
- * , "description": "etc...." }
- **/
-
-// or...
-
-/**package
-{ "name": "my-bare-module"
-, "version": "1.2.3"
-, "description": "etc...." }
-**/
-```
-
-The important thing is that it starts with `/**package`, and ends with
-`**/`.  If the package.json file exists, then the index.js is not
-parsed.
-
-### `{directories.man}/*.[0-9]`
-
-If there is not already a `man` field defined as an array of files or a
-single file, and
-there is a `directories.man` field defined, then that directory will
-be searched for manpages.
-
-Any valid manpages found in that directory will be assigned to the `man`
-array, and installed in the appropriate man directory at package install
-time, when installed globally on a Unix system.
-
-### `{directories.bin}/*`
-
-If there is not already a `bin` field defined as a string filename or a
-hash of `<name> : <filename>` pairs, then the `directories.bin`
-directory will be searched and all the files within it will be linked as
-executables at install time.
-
-When installing locally, npm links bins into `node_modules/.bin`, which
-is in the `PATH` environ when npm runs scripts.  When
-installing globally, they are linked into `{prefix}/bin`, which is
-presumably in the `PATH` environment variable.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-/node_modules/
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-language: node_js
-node_js:
-  - "0.10"
-  - "0.8"
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/AUTHORS	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-# Names sorted by how much code was originally theirs.
-Isaac Z. Schlueter <i@izs.me>
-Meryn Stol <merynstol@gmail.com>
-Robert Kowalski <rok@kowalski.gd>
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-This package contains code originally written by Isaac Z. Schlueter. 
-Used with permission.
-
-Copyright (c) Meryn Stol ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,101 +0,0 @@
-# normalize-package-data [![Build Status](https://travis-ci.org/meryn/normalize-package-data.png?branch=master)](https://travis-ci.org/meryn/normalize-package-data)
-
-normalize-package data exports a function that normalizes package metadata. This data is typically found in a package.json file, but in principle could come from any source - for example the npm registry.
-
-normalize-package-data is used by [read-package-json](https://npmjs.org/package/read-package-json) to normalize the data it reads from a package.json file. In turn, read-package-json is used by [npm](https://npmjs.org/package/npm) and various npm-related tools.
-
-## Installation
-
-```
-npm install normalize-package-data
-```
-
-## Usage
-
-Basic usage is really simple. You call the function that normalize-package-data exports. Let's call it `normalizeData`.
-
-```javascript
-normalizeData = require('normalize-package-data')
-packageData = fs.readfileSync("package.json")
-normalizeData(packageData)
-// packageData is now normalized
-```
-
-#### Strict mode
-
-You may activate strict validation by passing true as the second argument.
-
-```javascript
-normalizeData = require('normalize-package-data')
-packageData = fs.readfileSync("package.json")
-warnFn = function(msg) { console.error(msg) }
-normalizeData(packageData, true)
-// packageData is now normalized
-```
-
-If strict mode is activated, only Semver 2.0 version strings are accepted. Otherwise, Semver 1.0 strings are accepted as well. Packages must have a name, and the name field must not have contain leading or trailing whitespace.
-
-#### Warnings
-
-Optionally, you may pass a "warning" function. It gets called whenever the `normalizeData` function encounters something that doesn't look right. It indicates less than perfect input data.
-
-```javascript
-normalizeData = require('normalize-package-data')
-packageData = fs.readfileSync("package.json")
-warnFn = function(msg) { console.error(msg) }
-normalizeData(packageData, warnFn)
-// packageData is now normalized. Any number of warnings may have been logged.
-```
-
-You may combine strict validation with warnings by passing `true` as the second argument, and `warnFn` as third.
-
-When `private` field is set to `true`, warnings will be suppressed.
-
-### Potential exceptions
-
-If the supplied data has an invalid name or version vield, `normalizeData` will throw an error. Depending on where you call `normalizeData`, you may want to catch these errors so can pass them to a callback.
-
-## What normalization (currently) entails
-
-* The value of `name` field gets trimmed (unless in strict mode).
-* The value of the `version` field gets cleaned by `semver.clean`. See [documentation for the semver module](https://github.com/isaacs/node-semver).
-* If `name` and/or `version` fields are missing, they are set to empty strings.
-* If `files` field is not an array, it will be removed.
-* If `bin` field is a string, then `bin` field will become an object with `name` set to the value of the `name` field, and `bin` set to the original string value.
-* If `man` field is a string, it will become an array with the original string as its sole member.
-* If `keywords` field is string, it is considered to be a list of keywords separated by one or more white-space characters. It gets converted to an array by splitting on `\s+`.
-* All people fields (`author`, `maintainers`, `contributors`) get converted into objects with name, email and url properties.
-* If `bundledDependencies` field (a typo) exists and `bundleDependencies` field does not, `bundledDependencies` will get renamed to `bundleDependencies`.
-* If the value of any of the dependencies fields  (`dependencies`, `devDependencies`, `optionalDependencies`) is a string, it gets converted into an object with familiar `name=>value` pairs.
-* The values in `optionalDependencies` get added to `dependencies`. The `optionalDependencies` array is left untouched.
-* If `description` field does not exists, but `readme` field does, then (more or less) the first paragraph of text that's found in the readme is taken as value for `description`.
-* If `repository` field is a string, it will become an object with `url` set to the original string value, and `type` set to `"git"`.
-* If `repository.url` is not a valid url, but in the style of "[owner-name]/[repo-name]", `repository.url` will be set to git://github.com/[owner-name]/[repo-name]
-* If `bugs` field is a string, the value of `bugs` field is changed into an object with `url` set to the original string value.
-* If `bugs` field does not exist, but `repository` field points to a repository hosted on GitHub, the value of the `bugs` field gets set to an url in the form of https://github.com/[owner-name]/[repo-name]/issues . If the repository field points to a GitHub Gist repo url, the associated http url is chosen.
-* If `bugs` field is an object, the resulting value only has email and url properties. If email and url properties are not strings, they are ignored. If no valid values for either email or url is found, bugs field will be removed.
-* If `homepage` field is not a string, it will be removed.
-* If the url in the `homepage` field does not specify a protocol, then http is assumed. For example, `myproject.org` will be changed to `http://myproject.org`.
-* If `homepage` field does not exist, but `repository` field points to a repository hosted on GitHub, the value of the `homepage` field gets set to an url in the form of https://github.com/[owner-name]/[repo-name]/ . If the repository field points to a GitHub Gist repo url, the associated http url is chosen.
-
-### Rules for name field
-
-If `name` field is given, the value of the name field must be a string. The string may not:
-
-* start with a period.
-* contain the following characters: `/@\s+%`
-* contain and characters that would need to be encoded for use in urls.
-* resemble the word `node_modules` or `favicon.ico` (case doesn't matter).
-
-### Rules for version field
-
-If `version` field is given, the value of the version field must be a valid *semver* string, as determined by the `semver.valid` method. See [documentation for the semver module](https://github.com/isaacs/node-semver).
-
-## Credits
-
-This package contains code based on read-package-json written by Isaac Z. Schlueter. Used with permisson.
-
-## License
-
-normalize-package-data is released under the [BSD 2-Clause License](http://opensource.org/licenses/MIT).  
-Copyright (c) 2013 Meryn Stol  
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/core_module_names.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,29 +0,0 @@
-[
-"http",
-"events",
-"util",
-"domain",
-"cluster",
-"buffer",
-"stream",
-"crypto",
-"tls",
-"fs",
-"string_decoder",
-"path",
-"net",
-"dgram",
-"dns",
-"https",
-"url",
-"punycode",
-"readline",
-"repl",
-"vm",
-"child_process",
-"assert",
-"zlib",
-"tty",
-"os",
-"querystring"
-]
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/extract_description.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-module.exports = extractDescription
-
-// Extracts description from contents of a readme file in markdown format
-function extractDescription (d) {
-  if (!d) return;
-  if (d === "ERROR: No README data found!") return;
-  // the first block of text before the first heading
-  // that isn't the first line heading
-  d = d.trim().split('\n')
-  for (var s = 0; d[s] && d[s].trim().match(/^(#|$)/); s ++);
-  var l = d.length
-  for (var e = s + 1; e < l && d[e].trim(); e ++);
-  return d.slice(s, e).join(' ').trim()
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/fixer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,387 +0,0 @@
-var semver = require("semver")
-var parseGitHubURL = require("github-url-from-git")
-var depTypes = ["dependencies","devDependencies","optionalDependencies"]
-var extractDescription = require("./extract_description")
-var url = require("url")
-var typos = require("./typos")
-var coreModuleNames = require("./core_module_names")
-var githubUserRepo = require("github-url-from-username-repo")
-
-var fixer = module.exports = {
-  // default warning function
-  warn: function() {},
-
-  fixRepositoryField: function(data) {
-    if (data.repositories) {
-      this.warn("'repositories' (plural) Not supported.\n" +
-           "Please pick one as the 'repository' field");
-      data.repository = data.repositories[0]
-    }
-    if (!data.repository) return this.warn('No repository field.')
-    if (typeof data.repository === "string") {
-      data.repository = {
-        type: "git",
-        url: data.repository
-      }
-    }
-    var r = data.repository.url || ""
-    if (r) {
-      var ghurl = parseGitHubURL(r)
-      if (ghurl) {
-        r = ghurl.replace(/^https?:\/\//, 'git://')
-      } else if (githubUserRepo(r)) {
-        // repo has 'user/reponame' filled in as repo
-        data.repository.url = githubUserRepo(r)
-      }
-    }
-
-    if (r.match(/github.com\/[^\/]+\/[^\/]+\.git\.git$/)) {
-      this.warn("Probably broken git url: " + r)
-    }
-  }
-
-, fixTypos: function(data) {
-    Object.keys(typos.topLevel).forEach(function (d) {
-      if (data.hasOwnProperty(d)) {
-        this.warn(makeTypoWarning(d, typos.topLevel[d]))
-      }
-    }, this)
-  }
-
-, fixScriptsField: function(data) {
-    if (!data.scripts) return
-    if (typeof data.scripts !== "object") {
-      this.warn("scripts must be an object")
-      delete data.scripts
-    }
-    Object.keys(data.scripts).forEach(function (k) {
-      if (typeof data.scripts[k] !== "string") {
-        this.warn("script values must be string commands")
-        delete data.scripts[k]
-      } else if (typos.script[k]) {
-        this.warn(makeTypoWarning(k, typos.script[k], "scripts"))
-      }
-    }, this)
-  }
-
-, fixFilesField: function(data) {
-    var files = data.files
-    if (files && !Array.isArray(files)) {
-      this.warn("Invalid 'files' member")
-      delete data.files
-    } else if (data.files) {
-      data.files = data.files.filter(function(file) {
-        if (!file || typeof file !== "string") {
-          this.warn("Invalid filename in 'files' list: " + file)
-          return false
-        } else {
-          return true
-        }
-      }, this)
-    }
-  }
-
-, fixBinField: function(data) {
-    if (!data.bin) return;
-    if (typeof data.bin === "string") {
-      var b = {}
-      b[data.name] = data.bin
-      data.bin = b
-    }
-  }
-
-, fixManField: function(data) {
-    if (!data.man) return;
-    if (typeof data.man === "string") {
-      data.man = [ data.man ]
-    }
-  }
-, fixBundleDependenciesField: function(data) {
-    var bdd = "bundledDependencies"
-    var bd = "bundleDependencies"
-    if (data[bdd] && !data[bd]) {
-      data[bd] = data[bdd]
-      delete data[bdd]
-    }
-    if (data[bd] && !Array.isArray(data[bd])) {
-      this.warn("Invalid 'bundleDependencies' list. " +
-                "Must be array of package names")
-      delete data[bd]
-    } else if (data[bd]) {
-      data[bd] = data[bd].filter(function(bd) {
-        if (!bd || typeof bd !== 'string') {
-          this.warn("Invalid bundleDependencies member: " + bd)
-          return false
-        } else {
-          return true
-        }
-      }, this)
-    }
-  }
-
-, fixDependencies: function(data, strict) {
-    var loose = !strict
-    objectifyDeps(data, this.warn)
-    addOptionalDepsToDeps(data, this.warn)
-    this.fixBundleDependenciesField(data)
-
-    ;['dependencies','devDependencies'].forEach(function(deps) {
-      if (!(deps in data)) return
-      if (!data[deps] || typeof data[deps] !== "object") {
-        this.warn(deps + " field must be an object")
-        delete data[deps]
-        return
-      }
-      Object.keys(data[deps]).forEach(function (d) {
-        var r = data[deps][d]
-        if (typeof r !== 'string') {
-          this.warn('Invalid dependency: ' + d + ' ' + JSON.stringify(r))
-          delete data[deps][d]
-        }
-      }, this)
-    }, this)
-  }
-
-, fixModulesField: function (data) {
-    if (data.modules) {
-      this.warn("modules field is deprecated")
-      delete data.modules
-    }
-  }
-
-, fixKeywordsField: function (data) {
-    if (typeof data.keywords === "string") {
-      data.keywords = data.keywords.split(/,\s+/)
-    }
-    if (data.keywords && !Array.isArray(data.keywords)) {
-      delete data.keywords
-      this.warn("keywords should be an array of strings")
-    } else if (data.keywords) {
-      data.keywords = data.keywords.filter(function(kw) {
-        if (typeof kw !== "string" || !kw) {
-          this.warn("keywords should be an array of strings");
-          return false
-        } else {
-          return true
-        }
-      }, this)
-    }
-  }
-
-, fixVersionField: function(data, strict) {
-    // allow "loose" semver 1.0 versions in non-strict mode
-    // enforce strict semver 2.0 compliance in strict mode
-    var loose = !strict
-    if (!data.version) {
-      data.version = ""
-      return true
-    }
-    if (!semver.valid(data.version, loose)) {
-      throw new Error('Invalid version: "'+ data.version + '"')
-    }
-    data.version = semver.clean(data.version, loose)
-    return true
-  }
-
-, fixPeople: function(data) {
-    modifyPeople(data, unParsePerson)
-    modifyPeople(data, parsePerson)
-  }
-
-, fixNameField: function(data, strict) {
-    if (!data.name && !strict) {
-      data.name = ""
-      return
-    }
-    if (typeof data.name !== "string") {
-      throw new Error("name field must be a string.")
-    }
-    if (!strict)
-      data.name = data.name.trim()
-    ensureValidName(data.name, strict)
-    if (coreModuleNames.indexOf(data.name) !== -1)
-      this.warn(data.name + " is also the name of a node core module.")
-  }
-
-
-, fixDescriptionField: function (data) {
-    if (data.description && typeof data.description !== 'string') {
-      this.warn("'description' field should be a string")
-      delete data.description
-    }
-    if (data.readme && !data.description)
-      data.description = extractDescription(data.readme)
-      if(data.description === undefined) delete data.description;
-    if (!data.description) this.warn('No description')
-  }
-
-, fixReadmeField: function (data) {
-    if (!data.readme) {
-      this.warn("No README data")
-      data.readme = "ERROR: No README data found!"
-    }
-  }
-
-, fixBugsField: function(data) {
-    if (!data.bugs && data.repository && data.repository.url) {
-      var gh = parseGitHubURL(data.repository.url)
-      if(gh) {
-        if(gh.match(/^https:\/\/github.com\//))
-          data.bugs = {url: gh + "/issues"}
-        else // gist url
-          data.bugs = {url: gh}
-      }
-    }
-    else if(data.bugs) {
-      var emailRe = /^.+@.*\..+$/
-      if(typeof data.bugs == "string") {
-        if(emailRe.test(data.bugs))
-          data.bugs = {email:data.bugs}
-        else if(url.parse(data.bugs).protocol)
-          data.bugs = {url: data.bugs}
-        else
-          this.warn("Bug string field must be url, email, or {email,url}")
-      }
-      else {
-        bugsTypos(data.bugs, this.warn)
-        var oldBugs = data.bugs
-        data.bugs = {}
-        if(oldBugs.url) {
-          if(typeof(oldBugs.url) == "string" && url.parse(oldBugs.url).protocol)
-            data.bugs.url = oldBugs.url
-          else
-            this.warn("bugs.url field must be a string url. Deleted.")
-        }
-        if(oldBugs.email) {
-          if(typeof(oldBugs.email) == "string" && emailRe.test(oldBugs.email))
-            data.bugs.email = oldBugs.email
-          else
-            this.warn("bugs.email field must be a string email. Deleted.")
-        }
-      }
-      if(!data.bugs.email && !data.bugs.url) {
-        delete data.bugs
-        this.warn("Normalized value of bugs field is an empty object. Deleted.")
-      }
-    }
-  }
-
-, fixHomepageField: function(data) {
-    if (!data.homepage && data.repository && data.repository.url) {
-      var gh = parseGitHubURL(data.repository.url)
-      if (gh)
-          data.homepage = gh
-      else
-        return true
-    } else if (!data.homepage)
-      return true
-
-    if(typeof data.homepage !== "string") {
-      this.warn("homepage field must be a string url. Deleted.")
-      return delete data.homepage
-    }
-    if(!url.parse(data.homepage).protocol) {
-      this.warn("homepage field must start with a protocol.")
-      data.homepage = "http://" + data.homepage
-    }
-  }
-}
-
-function ensureValidName (name, strict) {
-  if (name.charAt(0) === "." ||
-      name.match(/[\/@\s\+%:]/) ||
-      name !== encodeURIComponent(name) ||
-      (strict && name !== name.toLowerCase()) ||
-      name.toLowerCase() === "node_modules" ||
-      name.toLowerCase() === "favicon.ico") {
-        throw new Error("Invalid name: " + JSON.stringify(name))
-  }
-}
-
-function modifyPeople (data, fn) {
-  if (data.author) data.author = fn(data.author)
-  ;["maintainers", "contributors"].forEach(function (set) {
-    if (!Array.isArray(data[set])) return;
-    data[set] = data[set].map(fn)
-  })
-  return data
-}
-
-function unParsePerson (person) {
-  if (typeof person === "string") return person
-  var name = person.name || ""
-  var u = person.url || person.web
-  var url = u ? (" ("+u+")") : ""
-  var e = person.email || person.mail
-  var email = e ? (" <"+e+">") : ""
-  return name+email+url
-}
-
-function parsePerson (person) {
-  if (typeof person !== "string") return person
-  var name = person.match(/^([^\(<]+)/)
-  var url = person.match(/\(([^\)]+)\)/)
-  var email = person.match(/<([^>]+)>/)
-  var obj = {}
-  if (name && name[0].trim()) obj.name = name[0].trim()
-  if (email) obj.email = email[1];
-  if (url) obj.url = url[1];
-  return obj
-}
-
-function addOptionalDepsToDeps (data, warn) {
-  var o = data.optionalDependencies
-  if (!o) return;
-  var d = data.dependencies || {}
-  Object.keys(o).forEach(function (k) {
-    d[k] = o[k]
-  })
-  data.dependencies = d
-}
-
-function depObjectify (deps, type, warn) {
-  if (!deps) return {}
-  if (typeof deps === "string") {
-    deps = deps.trim().split(/[\n\r\s\t ,]+/)
-  }
-  if (!Array.isArray(deps)) return deps
-  warn("specifying " + type + " as array is deprecated")
-  var o = {}
-  deps.filter(function (d) {
-    return typeof d === "string"
-  }).forEach(function(d) {
-    d = d.trim().split(/(:?[@\s><=])/)
-    var dn = d.shift()
-    var dv = d.join("")
-    dv = dv.trim()
-    dv = dv.replace(/^@/, "")
-    o[dn] = dv
-  })
-  return o
-}
-
-function objectifyDeps (data, warn) {
-  depTypes.forEach(function (type) {
-    if (!data[type]) return;
-    data[type] = depObjectify(data[type], type, warn)
-  })
-}
-
-function bugsTypos(bugs, warn) {
-  if (!bugs) return
-  Object.keys(bugs).forEach(function (k) {
-    if (typos.bugs[k]) {
-      warn(makeTypoWarning(k, typos.bugs[k], "bugs"))
-      bugs[typos.bugs[k]] = bugs[k]
-      delete bugs[k]
-    }
-  })
-}
-
-function makeTypoWarning (providedName, probableName, field) {
-  if (field) {
-    providedName = field + "['" + providedName + "']"
-    probableName = field + "['" + probableName + "']"
-  }
-  return providedName + " should probably be " + probableName + "."
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/normalize.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-module.exports = normalize
-
-var fixer = require("./fixer")
-
-var fieldsToFix = ['name','version','description','repository','modules','scripts'
-                  ,'files','bin','man','bugs','keywords','readme','homepage']
-var otherThingsToFix = ['dependencies','people', 'typos']
-
-var thingsToFix = fieldsToFix.map(function(fieldName) { 
-  return ucFirst(fieldName) + "Field"
-})
-// two ways to do this in CoffeeScript on only one line, sub-70 chars:
-// thingsToFix = fieldsToFix.map (name) -> ucFirst(name) + "Field"
-// thingsToFix = (ucFirst(name) + "Field" for name in fieldsToFix)
-thingsToFix = thingsToFix.concat(otherThingsToFix)
-
-function normalize (data, warn, strict) {
-  if(warn === true) warn = null, strict = true
-  if(!strict) strict = false
-  if(!warn || data.private) warn = function(msg) { /* noop */ }
-
-  if (data.scripts && 
-      data.scripts.install === "node-gyp rebuild" && 
-      !data.scripts.preinstall) {
-    data.gypfile = true
-  }
-  fixer.warn = warn
-  thingsToFix.forEach(function(thingName) {
-    fixer["fix" + ucFirst(thingName)](data, strict)
-  })
-  data._id = data.name + "@" + data.version
-}
-
-function ucFirst (string) {
-  return string.charAt(0).toUpperCase() + string.slice(1);
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/lib/typos.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-{
-  "topLevel": { 
-    "dependancies": "dependencies"
-   ,"dependecies": "dependencies"
-   ,"depdenencies": "dependencies"
-   ,"devEependencies": "devDependencies"
-   ,"depends": "dependencies"
-   ,"dev-dependencies": "devDependencies"
-   ,"devDependences": "devDependencies"
-   ,"devDepenencies": "devDependencies"
-   ,"devdependencies": "devDependencies"
-   ,"repostitory": "repository"
-   ,"prefereGlobal": "preferGlobal"
-   ,"hompage": "homepage"
-   ,"hampage": "homepage"
-   ,"autohr": "author"
-   ,"autor": "author"
-   ,"contributers": "contributors"
-   ,"publicationConfig": "publishConfig"
-   ,"script": "scripts"
-  },
-  "bugs": { "web": "url", "name": "url" },
-  "script": { "server": "start", "tests": "test" }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/node_modules/normalize-package-data/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,49 +0,0 @@
-{
-  "name": "normalize-package-data",
-  "version": "0.2.7",
-  "author": {
-    "name": "Meryn Stol",
-    "email": "merynstol@gmail.com"
-  },
-  "description": "Normalizes data that can be found in package.json files.",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/meryn/normalize-package-data.git"
-  },
-  "main": "lib/normalize.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "dependencies": {
-    "semver": "2",
-    "github-url-from-git": "~1.1.1",
-    "github-url-from-username-repo": "0.0.2"
-  },
-  "devDependencies": {
-    "tap": "~0.2.5",
-    "underscore": "~1.4.4",
-    "async": "~0.2.7"
-  },
-  "contributors": [
-    {
-      "name": "Isaac Z. Schlueter",
-      "email": "i@izs.me"
-    },
-    {
-      "name": "Meryn Stol",
-      "email": "merynstol@gmail.com"
-    },
-    {
-      "name": "Robert Kowalski",
-      "email": "rok@kowalski.gd"
-    }
-  ],
-  "readme": "# normalize-package-data [![Build Status](https://travis-ci.org/meryn/normalize-package-data.png?branch=master)](https://travis-ci.org/meryn/normalize-package-data)\n\nnormalize-package data exports a function that normalizes package metadata. This data is typically found in a package.json file, but in principle could come from any source - for example the npm registry.\n\nnormalize-package-data is used by [read-package-json](https://npmjs.org/package/read-package-json) to normalize the data it reads from a package.json file. In turn, read-package-json is used by [npm](https://npmjs.org/package/npm) and various npm-related tools.\n\n## Installation\n\n```\nnpm install normalize-package-data\n```\n\n## Usage\n\nBasic usage is really simple. You call the function that normalize-package-data exports. Let's call it `normalizeData`.\n\n```javascript\nnormalizeData = require('normalize-package-data')\npackageData = fs.readfileSync(\"package.json\")\nnormalizeData(packageData)\n// packageData is now normalized\n```\n\n#### Strict mode\n\nYou may activate strict validation by passing true as the second argument.\n\n```javascript\nnormalizeData = require('normalize-package-data')\npackageData = fs.readfileSync(\"package.json\")\nwarnFn = function(msg) { console.error(msg) }\nnormalizeData(packageData, true)\n// packageData is now normalized\n```\n\nIf strict mode is activated, only Semver 2.0 version strings are accepted. Otherwise, Semver 1.0 strings are accepted as well. Packages must have a name, and the name field must not have contain leading or trailing whitespace.\n\n#### Warnings\n\nOptionally, you may pass a \"warning\" function. It gets called whenever the `normalizeData` function encounters something that doesn't look right. It indicates less than perfect input data.\n\n```javascript\nnormalizeData = require('normalize-package-data')\npackageData = fs.readfileSync(\"package.json\")\nwarnFn = function(msg) { console.error(msg) }\nnormalizeData(packageData, warnFn)\n// packageData is now normalized. Any number of warnings may have been logged.\n```\n\nYou may combine strict validation with warnings by passing `true` as the second argument, and `warnFn` as third.\n\nWhen `private` field is set to `true`, warnings will be suppressed.\n\n### Potential exceptions\n\nIf the supplied data has an invalid name or version vield, `normalizeData` will throw an error. Depending on where you call `normalizeData`, you may want to catch these errors so can pass them to a callback.\n\n## What normalization (currently) entails\n\n* The value of `name` field gets trimmed (unless in strict mode).\n* The value of the `version` field gets cleaned by `semver.clean`. See [documentation for the semver module](https://github.com/isaacs/node-semver).\n* If `name` and/or `version` fields are missing, they are set to empty strings.\n* If `files` field is not an array, it will be removed.\n* If `bin` field is a string, then `bin` field will become an object with `name` set to the value of the `name` field, and `bin` set to the original string value.\n* If `man` field is a string, it will become an array with the original string as its sole member.\n* If `keywords` field is string, it is considered to be a list of keywords separated by one or more white-space characters. It gets converted to an array by splitting on `\\s+`.\n* All people fields (`author`, `maintainers`, `contributors`) get converted into objects with name, email and url properties.\n* If `bundledDependencies` field (a typo) exists and `bundleDependencies` field does not, `bundledDependencies` will get renamed to `bundleDependencies`.\n* If the value of any of the dependencies fields  (`dependencies`, `devDependencies`, `optionalDependencies`) is a string, it gets converted into an object with familiar `name=>value` pairs.\n* The values in `optionalDependencies` get added to `dependencies`. The `optionalDependencies` array is left untouched.\n* If `description` field does not exists, but `readme` field does, then (more or less) the first paragraph of text that's found in the readme is taken as value for `description`.\n* If `repository` field is a string, it will become an object with `url` set to the original string value, and `type` set to `\"git\"`.\n* If `repository.url` is not a valid url, but in the style of \"[owner-name]/[repo-name]\", `repository.url` will be set to git://github.com/[owner-name]/[repo-name]\n* If `bugs` field is a string, the value of `bugs` field is changed into an object with `url` set to the original string value.\n* If `bugs` field does not exist, but `repository` field points to a repository hosted on GitHub, the value of the `bugs` field gets set to an url in the form of https://github.com/[owner-name]/[repo-name]/issues . If the repository field points to a GitHub Gist repo url, the associated http url is chosen.\n* If `bugs` field is an object, the resulting value only has email and url properties. If email and url properties are not strings, they are ignored. If no valid values for either email or url is found, bugs field will be removed.\n* If `homepage` field is not a string, it will be removed.\n* If the url in the `homepage` field does not specify a protocol, then http is assumed. For example, `myproject.org` will be changed to `http://myproject.org`.\n* If `homepage` field does not exist, but `repository` field points to a repository hosted on GitHub, the value of the `homepage` field gets set to an url in the form of https://github.com/[owner-name]/[repo-name]/ . If the repository field points to a GitHub Gist repo url, the associated http url is chosen.\n\n### Rules for name field\n\nIf `name` field is given, the value of the name field must be a string. The string may not:\n\n* start with a period.\n* contain the following characters: `/@\\s+%`\n* contain and characters that would need to be encoded for use in urls.\n* resemble the word `node_modules` or `favicon.ico` (case doesn't matter).\n\n### Rules for version field\n\nIf `version` field is given, the value of the version field must be a valid *semver* string, as determined by the `semver.valid` method. See [documentation for the semver module](https://github.com/isaacs/node-semver).\n\n## Credits\n\nThis package contains code based on read-package-json written by Isaac Z. Schlueter. Used with permisson.\n\n## License\n\nnormalize-package-data is released under the [BSD 2-Clause License](http://opensource.org/licenses/MIT).  \nCopyright (c) 2013 Meryn Stol  ",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/meryn/normalize-package-data/issues"
-  },
-  "homepage": "https://github.com/meryn/normalize-package-data",
-  "_id": "normalize-package-data@0.2.7",
-  "_from": "normalize-package-data@~0.2.7"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-{
-  "name": "read-package-json",
-  "version": "1.1.4",
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "description": "The thing npm uses to read package.json files with semantics and defaults and validation",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/read-package-json.git"
-  },
-  "main": "read-json.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "dependencies": {
-    "glob": "~3.2.1",
-    "lru-cache": "2",
-    "normalize-package-data": "~0.2.7",
-    "graceful-fs": "2"
-  },
-  "devDependencies": {
-    "tap": "~0.2.5"
-  },
-  "optionalDependencies": {
-    "graceful-fs": "2"
-  },
-  "license": "ISC",
-  "readme": "# read-package-json\n\nThis is the thing that npm uses to read package.json files.  It\nvalidates some stuff, and loads some default things.\n\nIt keeps a cache of the files you've read, so that you don't end\nup reading the same package.json file multiple times.\n\nNote that if you just want to see what's literally in the package.json\nfile, you can usually do `var data = require('some-module/package.json')`.\n\nThis module is basically only needed by npm, but it's handy to see what\nnpm will see when it looks at your package.\n\n## Usage\n\n```javascript\nvar readJson = require('read-package-json')\n\n// readJson(filename, [logFunction=noop], [strict=false], cb)\nreadJson('/path/to/package.json', console.error, false, function (er, data) {\n  if (er) {\n    console.error(\"There was an error reading the file\")\n    return\n  }\n\n  console.error('the package data is', data)\n});\n```\n\n## readJson(file, [logFn = noop], [strict = false], cb)\n\n* `file` {String} The path to the package.json file\n* `logFn` {Function} Function to handle logging.  Defaults to a noop.\n* `strict` {Boolean} True to enforce SemVer 2.0 version strings, and\n  other strict requirements.\n* `cb` {Function} Gets called with `(er, data)`, as is The Node Way.\n\nReads the JSON file and does the things.\n\n## `package.json` Fields\n\nSee `man 5 package.json` or `npm help json`.\n\n## readJson.log\n\nBy default this is a reference to the `npmlog` module.  But if that\nmodule can't be found, then it'll be set to just a dummy thing that does\nnothing.\n\nReplace with your own `{log,warn,error}` object for fun loggy time.\n\n## readJson.extras(file, data, cb)\n\nRun all the extra stuff relative to the file, with the parsed data.\n\nModifies the data as it does stuff.  Calls the cb when it's done.\n\n## readJson.extraSet = [fn, fn, ...]\n\nArray of functions that are called by `extras`.  Each one receives the\narguments `fn(file, data, cb)` and is expected to call `cb(er, data)`\nwhen done or when an error occurs.\n\nOrder is indeterminate, so each function should be completely\nindependent.\n\nMix and match!\n\n## readJson.cache\n\nThe `lru-cache` object that readJson uses to not read the same file over\nand over again.  See\n[lru-cache](https://github.com/isaacs/node-lru-cache) for details.\n\n## Other Relevant Files Besides `package.json`\n\nSome other files have an effect on the resulting data object, in the\nfollowing ways:\n\n### `README?(.*)`\n\nIf there is a `README` or `README.*` file present, then npm will attach\na `readme` field to the data with the contents of this file.\n\nOwing to the fact that roughly 100% of existing node modules have\nMarkdown README files, it will generally be assumed to be Markdown,\nregardless of the extension.  Please plan accordingly.\n\n### `server.js`\n\nIf there is a `server.js` file, and there is not already a\n`scripts.start` field, then `scripts.start` will be set to `node\nserver.js`.\n\n### `AUTHORS`\n\nIf there is not already a `contributors` field, then the `contributors`\nfield will be set to the contents of the `AUTHORS` file, split by lines,\nand parsed.\n\n### `bindings.gyp`\n\nIf a bindings.gyp file exists, and there is not already a\n`scripts.install` field, then the `scripts.install` field will be set to\n`node-gyp rebuild`.\n\n### `wscript`\n\nIf a wscript file exists, and there is not already a `scripts.install`\nfield, then the `scripts.install` field will be set to `node-waf clean ;\nnode-waf configure build`.\n\nNote that the `bindings.gyp` file supercedes this, since node-waf has\nbeen deprecated in favor of node-gyp.\n\n### `index.js`\n\nIf the json file does not exist, but there is a `index.js` file\npresent instead, and that file has a package comment, then it will try\nto parse the package comment, and use that as the data instead.\n\nA package comment looks like this:\n\n```javascript\n/**package\n * { \"name\": \"my-bare-module\"\n * , \"version\": \"1.2.3\"\n * , \"description\": \"etc....\" }\n **/\n\n// or...\n\n/**package\n{ \"name\": \"my-bare-module\"\n, \"version\": \"1.2.3\"\n, \"description\": \"etc....\" }\n**/\n```\n\nThe important thing is that it starts with `/**package`, and ends with\n`**/`.  If the package.json file exists, then the index.js is not\nparsed.\n\n### `{directories.man}/*.[0-9]`\n\nIf there is not already a `man` field defined as an array of files or a\nsingle file, and\nthere is a `directories.man` field defined, then that directory will\nbe searched for manpages.\n\nAny valid manpages found in that directory will be assigned to the `man`\narray, and installed in the appropriate man directory at package install\ntime, when installed globally on a Unix system.\n\n### `{directories.bin}/*`\n\nIf there is not already a `bin` field defined as a string filename or a\nhash of `<name> : <filename>` pairs, then the `directories.bin`\ndirectory will be searched and all the files within it will be linked as\nexecutables at install time.\n\nWhen installing locally, npm links bins into `node_modules/.bin`, which\nis in the `PATH` environ when npm runs scripts.  When\ninstalling globally, they are linked into `{prefix}/bin`, which is\npresumably in the `PATH` environment variable.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/read-package-json/issues"
-  },
-  "homepage": "https://github.com/isaacs/read-package-json",
-  "_id": "read-package-json@1.1.4",
-  "_from": "read-package-json@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read-package-json/read-json.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,349 +0,0 @@
-// vim: set softtabstop=16 shiftwidth=16:
-
-try {
-                var fs = require("graceful-fs")
-} catch (er) {
-                var fs = require("fs")
-}
-
-
-module.exports = readJson
-
-var LRU = require("lru-cache")
-readJson.cache = new LRU({max: 1000})
-var path = require("path")
-var glob = require("glob")
-var normalizeData = require("normalize-package-data")
-
-// put more stuff on here to customize.
-readJson.extraSet = [
-                gypfile,
-                serverjs,
-                authors,
-                readme,
-                mans,
-                bins,
-                githead
-]
-
-var typoWarned = {}
-
-
-function readJson (file, log_, strict_, cb_) {
-                var log, strict, cb
-                for (var i = 1; i < arguments.length - 1; i++) {
-                                if (typeof arguments[i] === 'boolean')
-                                                strict = arguments[i]
-                                else if (typeof arguments[i] === 'function')
-                                                log = arguments[i]
-                }
-                if (!log) log = function () {};
-                cb = arguments[ arguments.length - 1 ]
-
-                var c = readJson.cache.get(file)
-                if (c) {
-                                cb = cb.bind(null, null, c)
-                                return process.nextTick(cb);
-                }
-                cb = (function (orig) { return function (er, data) {
-                                if (data) readJson.cache.set(file, data);
-                                return orig(er, data)
-                } })(cb)
-                readJson_(file, log, strict, cb)
-}
-
-
-function readJson_ (file, log, strict, cb) {
-                fs.readFile(file, "utf8", function (er, d) {
-                                parseJson(file, er, d, log, strict, cb)
-                })
-}
-
-
-function stripBOM(content) {
-                // Remove byte order marker. This catches EF BB BF (the UTF-8 BOM)
-                // because the buffer-to-string conversion in `fs.readFileSync()`
-                // translates it to FEFF, the UTF-16 BOM.
-                if (content.charCodeAt(0) === 0xFEFF) {
-                                content = content.slice(1);
-                }
-                return content;
-}
-
-
-function parseJson (file, er, d, log, strict, cb) {
-                if (er && er.code === "ENOENT") {
-                                indexjs(file, er, log, strict, cb)
-                                return
-                }
-                if (er) return cb(er);
-                try {
-                                d = JSON.parse(stripBOM(d))
-                } catch (er) {
-                                d = parseIndex(d)
-                                if (!d) return cb(parseError(er, file));
-                }
-                extras(file, d, log, strict, cb)
-}
-
-
-function indexjs (file, er, log, strict, cb) {
-                if (path.basename(file) === "index.js") {
-                                return cb(er);
-                }
-                var index = path.resolve(path.dirname(file), "index.js")
-                fs.readFile(index, "utf8", function (er2, d) {
-                                if (er2) return cb(er);
-                                d = parseIndex(d)
-                                if (!d) return cb(er);
-                                extras(file, d, log, strict, cb)
-                })
-}
-
-
-readJson.extras = extras
-function extras (file, data, log_, strict_, cb_) {
-                var log, strict, cb
-                for (var i = 2; i < arguments.length - 1; i++) {
-                                if (typeof arguments[i] === 'boolean')
-                                                strict = arguments[i]
-                                else if (typeof arguments[i] === 'function')
-                                                log = arguments[i]
-                }
-                cb = arguments[i]
-                var set = readJson.extraSet
-                var n = set.length
-                var errState = null
-                set.forEach(function (fn) {
-                                fn(file, data, then)
-                })
-                function then(er) {
-                                if (errState) return;
-                                if (er) return cb(errState = er);
-                                if (--n > 0) return;
-                                final(file, data, log, strict, cb);
-                }
-}
-
-function gypfile (file, data, cb) {
-                var dir = path.dirname(file)
-                var s = data.scripts || {}
-                if (s.install || s.preinstall)
-                                return cb(null, data);
-                glob("*.gyp", { cwd: dir }, function (er, files) {
-                                if (er) return cb(er);
-                                gypfile_(file, data, files, cb)
-                })
-}
-
-function gypfile_ (file, data, files, cb) {
-                if (!files.length) return cb(null, data);
-                var s = data.scripts || {}
-                s.install = "node-gyp rebuild"
-                data.scripts = s
-                data.gypfile = true
-                return cb(null, data);
-}
-
-function serverjs (file, data, cb) {
-                var dir = path.dirname(file)
-                var s = data.scripts || {}
-                if (s.start) return cb(null, data)
-                glob("server.js", { cwd: dir }, function (er, files) {
-                                if (er) return cb(er);
-                                serverjs_(file, data, files, cb)
-                })
-}
-function serverjs_ (file, data, files, cb) {
-                if (!files.length) return cb(null, data);
-                var s = data.scripts || {}
-                s.start = "node server.js"
-                data.scripts = s
-                return cb(null, data)
-}
-
-function authors (file, data, cb) {
-                if (data.contributors) return cb(null, data);
-                var af = path.resolve(path.dirname(file), "AUTHORS")
-                fs.readFile(af, "utf8", function (er, ad) {
-                                // ignore error.  just checking it.
-                                if (er) return cb(null, data);
-                                authors_(file, data, ad, cb)
-                })
-}
-function authors_ (file, data, ad, cb) {
-                ad = ad.split(/\r?\n/g).map(function (line) {
-                                return line.replace(/^\s*#.*$/, '').trim()
-                }).filter(function (line) {
-                                return line
-                })
-                data.contributors = ad
-                return cb(null, data)
-}
-
-var defDesc = "Unnamed repository; edit this file " +
-              "'description' to name the repository."
-function gitDescription (file, data, cb) {
-                if (data.description) return cb(null, data);
-                var dir = path.dirname(file)
-                // just cuz it'd be nice if this file mattered...
-                var gitDesc = path.resolve(dir, '.git/description')
-                fs.readFile(gitDesc, 'utf8', function (er, desc) {
-                                if (desc) desc = desc.trim()
-                                if (!er && desc !== defDesc)
-                                                data.description = desc
-                                return cb(null, data)
-                })
-}
-
-function readmeDescription (file, data) {
-                if (data.description) return cb(null, data);
-                var d = data.readme
-                if (!d) return;
-                // the first block of text before the first heading
-                // that isn't the first line heading
-                d = d.trim().split('\n')
-                for (var s = 0; d[s] && d[s].trim().match(/^(#|$)/); s ++);
-                var l = d.length
-                for (var e = s + 1; e < l && d[e].trim(); e ++);
-                data.description = d.slice(s, e).join(' ').trim()
-}
-
-function readme (file, data, cb) {
-                if (data.readme) return cb(null, data);
-                var dir = path.dirname(file)
-                var globOpts = { cwd: dir, nocase: true, mark: true }
-                glob("README?(.*)", globOpts, function (er, files) {
-                                if (er) return cb(er);
-                                // don't accept directories.
-                                files = files.filter(function (file) {
-                                                return !file.match(/\/$/)
-                                })
-                                if (!files.length) return cb();
-                                var rm = path.resolve(dir, files[0])
-                                readme_(file, data, rm, cb)
-                })
-}
-function readme_(file, data, rm, cb) {
-                var rmfn = path.basename(rm);
-                fs.readFile(rm, "utf8", function (er, rm) {
-                                // maybe not readable, or something.
-                                if (er) return cb()
-                                data.readme = rm
-                                data.readmeFilename = rmfn
-                                return cb(er, data)
-                })
-}
-
-function mans (file, data, cb) {
-                var m = data.directories && data.directories.man
-                if (data.man || !m) return cb(null, data);
-                m = path.resolve(path.dirname(file), m)
-                glob("**/*.[0-9]", { cwd: m }, function (er, mans) {
-                                if (er) return cb(er);
-                                mans_(file, data, mans, cb)
-                })
-}
-function mans_ (file, data, mans, cb) {
-                var m = data.directories && data.directories.man
-                data.man = mans.map(function (mf) {
-                                return path.resolve(path.dirname(file), m, mf)
-                })
-                return cb(null, data)
-}
-
-function bins (file, data, cb) {
-                if (Array.isArray(data.bin)) {
-                                return bins_(file, data, data.bin, cb)
-                }
-                var m = data.directories && data.directories.bin
-                if (data.bin || !m) return cb(null, data);
-                m = path.resolve(path.dirname(file), m)
-                glob("**", { cwd: m }, function (er, bins) {
-                                if (er) return cb(er);
-                                bins_(file, data, bins, cb)
-                })
-}
-function bins_ (file, data, bins, cb) {
-                var m = data.directories && data.directories.bin || '.'
-                data.bin = bins.reduce(function (acc, mf) {
-                                if (mf && mf.charAt(0) !== '.') {
-                                                var f = path.basename(mf)
-                                                acc[f] = path.join(m, mf)
-                                }
-                                return acc
-                }, {})
-                return cb(null, data)
-}
-
-function githead (file, data, cb) {
-                if (data.gitHead) return cb(null, data);
-                var dir = path.dirname(file)
-                var head = path.resolve(dir, '.git/HEAD')
-                fs.readFile(head, 'utf8', function (er, head) {
-                                if (er) return cb(null, data);
-                                githead_(file, data, dir, head, cb)
-                })
-}
-function githead_ (file, data, dir, head, cb) {
-                if (!head.match(/^ref: /)) {
-                                data.gitHead = head.trim()
-                                return cb(null, data)
-                }
-                var headFile = head.replace(/^ref: /, '').trim()
-                headFile = path.resolve(dir, '.git', headFile)
-                fs.readFile(headFile, 'utf8', function (er, head) {
-                                if (er || !head) return cb(null, data)
-                                head = head.replace(/^ref: /, '').trim()
-                                data.gitHead = head
-                                return cb(null, data)
-                })
-}
-
-function final (file, data, log, strict, cb) {
-                var pId = makePackageId(data)
-                function warn(msg) {
-                                if (typoWarned[pId]) return;
-                                if (log) log("package.json", pId, msg);
-                }
-                try {
-                                normalizeData(data, warn, strict)
-                }
-                catch (error) {
-                                return cb(error)
-                }
-                typoWarned[pId] = true
-                readJson.cache.set(file, data)
-                cb(null, data)
-}
-
-function makePackageId (data) {
-                return cleanString(data.name) + "@" + cleanString(data.version)
-}
-
-function cleanString(str) {
-                return (!str || typeof(str) !== "string") ? "" : str.trim()
-}
-
-// /**package { "name": "foo", "version": "1.2.3", ... } **/
-function parseIndex (data) {
-                data = data.split(/^\/\*\*package(?:\s|$)/m)
-                if (data.length < 2) return null
-                data = data[1]
-                data = data.split(/\*\*\/$/m)
-                if (data.length < 2) return null
-                data = data[0]
-                data = data.replace(/^\s*\*/mg, "")
-                try {
-                                return JSON.parse(data)
-                } catch (er) {
-                                return null
-                }
-}
-
-function parseError (ex, file) {
-                var e = new Error("Failed to parse json\n"+ex.message)
-                e.code = "EJSONPARSE"
-                e.file = file
-                return e
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-npm-debug.log
-node_modules
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/LICENCE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-Copyright (c) Isaac Z. Schlueter
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS
-``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
-TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
-INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
-CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
-ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
-POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-## read
-
-For reading user input from stdin.
-
-Similar to the `readline` builtin's `question()` method, but with a
-few more features.
-
-## USAGE
-
-```javascript
-var read = require("read")
-read(options, callback)
-```
-
-The callback gets called with either the user input, or the default
-specified, or an error, as `callback(error, result, isDefault)`
-node style.
-
-## OPTIONS
-
-Every option is optional.
-
-* `prompt` What to write to stdout before reading input.
-* `silent` Don't echo the output as the user types it.
-* `replace` Replace silenced characters with the supplied character value.
-* `timeout` Number of ms to wait for user input before giving up.
-* `default` The default value if the user enters nothing.
-* `edit` Allow the user to edit the default value.
-* `terminal` Treat the output as a TTY, whether it is or not.
-* `input` Readable stream to get input data from. (default `process.stdin`)
-* `output` Writeable stream to write prompts to. (default: `process.stdout`)
-
-If silent is true, and the input is a TTY, then read will set raw
-mode, and read character by character.
-
-## COMPATIBILITY
-
-This module works sort of with node 0.6.  It does not work with node
-versions less than 0.6.  It is best on node 0.8.
-
-On node version 0.6, it will remove all listeners on the input
-stream's `data` and `keypress` events, because the readline module did
-not fully clean up after itself in that version of node, and did not
-make it possible to clean up after it in a way that has no potential
-for side effects.
-
-Additionally, some of the readline options (like `terminal`) will not
-function in versions of node before 0.8, because they were not
-implemented in the builtin readline module.
-
-## CONTRIBUTING
-
-Patches welcome.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/example/example.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-var read = require("../lib/read.js")
-
-read({prompt: "Username: ", default: "test-user" }, function (er, user) {
-  read({prompt: "Password: ", default: "test-pass", silent: true }, function (er, pass) {
-    read({prompt: "Password again: ", default: "test-pass", silent: true }, function (er, pass2) {
-      console.error({user: user,
-                     pass: pass,
-                     verify: pass2,
-                     passMatch: (pass === pass2)})
-      console.error("the program should exit now")
-    })
-  })
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/lib/read.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,113 +0,0 @@
-
-module.exports = read
-
-var readline = require('readline')
-var Mute = require('mute-stream')
-
-function read (opts, cb) {
-  if (opts.num) {
-    throw new Error('read() no longer accepts a char number limit')
-  }
-
-  if (typeof opts.default !== 'undefined' &&
-      typeof opts.default !== 'string' &&
-      typeof opts.default !== 'number') {
-    throw new Error('default value must be string or number')
-  }
-
-  var input = opts.input || process.stdin
-  var output = opts.output || process.stdout
-  var prompt = (opts.prompt || '').trim() + ' '
-  var silent = opts.silent
-  var editDef = false
-  var timeout = opts.timeout
-
-  var def = opts.default || ''
-  if (def) {
-    if (silent) {
-      prompt += '(<default hidden>) '
-    } else if (opts.edit) {
-      editDef = true
-    } else {
-      prompt += '(' + def + ') '
-    }
-  }
-  var terminal = !!(opts.terminal || output.isTTY)
-
-  var m = new Mute({ replace: opts.replace, prompt: prompt })
-  m.pipe(output, {end: false})
-  output = m
-  var rlOpts = { input: input, output: output, terminal: terminal }
-
-  if (process.version.match(/^v0\.6/)) {
-    var rl = readline.createInterface(rlOpts.input, rlOpts.output)
-  } else {
-    var rl = readline.createInterface(rlOpts)
-  }
-
-
-  output.unmute()
-  rl.setPrompt(prompt)
-  rl.prompt()
-  if (silent) {
-    output.mute()
-  } else if (editDef) {
-    rl.line = def
-    rl.cursor = def.length
-    rl._refreshLine()
-  }
-
-  var called = false
-  rl.on('line', onLine)
-  rl.on('error', onError)
-
-  rl.on('SIGINT', function () {
-    rl.close()
-    onError(new Error('canceled'))
-  })
-
-  var timer
-  if (timeout) {
-    timer = setTimeout(function () {
-      onError(new Error('timed out'))
-    }, timeout)
-  }
-
-  function done () {
-    called = true
-    rl.close()
-
-    if (process.version.match(/^v0\.6/)) {
-      rl.input.removeAllListeners('data')
-      rl.input.removeAllListeners('keypress')
-      rl.input.pause()
-    }
-
-    clearTimeout(timer)
-    output.mute()
-    output.end()
-  }
-
-  function onError (er) {
-    if (called) return
-    done()
-    return cb(er)
-  }
-
-  function onLine (line) {
-    if (called) return
-    if (silent && terminal) {
-      output.unmute()
-      output.write('\r\n')
-    }
-    done()
-    // truncate the \n at the end.
-    line = line.replace(/\r?\n$/, '')
-    var isDefault = !!(editDef && line === def)
-    if (def && !line) {
-      isDefault = true
-      line = def
-    }
-    cb(null, line, isDefault)
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-# mute-stream
-
-Bytes go in, but they don't come out (when muted).
-
-This is a basic pass-through stream, but when muted, the bytes are
-silently dropped, rather than being passed through.
-
-## Usage
-
-```javascript
-var MuteStream = require('mute-stream')
-
-var ms = new MuteStream(options)
-
-ms.pipe(process.stdout)
-ms.write('foo') // writes 'foo' to stdout
-ms.mute()
-ms.write('bar') // does not write 'bar'
-ms.unmute()
-ms.write('baz') // writes 'baz' to stdout
-
-// can also be used to mute incoming data
-var ms = new MuteStream
-input.pipe(ms)
-
-ms.on('data', function (c) {
-  console.log('data: ' + c)
-})
-
-input.emit('data', 'foo') // logs 'foo'
-ms.mute()
-input.emit('data', 'bar') // does not log 'bar'
-ms.unmute()
-input.emit('data', 'baz') // logs 'baz'
-```
-
-## Options
-
-All options are optional.
-
-* `replace` Set to a string to replace each character with the
-  specified string when muted.  (So you can show `****` instead of the
-  password, for example.)
-
-* `prompt` If you are using a replacement char, and also using a
-  prompt with a readline stream (as for a `Password: *****` input),
-  then specify what the prompt is so that backspace will work
-  properly.  Otherwise, pressing backspace will overwrite the prompt
-  with the replacement character, which is weird.
-
-## ms.mute()
-
-Set `muted` to `true`.  Turns `.write()` into a no-op.
-
-## ms.unmute()
-
-Set `muted` to `false`
-
-## ms.isTTY
-
-True if the pipe destination is a TTY, or if the incoming pipe source is
-a TTY.
-
-## Other stream methods...
-
-The other standard readable and writable stream methods are all
-available.  The MuteStream object acts as a facade to its pipe source
-and destination.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/mute.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,140 +0,0 @@
-var Stream = require('stream')
-
-module.exports = MuteStream
-
-// var out = new MuteStream(process.stdout)
-// argument auto-pipes
-function MuteStream (opts) {
-  Stream.apply(this)
-  opts = opts || {}
-  this.writable = this.readable = true
-  this.muted = false
-  this.on('pipe', this._onpipe)
-  this.replace = opts.replace
-
-  // For readline-type situations
-  // This much at the start of a line being redrawn after a ctrl char
-  // is seen (such as backspace) won't be redrawn as the replacement
-  this._prompt = opts.prompt || null
-  this._hadControl = false
-}
-
-MuteStream.prototype = Object.create(Stream.prototype)
-
-Object.defineProperty(MuteStream.prototype, 'constructor', {
-  value: MuteStream,
-  enumerable: false
-})
-
-MuteStream.prototype.mute = function () {
-  this.muted = true
-}
-
-MuteStream.prototype.unmute = function () {
-  this.muted = false
-}
-
-Object.defineProperty(MuteStream.prototype, '_onpipe', {
-  value: onPipe,
-  enumerable: false,
-  writable: true,
-  configurable: true
-})
-
-function onPipe (src) {
-  this._src = src
-}
-
-Object.defineProperty(MuteStream.prototype, 'isTTY', {
-  get: getIsTTY,
-  set: setIsTTY,
-  enumerable: true,
-  configurable: true
-})
-
-function getIsTTY () {
-  return( (this._dest) ? this._dest.isTTY
-        : (this._src) ? this._src.isTTY
-        : false
-        )
-}
-
-// basically just get replace the getter/setter with a regular value
-function setIsTTY (isTTY) {
-  Object.defineProperty(this, 'isTTY', {
-    value: isTTY,
-    enumerable: true,
-    writable: true,
-    configurable: true
-  })
-}
-
-Object.defineProperty(MuteStream.prototype, 'rows', {
-  get: function () {
-    return( this._dest ? this._dest.rows
-          : this._src ? this._src.rows
-          : undefined )
-  }, enumerable: true, configurable: true })
-
-Object.defineProperty(MuteStream.prototype, 'columns', {
-  get: function () {
-    return( this._dest ? this._dest.columns
-          : this._src ? this._src.columns
-          : undefined )
-  }, enumerable: true, configurable: true })
-
-
-MuteStream.prototype.pipe = function (dest) {
-  this._dest = dest
-  return Stream.prototype.pipe.call(this, dest)
-}
-
-MuteStream.prototype.pause = function () {
-  if (this._src) return this._src.pause()
-}
-
-MuteStream.prototype.resume = function () {
-  if (this._src) return this._src.resume()
-}
-
-MuteStream.prototype.write = function (c) {
-  if (this.muted) {
-    if (!this.replace) return true
-    if (c.match(/^\u001b/)) {
-      this._hadControl = true
-      return this.emit('data', c)
-    } else {
-      if (this._prompt && this._hadControl &&
-          c.indexOf(this._prompt) === 0) {
-        this._hadControl = false
-        this.emit('data', this._prompt)
-        c = c.substr(this._prompt.length)
-      }
-      c = c.toString().replace(/./g, this.replace)
-    }
-  }
-  this.emit('data', c)
-}
-
-MuteStream.prototype.end = function (c) {
-  if (this.muted) {
-    if (c && this.replace) {
-      c = c.toString().replace(/./g, this.replace)
-    } else {
-      c = null
-    }
-  }
-  if (c) this.emit('data', c)
-  this.emit('end')
-}
-
-function proxy (fn) { return function () {
-  var d = this._dest
-  var s = this._src
-  if (d && d[fn]) d[fn].apply(d, arguments)
-  if (s && s[fn]) s[fn].apply(s, arguments)
-}}
-
-MuteStream.prototype.destroy = proxy('destroy')
-MuteStream.prototype.destroySoon = proxy('destroySoon')
-MuteStream.prototype.close = proxy('close')
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-{
-  "name": "mute-stream",
-  "version": "0.0.4",
-  "main": "mute.js",
-  "directories": {
-    "test": "test"
-  },
-  "devDependencies": {
-    "tap": "~0.2.5"
-  },
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/mute-stream"
-  },
-  "keywords": [
-    "mute",
-    "stream",
-    "pipe"
-  ],
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "license": "BSD",
-  "description": "Bytes go in, but they don't come out (when muted).",
-  "readme": "# mute-stream\n\nBytes go in, but they don't come out (when muted).\n\nThis is a basic pass-through stream, but when muted, the bytes are\nsilently dropped, rather than being passed through.\n\n## Usage\n\n```javascript\nvar MuteStream = require('mute-stream')\n\nvar ms = new MuteStream(options)\n\nms.pipe(process.stdout)\nms.write('foo') // writes 'foo' to stdout\nms.mute()\nms.write('bar') // does not write 'bar'\nms.unmute()\nms.write('baz') // writes 'baz' to stdout\n\n// can also be used to mute incoming data\nvar ms = new MuteStream\ninput.pipe(ms)\n\nms.on('data', function (c) {\n  console.log('data: ' + c)\n})\n\ninput.emit('data', 'foo') // logs 'foo'\nms.mute()\ninput.emit('data', 'bar') // does not log 'bar'\nms.unmute()\ninput.emit('data', 'baz') // logs 'baz'\n```\n\n## Options\n\nAll options are optional.\n\n* `replace` Set to a string to replace each character with the\n  specified string when muted.  (So you can show `****` instead of the\n  password, for example.)\n\n* `prompt` If you are using a replacement char, and also using a\n  prompt with a readline stream (as for a `Password: *****` input),\n  then specify what the prompt is so that backspace will work\n  properly.  Otherwise, pressing backspace will overwrite the prompt\n  with the replacement character, which is weird.\n\n## ms.mute()\n\nSet `muted` to `true`.  Turns `.write()` into a no-op.\n\n## ms.unmute()\n\nSet `muted` to `false`\n\n## ms.isTTY\n\nTrue if the pipe destination is a TTY, or if the incoming pipe source is\na TTY.\n\n## Other stream methods...\n\nThe other standard readable and writable stream methods are all\navailable.  The MuteStream object acts as a facade to its pipe source\nand destination.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/mute-stream/issues"
-  },
-  "_id": "mute-stream@0.0.4",
-  "_from": "mute-stream@~0.0.4"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "name": "read",
-  "version": "1.0.5",
-  "main": "lib/read.js",
-  "dependencies": {
-    "mute-stream": "~0.0.4"
-  },
-  "devDependencies": {
-    "tap": "*"
-  },
-  "engines": {
-    "node": ">=0.8"
-  },
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "description": "read(1) for node programs",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/read.git"
-  },
-  "license": "BSD",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "readme": "## read\n\nFor reading user input from stdin.\n\nSimilar to the `readline` builtin's `question()` method, but with a\nfew more features.\n\n## USAGE\n\n```javascript\nvar read = require(\"read\")\nread(options, callback)\n```\n\nThe callback gets called with either the user input, or the default\nspecified, or an error, as `callback(error, result, isDefault)`\nnode style.\n\n## OPTIONS\n\nEvery option is optional.\n\n* `prompt` What to write to stdout before reading input.\n* `silent` Don't echo the output as the user types it.\n* `replace` Replace silenced characters with the supplied character value.\n* `timeout` Number of ms to wait for user input before giving up.\n* `default` The default value if the user enters nothing.\n* `edit` Allow the user to edit the default value.\n* `terminal` Treat the output as a TTY, whether it is or not.\n* `input` Readable stream to get input data from. (default `process.stdin`)\n* `output` Writeable stream to write prompts to. (default: `process.stdout`)\n\nIf silent is true, and the input is a TTY, then read will set raw\nmode, and read character by character.\n\n## COMPATIBILITY\n\nThis module works sort of with node 0.6.  It does not work with node\nversions less than 0.6.  It is best on node 0.8.\n\nOn node version 0.6, it will remove all listeners on the input\nstream's `data` and `keypress` events, because the readline module did\nnot fully clean up after itself in that version of node, and did not\nmake it possible to clean up after it in a way that has no potential\nfor side effects.\n\nAdditionally, some of the readline options (like `terminal`) will not\nfunction in versions of node before 0.8, because they were not\nimplemented in the builtin readline module.\n\n## CONTRIBUTING\n\nPatches welcome.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/read/issues"
-  },
-  "_id": "read@1.0.5",
-  "_from": "read@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/read/rs.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-var read = require('read');
-read({ silent: true, prompt: 'stars: ' }, function(er, data) {
-  console.log(er, data)
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-node_modules
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-Apache License
-
-Version 2.0, January 2004
-
-http://www.apache.org/licenses/
-
-TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-1. Definitions.
-
-"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
-
-"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
-
-"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
-
-"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
-
-"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
-
-"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
-
-"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
-
-"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
-
-"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
-
-"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
-
-2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
-
-3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
-
-4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
-
-You must give any other recipients of the Work or Derivative Works a copy of this License; and
-
-You must cause any modified files to carry prominent notices stating that You changed the files; and
-
-You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
-
-If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
-
-5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
-
-6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
-
-7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
-
-8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
-
-9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
-
-END OF TERMS AND CONDITIONS
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,332 +0,0 @@
-# Request -- Simplified HTTP client
-
-[![NPM](https://nodei.co/npm/request.png)](https://nodei.co/npm/request/)
-
-## Super simple to use
-
-Request is designed to be the simplest way possible to make http calls. It supports HTTPS and follows redirects by default.
-
-```javascript
-var request = require('request');
-request('http://www.google.com', function (error, response, body) {
-  if (!error && response.statusCode == 200) {
-    console.log(body) // Print the google web page.
-  }
-})
-```
-
-## Streaming
-
-You can stream any response to a file stream.
-
-```javascript
-request('http://google.com/doodle.png').pipe(fs.createWriteStream('doodle.png'))
-```
-
-You can also stream a file to a PUT or POST request. This method will also check the file extension against a mapping of file extensions to content-types, in this case `application/json`, and use the proper content-type in the PUT request if one is not already provided in the headers.
-
-```javascript
-fs.createReadStream('file.json').pipe(request.put('http://mysite.com/obj.json'))
-```
-
-Request can also pipe to itself. When doing so the content-type and content-length will be preserved in the PUT headers.
-
-```javascript
-request.get('http://google.com/img.png').pipe(request.put('http://mysite.com/img.png'))
-```
-
-Now let's get fancy.
-
-```javascript
-http.createServer(function (req, resp) {
-  if (req.url === '/doodle.png') {
-    if (req.method === 'PUT') {
-      req.pipe(request.put('http://mysite.com/doodle.png'))
-    } else if (req.method === 'GET' || req.method === 'HEAD') {
-      request.get('http://mysite.com/doodle.png').pipe(resp)
-    }
-  }
-})
-```
-
-You can also pipe() from a http.ServerRequest instance and to a http.ServerResponse instance. The HTTP method and headers will be sent as well as the entity-body data. Which means that, if you don't really care about security, you can do:
-
-```javascript
-http.createServer(function (req, resp) {
-  if (req.url === '/doodle.png') {
-    var x = request('http://mysite.com/doodle.png')
-    req.pipe(x)
-    x.pipe(resp)
-  }
-})
-```
-
-And since pipe() returns the destination stream in node 0.5.x you can do one line proxying :)
-
-```javascript
-req.pipe(request('http://mysite.com/doodle.png')).pipe(resp)
-```
-
-Also, none of this new functionality conflicts with requests previous features, it just expands them.
-
-```javascript
-var r = request.defaults({'proxy':'http://localproxy.com'})
-
-http.createServer(function (req, resp) {
-  if (req.url === '/doodle.png') {
-    r.get('http://google.com/doodle.png').pipe(resp)
-  }
-})
-```
-You can still use intermediate proxies, the requests will still follow HTTP forwards, etc.
-
-## Forms
-
-`request` supports `application/x-www-form-urlencoded` and `multipart/form-data` form uploads. For `multipart/related` refer to the `multipart` API.
-
-Url encoded forms are simple
-
-```javascript
-request.post('http://service.com/upload', {form:{key:'value'}})
-// or
-request.post('http://service.com/upload').form({key:'value'})
-```
-
-For `multipart/form-data` we use the [form-data](https://github.com/felixge/node-form-data) library by [@felixge](https://github.com/felixge). You don't need to worry about piping the form object or setting the headers, `request` will handle that for you.
-
-```javascript
-var r = request.post('http://service.com/upload')
-var form = r.form()
-form.append('my_field', 'my_value')
-form.append('my_buffer', new Buffer([1, 2, 3]))
-form.append('my_file', fs.createReadStream(path.join(__dirname, 'doodle.png'))
-form.append('remote_file', request('http://google.com/doodle.png'))
-```
-
-## HTTP Authentication
-
-```javascript
-request.get('http://some.server.com/').auth('username', 'password', false);
-// or
-request.get('http://some.server.com/', {
-  'auth': {
-    'user': 'username',
-    'pass': 'password',
-    'sendImmediately': false
-  }
-});
-```
-
-If passed as an option, `auth` should be a hash containing values `user` || `username`, `password` || `pass`, and `sendImmediately` (optional).  The method form takes parameters `auth(username, password, sendImmediately)`.
-
-`sendImmediately` defaults to true, which will cause a basic authentication header to be sent.  If `sendImmediately` is `false`, then `request` will retry with a proper authentication header after receiving a 401 response from the server (which must contain a `WWW-Authenticate` header indicating the required authentication method).
-
-Digest authentication is supported, but it only works with `sendImmediately` set to `false` (otherwise `request` will send basic authentication on the initial request, which will probably cause the request to fail).
-
-## OAuth Signing
-
-```javascript
-// Twitter OAuth
-var qs = require('querystring')
-  , oauth =
-    { callback: 'http://mysite.com/callback/'
-    , consumer_key: CONSUMER_KEY
-    , consumer_secret: CONSUMER_SECRET
-    }
-  , url = 'https://api.twitter.com/oauth/request_token'
-  ;
-request.post({url:url, oauth:oauth}, function (e, r, body) {
-  // Ideally, you would take the body in the response
-  // and construct a URL that a user clicks on (like a sign in button).
-  // The verifier is only available in the response after a user has
-  // verified with twitter that they are authorizing your app.
-  var access_token = qs.parse(body)
-    , oauth =
-      { consumer_key: CONSUMER_KEY
-      , consumer_secret: CONSUMER_SECRET
-      , token: access_token.oauth_token
-      , verifier: access_token.oauth_verifier
-      }
-    , url = 'https://api.twitter.com/oauth/access_token'
-    ;
-  request.post({url:url, oauth:oauth}, function (e, r, body) {
-    var perm_token = qs.parse(body)
-      , oauth =
-        { consumer_key: CONSUMER_KEY
-        , consumer_secret: CONSUMER_SECRET
-        , token: perm_token.oauth_token
-        , token_secret: perm_token.oauth_token_secret
-        }
-      , url = 'https://api.twitter.com/1/users/show.json?'
-      , params =
-        { screen_name: perm_token.screen_name
-        , user_id: perm_token.user_id
-        }
-      ;
-    url += qs.stringify(params)
-    request.get({url:url, oauth:oauth, json:true}, function (e, r, user) {
-      console.log(user)
-    })
-  })
-})
-```
-
-
-
-### request(options, callback)
-
-The first argument can be either a url or an options object. The only required option is uri, all others are optional.
-
-* `uri` || `url` - fully qualified uri or a parsed url object from url.parse()
-* `qs` - object containing querystring values to be appended to the uri
-* `method` - http method, defaults to GET
-* `headers` - http headers, defaults to {}
-* `body` - entity body for PATCH, POST and PUT requests. Must be buffer or string.
-* `form` - when passed an object this will set `body` but to a querystring representation of value and adds `Content-type: application/x-www-form-urlencoded; charset=utf-8` header. When passed no option a FormData instance is returned that will be piped to request.
-* `auth` - A hash containing values `user` || `username`, `password` || `pass`, and `sendImmediately` (optional).  See documentation above.
-* `json` - sets `body` but to JSON representation of value and adds `Content-type: application/json` header.  Additionally, parses the response body as json.
-* `multipart` - (experimental) array of objects which contains their own headers and `body` attribute. Sends `multipart/related` request. See example below.
-* `followRedirect` - follow HTTP 3xx responses as redirects. defaults to true.
-* `followAllRedirects` - follow non-GET HTTP 3xx responses as redirects. defaults to false.
-* `maxRedirects` - the maximum number of redirects to follow, defaults to 10.
-* `encoding` - Encoding to be used on `setEncoding` of response data. If set to `null`, the body is returned as a Buffer.
-* `pool` - A hash object containing the agents for these requests. If omitted this request will use the global pool which is set to node's default maxSockets.
-* `pool.maxSockets` - Integer containing the maximum amount of sockets in the pool.
-* `timeout` - Integer containing the number of milliseconds to wait for a request to respond before aborting the request
-* `proxy` - An HTTP proxy to be used. Support proxy Auth with Basic Auth the same way it's supported with the `url` parameter by embedding the auth info in the uri.
-* `oauth` - Options for OAuth HMAC-SHA1 signing, see documentation above.
-* `hawk` - Options for [Hawk signing](https://github.com/hueniverse/hawk). The `credentials` key must contain the necessary signing info, [see hawk docs for details](https://github.com/hueniverse/hawk#usage-example).
-* `strictSSL` - Set to `true` to require that SSL certificates be valid. Note: to use your own certificate authority, you need to specify an agent that was created with that ca as an option.
-* `jar` - Set to `true` if you want cookies to be remembered for future use, or define your custom cookie jar (see examples section)
-* `aws` - object containing aws signing information, should have the properties `key` and `secret` as well as `bucket` unless you're specifying your bucket as part of the path, or you are making a request that doesn't use a bucket (i.e. GET Services)
-* `httpSignature` - Options for the [HTTP Signature Scheme](https://github.com/joyent/node-http-signature/blob/master/http_signing.md) using [Joyent's library](https://github.com/joyent/node-http-signature). The `keyId` and `key` properties must be specified. See the docs for other options.
-* `localAddress` - Local interface to bind for network connections.
-
-
-The callback argument gets 3 arguments. The first is an error when applicable (usually from the http.Client option not the http.ClientRequest object). The second is an http.ClientResponse object. The third is the response body String or Buffer.
-
-## Convenience methods
-
-There are also shorthand methods for different HTTP METHODs and some other conveniences.
-
-### request.defaults(options)
-
-This method returns a wrapper around the normal request API that defaults to whatever options you pass in to it.
-
-### request.put
-
-Same as request() but defaults to `method: "PUT"`.
-
-```javascript
-request.put(url)
-```
-
-### request.patch
-
-Same as request() but defaults to `method: "PATCH"`.
-
-```javascript
-request.patch(url)
-```
-
-### request.post
-
-Same as request() but defaults to `method: "POST"`.
-
-```javascript
-request.post(url)
-```
-
-### request.head
-
-Same as request() but defaults to `method: "HEAD"`.
-
-```javascript
-request.head(url)
-```
-
-### request.del
-
-Same as request() but defaults to `method: "DELETE"`.
-
-```javascript
-request.del(url)
-```
-
-### request.get
-
-Alias to normal request method for uniformity.
-
-```javascript
-request.get(url)
-```
-### request.cookie
-
-Function that creates a new cookie.
-
-```javascript
-request.cookie('cookie_string_here')
-```
-### request.jar
-
-Function that creates a new cookie jar.
-
-```javascript
-request.jar()
-```
-
-
-## Examples:
-
-```javascript
-  var request = require('request')
-    , rand = Math.floor(Math.random()*100000000).toString()
-    ;
-  request(
-    { method: 'PUT'
-    , uri: 'http://mikeal.iriscouch.com/testjs/' + rand
-    , multipart:
-      [ { 'content-type': 'application/json'
-        ,  body: JSON.stringify({foo: 'bar', _attachments: {'message.txt': {follows: true, length: 18, 'content_type': 'text/plain' }}})
-        }
-      , { body: 'I am an attachment' }
-      ]
-    }
-  , function (error, response, body) {
-      if(response.statusCode == 201){
-        console.log('document saved as: http://mikeal.iriscouch.com/testjs/'+ rand)
-      } else {
-        console.log('error: '+ response.statusCode)
-        console.log(body)
-      }
-    }
-  )
-```
-Cookies are disabled by default (else, they would be used in subsequent requests). To enable cookies set jar to true (either in defaults or in the options sent).
-
-```javascript
-var request = request.defaults({jar: true})
-request('http://www.google.com', function () {
-  request('http://images.google.com')
-})
-```
-
-If you to use a custom cookie jar (instead of letting request use its own global cookie jar) you do so by setting the jar default or by specifying it as an option:
-
-```javascript
-var j = request.jar()
-var request = request.defaults({jar:j})
-request('http://www.google.com', function () {
-  request('http://images.google.com')
-})
-```
-OR
-
-```javascript
-var j = request.jar()
-var cookie = request.cookie('your_cookie_here')
-j.add(cookie)
-request({url: 'http://www.google.com', jar: j}, function () {
-  request('http://images.google.com')
-})
-```
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,149 +0,0 @@
-// Copyright 2010-2012 Mikeal Rogers
-//
-//    Licensed under the Apache License, Version 2.0 (the "License");
-//    you may not use this file except in compliance with the License.
-//    You may obtain a copy of the License at
-//
-//        http://www.apache.org/licenses/LICENSE-2.0
-//
-//    Unless required by applicable law or agreed to in writing, software
-//    distributed under the License is distributed on an "AS IS" BASIS,
-//    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-//    See the License for the specific language governing permissions and
-//    limitations under the License.
-
-var Cookie = require('cookie-jar')
-  , CookieJar = Cookie.Jar
-  , cookieJar = new CookieJar
-
-  , copy = require('./lib/copy')
-  , Request = require('./request')
-  ;
-
-
-
-// organize params for patch, post, put, head, del
-function initParams(uri, options, callback) {
-  if ((typeof options === 'function') && !callback) callback = options
-  if (options && typeof options === 'object') {
-    options.uri = uri
-  } else if (typeof uri === 'string') {
-    options = {uri:uri}
-  } else {
-    options = uri
-    uri = options.uri
-  }
-  return { uri: uri, options: options, callback: callback }
-}
-
-function request (uri, options, callback) {
-  if (typeof uri === 'undefined') throw new Error('undefined is not a valid uri or options object.')
-  if ((typeof options === 'function') && !callback) callback = options
-  if (options && typeof options === 'object') {
-    options.uri = uri
-  } else if (typeof uri === 'string') {
-    options = {uri:uri}
-  } else {
-    options = uri
-  }
-
-  options = copy(options)
-
-  if (callback) options.callback = callback
-  var r = new Request(options)
-  return r
-}
-
-module.exports = request
-
-request.Request = Request;
-
-request.debug = process.env.NODE_DEBUG && /request/.test(process.env.NODE_DEBUG)
-
-request.initParams = initParams
-
-request.defaults = function (options, requester) {
-  var def = function (method) {
-    var d = function (uri, opts, callback) {
-      var params = initParams(uri, opts, callback)
-      for (var i in options) {
-        if (params.options[i] === undefined) params.options[i] = options[i]
-      }
-      if(typeof requester === 'function') {
-        if(method === request) {
-          method = requester
-        } else {
-          params.options._requester = requester
-        }
-      }
-      return method(params.options, params.callback)
-    }
-    return d
-  }
-  var de = def(request)
-  de.get = def(request.get)
-  de.patch = def(request.patch)
-  de.post = def(request.post)
-  de.put = def(request.put)
-  de.head = def(request.head)
-  de.del = def(request.del)
-  de.cookie = def(request.cookie)
-  de.jar = request.jar
-  return de
-}
-
-request.forever = function (agentOptions, optionsArg) {
-  var options = {}
-  if (optionsArg) {
-    for (option in optionsArg) {
-      options[option] = optionsArg[option]
-    }
-  }
-  if (agentOptions) options.agentOptions = agentOptions
-  options.forever = true
-  return request.defaults(options)
-}
-
-request.get = request
-request.post = function (uri, options, callback) {
-  var params = initParams(uri, options, callback)
-  params.options.method = 'POST'
-  return request(params.uri || null, params.options, params.callback)
-}
-request.put = function (uri, options, callback) {
-  var params = initParams(uri, options, callback)
-  params.options.method = 'PUT'
-  return request(params.uri || null, params.options, params.callback)
-}
-request.patch = function (uri, options, callback) {
-  var params = initParams(uri, options, callback)
-  params.options.method = 'PATCH'
-  return request(params.uri || null, params.options, params.callback)
-}
-request.head = function (uri, options, callback) {
-  var params = initParams(uri, options, callback)
-  params.options.method = 'HEAD'
-  if (params.options.body ||
-      params.options.requestBodyStream ||
-      (params.options.json && typeof params.options.json !== 'boolean') ||
-      params.options.multipart) {
-    throw new Error("HTTP HEAD requests MUST NOT include a request body.")
-  }
-  return request(params.uri || null, params.options, params.callback)
-}
-request.del = function (uri, options, callback) {
-  var params = initParams(uri, options, callback)
-  params.options.method = 'DELETE'
-  if(typeof params.options._requester === 'function') {
-    request = params.options._requester
-  }
-  return request(params.uri || null, params.options, params.callback)
-}
-request.jar = function () {
-  return new CookieJar
-}
-request.cookie = function (str) {
-  if (str && str.uri) str = str.uri
-  if (typeof str !== 'string') throw new Error("The cookie function only accepts STRING as param")
-  return new Cookie(str)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/lib/copy.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,8 +0,0 @@
-module.exports =
-function copy (obj) {
-  var o = {}
-  Object.keys(obj).forEach(function (i) {
-    o[i] = obj[i]
-  })
-  return o
-}
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/lib/debug.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-module.exports =
-function debug () {
-  if (/\brequest\b/.test(process.env.NODE_DEBUG))
-    console.error('REQUEST %s', util.format.apply(util, arguments))
-}
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/lib/getSafe.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,34 +0,0 @@
-// Safe toJSON
-module.exports =
-function getSafe (self, uuid) {
-  if (typeof self === 'object' || typeof self === 'function') var safe = {}
-  if (Array.isArray(self)) var safe = []
-
-  var recurse = []
-
-  Object.defineProperty(self, uuid, {})
-
-  var attrs = Object.keys(self).filter(function (i) {
-    if (i === uuid) return false
-    if ( (typeof self[i] !== 'object' && typeof self[i] !== 'function') || self[i] === null) return true
-    return !(Object.getOwnPropertyDescriptor(self[i], uuid))
-  })
-
-
-  for (var i=0;i<attrs.length;i++) {
-    if ( (typeof self[attrs[i]] !== 'object' && typeof self[attrs[i]] !== 'function') ||
-          self[attrs[i]] === null
-        ) {
-      safe[attrs[i]] = self[attrs[i]]
-    } else {
-      recurse.push(attrs[i])
-      Object.defineProperty(self[attrs[i]], uuid, {})
-    }
-  }
-
-  for (var i=0;i<recurse.length;i++) {
-    safe[recurse[i]] = getSafe(self[recurse[i]], uuid)
-  }
-
-  return safe
-}
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/aws-sign/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-Apache License
-
-Version 2.0, January 2004
-
-http://www.apache.org/licenses/
-
-TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-1. Definitions.
-
-"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
-
-"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
-
-"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
-
-"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
-
-"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
-
-"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
-
-"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
-
-"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
-
-"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
-
-"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
-
-2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
-
-3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
-
-4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
-
-You must give any other recipients of the Work or Derivative Works a copy of this License; and
-
-You must cause any modified files to carry prominent notices stating that You changed the files; and
-
-You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
-
-If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
-
-5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
-
-6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
-
-7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
-
-8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
-
-9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
-
-END OF TERMS AND CONDITIONS
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/aws-sign/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-aws-sign
-========
-
-AWS signing. Originally pulled from LearnBoost/knox, maintained as vendor in request, now a standalone module.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/aws-sign/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,202 +0,0 @@
-
-/*!
- * knox - auth
- * Copyright(c) 2010 LearnBoost <dev@learnboost.com>
- * MIT Licensed
- */
-
-/**
- * Module dependencies.
- */
-
-var crypto = require('crypto')
-  , parse = require('url').parse
-  ;
-
-/**
- * Valid keys.
- */
-
-var keys = 
-  [ 'acl'
-  , 'location'
-  , 'logging'
-  , 'notification'
-  , 'partNumber'
-  , 'policy'
-  , 'requestPayment'
-  , 'torrent'
-  , 'uploadId'
-  , 'uploads'
-  , 'versionId'
-  , 'versioning'
-  , 'versions'
-  , 'website'
-  ]
-
-/**
- * Return an "Authorization" header value with the given `options`
- * in the form of "AWS <key>:<signature>"
- *
- * @param {Object} options
- * @return {String}
- * @api private
- */
-
-function authorization (options) {
-  return 'AWS ' + options.key + ':' + sign(options)
-}
-
-module.exports = authorization
-module.exports.authorization = authorization
-
-/**
- * Simple HMAC-SHA1 Wrapper
- *
- * @param {Object} options
- * @return {String}
- * @api private
- */ 
-
-function hmacSha1 (options) {
-  return crypto.createHmac('sha1', options.secret).update(options.message).digest('base64')
-}
-
-module.exports.hmacSha1 = hmacSha1
-
-/**
- * Create a base64 sha1 HMAC for `options`. 
- * 
- * @param {Object} options
- * @return {String}
- * @api private
- */
-
-function sign (options) {
-  options.message = stringToSign(options)
-  return hmacSha1(options)
-}
-module.exports.sign = sign
-
-/**
- * Create a base64 sha1 HMAC for `options`. 
- *
- * Specifically to be used with S3 presigned URLs
- * 
- * @param {Object} options
- * @return {String}
- * @api private
- */
-
-function signQuery (options) {
-  options.message = queryStringToSign(options)
-  return hmacSha1(options)
-}
-module.exports.signQuery= signQuery
-
-/**
- * Return a string for sign() with the given `options`.
- *
- * Spec:
- * 
- *    <verb>\n
- *    <md5>\n
- *    <content-type>\n
- *    <date>\n
- *    [headers\n]
- *    <resource>
- *
- * @param {Object} options
- * @return {String}
- * @api private
- */
-
-function stringToSign (options) {
-  var headers = options.amazonHeaders || ''
-  if (headers) headers += '\n'
-  var r = 
-    [ options.verb
-    , options.md5
-    , options.contentType
-    , options.date.toUTCString()
-    , headers + options.resource
-    ]
-  return r.join('\n')
-}
-module.exports.queryStringToSign = stringToSign
-
-/**
- * Return a string for sign() with the given `options`, but is meant exclusively
- * for S3 presigned URLs
- *
- * Spec:
- * 
- *    <date>\n
- *    <resource>
- *
- * @param {Object} options
- * @return {String}
- * @api private
- */
-
-function queryStringToSign (options){
-  return 'GET\n\n\n' + options.date + '\n' + options.resource
-}
-module.exports.queryStringToSign = queryStringToSign
-
-/**
- * Perform the following:
- *
- *  - ignore non-amazon headers
- *  - lowercase fields
- *  - sort lexicographically
- *  - trim whitespace between ":"
- *  - join with newline
- *
- * @param {Object} headers
- * @return {String}
- * @api private
- */
-
-function canonicalizeHeaders (headers) {
-  var buf = []
-    , fields = Object.keys(headers)
-    ;
-  for (var i = 0, len = fields.length; i < len; ++i) {
-    var field = fields[i]
-      , val = headers[field]
-      , field = field.toLowerCase()
-      ;
-    if (0 !== field.indexOf('x-amz')) continue
-    buf.push(field + ':' + val)
-  }
-  return buf.sort().join('\n')
-}
-module.exports.canonicalizeHeaders = canonicalizeHeaders
-
-/**
- * Perform the following:
- *
- *  - ignore non sub-resources
- *  - sort lexicographically
- *
- * @param {String} resource
- * @return {String}
- * @api private
- */
-
-function canonicalizeResource (resource) {
-  var url = parse(resource, true)
-    , path = url.pathname
-    , buf = []
-    ;
-
-  Object.keys(url.query).forEach(function(key){
-    if (!~keys.indexOf(key)) return
-    var val = '' == url.query[key] ? '' : '=' + encodeURIComponent(url.query[key])
-    buf.push(key + val)
-  })
-
-  return path + (buf.length ? '?' + buf.sort().join('&') : '')
-}
-module.exports.canonicalizeResource = canonicalizeResource
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/aws-sign/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-{
-  "author": {
-    "name": "Mikeal Rogers",
-    "email": "mikeal.rogers@gmail.com",
-    "url": "http://www.futurealoof.com"
-  },
-  "name": "aws-sign",
-  "description": "AWS signing. Originally pulled from LearnBoost/knox, maintained as vendor in request, now a standalone module.",
-  "version": "0.3.0",
-  "repository": {
-    "url": "https://github.com/mikeal/aws-sign"
-  },
-  "main": "index.js",
-  "dependencies": {},
-  "devDependencies": {},
-  "optionalDependencies": {},
-  "engines": {
-    "node": "*"
-  },
-  "readme": "aws-sign\n========\n\nAWS signing. Originally pulled from LearnBoost/knox, maintained as vendor in request, now a standalone module.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/mikeal/aws-sign/issues"
-  },
-  "homepage": "https://github.com/mikeal/aws-sign",
-  "_id": "aws-sign@0.3.0",
-  "_from": "aws-sign@~0.3.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-Apache License
-
-Version 2.0, January 2004
-
-http://www.apache.org/licenses/
-
-TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-1. Definitions.
-
-"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
-
-"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
-
-"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
-
-"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
-
-"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
-
-"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
-
-"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
-
-"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
-
-"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
-
-"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
-
-2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
-
-3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
-
-4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
-
-You must give any other recipients of the Work or Derivative Works a copy of this License; and
-
-You must cause any modified files to carry prominent notices stating that You changed the files; and
-
-You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
-
-If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
-
-5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
-
-6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
-
-7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
-
-8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
-
-9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
-
-END OF TERMS AND CONDITIONS
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-cookie-jar
-==========
-
-Cookie Jar. Originally pulled from LearnBoost/tobi, maintained as vendor in request, now a standalone module.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,67 +0,0 @@
-/*!
- * Tobi - Cookie
- * Copyright(c) 2010 LearnBoost <dev@learnboost.com>
- * MIT Licensed
- */
-
-/**
- * Module dependencies.
- */
-
-var url = require('url');
-
-/**
- * Initialize a new `Cookie` with the given cookie `str` and `req`.
- *
- * @param {String} str
- * @param {IncomingRequest} req
- * @api private
- */
-
-var Cookie = exports = module.exports = function Cookie(str, req) {
-  this.str = str;
-
-  // Map the key/val pairs
-  str.split(/ *; */).reduce(function(obj, pair){
-   var p = pair.indexOf('=');
-   var key = p > 0 ? pair.substring(0, p).trim() : pair.trim();
-   var lowerCasedKey = key.toLowerCase();
-   var value = p > 0 ? pair.substring(p + 1).trim() : true;
-
-   if (!obj.name) {
-    // First key is the name
-    obj.name = key;
-    obj.value = value;
-   }
-   else if (lowerCasedKey === 'httponly') {
-    obj.httpOnly = value;
-   }
-   else {
-    obj[lowerCasedKey] = value;
-   }
-   return obj;
-  }, this);
-
-  // Expires
-  this.expires = this.expires
-    ? new Date(this.expires)
-    : Infinity;
-
-  // Default or trim path
-  this.path = this.path
-    ? this.path.trim(): req 
-    ? url.parse(req.url).pathname: '/';
-};
-
-/**
- * Return the original cookie string.
- *
- * @return {String}
- * @api public
- */
-
-Cookie.prototype.toString = function(){
-  return this.str;
-};
-
-module.exports.Jar = require('./jar')
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/jar.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,72 +0,0 @@
-/*!
-* Tobi - CookieJar
-* Copyright(c) 2010 LearnBoost <dev@learnboost.com>
-* MIT Licensed
-*/
-
-/**
-* Module dependencies.
-*/
-
-var url = require('url');
-
-/**
-* Initialize a new `CookieJar`.
-*
-* @api private
-*/
-
-var CookieJar = exports = module.exports = function CookieJar() {
-  this.cookies = [];
-};
-
-/**
-* Add the given `cookie` to the jar.
-*
-* @param {Cookie} cookie
-* @api private
-*/
-
-CookieJar.prototype.add = function(cookie){
-  this.cookies = this.cookies.filter(function(c){
-    // Avoid duplication (same path, same name)
-    return !(c.name == cookie.name && c.path == cookie.path);
-  });
-  this.cookies.push(cookie);
-};
-
-/**
-* Get cookies for the given `req`.
-*
-* @param {IncomingRequest} req
-* @return {Array}
-* @api private
-*/
-
-CookieJar.prototype.get = function(req){
-  var path = url.parse(req.url).pathname
-    , now = new Date
-    , specificity = {};
-  return this.cookies.filter(function(cookie){
-    if (0 == path.indexOf(cookie.path) && now < cookie.expires
-      && cookie.path.length > (specificity[cookie.name] || 0))
-      return specificity[cookie.name] = cookie.path.length;
-  });
-};
-
-/**
-* Return Cookie string for the given `req`.
-*
-* @param {IncomingRequest} req
-* @return {String}
-* @api private
-*/
-
-CookieJar.prototype.cookieString = function(req){
-  var cookies = this.get(req);
-  if (cookies.length) {
-    return cookies.map(function(cookie){
-      return cookie.name + '=' + cookie.value;
-    }).join('; ');
-  }
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-{
-  "author": {
-    "name": "Mikeal Rogers",
-    "email": "mikeal.rogers@gmail.com",
-    "url": "http://www.futurealoof.com"
-  },
-  "name": "cookie-jar",
-  "description": "Cookie Jar. Originally pulled form tobi, maintained as vendor in request, now a standalone module.",
-  "version": "0.3.0",
-  "repository": {
-    "url": "https://github.com/mikeal/cookie-jar"
-  },
-  "main": "index.js",
-  "scripts": {
-    "test": "node tests/run.js"
-  },
-  "dependencies": {},
-  "devDependencies": {},
-  "optionalDependencies": {},
-  "engines": {
-    "node": "*"
-  },
-  "readme": "cookie-jar\n==========\n\nCookie Jar. Originally pulled from LearnBoost/tobi, maintained as vendor in request, now a standalone module.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/mikeal/cookie-jar/issues"
-  },
-  "homepage": "https://github.com/mikeal/cookie-jar",
-  "_id": "cookie-jar@0.3.0",
-  "_from": "cookie-jar@~0.3.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/tests/run.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-var spawn = require('child_process').spawn
-  , exitCode = 0
-  , timeout = 10000
-  , fs = require('fs')
-  ;
-
-fs.readdir(__dirname, function (e, files) {
-  if (e) throw e
-
-  var tests = files.filter(function (f) {return f.slice(0, 'test-'.length) === 'test-'})
-
-  var next = function () {
-    if (tests.length === 0) process.exit(exitCode);
-
-    var file = tests.shift()
-    console.log(file)
-    var proc = spawn('node', [ 'tests/' + file ])
-
-    var killed = false
-    var t = setTimeout(function () {
-      proc.kill()
-      exitCode += 1
-      console.error(file + ' timeout')
-      killed = true
-    }, timeout)
-
-    proc.stdout.pipe(process.stdout)
-    proc.stderr.pipe(process.stderr)
-    proc.on('exit', function (code) {
-      if (code && !killed) console.error(file + ' failed')
-      exitCode += code || 0
-      clearTimeout(t)
-      next()
-    })
-  }
-  next()
-    
-})
-
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/tests/test-cookie.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,29 +0,0 @@
-var Cookie = require('../index')
-  , assert = require('assert');
-
-var str = 'Sid="s543qactge.wKE61E01Bs%2BKhzmxrwrnug="; Path=/; httpOnly; Expires=Sat, 04 Dec 2010 23:27:28 GMT';
-var cookie = new Cookie(str);
-
-// test .toString()
-assert.equal(cookie.toString(), str);
-
-// test .path
-assert.equal(cookie.path, '/');
-
-// test .httpOnly
-assert.equal(cookie.httpOnly, true);
-
-// test .name
-assert.equal(cookie.name, 'Sid');
-
-// test .value
-assert.equal(cookie.value, '"s543qactge.wKE61E01Bs%2BKhzmxrwrnug="');
-
-// test .expires
-assert.equal(cookie.expires instanceof Date, true);
-
-// test .path default
-var cookie = new Cookie('foo=bar', { url: 'http://foo.com/bar' });
-assert.equal(cookie.path, '/bar');
-
-console.log('All tests passed');
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/cookie-jar/tests/test-cookiejar.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,90 +0,0 @@
-var Cookie = require('../index')
-  , Jar = Cookie.Jar
-  , assert = require('assert');
-
-function expires(ms) {
-  return new Date(Date.now() + ms).toUTCString();
-}
-
-// test .get() expiration
-(function() {
-  var jar = new Jar;
-  var cookie = new Cookie('sid=1234; path=/; expires=' + expires(1000));
-  jar.add(cookie);
-  setTimeout(function(){
-    var cookies = jar.get({ url: 'http://foo.com/foo' });
-    assert.equal(cookies.length, 1);
-    assert.equal(cookies[0], cookie);
-    setTimeout(function(){
-      var cookies = jar.get({ url: 'http://foo.com/foo' });
-      assert.equal(cookies.length, 0);
-    }, 1000);
-  }, 5);
-})();
-
-// test .get() path support
-(function() {
-  var jar = new Jar;
-  var a = new Cookie('sid=1234; path=/');
-  var b = new Cookie('sid=1111; path=/foo/bar');
-  var c = new Cookie('sid=2222; path=/');
-  jar.add(a);
-  jar.add(b);
-  jar.add(c);
-
-  // should remove the duplicates
-  assert.equal(jar.cookies.length, 2);
-
-  // same name, same path, latter prevails
-  var cookies = jar.get({ url: 'http://foo.com/' });
-  assert.equal(cookies.length, 1);
-  assert.equal(cookies[0], c);
-
-  // same name, diff path, path specifity prevails, latter prevails
-  var cookies = jar.get({ url: 'http://foo.com/foo/bar' });
-  assert.equal(cookies.length, 1);
-  assert.equal(cookies[0], b);
-
-  var jar = new Jar;
-  var a = new Cookie('sid=1111; path=/foo/bar');
-  var b = new Cookie('sid=1234; path=/');
-  jar.add(a);
-  jar.add(b);
-
-  var cookies = jar.get({ url: 'http://foo.com/foo/bar' });
-  assert.equal(cookies.length, 1);
-  assert.equal(cookies[0], a);
-
-  var cookies = jar.get({ url: 'http://foo.com/' });
-  assert.equal(cookies.length, 1);
-  assert.equal(cookies[0], b);
-
-  var jar = new Jar;
-  var a = new Cookie('sid=1111; path=/foo/bar');
-  var b = new Cookie('sid=3333; path=/foo/bar');
-  var c = new Cookie('pid=3333; path=/foo/bar');
-  var d = new Cookie('sid=2222; path=/foo/');
-  var e = new Cookie('sid=1234; path=/');
-  jar.add(a);
-  jar.add(b);
-  jar.add(c);
-  jar.add(d);
-  jar.add(e);
-
-  var cookies = jar.get({ url: 'http://foo.com/foo/bar' });
-  assert.equal(cookies.length, 2);
-  assert.equal(cookies[0], b);
-  assert.equal(cookies[1], c);
-
-  var cookies = jar.get({ url: 'http://foo.com/foo/' });
-  assert.equal(cookies.length, 1);
-  assert.equal(cookies[0], d);
-
-  var cookies = jar.get({ url: 'http://foo.com/' });
-  assert.equal(cookies.length, 1);
-  assert.equal(cookies[0], e);
-})();
-
-setTimeout(function() {
-  console.log('All tests passed');
-}, 1200);
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-Apache License
-
-Version 2.0, January 2004
-
-http://www.apache.org/licenses/
-
-TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-1. Definitions.
-
-"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
-
-"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
-
-"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
-
-"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
-
-"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
-
-"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
-
-"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
-
-"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
-
-"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
-
-"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
-
-2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
-
-3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
-
-4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
-
-You must give any other recipients of the Work or Derivative Works a copy of this License; and
-
-You must cause any modified files to carry prominent notices stating that You changed the files; and
-
-You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
-
-If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
-
-5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
-
-6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
-
-7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
-
-8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
-
-9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
-
-END OF TERMS AND CONDITIONS
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-forever-agent
-=============
-
-HTTP Agent that keeps socket connections alive between keep-alive requests. Formerly part of mikeal/request, now a standalone module.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,119 +0,0 @@
-module.exports = ForeverAgent
-ForeverAgent.SSL = ForeverAgentSSL
-
-var util = require('util')
-  , Agent = require('http').Agent
-  , net = require('net')
-  , tls = require('tls')
-  , AgentSSL = require('https').Agent
-
-function ForeverAgent(options) {
-  var self = this
-  self.options = options || {}
-  self.requests = {}
-  self.sockets = {}
-  self.freeSockets = {}
-  self.maxSockets = self.options.maxSockets || Agent.defaultMaxSockets
-  self.minSockets = self.options.minSockets || ForeverAgent.defaultMinSockets
-  self.on('free', function(socket, host, port) {
-    var name = host + ':' + port
-    if (self.requests[name] && self.requests[name].length) {
-      self.requests[name].shift().onSocket(socket)
-    } else if (self.sockets[name].length < self.minSockets) {
-      if (!self.freeSockets[name]) self.freeSockets[name] = []
-      self.freeSockets[name].push(socket)
-      
-      // if an error happens while we don't use the socket anyway, meh, throw the socket away
-      function onIdleError() {
-        socket.destroy()
-      }
-      socket._onIdleError = onIdleError
-      socket.on('error', onIdleError)
-    } else {
-      // If there are no pending requests just destroy the
-      // socket and it will get removed from the pool. This
-      // gets us out of timeout issues and allows us to
-      // default to Connection:keep-alive.
-      socket.destroy()
-    }
-  })
-
-}
-util.inherits(ForeverAgent, Agent)
-
-ForeverAgent.defaultMinSockets = 5
-
-
-ForeverAgent.prototype.createConnection = net.createConnection
-ForeverAgent.prototype.addRequestNoreuse = Agent.prototype.addRequest
-ForeverAgent.prototype.addRequest = function(req, host, port) {
-  var name = host + ':' + port
-  if (this.freeSockets[name] && this.freeSockets[name].length > 0 && !req.useChunkedEncodingByDefault) {
-    var idleSocket = this.freeSockets[name].pop()
-    idleSocket.removeListener('error', idleSocket._onIdleError)
-    delete idleSocket._onIdleError
-    req._reusedSocket = true
-    req.onSocket(idleSocket)
-  } else {
-    this.addRequestNoreuse(req, host, port)
-  }
-}
-
-ForeverAgent.prototype.removeSocket = function(s, name, host, port) {
-  if (this.sockets[name]) {
-    var index = this.sockets[name].indexOf(s)
-    if (index !== -1) {
-      this.sockets[name].splice(index, 1)
-    }
-  } else if (this.sockets[name] && this.sockets[name].length === 0) {
-    // don't leak
-    delete this.sockets[name]
-    delete this.requests[name]
-  }
-  
-  if (this.freeSockets[name]) {
-    var index = this.freeSockets[name].indexOf(s)
-    if (index !== -1) {
-      this.freeSockets[name].splice(index, 1)
-      if (this.freeSockets[name].length === 0) {
-        delete this.freeSockets[name]
-      }
-    }
-  }
-
-  if (this.requests[name] && this.requests[name].length) {
-    // If we have pending requests and a socket gets closed a new one
-    // needs to be created to take over in the pool for the one that closed.
-    this.createSocket(name, host, port).emit('free')
-  }
-}
-
-function ForeverAgentSSL (options) {
-  ForeverAgent.call(this, options)
-}
-util.inherits(ForeverAgentSSL, ForeverAgent)
-
-ForeverAgentSSL.prototype.createConnection = createConnectionSSL
-ForeverAgentSSL.prototype.addRequestNoreuse = AgentSSL.prototype.addRequest
-
-function createConnectionSSL (port, host, options) {
-  if (typeof port === 'object') {
-    options = port;
-  } else if (typeof host === 'object') {
-    options = host;
-  } else if (typeof options === 'object') {
-    options = options;
-  } else {
-    options = {};
-  }
-
-  if (typeof port === 'number') {
-    options.port = port;
-  }
-
-  if (typeof host === 'string') {
-    options.host = host;
-  }
-
-  return tls.connect(options);
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-{
-  "author": {
-    "name": "Mikeal Rogers",
-    "email": "mikeal.rogers@gmail.com",
-    "url": "http://www.futurealoof.com"
-  },
-  "name": "forever-agent",
-  "description": "HTTP Agent that keeps socket connections alive between keep-alive requests. Formerly part of mikeal/request, now a standalone module.",
-  "version": "0.5.0",
-  "repository": {
-    "url": "https://github.com/mikeal/forever-agent"
-  },
-  "main": "index.js",
-  "dependencies": {},
-  "devDependencies": {},
-  "optionalDependencies": {},
-  "engines": {
-    "node": "*"
-  },
-  "readme": "forever-agent\n=============\n\nHTTP Agent that keeps socket connections alive between keep-alive requests. Formerly part of mikeal/request, now a standalone module.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/mikeal/forever-agent/issues"
-  },
-  "homepage": "https://github.com/mikeal/forever-agent",
-  "_id": "forever-agent@0.5.0",
-  "_from": "forever-agent@~0.5.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/License	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-Copyright (c) 2012 Felix Geisendörfer (felix@debuggable.com) and contributors
-
- Permission is hereby granted, free of charge, to any person obtaining a copy
- of this software and associated documentation files (the "Software"), to deal
- in the Software without restriction, including without limitation the rights
- to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
- copies of the Software, and to permit persons to whom the Software is
- furnished to do so, subject to the following conditions:
-
- The above copyright notice and this permission notice shall be included in
- all copies or substantial portions of the Software.
-
- THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
- IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
- FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
- AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
- LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
- OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
- THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/Readme.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,163 +0,0 @@
-# Form-Data [![Build Status](https://travis-ci.org/felixge/node-form-data.png?branch=master)](https://travis-ci.org/felixge/node-form-data) [![Dependency Status](https://gemnasium.com/felixge/node-form-data.png)](https://gemnasium.com/felixge/node-form-data)
-
-A module to create readable ```"multipart/form-data"``` streams. Can be used to submit forms and file uploads to other web applications.
-
-The API of this module is inspired by the [XMLHttpRequest-2 FormData Interface][xhr2-fd].
-
-[xhr2-fd]: http://dev.w3.org/2006/webapi/XMLHttpRequest-2/Overview.html#the-formdata-interface
-[streams2-thing]: http://nodejs.org/api/stream.html#stream_compatibility_with_older_node_versions
-
-## Install
-
-```
-npm install form-data
-```
-
-## Usage
-
-In this example we are constructing a form with 3 fields that contain a string,
-a buffer and a file stream.
-
-``` javascript
-var FormData = require('form-data');
-var fs = require('fs');
-
-var form = new FormData();
-form.append('my_field', 'my value');
-form.append('my_buffer', new Buffer(10));
-form.append('my_file', fs.createReadStream('/foo/bar.jpg'));
-```
-
-Also you can use http-response stream:
-
-``` javascript
-var FormData = require('form-data');
-var http = require('http');
-
-var form = new FormData();
-
-http.request('http://nodejs.org/images/logo.png', function(response) {
-  form.append('my_field', 'my value');
-  form.append('my_buffer', new Buffer(10));
-  form.append('my_logo', response);
-});
-```
-
-Or @mikeal's request stream:
-
-``` javascript
-var FormData = require('form-data');
-var request = require('request');
-
-var form = new FormData();
-
-form.append('my_field', 'my value');
-form.append('my_buffer', new Buffer(10));
-form.append('my_logo', request('http://nodejs.org/images/logo.png'));
-```
-
-In order to submit this form to a web application, call ```submit(url, [callback])``` method:
-
-``` javascript
-form.submit('http://example.org/', function(err, res) {
-  // res – response object (http.IncomingMessage)  //
-  res.resume(); // for node-0.10.x
-});
-
-```
-
-For more advanced request manipulations ```submit()``` method returns ```http.ClientRequest``` object, or you can choose from one of the alternative submission methods.
-
-### Alternative submission methods
-
-You can use node's http client interface:
-
-``` javascript
-var http = require('http');
-
-var request = http.request({
-  method: 'post',
-  host: 'example.org',
-  path: '/upload',
-  headers: form.getHeaders()
-});
-
-form.pipe(request);
-
-request.on('response', function(res) {
-  console.log(res.statusCode);
-});
-```
-
-Or if you would prefer the `'Content-Length'` header to be set for you:
-
-``` javascript
-form.submit('example.org/upload', function(err, res) {
-  console.log(res.statusCode);
-});
-```
-
-To use custom headers and pre-known length in parts:
-
-``` javascript
-var CRLF = '\r\n';
-var form = new FormData();
-
-var options = {
-  header: CRLF + '--' + form.getBoundary() + CRLF + 'X-Custom-Header: 123' + CRLF + CRLF,
-  knownLength: 1
-};
-
-form.append('my_buffer', buffer, options);
-
-form.submit('http://example.com/', function(err, res) {
-  if (err) throw err;
-  console.log('Done');
-});
-```
-
-Form-Data can recognize and fetch all the required information from common types of streams (```fs.readStream```, ```http.response``` and ```mikeal's request```), for some other types of streams you'd need to provide "file"-related information manually:
-
-``` javascript
-someModule.stream(function(err, stdout, stderr) {
-  if (err) throw err;
-
-  var form = new FormData();
-
-  form.append('file', stdout, {
-    filename: 'unicycle.jpg',
-    contentType: 'image/jpg',
-    knownLength: 19806
-  });
-
-  form.submit('http://example.com/', function(err, res) {
-    if (err) throw err;
-    console.log('Done');
-  });
-});
-```
-
-For edge cases, like POST request to URL with query string or to pass HTTP auth credentials, object can be passed to `form.submit()` as first parameter:
-
-``` javascript
-form.submit({
-  host: 'example.com',
-  path: '/probably.php?extra=params',
-  auth: 'username:password'
-}, function(err, res) {
-  console.log(res.statusCode);
-});
-```
-
-## Notes
-
-- ```getLengthSync()``` method DOESN'T calculate length for streams, use ```knownLength``` options as workaround.
-- If it feels like FormData hangs after submit and you're on ```node-0.10```, please check [Compatibility with Older Node Versions][streams2-thing]
-
-## TODO
-
-- Add new streams (0.10) support and try really hard not to break it for 0.8.x.
-
-## License
-
-Form-Data is licensed under the MIT license.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/lib/form_data.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,325 +0,0 @@
-var CombinedStream = require('combined-stream');
-var util = require('util');
-var path = require('path');
-var http = require('http');
-var https = require('https');
-var parseUrl = require('url').parse;
-var fs = require('fs');
-var mime = require('mime');
-var async = require('async');
-
-module.exports = FormData;
-function FormData() {
-  this._overheadLength = 0;
-  this._valueLength = 0;
-  this._lengthRetrievers = [];
-
-  CombinedStream.call(this);
-}
-util.inherits(FormData, CombinedStream);
-
-FormData.LINE_BREAK = '\r\n';
-
-FormData.prototype.append = function(field, value, options) {
-  options = options || {};
-
-  var append = CombinedStream.prototype.append.bind(this);
-
-  // all that streamy business can't handle numbers
-  if (typeof value == 'number') value = ''+value;
-
-  // https://github.com/felixge/node-form-data/issues/38
-  if (util.isArray(value)) {
-    // Please convert your array into string
-    // the way web server expects it
-    this._error(new Error('Arrays are not supported.'));
-    return;
-  }
-
-  var header = this._multiPartHeader(field, value, options);
-  var footer = this._multiPartFooter(field, value, options);
-
-  append(header);
-  append(value);
-  append(footer);
-
-  // pass along options.knownLength
-  this._trackLength(header, value, options);
-};
-
-FormData.prototype._trackLength = function(header, value, options) {
-  var valueLength = 0;
-
-  // used w/ getLengthSync(), when length is known.
-  // e.g. for streaming directly from a remote server,
-  // w/ a known file a size, and not wanting to wait for
-  // incoming file to finish to get its size.
-  if (options.knownLength != null) {
-    valueLength += +options.knownLength;
-  } else if (Buffer.isBuffer(value)) {
-    valueLength = value.length;
-  } else if (typeof value === 'string') {
-    valueLength = Buffer.byteLength(value);
-  }
-
-  this._valueLength += valueLength;
-
-  // @check why add CRLF? does this account for custom/multiple CRLFs?
-  this._overheadLength +=
-    Buffer.byteLength(header) +
-    + FormData.LINE_BREAK.length;
-
-  // empty or either doesn't have path or not an http response
-  if (!value || ( !value.path && !(value.readable && value.hasOwnProperty('httpVersion')) )) {
-    return;
-  }
-
-  // no need to bother with the length
-  if (!options.knownLength)
-  this._lengthRetrievers.push(function(next) {
-
-    if (value.hasOwnProperty('fd')) {
-      fs.stat(value.path, function(err, stat) {
-        if (err) {
-          next(err);
-          return;
-        }
-
-        next(null, stat.size);
-      });
-
-    // or http response
-    } else if (value.hasOwnProperty('httpVersion')) {
-      next(null, +value.headers['content-length']);
-
-    // or request stream http://github.com/mikeal/request
-    } else if (value.hasOwnProperty('httpModule')) {
-      // wait till response come back
-      value.on('response', function(response) {
-        value.pause();
-        next(null, +response.headers['content-length']);
-      });
-      value.resume();
-
-    // something else
-    } else {
-      next('Unknown stream');
-    }
-  });
-};
-
-FormData.prototype._multiPartHeader = function(field, value, options) {
-  var boundary = this.getBoundary();
-  var header = '';
-
-  // custom header specified (as string)?
-  // it becomes responsible for boundary
-  // (e.g. to handle extra CRLFs on .NET servers)
-  if (options.header != null) {
-    header = options.header;
-  } else {
-    header += '--' + boundary + FormData.LINE_BREAK +
-      'Content-Disposition: form-data; name="' + field + '"';
-
-    // fs- and request- streams have path property
-    // or use custom filename and/or contentType
-    // TODO: Use request's response mime-type
-    if (options.filename || value.path) {
-      header +=
-        '; filename="' + path.basename(options.filename || value.path) + '"' + FormData.LINE_BREAK +
-        'Content-Type: ' +  (options.contentType || mime.lookup(options.filename || value.path));
-
-    // http response has not
-    } else if (value.readable && value.hasOwnProperty('httpVersion')) {
-      header +=
-        '; filename="' + path.basename(value.client._httpMessage.path) + '"' + FormData.LINE_BREAK +
-        'Content-Type: ' + value.headers['content-type'];
-    }
-
-    header += FormData.LINE_BREAK + FormData.LINE_BREAK;
-  }
-
-  return header;
-};
-
-FormData.prototype._multiPartFooter = function(field, value, options) {
-  return function(next) {
-    var footer = FormData.LINE_BREAK;
-
-    var lastPart = (this._streams.length === 0);
-    if (lastPart) {
-      footer += this._lastBoundary();
-    }
-
-    next(footer);
-  }.bind(this);
-};
-
-FormData.prototype._lastBoundary = function() {
-  return '--' + this.getBoundary() + '--';
-};
-
-FormData.prototype.getHeaders = function(userHeaders) {
-  var formHeaders = {
-    'content-type': 'multipart/form-data; boundary=' + this.getBoundary()
-  };
-
-  for (var header in userHeaders) {
-    formHeaders[header.toLowerCase()] = userHeaders[header];
-  }
-
-  return formHeaders;
-}
-
-FormData.prototype.getCustomHeaders = function(contentType) {
-    contentType = contentType ? contentType : 'multipart/form-data';
-
-    var formHeaders = {
-        'content-type': contentType + '; boundary=' + this.getBoundary(),
-        'content-length': this.getLengthSync()
-    };
-
-    return formHeaders;
-}
-
-FormData.prototype.getBoundary = function() {
-  if (!this._boundary) {
-    this._generateBoundary();
-  }
-
-  return this._boundary;
-};
-
-FormData.prototype._generateBoundary = function() {
-  // This generates a 50 character boundary similar to those used by Firefox.
-  // They are optimized for boyer-moore parsing.
-  var boundary = '--------------------------';
-  for (var i = 0; i < 24; i++) {
-    boundary += Math.floor(Math.random() * 10).toString(16);
-  }
-
-  this._boundary = boundary;
-};
-
-// Note: getLengthSync DOESN'T calculate streams length
-// As workaround one can calculate file size manually
-// and add it as knownLength option
-FormData.prototype.getLengthSync = function(debug) {
-  var knownLength = this._overheadLength + this._valueLength;
-
-  // Don't get confused, there are 3 "internal" streams for each keyval pair
-  // so it basically checks if there is any value added to the form
-  if (this._streams.length) {
-    knownLength += this._lastBoundary().length;
-  }
-
-  // https://github.com/felixge/node-form-data/issues/40
-  if (this._lengthRetrievers.length) {
-    // Some async length retrivers are present
-    // therefore synchronous length calculation is false.
-    // Please use getLength(callback) to get proper length
-    this._error(new Error('Cannot calculate proper length in synchronous way.'));
-  }
-
-  return knownLength;
-};
-
-FormData.prototype.getLength = function(cb) {
-  var knownLength = this._overheadLength + this._valueLength;
-
-  if (this._streams.length) {
-    knownLength += this._lastBoundary().length;
-  }
-
-  if (!this._lengthRetrievers.length) {
-    process.nextTick(cb.bind(this, null, knownLength));
-    return;
-  }
-
-  async.parallel(this._lengthRetrievers, function(err, values) {
-    if (err) {
-      cb(err);
-      return;
-    }
-
-    values.forEach(function(length) {
-      knownLength += length;
-    });
-
-    cb(null, knownLength);
-  });
-};
-
-FormData.prototype.submit = function(params, cb) {
-
-  var request
-    , options
-    , defaults = {
-        method : 'post',
-        headers: this.getHeaders()
-    };
-
-  // parse provided url if it's string
-  // or treat it as options object
-  if (typeof params == 'string') {
-    params = parseUrl(params);
-
-    options = populate({
-      port: params.port,
-      path: params.pathname,
-      host: params.hostname
-    }, defaults);
-  }
-  else // use custom params
-  {
-    options = populate(params, defaults);
-    // if no port provided use default one
-    if (!options.port) {
-      options.port = options.protocol == 'https:' ? 443 : 80;
-    }
-  }
-
-  // https if specified, fallback to http in any other case
-  if (params.protocol == 'https:') {
-    request = https.request(options);
-  } else {
-    request = http.request(options);
-  }
-
-  // get content length and fire away
-  this.getLength(function(err, length) {
-
-    // TODO: Add chunked encoding when no length (if err)
-
-    // add content length
-    request.setHeader('Content-Length', length);
-
-    this.pipe(request);
-    if (cb) {
-      request.on('error', cb);
-      request.on('response', cb.bind(this, null));
-    }
-  }.bind(this));
-
-  return request;
-};
-
-FormData.prototype._error = function(err) {
-  if (this.error) return;
-
-  this.error = err;
-  this.pause();
-  this.emit('error', err);
-};
-
-/*
- * Santa's little helpers
- */
-
-// populates missing values
-function populate(dst, src) {
-  for (var prop in src) {
-    if (!dst[prop]) dst[prop] = src[prop];
-  }
-  return dst;
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-Copyright (c) 2010 Caolan McMahon
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1414 +0,0 @@
-# Async.js
-
-Async is a utility module which provides straight-forward, powerful functions
-for working with asynchronous JavaScript. Although originally designed for
-use with [node.js](http://nodejs.org), it can also be used directly in the
-browser. Also supports [component](https://github.com/component/component).
-
-Async provides around 20 functions that include the usual 'functional'
-suspects (map, reduce, filter, each…) as well as some common patterns
-for asynchronous control flow (parallel, series, waterfall…). All these
-functions assume you follow the node.js convention of providing a single
-callback as the last argument of your async function.
-
-
-## Quick Examples
-
-```javascript
-async.map(['file1','file2','file3'], fs.stat, function(err, results){
-    // results is now an array of stats for each file
-});
-
-async.filter(['file1','file2','file3'], fs.exists, function(results){
-    // results now equals an array of the existing files
-});
-
-async.parallel([
-    function(){ ... },
-    function(){ ... }
-], callback);
-
-async.series([
-    function(){ ... },
-    function(){ ... }
-]);
-```
-
-There are many more functions available so take a look at the docs below for a
-full list. This module aims to be comprehensive, so if you feel anything is
-missing please create a GitHub issue for it.
-
-## Common Pitfalls
-
-### Binding a context to an iterator
-
-This section is really about bind, not about async. If you are wondering how to
-make async execute your iterators in a given context, or are confused as to why
-a method of another library isn't working as an iterator, study this example:
-
-```js
-// Here is a simple object with an (unnecessarily roundabout) squaring method
-var AsyncSquaringLibrary = {
-  squareExponent: 2,
-  square: function(number, callback){ 
-    var result = Math.pow(number, this.squareExponent);
-    setTimeout(function(){
-      callback(null, result);
-    }, 200);
-  }
-};
-
-async.map([1, 2, 3], AsyncSquaringLibrary.square, function(err, result){
-  // result is [NaN, NaN, NaN]
-  // This fails because the `this.squareExponent` expression in the square
-  // function is not evaluated in the context of AsyncSquaringLibrary, and is
-  // therefore undefined.
-});
-
-async.map([1, 2, 3], AsyncSquaringLibrary.square.bind(AsyncSquaringLibrary), function(err, result){
-  // result is [1, 4, 9]
-  // With the help of bind we can attach a context to the iterator before
-  // passing it to async. Now the square function will be executed in its 
-  // 'home' AsyncSquaringLibrary context and the value of `this.squareExponent`
-  // will be as expected.
-});
-```
-
-## Download
-
-The source is available for download from
-[GitHub](http://github.com/caolan/async).
-Alternatively, you can install using Node Package Manager (npm):
-
-    npm install async
-
-__Development:__ [async.js](https://github.com/caolan/async/raw/master/lib/async.js) - 29.6kb Uncompressed
-
-## In the Browser
-
-So far it's been tested in IE6, IE7, IE8, FF3.6 and Chrome 5. Usage:
-
-```html
-<script type="text/javascript" src="async.js"></script>
-<script type="text/javascript">
-
-    async.map(data, asyncProcess, function(err, results){
-        alert(results);
-    });
-
-</script>
-```
-
-## Documentation
-
-### Collections
-
-* [each](#each)
-* [map](#map)
-* [filter](#filter)
-* [reject](#reject)
-* [reduce](#reduce)
-* [detect](#detect)
-* [sortBy](#sortBy)
-* [some](#some)
-* [every](#every)
-* [concat](#concat)
-
-### Control Flow
-
-* [series](#series)
-* [parallel](#parallel)
-* [whilst](#whilst)
-* [doWhilst](#doWhilst)
-* [until](#until)
-* [doUntil](#doUntil)
-* [forever](#forever)
-* [waterfall](#waterfall)
-* [compose](#compose)
-* [applyEach](#applyEach)
-* [queue](#queue)
-* [cargo](#cargo)
-* [auto](#auto)
-* [iterator](#iterator)
-* [apply](#apply)
-* [nextTick](#nextTick)
-* [times](#times)
-* [timesSeries](#timesSeries)
-
-### Utils
-
-* [memoize](#memoize)
-* [unmemoize](#unmemoize)
-* [log](#log)
-* [dir](#dir)
-* [noConflict](#noConflict)
-
-
-## Collections
-
-<a name="forEach" />
-<a name="each" />
-### each(arr, iterator, callback)
-
-Applies an iterator function to each item in an array, in parallel.
-The iterator is called with an item from the list and a callback for when it
-has finished. If the iterator passes an error to this callback, the main
-callback for the each function is immediately called with the error.
-
-Note, that since this function applies the iterator to each item in parallel
-there is no guarantee that the iterator functions will complete in order.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* iterator(item, callback) - A function to apply to each item in the array.
-  The iterator is passed a callback(err) which must be called once it has 
-  completed. If no error has occured, the callback should be run without 
-  arguments or with an explicit null argument.
-* callback(err) - A callback which is called after all the iterator functions
-  have finished, or an error has occurred.
-
-__Example__
-
-```js
-// assuming openFiles is an array of file names and saveFile is a function
-// to save the modified contents of that file:
-
-async.each(openFiles, saveFile, function(err){
-    // if any of the saves produced an error, err would equal that error
-});
-```
-
----------------------------------------
-
-<a name="forEachSeries" />
-<a name="eachSeries" />
-### eachSeries(arr, iterator, callback)
-
-The same as each only the iterator is applied to each item in the array in
-series. The next iterator is only called once the current one has completed
-processing. This means the iterator functions will complete in order.
-
-
----------------------------------------
-
-<a name="forEachLimit" />
-<a name="eachLimit" />
-### eachLimit(arr, limit, iterator, callback)
-
-The same as each only no more than "limit" iterators will be simultaneously 
-running at any time.
-
-Note that the items are not processed in batches, so there is no guarantee that
- the first "limit" iterator functions will complete before any others are 
-started.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* limit - The maximum number of iterators to run at any time.
-* iterator(item, callback) - A function to apply to each item in the array.
-  The iterator is passed a callback(err) which must be called once it has 
-  completed. If no error has occured, the callback should be run without 
-  arguments or with an explicit null argument.
-* callback(err) - A callback which is called after all the iterator functions
-  have finished, or an error has occurred.
-
-__Example__
-
-```js
-// Assume documents is an array of JSON objects and requestApi is a
-// function that interacts with a rate-limited REST api.
-
-async.eachLimit(documents, 20, requestApi, function(err){
-    // if any of the saves produced an error, err would equal that error
-});
-```
-
----------------------------------------
-
-<a name="map" />
-### map(arr, iterator, callback)
-
-Produces a new array of values by mapping each value in the given array through
-the iterator function. The iterator is called with an item from the array and a
-callback for when it has finished processing. The callback takes 2 arguments, 
-an error and the transformed item from the array. If the iterator passes an
-error to this callback, the main callback for the map function is immediately
-called with the error.
-
-Note, that since this function applies the iterator to each item in parallel
-there is no guarantee that the iterator functions will complete in order, however
-the results array will be in the same order as the original array.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* iterator(item, callback) - A function to apply to each item in the array.
-  The iterator is passed a callback(err, transformed) which must be called once 
-  it has completed with an error (which can be null) and a transformed item.
-* callback(err, results) - A callback which is called after all the iterator
-  functions have finished, or an error has occurred. Results is an array of the
-  transformed items from the original array.
-
-__Example__
-
-```js
-async.map(['file1','file2','file3'], fs.stat, function(err, results){
-    // results is now an array of stats for each file
-});
-```
-
----------------------------------------
-
-<a name="mapSeries" />
-### mapSeries(arr, iterator, callback)
-
-The same as map only the iterator is applied to each item in the array in
-series. The next iterator is only called once the current one has completed
-processing. The results array will be in the same order as the original.
-
-
----------------------------------------
-
-<a name="mapLimit" />
-### mapLimit(arr, limit, iterator, callback)
-
-The same as map only no more than "limit" iterators will be simultaneously 
-running at any time.
-
-Note that the items are not processed in batches, so there is no guarantee that
- the first "limit" iterator functions will complete before any others are 
-started.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* limit - The maximum number of iterators to run at any time.
-* iterator(item, callback) - A function to apply to each item in the array.
-  The iterator is passed a callback(err, transformed) which must be called once 
-  it has completed with an error (which can be null) and a transformed item.
-* callback(err, results) - A callback which is called after all the iterator
-  functions have finished, or an error has occurred. Results is an array of the
-  transformed items from the original array.
-
-__Example__
-
-```js
-async.map(['file1','file2','file3'], 1, fs.stat, function(err, results){
-    // results is now an array of stats for each file
-});
-```
-
----------------------------------------
-
-<a name="filter" />
-### filter(arr, iterator, callback)
-
-__Alias:__ select
-
-Returns a new array of all the values which pass an async truth test.
-_The callback for each iterator call only accepts a single argument of true or
-false, it does not accept an error argument first!_ This is in-line with the
-way node libraries work with truth tests like fs.exists. This operation is
-performed in parallel, but the results array will be in the same order as the
-original.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* iterator(item, callback) - A truth test to apply to each item in the array.
-  The iterator is passed a callback(truthValue) which must be called with a 
-  boolean argument once it has completed.
-* callback(results) - A callback which is called after all the iterator
-  functions have finished.
-
-__Example__
-
-```js
-async.filter(['file1','file2','file3'], fs.exists, function(results){
-    // results now equals an array of the existing files
-});
-```
-
----------------------------------------
-
-<a name="filterSeries" />
-### filterSeries(arr, iterator, callback)
-
-__alias:__ selectSeries
-
-The same as filter only the iterator is applied to each item in the array in
-series. The next iterator is only called once the current one has completed
-processing. The results array will be in the same order as the original.
-
----------------------------------------
-
-<a name="reject" />
-### reject(arr, iterator, callback)
-
-The opposite of filter. Removes values that pass an async truth test.
-
----------------------------------------
-
-<a name="rejectSeries" />
-### rejectSeries(arr, iterator, callback)
-
-The same as reject, only the iterator is applied to each item in the array
-in series.
-
-
----------------------------------------
-
-<a name="reduce" />
-### reduce(arr, memo, iterator, callback)
-
-__aliases:__ inject, foldl
-
-Reduces a list of values into a single value using an async iterator to return
-each successive step. Memo is the initial state of the reduction. This
-function only operates in series. For performance reasons, it may make sense to
-split a call to this function into a parallel map, then use the normal
-Array.prototype.reduce on the results. This function is for situations where
-each step in the reduction needs to be async, if you can get the data before
-reducing it then it's probably a good idea to do so.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* memo - The initial state of the reduction.
-* iterator(memo, item, callback) - A function applied to each item in the
-  array to produce the next step in the reduction. The iterator is passed a
-  callback(err, reduction) which accepts an optional error as its first 
-  argument, and the state of the reduction as the second. If an error is 
-  passed to the callback, the reduction is stopped and the main callback is 
-  immediately called with the error.
-* callback(err, result) - A callback which is called after all the iterator
-  functions have finished. Result is the reduced value.
-
-__Example__
-
-```js
-async.reduce([1,2,3], 0, function(memo, item, callback){
-    // pointless async:
-    process.nextTick(function(){
-        callback(null, memo + item)
-    });
-}, function(err, result){
-    // result is now equal to the last value of memo, which is 6
-});
-```
-
----------------------------------------
-
-<a name="reduceRight" />
-### reduceRight(arr, memo, iterator, callback)
-
-__Alias:__ foldr
-
-Same as reduce, only operates on the items in the array in reverse order.
-
-
----------------------------------------
-
-<a name="detect" />
-### detect(arr, iterator, callback)
-
-Returns the first value in a list that passes an async truth test. The
-iterator is applied in parallel, meaning the first iterator to return true will
-fire the detect callback with that result. That means the result might not be
-the first item in the original array (in terms of order) that passes the test.
-
-If order within the original array is important then look at detectSeries.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* iterator(item, callback) - A truth test to apply to each item in the array.
-  The iterator is passed a callback(truthValue) which must be called with a 
-  boolean argument once it has completed.
-* callback(result) - A callback which is called as soon as any iterator returns
-  true, or after all the iterator functions have finished. Result will be
-  the first item in the array that passes the truth test (iterator) or the
-  value undefined if none passed.
-
-__Example__
-
-```js
-async.detect(['file1','file2','file3'], fs.exists, function(result){
-    // result now equals the first file in the list that exists
-});
-```
-
----------------------------------------
-
-<a name="detectSeries" />
-### detectSeries(arr, iterator, callback)
-
-The same as detect, only the iterator is applied to each item in the array
-in series. This means the result is always the first in the original array (in
-terms of array order) that passes the truth test.
-
-
----------------------------------------
-
-<a name="sortBy" />
-### sortBy(arr, iterator, callback)
-
-Sorts a list by the results of running each value through an async iterator.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* iterator(item, callback) - A function to apply to each item in the array.
-  The iterator is passed a callback(err, sortValue) which must be called once it
-  has completed with an error (which can be null) and a value to use as the sort
-  criteria.
-* callback(err, results) - A callback which is called after all the iterator
-  functions have finished, or an error has occurred. Results is the items from
-  the original array sorted by the values returned by the iterator calls.
-
-__Example__
-
-```js
-async.sortBy(['file1','file2','file3'], function(file, callback){
-    fs.stat(file, function(err, stats){
-        callback(err, stats.mtime);
-    });
-}, function(err, results){
-    // results is now the original array of files sorted by
-    // modified date
-});
-```
-
----------------------------------------
-
-<a name="some" />
-### some(arr, iterator, callback)
-
-__Alias:__ any
-
-Returns true if at least one element in the array satisfies an async test.
-_The callback for each iterator call only accepts a single argument of true or
-false, it does not accept an error argument first!_ This is in-line with the
-way node libraries work with truth tests like fs.exists. Once any iterator
-call returns true, the main callback is immediately called.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* iterator(item, callback) - A truth test to apply to each item in the array.
-  The iterator is passed a callback(truthValue) which must be called with a 
-  boolean argument once it has completed.
-* callback(result) - A callback which is called as soon as any iterator returns
-  true, or after all the iterator functions have finished. Result will be
-  either true or false depending on the values of the async tests.
-
-__Example__
-
-```js
-async.some(['file1','file2','file3'], fs.exists, function(result){
-    // if result is true then at least one of the files exists
-});
-```
-
----------------------------------------
-
-<a name="every" />
-### every(arr, iterator, callback)
-
-__Alias:__ all
-
-Returns true if every element in the array satisfies an async test.
-_The callback for each iterator call only accepts a single argument of true or
-false, it does not accept an error argument first!_ This is in-line with the
-way node libraries work with truth tests like fs.exists.
-
-__Arguments__
-
-* arr - An array to iterate over.
-* iterator(item, callback) - A truth test to apply to each item in the array.
-  The iterator is passed a callback(truthValue) which must be called with a 
-  boolean argument once it has completed.
-* callback(result) - A callback which is called after all the iterator
-  functions have finished. Result will be either true or false depending on
-  the values of the async tests.
-
-__Example__
-
-```js
-async.every(['file1','file2','file3'], fs.exists, function(result){
-    // if result is true then every file exists
-});
-```
-
----------------------------------------
-
-<a name="concat" />
-### concat(arr, iterator, callback)
-
-Applies an iterator to each item in a list, concatenating the results. Returns the
-concatenated list. The iterators are called in parallel, and the results are
-concatenated as they return. There is no guarantee that the results array will
-be returned in the original order of the arguments passed to the iterator function.
-
-__Arguments__
-
-* arr - An array to iterate over
-* iterator(item, callback) - A function to apply to each item in the array.
-  The iterator is passed a callback(err, results) which must be called once it 
-  has completed with an error (which can be null) and an array of results.
-* callback(err, results) - A callback which is called after all the iterator
-  functions have finished, or an error has occurred. Results is an array containing
-  the concatenated results of the iterator function.
-
-__Example__
-
-```js
-async.concat(['dir1','dir2','dir3'], fs.readdir, function(err, files){
-    // files is now a list of filenames that exist in the 3 directories
-});
-```
-
----------------------------------------
-
-<a name="concatSeries" />
-### concatSeries(arr, iterator, callback)
-
-Same as async.concat, but executes in series instead of parallel.
-
-
-## Control Flow
-
-<a name="series" />
-### series(tasks, [callback])
-
-Run an array of functions in series, each one running once the previous
-function has completed. If any functions in the series pass an error to its
-callback, no more functions are run and the callback for the series is
-immediately called with the value of the error. Once the tasks have completed,
-the results are passed to the final callback as an array.
-
-It is also possible to use an object instead of an array. Each property will be
-run as a function and the results will be passed to the final callback as an object
-instead of an array. This can be a more readable way of handling results from
-async.series.
-
-
-__Arguments__
-
-* tasks - An array or object containing functions to run, each function is passed
-  a callback(err, result) it must call on completion with an error (which can
-  be null) and an optional result value.
-* callback(err, results) - An optional callback to run once all the functions
-  have completed. This function gets a results array (or object) containing all 
-  the result arguments passed to the task callbacks.
-
-__Example__
-
-```js
-async.series([
-    function(callback){
-        // do some stuff ...
-        callback(null, 'one');
-    },
-    function(callback){
-        // do some more stuff ...
-        callback(null, 'two');
-    }
-],
-// optional callback
-function(err, results){
-    // results is now equal to ['one', 'two']
-});
-
-
-// an example using an object instead of an array
-async.series({
-    one: function(callback){
-        setTimeout(function(){
-            callback(null, 1);
-        }, 200);
-    },
-    two: function(callback){
-        setTimeout(function(){
-            callback(null, 2);
-        }, 100);
-    }
-},
-function(err, results) {
-    // results is now equal to: {one: 1, two: 2}
-});
-```
-
----------------------------------------
-
-<a name="parallel" />
-### parallel(tasks, [callback])
-
-Run an array of functions in parallel, without waiting until the previous
-function has completed. If any of the functions pass an error to its
-callback, the main callback is immediately called with the value of the error.
-Once the tasks have completed, the results are passed to the final callback as an
-array.
-
-It is also possible to use an object instead of an array. Each property will be
-run as a function and the results will be passed to the final callback as an object
-instead of an array. This can be a more readable way of handling results from
-async.parallel.
-
-
-__Arguments__
-
-* tasks - An array or object containing functions to run, each function is passed 
-  a callback(err, result) it must call on completion with an error (which can
-  be null) and an optional result value.
-* callback(err, results) - An optional callback to run once all the functions
-  have completed. This function gets a results array (or object) containing all 
-  the result arguments passed to the task callbacks.
-
-__Example__
-
-```js
-async.parallel([
-    function(callback){
-        setTimeout(function(){
-            callback(null, 'one');
-        }, 200);
-    },
-    function(callback){
-        setTimeout(function(){
-            callback(null, 'two');
-        }, 100);
-    }
-],
-// optional callback
-function(err, results){
-    // the results array will equal ['one','two'] even though
-    // the second function had a shorter timeout.
-});
-
-
-// an example using an object instead of an array
-async.parallel({
-    one: function(callback){
-        setTimeout(function(){
-            callback(null, 1);
-        }, 200);
-    },
-    two: function(callback){
-        setTimeout(function(){
-            callback(null, 2);
-        }, 100);
-    }
-},
-function(err, results) {
-    // results is now equals to: {one: 1, two: 2}
-});
-```
-
----------------------------------------
-
-<a name="parallel" />
-### parallelLimit(tasks, limit, [callback])
-
-The same as parallel only the tasks are executed in parallel with a maximum of "limit" 
-tasks executing at any time.
-
-Note that the tasks are not executed in batches, so there is no guarantee that 
-the first "limit" tasks will complete before any others are started.
-
-__Arguments__
-
-* tasks - An array or object containing functions to run, each function is passed 
-  a callback(err, result) it must call on completion with an error (which can
-  be null) and an optional result value.
-* limit - The maximum number of tasks to run at any time.
-* callback(err, results) - An optional callback to run once all the functions
-  have completed. This function gets a results array (or object) containing all 
-  the result arguments passed to the task callbacks.
-
----------------------------------------
-
-<a name="whilst" />
-### whilst(test, fn, callback)
-
-Repeatedly call fn, while test returns true. Calls the callback when stopped,
-or an error occurs.
-
-__Arguments__
-
-* test() - synchronous truth test to perform before each execution of fn.
-* fn(callback) - A function to call each time the test passes. The function is
-  passed a callback(err) which must be called once it has completed with an 
-  optional error argument.
-* callback(err) - A callback which is called after the test fails and repeated
-  execution of fn has stopped.
-
-__Example__
-
-```js
-var count = 0;
-
-async.whilst(
-    function () { return count < 5; },
-    function (callback) {
-        count++;
-        setTimeout(callback, 1000);
-    },
-    function (err) {
-        // 5 seconds have passed
-    }
-);
-```
-
----------------------------------------
-
-<a name="doWhilst" />
-### doWhilst(fn, test, callback)
-
-The post check version of whilst. To reflect the difference in the order of operations `test` and `fn` arguments are switched. `doWhilst` is to `whilst` as `do while` is to `while` in plain JavaScript.
-
----------------------------------------
-
-<a name="until" />
-### until(test, fn, callback)
-
-Repeatedly call fn, until test returns true. Calls the callback when stopped,
-or an error occurs.
-
-The inverse of async.whilst.
-
----------------------------------------
-
-<a name="doUntil" />
-### doUntil(fn, test, callback)
-
-Like doWhilst except the test is inverted. Note the argument ordering differs from `until`.
-
----------------------------------------
-
-<a name="forever" />
-### forever(fn, callback)
-
-Calls the asynchronous function 'fn' repeatedly, in series, indefinitely.
-If an error is passed to fn's callback then 'callback' is called with the
-error, otherwise it will never be called.
-
----------------------------------------
-
-<a name="waterfall" />
-### waterfall(tasks, [callback])
-
-Runs an array of functions in series, each passing their results to the next in
-the array. However, if any of the functions pass an error to the callback, the
-next function is not executed and the main callback is immediately called with
-the error.
-
-__Arguments__
-
-* tasks - An array of functions to run, each function is passed a 
-  callback(err, result1, result2, ...) it must call on completion. The first
-  argument is an error (which can be null) and any further arguments will be 
-  passed as arguments in order to the next task.
-* callback(err, [results]) - An optional callback to run once all the functions
-  have completed. This will be passed the results of the last task's callback.
-
-
-
-__Example__
-
-```js
-async.waterfall([
-    function(callback){
-        callback(null, 'one', 'two');
-    },
-    function(arg1, arg2, callback){
-        callback(null, 'three');
-    },
-    function(arg1, callback){
-        // arg1 now equals 'three'
-        callback(null, 'done');
-    }
-], function (err, result) {
-   // result now equals 'done'    
-});
-```
-
----------------------------------------
-<a name="compose" />
-### compose(fn1, fn2...)
-
-Creates a function which is a composition of the passed asynchronous
-functions. Each function consumes the return value of the function that
-follows. Composing functions f(), g() and h() would produce the result of
-f(g(h())), only this version uses callbacks to obtain the return values.
-
-Each function is executed with the `this` binding of the composed function.
-
-__Arguments__
-
-* functions... - the asynchronous functions to compose
-
-
-__Example__
-
-```js
-function add1(n, callback) {
-    setTimeout(function () {
-        callback(null, n + 1);
-    }, 10);
-}
-
-function mul3(n, callback) {
-    setTimeout(function () {
-        callback(null, n * 3);
-    }, 10);
-}
-
-var add1mul3 = async.compose(mul3, add1);
-
-add1mul3(4, function (err, result) {
-   // result now equals 15
-});
-```
-
----------------------------------------
-<a name="applyEach" />
-### applyEach(fns, args..., callback)
-
-Applies the provided arguments to each function in the array, calling the
-callback after all functions have completed. If you only provide the first
-argument then it will return a function which lets you pass in the
-arguments as if it were a single function call.
-
-__Arguments__
-
-* fns - the asynchronous functions to all call with the same arguments
-* args... - any number of separate arguments to pass to the function
-* callback - the final argument should be the callback, called when all
-  functions have completed processing
-
-
-__Example__
-
-```js
-async.applyEach([enableSearch, updateSchema], 'bucket', callback);
-
-// partial application example:
-async.each(
-    buckets,
-    async.applyEach([enableSearch, updateSchema]),
-    callback
-);
-```
-
----------------------------------------
-
-<a name="applyEachSeries" />
-### applyEachSeries(arr, iterator, callback)
-
-The same as applyEach only the functions are applied in series.
-
----------------------------------------
-
-<a name="queue" />
-### queue(worker, concurrency)
-
-Creates a queue object with the specified concurrency. Tasks added to the
-queue will be processed in parallel (up to the concurrency limit). If all
-workers are in progress, the task is queued until one is available. Once
-a worker has completed a task, the task's callback is called.
-
-__Arguments__
-
-* worker(task, callback) - An asynchronous function for processing a queued
-  task, which must call its callback(err) argument when finished, with an 
-  optional error as an argument.
-* concurrency - An integer for determining how many worker functions should be
-  run in parallel.
-
-__Queue objects__
-
-The queue object returned by this function has the following properties and
-methods:
-
-* length() - a function returning the number of items waiting to be processed.
-* concurrency - an integer for determining how many worker functions should be
-  run in parallel. This property can be changed after a queue is created to
-  alter the concurrency on-the-fly.
-* push(task, [callback]) - add a new task to the queue, the callback is called
-  once the worker has finished processing the task.
-  instead of a single task, an array of tasks can be submitted. the respective callback is used for every task in the list.
-* unshift(task, [callback]) - add a new task to the front of the queue.
-* saturated - a callback that is called when the queue length hits the concurrency and further tasks will be queued
-* empty - a callback that is called when the last item from the queue is given to a worker
-* drain - a callback that is called when the last item from the queue has returned from the worker
-
-__Example__
-
-```js
-// create a queue object with concurrency 2
-
-var q = async.queue(function (task, callback) {
-    console.log('hello ' + task.name);
-    callback();
-}, 2);
-
-
-// assign a callback
-q.drain = function() {
-    console.log('all items have been processed');
-}
-
-// add some items to the queue
-
-q.push({name: 'foo'}, function (err) {
-    console.log('finished processing foo');
-});
-q.push({name: 'bar'}, function (err) {
-    console.log('finished processing bar');
-});
-
-// add some items to the queue (batch-wise)
-
-q.push([{name: 'baz'},{name: 'bay'},{name: 'bax'}], function (err) {
-    console.log('finished processing bar');
-});
-
-// add some items to the front of the queue
-
-q.unshift({name: 'bar'}, function (err) {
-    console.log('finished processing bar');
-});
-```
-
----------------------------------------
-
-<a name="cargo" />
-### cargo(worker, [payload])
-
-Creates a cargo object with the specified payload. Tasks added to the
-cargo will be processed altogether (up to the payload limit). If the
-worker is in progress, the task is queued until it is available. Once
-the worker has completed some tasks, each callback of those tasks is called.
-
-__Arguments__
-
-* worker(tasks, callback) - An asynchronous function for processing an array of
-  queued tasks, which must call its callback(err) argument when finished, with 
-  an optional error as an argument.
-* payload - An optional integer for determining how many tasks should be
-  processed per round; if omitted, the default is unlimited.
-
-__Cargo objects__
-
-The cargo object returned by this function has the following properties and
-methods:
-
-* length() - a function returning the number of items waiting to be processed.
-* payload - an integer for determining how many tasks should be
-  process per round. This property can be changed after a cargo is created to
-  alter the payload on-the-fly.
-* push(task, [callback]) - add a new task to the queue, the callback is called
-  once the worker has finished processing the task.
-  instead of a single task, an array of tasks can be submitted. the respective callback is used for every task in the list.
-* saturated - a callback that is called when the queue length hits the concurrency and further tasks will be queued
-* empty - a callback that is called when the last item from the queue is given to a worker
-* drain - a callback that is called when the last item from the queue has returned from the worker
-
-__Example__
-
-```js
-// create a cargo object with payload 2
-
-var cargo = async.cargo(function (tasks, callback) {
-    for(var i=0; i<tasks.length; i++){
-      console.log('hello ' + tasks[i].name);
-    }
-    callback();
-}, 2);
-
-
-// add some items
-
-cargo.push({name: 'foo'}, function (err) {
-    console.log('finished processing foo');
-});
-cargo.push({name: 'bar'}, function (err) {
-    console.log('finished processing bar');
-});
-cargo.push({name: 'baz'}, function (err) {
-    console.log('finished processing baz');
-});
-```
-
----------------------------------------
-
-<a name="auto" />
-### auto(tasks, [callback])
-
-Determines the best order for running functions based on their requirements.
-Each function can optionally depend on other functions being completed first,
-and each function is run as soon as its requirements are satisfied. If any of
-the functions pass an error to their callback, that function will not complete
-(so any other functions depending on it will not run) and the main callback
-will be called immediately with the error. Functions also receive an object
-containing the results of functions which have completed so far.
-
-Note, all functions are called with a results object as a second argument, 
-so it is unsafe to pass functions in the tasks object which cannot handle the
-extra argument. For example, this snippet of code:
-
-```js
-async.auto({
-  readData: async.apply(fs.readFile, 'data.txt', 'utf-8');
-}, callback);
-```
-
-will have the effect of calling readFile with the results object as the last
-argument, which will fail:
-
-```js
-fs.readFile('data.txt', 'utf-8', cb, {});
-```
-
-Instead, wrap the call to readFile in a function which does not forward the 
-results object:
-
-```js
-async.auto({
-  readData: function(cb, results){
-    fs.readFile('data.txt', 'utf-8', cb);
-  }
-}, callback);
-```
-
-__Arguments__
-
-* tasks - An object literal containing named functions or an array of
-  requirements, with the function itself the last item in the array. The key
-  used for each function or array is used when specifying requirements. The 
-  function receives two arguments: (1) a callback(err, result) which must be 
-  called when finished, passing an error (which can be null) and the result of 
-  the function's execution, and (2) a results object, containing the results of
-  the previously executed functions.
-* callback(err, results) - An optional callback which is called when all the
-  tasks have been completed. The callback will receive an error as an argument
-  if any tasks pass an error to their callback. Results will always be passed
-	but if an error occurred, no other tasks will be performed, and the results
-	object will only contain partial results.
-  
-
-__Example__
-
-```js
-async.auto({
-    get_data: function(callback){
-        // async code to get some data
-    },
-    make_folder: function(callback){
-        // async code to create a directory to store a file in
-        // this is run at the same time as getting the data
-    },
-    write_file: ['get_data', 'make_folder', function(callback){
-        // once there is some data and the directory exists,
-        // write the data to a file in the directory
-        callback(null, filename);
-    }],
-    email_link: ['write_file', function(callback, results){
-        // once the file is written let's email a link to it...
-        // results.write_file contains the filename returned by write_file.
-    }]
-});
-```
-
-This is a fairly trivial example, but to do this using the basic parallel and
-series functions would look like this:
-
-```js
-async.parallel([
-    function(callback){
-        // async code to get some data
-    },
-    function(callback){
-        // async code to create a directory to store a file in
-        // this is run at the same time as getting the data
-    }
-],
-function(err, results){
-    async.series([
-        function(callback){
-            // once there is some data and the directory exists,
-            // write the data to a file in the directory
-        },
-        function(callback){
-            // once the file is written let's email a link to it...
-        }
-    ]);
-});
-```
-
-For a complicated series of async tasks using the auto function makes adding
-new tasks much easier and makes the code more readable.
-
-
----------------------------------------
-
-<a name="iterator" />
-### iterator(tasks)
-
-Creates an iterator function which calls the next function in the array,
-returning a continuation to call the next one after that. It's also possible to
-'peek' the next iterator by doing iterator.next().
-
-This function is used internally by the async module but can be useful when
-you want to manually control the flow of functions in series.
-
-__Arguments__
-
-* tasks - An array of functions to run.
-
-__Example__
-
-```js
-var iterator = async.iterator([
-    function(){ sys.p('one'); },
-    function(){ sys.p('two'); },
-    function(){ sys.p('three'); }
-]);
-
-node> var iterator2 = iterator();
-'one'
-node> var iterator3 = iterator2();
-'two'
-node> iterator3();
-'three'
-node> var nextfn = iterator2.next();
-node> nextfn();
-'three'
-```
-
----------------------------------------
-
-<a name="apply" />
-### apply(function, arguments..)
-
-Creates a continuation function with some arguments already applied, a useful
-shorthand when combined with other control flow functions. Any arguments
-passed to the returned function are added to the arguments originally passed
-to apply.
-
-__Arguments__
-
-* function - The function you want to eventually apply all arguments to.
-* arguments... - Any number of arguments to automatically apply when the
-  continuation is called.
-
-__Example__
-
-```js
-// using apply
-
-async.parallel([
-    async.apply(fs.writeFile, 'testfile1', 'test1'),
-    async.apply(fs.writeFile, 'testfile2', 'test2'),
-]);
-
-
-// the same process without using apply
-
-async.parallel([
-    function(callback){
-        fs.writeFile('testfile1', 'test1', callback);
-    },
-    function(callback){
-        fs.writeFile('testfile2', 'test2', callback);
-    }
-]);
-```
-
-It's possible to pass any number of additional arguments when calling the
-continuation:
-
-```js
-node> var fn = async.apply(sys.puts, 'one');
-node> fn('two', 'three');
-one
-two
-three
-```
-
----------------------------------------
-
-<a name="nextTick" />
-### nextTick(callback)
-
-Calls the callback on a later loop around the event loop. In node.js this just
-calls process.nextTick, in the browser it falls back to setImmediate(callback)
-if available, otherwise setTimeout(callback, 0), which means other higher priority
-events may precede the execution of the callback.
-
-This is used internally for browser-compatibility purposes.
-
-__Arguments__
-
-* callback - The function to call on a later loop around the event loop.
-
-__Example__
-
-```js
-var call_order = [];
-async.nextTick(function(){
-    call_order.push('two');
-    // call_order now equals ['one','two']
-});
-call_order.push('one')
-```
-
-<a name="times" />
-### times(n, callback)
-
-Calls the callback n times and accumulates results in the same manner
-you would use with async.map.
-
-__Arguments__
-
-* n - The number of times to run the function.
-* callback - The function to call n times.
-
-__Example__
-
-```js
-// Pretend this is some complicated async factory
-var createUser = function(id, callback) {
-  callback(null, {
-    id: 'user' + id
-  })
-}
-// generate 5 users
-async.times(5, function(n, next){
-    createUser(n, function(err, user) {
-      next(err, user)
-    })
-}, function(err, users) {
-  // we should now have 5 users
-});
-```
-
-<a name="timesSeries" />
-### timesSeries(n, callback)
-
-The same as times only the iterator is applied to each item in the array in
-series. The next iterator is only called once the current one has completed
-processing. The results array will be in the same order as the original.
-
-
-## Utils
-
-<a name="memoize" />
-### memoize(fn, [hasher])
-
-Caches the results of an async function. When creating a hash to store function
-results against, the callback is omitted from the hash and an optional hash
-function can be used.
-
-The cache of results is exposed as the `memo` property of the function returned
-by `memoize`.
-
-__Arguments__
-
-* fn - the function you to proxy and cache results from.
-* hasher - an optional function for generating a custom hash for storing
-  results, it has all the arguments applied to it apart from the callback, and
-  must be synchronous.
-
-__Example__
-
-```js
-var slow_fn = function (name, callback) {
-    // do something
-    callback(null, result);
-};
-var fn = async.memoize(slow_fn);
-
-// fn can now be used as if it were slow_fn
-fn('some name', function () {
-    // callback
-});
-```
-
-<a name="unmemoize" />
-### unmemoize(fn)
-
-Undoes a memoized function, reverting it to the original, unmemoized
-form. Comes handy in tests.
-
-__Arguments__
-
-* fn - the memoized function
-
-<a name="log" />
-### log(function, arguments)
-
-Logs the result of an async function to the console. Only works in node.js or
-in browsers that support console.log and console.error (such as FF and Chrome).
-If multiple arguments are returned from the async function, console.log is
-called on each argument in order.
-
-__Arguments__
-
-* function - The function you want to eventually apply all arguments to.
-* arguments... - Any number of arguments to apply to the function.
-
-__Example__
-
-```js
-var hello = function(name, callback){
-    setTimeout(function(){
-        callback(null, 'hello ' + name);
-    }, 1000);
-};
-```
-```js
-node> async.log(hello, 'world');
-'hello world'
-```
-
----------------------------------------
-
-<a name="dir" />
-### dir(function, arguments)
-
-Logs the result of an async function to the console using console.dir to
-display the properties of the resulting object. Only works in node.js or
-in browsers that support console.dir and console.error (such as FF and Chrome).
-If multiple arguments are returned from the async function, console.dir is
-called on each argument in order.
-
-__Arguments__
-
-* function - The function you want to eventually apply all arguments to.
-* arguments... - Any number of arguments to apply to the function.
-
-__Example__
-
-```js
-var hello = function(name, callback){
-    setTimeout(function(){
-        callback(null, {hello: name});
-    }, 1000);
-};
-```
-```js
-node> async.dir(hello, 'world');
-{hello: 'world'}
-```
-
----------------------------------------
-
-<a name="noConflict" />
-### noConflict()
-
-Changes the value of async back to its original value, returning a reference to the
-async object.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/component.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
-{
-  "name": "async",
-  "repo": "caolan/async",
-  "description": "Higher-order functions and common patterns for asynchronous code",
-  "version": "0.1.23",
-  "keywords": [],
-  "dependencies": {},
-  "development": {},
-  "main": "lib/async.js",
-  "scripts": [ "lib/async.js" ]
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/lib/async.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,955 +0,0 @@
-/*global setImmediate: false, setTimeout: false, console: false */
-(function () {
-
-    var async = {};
-
-    // global on the server, window in the browser
-    var root, previous_async;
-
-    root = this;
-    if (root != null) {
-      previous_async = root.async;
-    }
-
-    async.noConflict = function () {
-        root.async = previous_async;
-        return async;
-    };
-
-    function only_once(fn) {
-        var called = false;
-        return function() {
-            if (called) throw new Error("Callback was already called.");
-            called = true;
-            fn.apply(root, arguments);
-        }
-    }
-
-    //// cross-browser compatiblity functions ////
-
-    var _each = function (arr, iterator) {
-        if (arr.forEach) {
-            return arr.forEach(iterator);
-        }
-        for (var i = 0; i < arr.length; i += 1) {
-            iterator(arr[i], i, arr);
-        }
-    };
-
-    var _map = function (arr, iterator) {
-        if (arr.map) {
-            return arr.map(iterator);
-        }
-        var results = [];
-        _each(arr, function (x, i, a) {
-            results.push(iterator(x, i, a));
-        });
-        return results;
-    };
-
-    var _reduce = function (arr, iterator, memo) {
-        if (arr.reduce) {
-            return arr.reduce(iterator, memo);
-        }
-        _each(arr, function (x, i, a) {
-            memo = iterator(memo, x, i, a);
-        });
-        return memo;
-    };
-
-    var _keys = function (obj) {
-        if (Object.keys) {
-            return Object.keys(obj);
-        }
-        var keys = [];
-        for (var k in obj) {
-            if (obj.hasOwnProperty(k)) {
-                keys.push(k);
-            }
-        }
-        return keys;
-    };
-
-    //// exported async module functions ////
-
-    //// nextTick implementation with browser-compatible fallback ////
-    if (typeof process === 'undefined' || !(process.nextTick)) {
-        if (typeof setImmediate === 'function') {
-            async.nextTick = function (fn) {
-                // not a direct alias for IE10 compatibility
-                setImmediate(fn);
-            };
-            async.setImmediate = async.nextTick;
-        }
-        else {
-            async.nextTick = function (fn) {
-                setTimeout(fn, 0);
-            };
-            async.setImmediate = async.nextTick;
-        }
-    }
-    else {
-        async.nextTick = process.nextTick;
-        if (typeof setImmediate !== 'undefined') {
-            async.setImmediate = setImmediate;
-        }
-        else {
-            async.setImmediate = async.nextTick;
-        }
-    }
-
-    async.each = function (arr, iterator, callback) {
-        callback = callback || function () {};
-        if (!arr.length) {
-            return callback();
-        }
-        var completed = 0;
-        _each(arr, function (x) {
-            iterator(x, only_once(function (err) {
-                if (err) {
-                    callback(err);
-                    callback = function () {};
-                }
-                else {
-                    completed += 1;
-                    if (completed >= arr.length) {
-                        callback(null);
-                    }
-                }
-            }));
-        });
-    };
-    async.forEach = async.each;
-
-    async.eachSeries = function (arr, iterator, callback) {
-        callback = callback || function () {};
-        if (!arr.length) {
-            return callback();
-        }
-        var completed = 0;
-        var iterate = function () {
-            iterator(arr[completed], function (err) {
-                if (err) {
-                    callback(err);
-                    callback = function () {};
-                }
-                else {
-                    completed += 1;
-                    if (completed >= arr.length) {
-                        callback(null);
-                    }
-                    else {
-                        iterate();
-                    }
-                }
-            });
-        };
-        iterate();
-    };
-    async.forEachSeries = async.eachSeries;
-
-    async.eachLimit = function (arr, limit, iterator, callback) {
-        var fn = _eachLimit(limit);
-        fn.apply(null, [arr, iterator, callback]);
-    };
-    async.forEachLimit = async.eachLimit;
-
-    var _eachLimit = function (limit) {
-
-        return function (arr, iterator, callback) {
-            callback = callback || function () {};
-            if (!arr.length || limit <= 0) {
-                return callback();
-            }
-            var completed = 0;
-            var started = 0;
-            var running = 0;
-
-            (function replenish () {
-                if (completed >= arr.length) {
-                    return callback();
-                }
-
-                while (running < limit && started < arr.length) {
-                    started += 1;
-                    running += 1;
-                    iterator(arr[started - 1], function (err) {
-                        if (err) {
-                            callback(err);
-                            callback = function () {};
-                        }
-                        else {
-                            completed += 1;
-                            running -= 1;
-                            if (completed >= arr.length) {
-                                callback();
-                            }
-                            else {
-                                replenish();
-                            }
-                        }
-                    });
-                }
-            })();
-        };
-    };
-
-
-    var doParallel = function (fn) {
-        return function () {
-            var args = Array.prototype.slice.call(arguments);
-            return fn.apply(null, [async.each].concat(args));
-        };
-    };
-    var doParallelLimit = function(limit, fn) {
-        return function () {
-            var args = Array.prototype.slice.call(arguments);
-            return fn.apply(null, [_eachLimit(limit)].concat(args));
-        };
-    };
-    var doSeries = function (fn) {
-        return function () {
-            var args = Array.prototype.slice.call(arguments);
-            return fn.apply(null, [async.eachSeries].concat(args));
-        };
-    };
-
-
-    var _asyncMap = function (eachfn, arr, iterator, callback) {
-        var results = [];
-        arr = _map(arr, function (x, i) {
-            return {index: i, value: x};
-        });
-        eachfn(arr, function (x, callback) {
-            iterator(x.value, function (err, v) {
-                results[x.index] = v;
-                callback(err);
-            });
-        }, function (err) {
-            callback(err, results);
-        });
-    };
-    async.map = doParallel(_asyncMap);
-    async.mapSeries = doSeries(_asyncMap);
-    async.mapLimit = function (arr, limit, iterator, callback) {
-        return _mapLimit(limit)(arr, iterator, callback);
-    };
-
-    var _mapLimit = function(limit) {
-        return doParallelLimit(limit, _asyncMap);
-    };
-
-    // reduce only has a series version, as doing reduce in parallel won't
-    // work in many situations.
-    async.reduce = function (arr, memo, iterator, callback) {
-        async.eachSeries(arr, function (x, callback) {
-            iterator(memo, x, function (err, v) {
-                memo = v;
-                callback(err);
-            });
-        }, function (err) {
-            callback(err, memo);
-        });
-    };
-    // inject alias
-    async.inject = async.reduce;
-    // foldl alias
-    async.foldl = async.reduce;
-
-    async.reduceRight = function (arr, memo, iterator, callback) {
-        var reversed = _map(arr, function (x) {
-            return x;
-        }).reverse();
-        async.reduce(reversed, memo, iterator, callback);
-    };
-    // foldr alias
-    async.foldr = async.reduceRight;
-
-    var _filter = function (eachfn, arr, iterator, callback) {
-        var results = [];
-        arr = _map(arr, function (x, i) {
-            return {index: i, value: x};
-        });
-        eachfn(arr, function (x, callback) {
-            iterator(x.value, function (v) {
-                if (v) {
-                    results.push(x);
-                }
-                callback();
-            });
-        }, function (err) {
-            callback(_map(results.sort(function (a, b) {
-                return a.index - b.index;
-            }), function (x) {
-                return x.value;
-            }));
-        });
-    };
-    async.filter = doParallel(_filter);
-    async.filterSeries = doSeries(_filter);
-    // select alias
-    async.select = async.filter;
-    async.selectSeries = async.filterSeries;
-
-    var _reject = function (eachfn, arr, iterator, callback) {
-        var results = [];
-        arr = _map(arr, function (x, i) {
-            return {index: i, value: x};
-        });
-        eachfn(arr, function (x, callback) {
-            iterator(x.value, function (v) {
-                if (!v) {
-                    results.push(x);
-                }
-                callback();
-            });
-        }, function (err) {
-            callback(_map(results.sort(function (a, b) {
-                return a.index - b.index;
-            }), function (x) {
-                return x.value;
-            }));
-        });
-    };
-    async.reject = doParallel(_reject);
-    async.rejectSeries = doSeries(_reject);
-
-    var _detect = function (eachfn, arr, iterator, main_callback) {
-        eachfn(arr, function (x, callback) {
-            iterator(x, function (result) {
-                if (result) {
-                    main_callback(x);
-                    main_callback = function () {};
-                }
-                else {
-                    callback();
-                }
-            });
-        }, function (err) {
-            main_callback();
-        });
-    };
-    async.detect = doParallel(_detect);
-    async.detectSeries = doSeries(_detect);
-
-    async.some = function (arr, iterator, main_callback) {
-        async.each(arr, function (x, callback) {
-            iterator(x, function (v) {
-                if (v) {
-                    main_callback(true);
-                    main_callback = function () {};
-                }
-                callback();
-            });
-        }, function (err) {
-            main_callback(false);
-        });
-    };
-    // any alias
-    async.any = async.some;
-
-    async.every = function (arr, iterator, main_callback) {
-        async.each(arr, function (x, callback) {
-            iterator(x, function (v) {
-                if (!v) {
-                    main_callback(false);
-                    main_callback = function () {};
-                }
-                callback();
-            });
-        }, function (err) {
-            main_callback(true);
-        });
-    };
-    // all alias
-    async.all = async.every;
-
-    async.sortBy = function (arr, iterator, callback) {
-        async.map(arr, function (x, callback) {
-            iterator(x, function (err, criteria) {
-                if (err) {
-                    callback(err);
-                }
-                else {
-                    callback(null, {value: x, criteria: criteria});
-                }
-            });
-        }, function (err, results) {
-            if (err) {
-                return callback(err);
-            }
-            else {
-                var fn = function (left, right) {
-                    var a = left.criteria, b = right.criteria;
-                    return a < b ? -1 : a > b ? 1 : 0;
-                };
-                callback(null, _map(results.sort(fn), function (x) {
-                    return x.value;
-                }));
-            }
-        });
-    };
-
-    async.auto = function (tasks, callback) {
-        callback = callback || function () {};
-        var keys = _keys(tasks);
-        if (!keys.length) {
-            return callback(null);
-        }
-
-        var results = {};
-
-        var listeners = [];
-        var addListener = function (fn) {
-            listeners.unshift(fn);
-        };
-        var removeListener = function (fn) {
-            for (var i = 0; i < listeners.length; i += 1) {
-                if (listeners[i] === fn) {
-                    listeners.splice(i, 1);
-                    return;
-                }
-            }
-        };
-        var taskComplete = function () {
-            _each(listeners.slice(0), function (fn) {
-                fn();
-            });
-        };
-
-        addListener(function () {
-            if (_keys(results).length === keys.length) {
-                callback(null, results);
-                callback = function () {};
-            }
-        });
-
-        _each(keys, function (k) {
-            var task = (tasks[k] instanceof Function) ? [tasks[k]]: tasks[k];
-            var taskCallback = function (err) {
-                var args = Array.prototype.slice.call(arguments, 1);
-                if (args.length <= 1) {
-                    args = args[0];
-                }
-                if (err) {
-                    var safeResults = {};
-                    _each(_keys(results), function(rkey) {
-                        safeResults[rkey] = results[rkey];
-                    });
-                    safeResults[k] = args;
-                    callback(err, safeResults);
-                    // stop subsequent errors hitting callback multiple times
-                    callback = function () {};
-                }
-                else {
-                    results[k] = args;
-                    async.setImmediate(taskComplete);
-                }
-            };
-            var requires = task.slice(0, Math.abs(task.length - 1)) || [];
-            var ready = function () {
-                return _reduce(requires, function (a, x) {
-                    return (a && results.hasOwnProperty(x));
-                }, true) && !results.hasOwnProperty(k);
-            };
-            if (ready()) {
-                task[task.length - 1](taskCallback, results);
-            }
-            else {
-                var listener = function () {
-                    if (ready()) {
-                        removeListener(listener);
-                        task[task.length - 1](taskCallback, results);
-                    }
-                };
-                addListener(listener);
-            }
-        });
-    };
-
-    async.waterfall = function (tasks, callback) {
-        callback = callback || function () {};
-        if (tasks.constructor !== Array) {
-          var err = new Error('First argument to waterfall must be an array of functions');
-          return callback(err);
-        }
-        if (!tasks.length) {
-            return callback();
-        }
-        var wrapIterator = function (iterator) {
-            return function (err) {
-                if (err) {
-                    callback.apply(null, arguments);
-                    callback = function () {};
-                }
-                else {
-                    var args = Array.prototype.slice.call(arguments, 1);
-                    var next = iterator.next();
-                    if (next) {
-                        args.push(wrapIterator(next));
-                    }
-                    else {
-                        args.push(callback);
-                    }
-                    async.setImmediate(function () {
-                        iterator.apply(null, args);
-                    });
-                }
-            };
-        };
-        wrapIterator(async.iterator(tasks))();
-    };
-
-    var _parallel = function(eachfn, tasks, callback) {
-        callback = callback || function () {};
-        if (tasks.constructor === Array) {
-            eachfn.map(tasks, function (fn, callback) {
-                if (fn) {
-                    fn(function (err) {
-                        var args = Array.prototype.slice.call(arguments, 1);
-                        if (args.length <= 1) {
-                            args = args[0];
-                        }
-                        callback.call(null, err, args);
-                    });
-                }
-            }, callback);
-        }
-        else {
-            var results = {};
-            eachfn.each(_keys(tasks), function (k, callback) {
-                tasks[k](function (err) {
-                    var args = Array.prototype.slice.call(arguments, 1);
-                    if (args.length <= 1) {
-                        args = args[0];
-                    }
-                    results[k] = args;
-                    callback(err);
-                });
-            }, function (err) {
-                callback(err, results);
-            });
-        }
-    };
-
-    async.parallel = function (tasks, callback) {
-        _parallel({ map: async.map, each: async.each }, tasks, callback);
-    };
-
-    async.parallelLimit = function(tasks, limit, callback) {
-        _parallel({ map: _mapLimit(limit), each: _eachLimit(limit) }, tasks, callback);
-    };
-
-    async.series = function (tasks, callback) {
-        callback = callback || function () {};
-        if (tasks.constructor === Array) {
-            async.mapSeries(tasks, function (fn, callback) {
-                if (fn) {
-                    fn(function (err) {
-                        var args = Array.prototype.slice.call(arguments, 1);
-                        if (args.length <= 1) {
-                            args = args[0];
-                        }
-                        callback.call(null, err, args);
-                    });
-                }
-            }, callback);
-        }
-        else {
-            var results = {};
-            async.eachSeries(_keys(tasks), function (k, callback) {
-                tasks[k](function (err) {
-                    var args = Array.prototype.slice.call(arguments, 1);
-                    if (args.length <= 1) {
-                        args = args[0];
-                    }
-                    results[k] = args;
-                    callback(err);
-                });
-            }, function (err) {
-                callback(err, results);
-            });
-        }
-    };
-
-    async.iterator = function (tasks) {
-        var makeCallback = function (index) {
-            var fn = function () {
-                if (tasks.length) {
-                    tasks[index].apply(null, arguments);
-                }
-                return fn.next();
-            };
-            fn.next = function () {
-                return (index < tasks.length - 1) ? makeCallback(index + 1): null;
-            };
-            return fn;
-        };
-        return makeCallback(0);
-    };
-
-    async.apply = function (fn) {
-        var args = Array.prototype.slice.call(arguments, 1);
-        return function () {
-            return fn.apply(
-                null, args.concat(Array.prototype.slice.call(arguments))
-            );
-        };
-    };
-
-    var _concat = function (eachfn, arr, fn, callback) {
-        var r = [];
-        eachfn(arr, function (x, cb) {
-            fn(x, function (err, y) {
-                r = r.concat(y || []);
-                cb(err);
-            });
-        }, function (err) {
-            callback(err, r);
-        });
-    };
-    async.concat = doParallel(_concat);
-    async.concatSeries = doSeries(_concat);
-
-    async.whilst = function (test, iterator, callback) {
-        if (test()) {
-            iterator(function (err) {
-                if (err) {
-                    return callback(err);
-                }
-                async.whilst(test, iterator, callback);
-            });
-        }
-        else {
-            callback();
-        }
-    };
-
-    async.doWhilst = function (iterator, test, callback) {
-        iterator(function (err) {
-            if (err) {
-                return callback(err);
-            }
-            if (test()) {
-                async.doWhilst(iterator, test, callback);
-            }
-            else {
-                callback();
-            }
-        });
-    };
-
-    async.until = function (test, iterator, callback) {
-        if (!test()) {
-            iterator(function (err) {
-                if (err) {
-                    return callback(err);
-                }
-                async.until(test, iterator, callback);
-            });
-        }
-        else {
-            callback();
-        }
-    };
-
-    async.doUntil = function (iterator, test, callback) {
-        iterator(function (err) {
-            if (err) {
-                return callback(err);
-            }
-            if (!test()) {
-                async.doUntil(iterator, test, callback);
-            }
-            else {
-                callback();
-            }
-        });
-    };
-
-    async.queue = function (worker, concurrency) {
-        if (concurrency === undefined) {
-            concurrency = 1;
-        }
-        function _insert(q, data, pos, callback) {
-          if(data.constructor !== Array) {
-              data = [data];
-          }
-          _each(data, function(task) {
-              var item = {
-                  data: task,
-                  callback: typeof callback === 'function' ? callback : null
-              };
-
-              if (pos) {
-                q.tasks.unshift(item);
-              } else {
-                q.tasks.push(item);
-              }
-
-              if (q.saturated && q.tasks.length === concurrency) {
-                  q.saturated();
-              }
-              async.setImmediate(q.process);
-          });
-        }
-
-        var workers = 0;
-        var q = {
-            tasks: [],
-            concurrency: concurrency,
-            saturated: null,
-            empty: null,
-            drain: null,
-            push: function (data, callback) {
-              _insert(q, data, false, callback);
-            },
-            unshift: function (data, callback) {
-              _insert(q, data, true, callback);
-            },
-            process: function () {
-                if (workers < q.concurrency && q.tasks.length) {
-                    var task = q.tasks.shift();
-                    if (q.empty && q.tasks.length === 0) {
-                        q.empty();
-                    }
-                    workers += 1;
-                    var next = function () {
-                        workers -= 1;
-                        if (task.callback) {
-                            task.callback.apply(task, arguments);
-                        }
-                        if (q.drain && q.tasks.length + workers === 0) {
-                            q.drain();
-                        }
-                        q.process();
-                    };
-                    var cb = only_once(next);
-                    worker(task.data, cb);
-                }
-            },
-            length: function () {
-                return q.tasks.length;
-            },
-            running: function () {
-                return workers;
-            }
-        };
-        return q;
-    };
-
-    async.cargo = function (worker, payload) {
-        var working     = false,
-            tasks       = [];
-
-        var cargo = {
-            tasks: tasks,
-            payload: payload,
-            saturated: null,
-            empty: null,
-            drain: null,
-            push: function (data, callback) {
-                if(data.constructor !== Array) {
-                    data = [data];
-                }
-                _each(data, function(task) {
-                    tasks.push({
-                        data: task,
-                        callback: typeof callback === 'function' ? callback : null
-                    });
-                    if (cargo.saturated && tasks.length === payload) {
-                        cargo.saturated();
-                    }
-                });
-                async.setImmediate(cargo.process);
-            },
-            process: function process() {
-                if (working) return;
-                if (tasks.length === 0) {
-                    if(cargo.drain) cargo.drain();
-                    return;
-                }
-
-                var ts = typeof payload === 'number'
-                            ? tasks.splice(0, payload)
-                            : tasks.splice(0);
-
-                var ds = _map(ts, function (task) {
-                    return task.data;
-                });
-
-                if(cargo.empty) cargo.empty();
-                working = true;
-                worker(ds, function () {
-                    working = false;
-
-                    var args = arguments;
-                    _each(ts, function (data) {
-                        if (data.callback) {
-                            data.callback.apply(null, args);
-                        }
-                    });
-
-                    process();
-                });
-            },
-            length: function () {
-                return tasks.length;
-            },
-            running: function () {
-                return working;
-            }
-        };
-        return cargo;
-    };
-
-    var _console_fn = function (name) {
-        return function (fn) {
-            var args = Array.prototype.slice.call(arguments, 1);
-            fn.apply(null, args.concat([function (err) {
-                var args = Array.prototype.slice.call(arguments, 1);
-                if (typeof console !== 'undefined') {
-                    if (err) {
-                        if (console.error) {
-                            console.error(err);
-                        }
-                    }
-                    else if (console[name]) {
-                        _each(args, function (x) {
-                            console[name](x);
-                        });
-                    }
-                }
-            }]));
-        };
-    };
-    async.log = _console_fn('log');
-    async.dir = _console_fn('dir');
-    /*async.info = _console_fn('info');
-    async.warn = _console_fn('warn');
-    async.error = _console_fn('error');*/
-
-    async.memoize = function (fn, hasher) {
-        var memo = {};
-        var queues = {};
-        hasher = hasher || function (x) {
-            return x;
-        };
-        var memoized = function () {
-            var args = Array.prototype.slice.call(arguments);
-            var callback = args.pop();
-            var key = hasher.apply(null, args);
-            if (key in memo) {
-                callback.apply(null, memo[key]);
-            }
-            else if (key in queues) {
-                queues[key].push(callback);
-            }
-            else {
-                queues[key] = [callback];
-                fn.apply(null, args.concat([function () {
-                    memo[key] = arguments;
-                    var q = queues[key];
-                    delete queues[key];
-                    for (var i = 0, l = q.length; i < l; i++) {
-                      q[i].apply(null, arguments);
-                    }
-                }]));
-            }
-        };
-        memoized.memo = memo;
-        memoized.unmemoized = fn;
-        return memoized;
-    };
-
-    async.unmemoize = function (fn) {
-      return function () {
-        return (fn.unmemoized || fn).apply(null, arguments);
-      };
-    };
-
-    async.times = function (count, iterator, callback) {
-        var counter = [];
-        for (var i = 0; i < count; i++) {
-            counter.push(i);
-        }
-        return async.map(counter, iterator, callback);
-    };
-
-    async.timesSeries = function (count, iterator, callback) {
-        var counter = [];
-        for (var i = 0; i < count; i++) {
-            counter.push(i);
-        }
-        return async.mapSeries(counter, iterator, callback);
-    };
-
-    async.compose = function (/* functions... */) {
-        var fns = Array.prototype.reverse.call(arguments);
-        return function () {
-            var that = this;
-            var args = Array.prototype.slice.call(arguments);
-            var callback = args.pop();
-            async.reduce(fns, args, function (newargs, fn, cb) {
-                fn.apply(that, newargs.concat([function () {
-                    var err = arguments[0];
-                    var nextargs = Array.prototype.slice.call(arguments, 1);
-                    cb(err, nextargs);
-                }]))
-            },
-            function (err, results) {
-                callback.apply(that, [err].concat(results));
-            });
-        };
-    };
-
-    var _applyEach = function (eachfn, fns /*args...*/) {
-        var go = function () {
-            var that = this;
-            var args = Array.prototype.slice.call(arguments);
-            var callback = args.pop();
-            return eachfn(fns, function (fn, cb) {
-                fn.apply(that, args.concat([cb]));
-            },
-            callback);
-        };
-        if (arguments.length > 2) {
-            var args = Array.prototype.slice.call(arguments, 2);
-            return go.apply(this, args);
-        }
-        else {
-            return go;
-        }
-    };
-    async.applyEach = doParallel(_applyEach);
-    async.applyEachSeries = doSeries(_applyEach);
-
-    async.forever = function (fn, callback) {
-        function next(err) {
-            if (err) {
-                if (callback) {
-                    return callback(err);
-                }
-                throw err;
-            }
-            fn(next);
-        }
-        next();
-    };
-
-    // AMD / RequireJS
-    if (typeof define !== 'undefined' && define.amd) {
-        define([], function () {
-            return async;
-        });
-    }
-    // Node.js
-    else if (typeof module !== 'undefined' && module.exports) {
-        module.exports = async;
-    }
-    // included directly via <script> tag
-    else {
-        root.async = async;
-    }
-
-}());
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,47 +0,0 @@
-{
-  "name": "async",
-  "description": "Higher-order functions and common patterns for asynchronous code",
-  "main": "./lib/async",
-  "author": {
-    "name": "Caolan McMahon"
-  },
-  "version": "0.2.9",
-  "repository": {
-    "type": "git",
-    "url": "https://github.com/caolan/async.git"
-  },
-  "bugs": {
-    "url": "https://github.com/caolan/async/issues"
-  },
-  "licenses": [
-    {
-      "type": "MIT",
-      "url": "https://github.com/caolan/async/raw/master/LICENSE"
-    }
-  ],
-  "devDependencies": {
-    "nodeunit": ">0.0.0",
-    "uglify-js": "1.2.x",
-    "nodelint": ">0.0.0"
-  },
-  "jam": {
-    "main": "lib/async.js",
-    "include": [
-      "lib/async.js",
-      "README.md",
-      "LICENSE"
-    ]
-  },
-  "scripts": {
-    "test": "nodeunit test/test-async.js"
-  },
-  "readme": "# Async.js\n\nAsync is a utility module which provides straight-forward, powerful functions\nfor working with asynchronous JavaScript. Although originally designed for\nuse with [node.js](http://nodejs.org), it can also be used directly in the\nbrowser. Also supports [component](https://github.com/component/component).\n\nAsync provides around 20 functions that include the usual 'functional'\nsuspects (map, reduce, filter, each…) as well as some common patterns\nfor asynchronous control flow (parallel, series, waterfall…). All these\nfunctions assume you follow the node.js convention of providing a single\ncallback as the last argument of your async function.\n\n\n## Quick Examples\n\n```javascript\nasync.map(['file1','file2','file3'], fs.stat, function(err, results){\n    // results is now an array of stats for each file\n});\n\nasync.filter(['file1','file2','file3'], fs.exists, function(results){\n    // results now equals an array of the existing files\n});\n\nasync.parallel([\n    function(){ ... },\n    function(){ ... }\n], callback);\n\nasync.series([\n    function(){ ... },\n    function(){ ... }\n]);\n```\n\nThere are many more functions available so take a look at the docs below for a\nfull list. This module aims to be comprehensive, so if you feel anything is\nmissing please create a GitHub issue for it.\n\n## Common Pitfalls\n\n### Binding a context to an iterator\n\nThis section is really about bind, not about async. If you are wondering how to\nmake async execute your iterators in a given context, or are confused as to why\na method of another library isn't working as an iterator, study this example:\n\n```js\n// Here is a simple object with an (unnecessarily roundabout) squaring method\nvar AsyncSquaringLibrary = {\n  squareExponent: 2,\n  square: function(number, callback){ \n    var result = Math.pow(number, this.squareExponent);\n    setTimeout(function(){\n      callback(null, result);\n    }, 200);\n  }\n};\n\nasync.map([1, 2, 3], AsyncSquaringLibrary.square, function(err, result){\n  // result is [NaN, NaN, NaN]\n  // This fails because the `this.squareExponent` expression in the square\n  // function is not evaluated in the context of AsyncSquaringLibrary, and is\n  // therefore undefined.\n});\n\nasync.map([1, 2, 3], AsyncSquaringLibrary.square.bind(AsyncSquaringLibrary), function(err, result){\n  // result is [1, 4, 9]\n  // With the help of bind we can attach a context to the iterator before\n  // passing it to async. Now the square function will be executed in its \n  // 'home' AsyncSquaringLibrary context and the value of `this.squareExponent`\n  // will be as expected.\n});\n```\n\n## Download\n\nThe source is available for download from\n[GitHub](http://github.com/caolan/async).\nAlternatively, you can install using Node Package Manager (npm):\n\n    npm install async\n\n__Development:__ [async.js](https://github.com/caolan/async/raw/master/lib/async.js) - 29.6kb Uncompressed\n\n## In the Browser\n\nSo far it's been tested in IE6, IE7, IE8, FF3.6 and Chrome 5. Usage:\n\n```html\n<script type=\"text/javascript\" src=\"async.js\"></script>\n<script type=\"text/javascript\">\n\n    async.map(data, asyncProcess, function(err, results){\n        alert(results);\n    });\n\n</script>\n```\n\n## Documentation\n\n### Collections\n\n* [each](#each)\n* [map](#map)\n* [filter](#filter)\n* [reject](#reject)\n* [reduce](#reduce)\n* [detect](#detect)\n* [sortBy](#sortBy)\n* [some](#some)\n* [every](#every)\n* [concat](#concat)\n\n### Control Flow\n\n* [series](#series)\n* [parallel](#parallel)\n* [whilst](#whilst)\n* [doWhilst](#doWhilst)\n* [until](#until)\n* [doUntil](#doUntil)\n* [forever](#forever)\n* [waterfall](#waterfall)\n* [compose](#compose)\n* [applyEach](#applyEach)\n* [queue](#queue)\n* [cargo](#cargo)\n* [auto](#auto)\n* [iterator](#iterator)\n* [apply](#apply)\n* [nextTick](#nextTick)\n* [times](#times)\n* [timesSeries](#timesSeries)\n\n### Utils\n\n* [memoize](#memoize)\n* [unmemoize](#unmemoize)\n* [log](#log)\n* [dir](#dir)\n* [noConflict](#noConflict)\n\n\n## Collections\n\n<a name=\"forEach\" />\n<a name=\"each\" />\n### each(arr, iterator, callback)\n\nApplies an iterator function to each item in an array, in parallel.\nThe iterator is called with an item from the list and a callback for when it\nhas finished. If the iterator passes an error to this callback, the main\ncallback for the each function is immediately called with the error.\n\nNote, that since this function applies the iterator to each item in parallel\nthere is no guarantee that the iterator functions will complete in order.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A function to apply to each item in the array.\n  The iterator is passed a callback(err) which must be called once it has \n  completed. If no error has occured, the callback should be run without \n  arguments or with an explicit null argument.\n* callback(err) - A callback which is called after all the iterator functions\n  have finished, or an error has occurred.\n\n__Example__\n\n```js\n// assuming openFiles is an array of file names and saveFile is a function\n// to save the modified contents of that file:\n\nasync.each(openFiles, saveFile, function(err){\n    // if any of the saves produced an error, err would equal that error\n});\n```\n\n---------------------------------------\n\n<a name=\"forEachSeries\" />\n<a name=\"eachSeries\" />\n### eachSeries(arr, iterator, callback)\n\nThe same as each only the iterator is applied to each item in the array in\nseries. The next iterator is only called once the current one has completed\nprocessing. This means the iterator functions will complete in order.\n\n\n---------------------------------------\n\n<a name=\"forEachLimit\" />\n<a name=\"eachLimit\" />\n### eachLimit(arr, limit, iterator, callback)\n\nThe same as each only no more than \"limit\" iterators will be simultaneously \nrunning at any time.\n\nNote that the items are not processed in batches, so there is no guarantee that\n the first \"limit\" iterator functions will complete before any others are \nstarted.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* limit - The maximum number of iterators to run at any time.\n* iterator(item, callback) - A function to apply to each item in the array.\n  The iterator is passed a callback(err) which must be called once it has \n  completed. If no error has occured, the callback should be run without \n  arguments or with an explicit null argument.\n* callback(err) - A callback which is called after all the iterator functions\n  have finished, or an error has occurred.\n\n__Example__\n\n```js\n// Assume documents is an array of JSON objects and requestApi is a\n// function that interacts with a rate-limited REST api.\n\nasync.eachLimit(documents, 20, requestApi, function(err){\n    // if any of the saves produced an error, err would equal that error\n});\n```\n\n---------------------------------------\n\n<a name=\"map\" />\n### map(arr, iterator, callback)\n\nProduces a new array of values by mapping each value in the given array through\nthe iterator function. The iterator is called with an item from the array and a\ncallback for when it has finished processing. The callback takes 2 arguments, \nan error and the transformed item from the array. If the iterator passes an\nerror to this callback, the main callback for the map function is immediately\ncalled with the error.\n\nNote, that since this function applies the iterator to each item in parallel\nthere is no guarantee that the iterator functions will complete in order, however\nthe results array will be in the same order as the original array.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A function to apply to each item in the array.\n  The iterator is passed a callback(err, transformed) which must be called once \n  it has completed with an error (which can be null) and a transformed item.\n* callback(err, results) - A callback which is called after all the iterator\n  functions have finished, or an error has occurred. Results is an array of the\n  transformed items from the original array.\n\n__Example__\n\n```js\nasync.map(['file1','file2','file3'], fs.stat, function(err, results){\n    // results is now an array of stats for each file\n});\n```\n\n---------------------------------------\n\n<a name=\"mapSeries\" />\n### mapSeries(arr, iterator, callback)\n\nThe same as map only the iterator is applied to each item in the array in\nseries. The next iterator is only called once the current one has completed\nprocessing. The results array will be in the same order as the original.\n\n\n---------------------------------------\n\n<a name=\"mapLimit\" />\n### mapLimit(arr, limit, iterator, callback)\n\nThe same as map only no more than \"limit\" iterators will be simultaneously \nrunning at any time.\n\nNote that the items are not processed in batches, so there is no guarantee that\n the first \"limit\" iterator functions will complete before any others are \nstarted.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* limit - The maximum number of iterators to run at any time.\n* iterator(item, callback) - A function to apply to each item in the array.\n  The iterator is passed a callback(err, transformed) which must be called once \n  it has completed with an error (which can be null) and a transformed item.\n* callback(err, results) - A callback which is called after all the iterator\n  functions have finished, or an error has occurred. Results is an array of the\n  transformed items from the original array.\n\n__Example__\n\n```js\nasync.map(['file1','file2','file3'], 1, fs.stat, function(err, results){\n    // results is now an array of stats for each file\n});\n```\n\n---------------------------------------\n\n<a name=\"filter\" />\n### filter(arr, iterator, callback)\n\n__Alias:__ select\n\nReturns a new array of all the values which pass an async truth test.\n_The callback for each iterator call only accepts a single argument of true or\nfalse, it does not accept an error argument first!_ This is in-line with the\nway node libraries work with truth tests like fs.exists. This operation is\nperformed in parallel, but the results array will be in the same order as the\noriginal.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A truth test to apply to each item in the array.\n  The iterator is passed a callback(truthValue) which must be called with a \n  boolean argument once it has completed.\n* callback(results) - A callback which is called after all the iterator\n  functions have finished.\n\n__Example__\n\n```js\nasync.filter(['file1','file2','file3'], fs.exists, function(results){\n    // results now equals an array of the existing files\n});\n```\n\n---------------------------------------\n\n<a name=\"filterSeries\" />\n### filterSeries(arr, iterator, callback)\n\n__alias:__ selectSeries\n\nThe same as filter only the iterator is applied to each item in the array in\nseries. The next iterator is only called once the current one has completed\nprocessing. The results array will be in the same order as the original.\n\n---------------------------------------\n\n<a name=\"reject\" />\n### reject(arr, iterator, callback)\n\nThe opposite of filter. Removes values that pass an async truth test.\n\n---------------------------------------\n\n<a name=\"rejectSeries\" />\n### rejectSeries(arr, iterator, callback)\n\nThe same as reject, only the iterator is applied to each item in the array\nin series.\n\n\n---------------------------------------\n\n<a name=\"reduce\" />\n### reduce(arr, memo, iterator, callback)\n\n__aliases:__ inject, foldl\n\nReduces a list of values into a single value using an async iterator to return\neach successive step. Memo is the initial state of the reduction. This\nfunction only operates in series. For performance reasons, it may make sense to\nsplit a call to this function into a parallel map, then use the normal\nArray.prototype.reduce on the results. This function is for situations where\neach step in the reduction needs to be async, if you can get the data before\nreducing it then it's probably a good idea to do so.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* memo - The initial state of the reduction.\n* iterator(memo, item, callback) - A function applied to each item in the\n  array to produce the next step in the reduction. The iterator is passed a\n  callback(err, reduction) which accepts an optional error as its first \n  argument, and the state of the reduction as the second. If an error is \n  passed to the callback, the reduction is stopped and the main callback is \n  immediately called with the error.\n* callback(err, result) - A callback which is called after all the iterator\n  functions have finished. Result is the reduced value.\n\n__Example__\n\n```js\nasync.reduce([1,2,3], 0, function(memo, item, callback){\n    // pointless async:\n    process.nextTick(function(){\n        callback(null, memo + item)\n    });\n}, function(err, result){\n    // result is now equal to the last value of memo, which is 6\n});\n```\n\n---------------------------------------\n\n<a name=\"reduceRight\" />\n### reduceRight(arr, memo, iterator, callback)\n\n__Alias:__ foldr\n\nSame as reduce, only operates on the items in the array in reverse order.\n\n\n---------------------------------------\n\n<a name=\"detect\" />\n### detect(arr, iterator, callback)\n\nReturns the first value in a list that passes an async truth test. The\niterator is applied in parallel, meaning the first iterator to return true will\nfire the detect callback with that result. That means the result might not be\nthe first item in the original array (in terms of order) that passes the test.\n\nIf order within the original array is important then look at detectSeries.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A truth test to apply to each item in the array.\n  The iterator is passed a callback(truthValue) which must be called with a \n  boolean argument once it has completed.\n* callback(result) - A callback which is called as soon as any iterator returns\n  true, or after all the iterator functions have finished. Result will be\n  the first item in the array that passes the truth test (iterator) or the\n  value undefined if none passed.\n\n__Example__\n\n```js\nasync.detect(['file1','file2','file3'], fs.exists, function(result){\n    // result now equals the first file in the list that exists\n});\n```\n\n---------------------------------------\n\n<a name=\"detectSeries\" />\n### detectSeries(arr, iterator, callback)\n\nThe same as detect, only the iterator is applied to each item in the array\nin series. This means the result is always the first in the original array (in\nterms of array order) that passes the truth test.\n\n\n---------------------------------------\n\n<a name=\"sortBy\" />\n### sortBy(arr, iterator, callback)\n\nSorts a list by the results of running each value through an async iterator.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A function to apply to each item in the array.\n  The iterator is passed a callback(err, sortValue) which must be called once it\n  has completed with an error (which can be null) and a value to use as the sort\n  criteria.\n* callback(err, results) - A callback which is called after all the iterator\n  functions have finished, or an error has occurred. Results is the items from\n  the original array sorted by the values returned by the iterator calls.\n\n__Example__\n\n```js\nasync.sortBy(['file1','file2','file3'], function(file, callback){\n    fs.stat(file, function(err, stats){\n        callback(err, stats.mtime);\n    });\n}, function(err, results){\n    // results is now the original array of files sorted by\n    // modified date\n});\n```\n\n---------------------------------------\n\n<a name=\"some\" />\n### some(arr, iterator, callback)\n\n__Alias:__ any\n\nReturns true if at least one element in the array satisfies an async test.\n_The callback for each iterator call only accepts a single argument of true or\nfalse, it does not accept an error argument first!_ This is in-line with the\nway node libraries work with truth tests like fs.exists. Once any iterator\ncall returns true, the main callback is immediately called.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A truth test to apply to each item in the array.\n  The iterator is passed a callback(truthValue) which must be called with a \n  boolean argument once it has completed.\n* callback(result) - A callback which is called as soon as any iterator returns\n  true, or after all the iterator functions have finished. Result will be\n  either true or false depending on the values of the async tests.\n\n__Example__\n\n```js\nasync.some(['file1','file2','file3'], fs.exists, function(result){\n    // if result is true then at least one of the files exists\n});\n```\n\n---------------------------------------\n\n<a name=\"every\" />\n### every(arr, iterator, callback)\n\n__Alias:__ all\n\nReturns true if every element in the array satisfies an async test.\n_The callback for each iterator call only accepts a single argument of true or\nfalse, it does not accept an error argument first!_ This is in-line with the\nway node libraries work with truth tests like fs.exists.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A truth test to apply to each item in the array.\n  The iterator is passed a callback(truthValue) which must be called with a \n  boolean argument once it has completed.\n* callback(result) - A callback which is called after all the iterator\n  functions have finished. Result will be either true or false depending on\n  the values of the async tests.\n\n__Example__\n\n```js\nasync.every(['file1','file2','file3'], fs.exists, function(result){\n    // if result is true then every file exists\n});\n```\n\n---------------------------------------\n\n<a name=\"concat\" />\n### concat(arr, iterator, callback)\n\nApplies an iterator to each item in a list, concatenating the results. Returns the\nconcatenated list. The iterators are called in parallel, and the results are\nconcatenated as they return. There is no guarantee that the results array will\nbe returned in the original order of the arguments passed to the iterator function.\n\n__Arguments__\n\n* arr - An array to iterate over\n* iterator(item, callback) - A function to apply to each item in the array.\n  The iterator is passed a callback(err, results) which must be called once it \n  has completed with an error (which can be null) and an array of results.\n* callback(err, results) - A callback which is called after all the iterator\n  functions have finished, or an error has occurred. Results is an array containing\n  the concatenated results of the iterator function.\n\n__Example__\n\n```js\nasync.concat(['dir1','dir2','dir3'], fs.readdir, function(err, files){\n    // files is now a list of filenames that exist in the 3 directories\n});\n```\n\n---------------------------------------\n\n<a name=\"concatSeries\" />\n### concatSeries(arr, iterator, callback)\n\nSame as async.concat, but executes in series instead of parallel.\n\n\n## Control Flow\n\n<a name=\"series\" />\n### series(tasks, [callback])\n\nRun an array of functions in series, each one running once the previous\nfunction has completed. If any functions in the series pass an error to its\ncallback, no more functions are run and the callback for the series is\nimmediately called with the value of the error. Once the tasks have completed,\nthe results are passed to the final callback as an array.\n\nIt is also possible to use an object instead of an array. Each property will be\nrun as a function and the results will be passed to the final callback as an object\ninstead of an array. This can be a more readable way of handling results from\nasync.series.\n\n\n__Arguments__\n\n* tasks - An array or object containing functions to run, each function is passed\n  a callback(err, result) it must call on completion with an error (which can\n  be null) and an optional result value.\n* callback(err, results) - An optional callback to run once all the functions\n  have completed. This function gets a results array (or object) containing all \n  the result arguments passed to the task callbacks.\n\n__Example__\n\n```js\nasync.series([\n    function(callback){\n        // do some stuff ...\n        callback(null, 'one');\n    },\n    function(callback){\n        // do some more stuff ...\n        callback(null, 'two');\n    }\n],\n// optional callback\nfunction(err, results){\n    // results is now equal to ['one', 'two']\n});\n\n\n// an example using an object instead of an array\nasync.series({\n    one: function(callback){\n        setTimeout(function(){\n            callback(null, 1);\n        }, 200);\n    },\n    two: function(callback){\n        setTimeout(function(){\n            callback(null, 2);\n        }, 100);\n    }\n},\nfunction(err, results) {\n    // results is now equal to: {one: 1, two: 2}\n});\n```\n\n---------------------------------------\n\n<a name=\"parallel\" />\n### parallel(tasks, [callback])\n\nRun an array of functions in parallel, without waiting until the previous\nfunction has completed. If any of the functions pass an error to its\ncallback, the main callback is immediately called with the value of the error.\nOnce the tasks have completed, the results are passed to the final callback as an\narray.\n\nIt is also possible to use an object instead of an array. Each property will be\nrun as a function and the results will be passed to the final callback as an object\ninstead of an array. This can be a more readable way of handling results from\nasync.parallel.\n\n\n__Arguments__\n\n* tasks - An array or object containing functions to run, each function is passed \n  a callback(err, result) it must call on completion with an error (which can\n  be null) and an optional result value.\n* callback(err, results) - An optional callback to run once all the functions\n  have completed. This function gets a results array (or object) containing all \n  the result arguments passed to the task callbacks.\n\n__Example__\n\n```js\nasync.parallel([\n    function(callback){\n        setTimeout(function(){\n            callback(null, 'one');\n        }, 200);\n    },\n    function(callback){\n        setTimeout(function(){\n            callback(null, 'two');\n        }, 100);\n    }\n],\n// optional callback\nfunction(err, results){\n    // the results array will equal ['one','two'] even though\n    // the second function had a shorter timeout.\n});\n\n\n// an example using an object instead of an array\nasync.parallel({\n    one: function(callback){\n        setTimeout(function(){\n            callback(null, 1);\n        }, 200);\n    },\n    two: function(callback){\n        setTimeout(function(){\n            callback(null, 2);\n        }, 100);\n    }\n},\nfunction(err, results) {\n    // results is now equals to: {one: 1, two: 2}\n});\n```\n\n---------------------------------------\n\n<a name=\"parallel\" />\n### parallelLimit(tasks, limit, [callback])\n\nThe same as parallel only the tasks are executed in parallel with a maximum of \"limit\" \ntasks executing at any time.\n\nNote that the tasks are not executed in batches, so there is no guarantee that \nthe first \"limit\" tasks will complete before any others are started.\n\n__Arguments__\n\n* tasks - An array or object containing functions to run, each function is passed \n  a callback(err, result) it must call on completion with an error (which can\n  be null) and an optional result value.\n* limit - The maximum number of tasks to run at any time.\n* callback(err, results) - An optional callback to run once all the functions\n  have completed. This function gets a results array (or object) containing all \n  the result arguments passed to the task callbacks.\n\n---------------------------------------\n\n<a name=\"whilst\" />\n### whilst(test, fn, callback)\n\nRepeatedly call fn, while test returns true. Calls the callback when stopped,\nor an error occurs.\n\n__Arguments__\n\n* test() - synchronous truth test to perform before each execution of fn.\n* fn(callback) - A function to call each time the test passes. The function is\n  passed a callback(err) which must be called once it has completed with an \n  optional error argument.\n* callback(err) - A callback which is called after the test fails and repeated\n  execution of fn has stopped.\n\n__Example__\n\n```js\nvar count = 0;\n\nasync.whilst(\n    function () { return count < 5; },\n    function (callback) {\n        count++;\n        setTimeout(callback, 1000);\n    },\n    function (err) {\n        // 5 seconds have passed\n    }\n);\n```\n\n---------------------------------------\n\n<a name=\"doWhilst\" />\n### doWhilst(fn, test, callback)\n\nThe post check version of whilst. To reflect the difference in the order of operations `test` and `fn` arguments are switched. `doWhilst` is to `whilst` as `do while` is to `while` in plain JavaScript.\n\n---------------------------------------\n\n<a name=\"until\" />\n### until(test, fn, callback)\n\nRepeatedly call fn, until test returns true. Calls the callback when stopped,\nor an error occurs.\n\nThe inverse of async.whilst.\n\n---------------------------------------\n\n<a name=\"doUntil\" />\n### doUntil(fn, test, callback)\n\nLike doWhilst except the test is inverted. Note the argument ordering differs from `until`.\n\n---------------------------------------\n\n<a name=\"forever\" />\n### forever(fn, callback)\n\nCalls the asynchronous function 'fn' repeatedly, in series, indefinitely.\nIf an error is passed to fn's callback then 'callback' is called with the\nerror, otherwise it will never be called.\n\n---------------------------------------\n\n<a name=\"waterfall\" />\n### waterfall(tasks, [callback])\n\nRuns an array of functions in series, each passing their results to the next in\nthe array. However, if any of the functions pass an error to the callback, the\nnext function is not executed and the main callback is immediately called with\nthe error.\n\n__Arguments__\n\n* tasks - An array of functions to run, each function is passed a \n  callback(err, result1, result2, ...) it must call on completion. The first\n  argument is an error (which can be null) and any further arguments will be \n  passed as arguments in order to the next task.\n* callback(err, [results]) - An optional callback to run once all the functions\n  have completed. This will be passed the results of the last task's callback.\n\n\n\n__Example__\n\n```js\nasync.waterfall([\n    function(callback){\n        callback(null, 'one', 'two');\n    },\n    function(arg1, arg2, callback){\n        callback(null, 'three');\n    },\n    function(arg1, callback){\n        // arg1 now equals 'three'\n        callback(null, 'done');\n    }\n], function (err, result) {\n   // result now equals 'done'    \n});\n```\n\n---------------------------------------\n<a name=\"compose\" />\n### compose(fn1, fn2...)\n\nCreates a function which is a composition of the passed asynchronous\nfunctions. Each function consumes the return value of the function that\nfollows. Composing functions f(), g() and h() would produce the result of\nf(g(h())), only this version uses callbacks to obtain the return values.\n\nEach function is executed with the `this` binding of the composed function.\n\n__Arguments__\n\n* functions... - the asynchronous functions to compose\n\n\n__Example__\n\n```js\nfunction add1(n, callback) {\n    setTimeout(function () {\n        callback(null, n + 1);\n    }, 10);\n}\n\nfunction mul3(n, callback) {\n    setTimeout(function () {\n        callback(null, n * 3);\n    }, 10);\n}\n\nvar add1mul3 = async.compose(mul3, add1);\n\nadd1mul3(4, function (err, result) {\n   // result now equals 15\n});\n```\n\n---------------------------------------\n<a name=\"applyEach\" />\n### applyEach(fns, args..., callback)\n\nApplies the provided arguments to each function in the array, calling the\ncallback after all functions have completed. If you only provide the first\nargument then it will return a function which lets you pass in the\narguments as if it were a single function call.\n\n__Arguments__\n\n* fns - the asynchronous functions to all call with the same arguments\n* args... - any number of separate arguments to pass to the function\n* callback - the final argument should be the callback, called when all\n  functions have completed processing\n\n\n__Example__\n\n```js\nasync.applyEach([enableSearch, updateSchema], 'bucket', callback);\n\n// partial application example:\nasync.each(\n    buckets,\n    async.applyEach([enableSearch, updateSchema]),\n    callback\n);\n```\n\n---------------------------------------\n\n<a name=\"applyEachSeries\" />\n### applyEachSeries(arr, iterator, callback)\n\nThe same as applyEach only the functions are applied in series.\n\n---------------------------------------\n\n<a name=\"queue\" />\n### queue(worker, concurrency)\n\nCreates a queue object with the specified concurrency. Tasks added to the\nqueue will be processed in parallel (up to the concurrency limit). If all\nworkers are in progress, the task is queued until one is available. Once\na worker has completed a task, the task's callback is called.\n\n__Arguments__\n\n* worker(task, callback) - An asynchronous function for processing a queued\n  task, which must call its callback(err) argument when finished, with an \n  optional error as an argument.\n* concurrency - An integer for determining how many worker functions should be\n  run in parallel.\n\n__Queue objects__\n\nThe queue object returned by this function has the following properties and\nmethods:\n\n* length() - a function returning the number of items waiting to be processed.\n* concurrency - an integer for determining how many worker functions should be\n  run in parallel. This property can be changed after a queue is created to\n  alter the concurrency on-the-fly.\n* push(task, [callback]) - add a new task to the queue, the callback is called\n  once the worker has finished processing the task.\n  instead of a single task, an array of tasks can be submitted. the respective callback is used for every task in the list.\n* unshift(task, [callback]) - add a new task to the front of the queue.\n* saturated - a callback that is called when the queue length hits the concurrency and further tasks will be queued\n* empty - a callback that is called when the last item from the queue is given to a worker\n* drain - a callback that is called when the last item from the queue has returned from the worker\n\n__Example__\n\n```js\n// create a queue object with concurrency 2\n\nvar q = async.queue(function (task, callback) {\n    console.log('hello ' + task.name);\n    callback();\n}, 2);\n\n\n// assign a callback\nq.drain = function() {\n    console.log('all items have been processed');\n}\n\n// add some items to the queue\n\nq.push({name: 'foo'}, function (err) {\n    console.log('finished processing foo');\n});\nq.push({name: 'bar'}, function (err) {\n    console.log('finished processing bar');\n});\n\n// add some items to the queue (batch-wise)\n\nq.push([{name: 'baz'},{name: 'bay'},{name: 'bax'}], function (err) {\n    console.log('finished processing bar');\n});\n\n// add some items to the front of the queue\n\nq.unshift({name: 'bar'}, function (err) {\n    console.log('finished processing bar');\n});\n```\n\n---------------------------------------\n\n<a name=\"cargo\" />\n### cargo(worker, [payload])\n\nCreates a cargo object with the specified payload. Tasks added to the\ncargo will be processed altogether (up to the payload limit). If the\nworker is in progress, the task is queued until it is available. Once\nthe worker has completed some tasks, each callback of those tasks is called.\n\n__Arguments__\n\n* worker(tasks, callback) - An asynchronous function for processing an array of\n  queued tasks, which must call its callback(err) argument when finished, with \n  an optional error as an argument.\n* payload - An optional integer for determining how many tasks should be\n  processed per round; if omitted, the default is unlimited.\n\n__Cargo objects__\n\nThe cargo object returned by this function has the following properties and\nmethods:\n\n* length() - a function returning the number of items waiting to be processed.\n* payload - an integer for determining how many tasks should be\n  process per round. This property can be changed after a cargo is created to\n  alter the payload on-the-fly.\n* push(task, [callback]) - add a new task to the queue, the callback is called\n  once the worker has finished processing the task.\n  instead of a single task, an array of tasks can be submitted. the respective callback is used for every task in the list.\n* saturated - a callback that is called when the queue length hits the concurrency and further tasks will be queued\n* empty - a callback that is called when the last item from the queue is given to a worker\n* drain - a callback that is called when the last item from the queue has returned from the worker\n\n__Example__\n\n```js\n// create a cargo object with payload 2\n\nvar cargo = async.cargo(function (tasks, callback) {\n    for(var i=0; i<tasks.length; i++){\n      console.log('hello ' + tasks[i].name);\n    }\n    callback();\n}, 2);\n\n\n// add some items\n\ncargo.push({name: 'foo'}, function (err) {\n    console.log('finished processing foo');\n});\ncargo.push({name: 'bar'}, function (err) {\n    console.log('finished processing bar');\n});\ncargo.push({name: 'baz'}, function (err) {\n    console.log('finished processing baz');\n});\n```\n\n---------------------------------------\n\n<a name=\"auto\" />\n### auto(tasks, [callback])\n\nDetermines the best order for running functions based on their requirements.\nEach function can optionally depend on other functions being completed first,\nand each function is run as soon as its requirements are satisfied. If any of\nthe functions pass an error to their callback, that function will not complete\n(so any other functions depending on it will not run) and the main callback\nwill be called immediately with the error. Functions also receive an object\ncontaining the results of functions which have completed so far.\n\nNote, all functions are called with a results object as a second argument, \nso it is unsafe to pass functions in the tasks object which cannot handle the\nextra argument. For example, this snippet of code:\n\n```js\nasync.auto({\n  readData: async.apply(fs.readFile, 'data.txt', 'utf-8');\n}, callback);\n```\n\nwill have the effect of calling readFile with the results object as the last\nargument, which will fail:\n\n```js\nfs.readFile('data.txt', 'utf-8', cb, {});\n```\n\nInstead, wrap the call to readFile in a function which does not forward the \nresults object:\n\n```js\nasync.auto({\n  readData: function(cb, results){\n    fs.readFile('data.txt', 'utf-8', cb);\n  }\n}, callback);\n```\n\n__Arguments__\n\n* tasks - An object literal containing named functions or an array of\n  requirements, with the function itself the last item in the array. The key\n  used for each function or array is used when specifying requirements. The \n  function receives two arguments: (1) a callback(err, result) which must be \n  called when finished, passing an error (which can be null) and the result of \n  the function's execution, and (2) a results object, containing the results of\n  the previously executed functions.\n* callback(err, results) - An optional callback which is called when all the\n  tasks have been completed. The callback will receive an error as an argument\n  if any tasks pass an error to their callback. Results will always be passed\n\tbut if an error occurred, no other tasks will be performed, and the results\n\tobject will only contain partial results.\n  \n\n__Example__\n\n```js\nasync.auto({\n    get_data: function(callback){\n        // async code to get some data\n    },\n    make_folder: function(callback){\n        // async code to create a directory to store a file in\n        // this is run at the same time as getting the data\n    },\n    write_file: ['get_data', 'make_folder', function(callback){\n        // once there is some data and the directory exists,\n        // write the data to a file in the directory\n        callback(null, filename);\n    }],\n    email_link: ['write_file', function(callback, results){\n        // once the file is written let's email a link to it...\n        // results.write_file contains the filename returned by write_file.\n    }]\n});\n```\n\nThis is a fairly trivial example, but to do this using the basic parallel and\nseries functions would look like this:\n\n```js\nasync.parallel([\n    function(callback){\n        // async code to get some data\n    },\n    function(callback){\n        // async code to create a directory to store a file in\n        // this is run at the same time as getting the data\n    }\n],\nfunction(err, results){\n    async.series([\n        function(callback){\n            // once there is some data and the directory exists,\n            // write the data to a file in the directory\n        },\n        function(callback){\n            // once the file is written let's email a link to it...\n        }\n    ]);\n});\n```\n\nFor a complicated series of async tasks using the auto function makes adding\nnew tasks much easier and makes the code more readable.\n\n\n---------------------------------------\n\n<a name=\"iterator\" />\n### iterator(tasks)\n\nCreates an iterator function which calls the next function in the array,\nreturning a continuation to call the next one after that. It's also possible to\n'peek' the next iterator by doing iterator.next().\n\nThis function is used internally by the async module but can be useful when\nyou want to manually control the flow of functions in series.\n\n__Arguments__\n\n* tasks - An array of functions to run.\n\n__Example__\n\n```js\nvar iterator = async.iterator([\n    function(){ sys.p('one'); },\n    function(){ sys.p('two'); },\n    function(){ sys.p('three'); }\n]);\n\nnode> var iterator2 = iterator();\n'one'\nnode> var iterator3 = iterator2();\n'two'\nnode> iterator3();\n'three'\nnode> var nextfn = iterator2.next();\nnode> nextfn();\n'three'\n```\n\n---------------------------------------\n\n<a name=\"apply\" />\n### apply(function, arguments..)\n\nCreates a continuation function with some arguments already applied, a useful\nshorthand when combined with other control flow functions. Any arguments\npassed to the returned function are added to the arguments originally passed\nto apply.\n\n__Arguments__\n\n* function - The function you want to eventually apply all arguments to.\n* arguments... - Any number of arguments to automatically apply when the\n  continuation is called.\n\n__Example__\n\n```js\n// using apply\n\nasync.parallel([\n    async.apply(fs.writeFile, 'testfile1', 'test1'),\n    async.apply(fs.writeFile, 'testfile2', 'test2'),\n]);\n\n\n// the same process without using apply\n\nasync.parallel([\n    function(callback){\n        fs.writeFile('testfile1', 'test1', callback);\n    },\n    function(callback){\n        fs.writeFile('testfile2', 'test2', callback);\n    }\n]);\n```\n\nIt's possible to pass any number of additional arguments when calling the\ncontinuation:\n\n```js\nnode> var fn = async.apply(sys.puts, 'one');\nnode> fn('two', 'three');\none\ntwo\nthree\n```\n\n---------------------------------------\n\n<a name=\"nextTick\" />\n### nextTick(callback)\n\nCalls the callback on a later loop around the event loop. In node.js this just\ncalls process.nextTick, in the browser it falls back to setImmediate(callback)\nif available, otherwise setTimeout(callback, 0), which means other higher priority\nevents may precede the execution of the callback.\n\nThis is used internally for browser-compatibility purposes.\n\n__Arguments__\n\n* callback - The function to call on a later loop around the event loop.\n\n__Example__\n\n```js\nvar call_order = [];\nasync.nextTick(function(){\n    call_order.push('two');\n    // call_order now equals ['one','two']\n});\ncall_order.push('one')\n```\n\n<a name=\"times\" />\n### times(n, callback)\n\nCalls the callback n times and accumulates results in the same manner\nyou would use with async.map.\n\n__Arguments__\n\n* n - The number of times to run the function.\n* callback - The function to call n times.\n\n__Example__\n\n```js\n// Pretend this is some complicated async factory\nvar createUser = function(id, callback) {\n  callback(null, {\n    id: 'user' + id\n  })\n}\n// generate 5 users\nasync.times(5, function(n, next){\n    createUser(n, function(err, user) {\n      next(err, user)\n    })\n}, function(err, users) {\n  // we should now have 5 users\n});\n```\n\n<a name=\"timesSeries\" />\n### timesSeries(n, callback)\n\nThe same as times only the iterator is applied to each item in the array in\nseries. The next iterator is only called once the current one has completed\nprocessing. The results array will be in the same order as the original.\n\n\n## Utils\n\n<a name=\"memoize\" />\n### memoize(fn, [hasher])\n\nCaches the results of an async function. When creating a hash to store function\nresults against, the callback is omitted from the hash and an optional hash\nfunction can be used.\n\nThe cache of results is exposed as the `memo` property of the function returned\nby `memoize`.\n\n__Arguments__\n\n* fn - the function you to proxy and cache results from.\n* hasher - an optional function for generating a custom hash for storing\n  results, it has all the arguments applied to it apart from the callback, and\n  must be synchronous.\n\n__Example__\n\n```js\nvar slow_fn = function (name, callback) {\n    // do something\n    callback(null, result);\n};\nvar fn = async.memoize(slow_fn);\n\n// fn can now be used as if it were slow_fn\nfn('some name', function () {\n    // callback\n});\n```\n\n<a name=\"unmemoize\" />\n### unmemoize(fn)\n\nUndoes a memoized function, reverting it to the original, unmemoized\nform. Comes handy in tests.\n\n__Arguments__\n\n* fn - the memoized function\n\n<a name=\"log\" />\n### log(function, arguments)\n\nLogs the result of an async function to the console. Only works in node.js or\nin browsers that support console.log and console.error (such as FF and Chrome).\nIf multiple arguments are returned from the async function, console.log is\ncalled on each argument in order.\n\n__Arguments__\n\n* function - The function you want to eventually apply all arguments to.\n* arguments... - Any number of arguments to apply to the function.\n\n__Example__\n\n```js\nvar hello = function(name, callback){\n    setTimeout(function(){\n        callback(null, 'hello ' + name);\n    }, 1000);\n};\n```\n```js\nnode> async.log(hello, 'world');\n'hello world'\n```\n\n---------------------------------------\n\n<a name=\"dir\" />\n### dir(function, arguments)\n\nLogs the result of an async function to the console using console.dir to\ndisplay the properties of the resulting object. Only works in node.js or\nin browsers that support console.dir and console.error (such as FF and Chrome).\nIf multiple arguments are returned from the async function, console.dir is\ncalled on each argument in order.\n\n__Arguments__\n\n* function - The function you want to eventually apply all arguments to.\n* arguments... - Any number of arguments to apply to the function.\n\n__Example__\n\n```js\nvar hello = function(name, callback){\n    setTimeout(function(){\n        callback(null, {hello: name});\n    }, 1000);\n};\n```\n```js\nnode> async.dir(hello, 'world');\n{hello: 'world'}\n```\n\n---------------------------------------\n\n<a name=\"noConflict\" />\n### noConflict()\n\nChanges the value of async back to its original value, returning a reference to the\nasync object.\n",
-  "readmeFilename": "README.md",
-  "homepage": "https://github.com/caolan/async",
-  "_id": "async@0.2.9",
-  "dist": {
-    "shasum": "32ed8c2d04e85d6918e618b1b7857233841f6447"
-  },
-  "_from": "async@~0.2.9",
-  "_resolved": "https://registry.npmjs.org/async/-/async-0.2.9.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-*.un~
-/node_modules
-/test/tmp
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/License	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-Copyright (c) 2011 Debuggable Limited <felix@debuggable.com>
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,7 +0,0 @@
-SHELL := /bin/bash
-
-test:
-	@./test/run.js
-
-.PHONY: test
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/Readme.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,132 +0,0 @@
-# combined-stream
-
-A stream that emits multiple other streams one after another.
-
-## Installation
-
-``` bash
-npm install combined-stream
-```
-
-## Usage
-
-Here is a simple example that shows how you can use combined-stream to combine
-two files into one:
-
-``` javascript
-var CombinedStream = require('combined-stream');
-var fs = require('fs');
-
-var combinedStream = CombinedStream.create();
-combinedStream.append(fs.createReadStream('file1.txt'));
-combinedStream.append(fs.createReadStream('file2.txt'));
-
-combinedStream.pipe(fs.createWriteStream('combined.txt'));
-```
-
-While the example above works great, it will pause all source streams until
-they are needed. If you don't want that to happen, you can set `pauseStreams`
-to `false`:
-
-``` javascript
-var CombinedStream = require('combined-stream');
-var fs = require('fs');
-
-var combinedStream = CombinedStream.create({pauseStreams: false});
-combinedStream.append(fs.createReadStream('file1.txt'));
-combinedStream.append(fs.createReadStream('file2.txt'));
-
-combinedStream.pipe(fs.createWriteStream('combined.txt'));
-```
-
-However, what if you don't have all the source streams yet, or you don't want
-to allocate the resources (file descriptors, memory, etc.) for them right away?
-Well, in that case you can simply provide a callback that supplies the stream
-by calling a `next()` function:
-
-``` javascript
-var CombinedStream = require('combined-stream');
-var fs = require('fs');
-
-var combinedStream = CombinedStream.create();
-combinedStream.append(function(next) {
-  next(fs.createReadStream('file1.txt'));
-});
-combinedStream.append(function(next) {
-  next(fs.createReadStream('file2.txt'));
-});
-
-combinedStream.pipe(fs.createWriteStream('combined.txt'));
-```
-
-## API
-
-### CombinedStream.create([options])
-
-Returns a new combined stream object. Available options are:
-
-* `maxDataSize`
-* `pauseStreams`
-
-The effect of those options is described below.
-
-### combinedStream.pauseStreams = true
-
-Whether to apply back pressure to the underlaying streams. If set to `false`,
-the underlaying streams will never be paused. If set to `true`, the
-underlaying streams will be paused right after being appended, as well as when
-`delayedStream.pipe()` wants to throttle.
-
-### combinedStream.maxDataSize = 2 * 1024 * 1024
-
-The maximum amount of bytes (or characters) to buffer for all source streams.
-If this value is exceeded, `combinedStream` emits an `'error'` event.
-
-### combinedStream.dataSize = 0
-
-The amount of bytes (or characters) currently buffered by `combinedStream`.
-
-### combinedStream.append(stream)
-
-Appends the given `stream` to the combinedStream object. If `pauseStreams` is
-set to `true, this stream will also be paused right away.
-
-`streams` can also be a function that takes one parameter called `next`. `next`
-is a function that must be invoked in order to provide the `next` stream, see
-example above.
-
-Regardless of how the `stream` is appended, combined-stream always attaches an
-`'error'` listener to it, so you don't have to do that manually.
-
-Special case: `stream` can also be a String or Buffer.
-
-### combinedStream.write(data)
-
-You should not call this, `combinedStream` takes care of piping the appended
-streams into itself for you.
-
-### combinedStream.resume()
-
-Causes `combinedStream` to start drain the streams it manages. The function is
-idempotent, and also emits a `'resume'` event each time which usually goes to
-the stream that is currently being drained.
-
-### combinedStream.pause();
-
-If `combinedStream.pauseStreams` is set to `false`, this does nothing.
-Otherwise a `'pause'` event is emitted, this goes to the stream that is
-currently being drained, so you can use it to apply back pressure.
-
-### combinedStream.end();
-
-Sets `combinedStream.writable` to false, emits an `'end'` event, and removes
-all streams from the queue.
-
-### combinedStream.destroy();
-
-Same as `combinedStream.end()`, except it emits a `'close'` event instead of
-`'end'`.
-
-## License
-
-combined-stream is licensed under the MIT license.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/lib/combined_stream.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,185 +0,0 @@
-var util = require('util');
-var Stream = require('stream').Stream;
-var DelayedStream = require('delayed-stream');
-
-module.exports = CombinedStream;
-function CombinedStream() {
-  this.writable = false;
-  this.readable = true;
-  this.dataSize = 0;
-  this.maxDataSize = 2 * 1024 * 1024;
-  this.pauseStreams = true;
-
-  this._released = false;
-  this._streams = [];
-  this._currentStream = null;
-}
-util.inherits(CombinedStream, Stream);
-
-CombinedStream.create = function(options) {
-  var combinedStream = new this();
-
-  options = options || {};
-  for (var option in options) {
-    combinedStream[option] = options[option];
-  }
-
-  return combinedStream;
-};
-
-CombinedStream.isStreamLike = function(stream) {
-  return (typeof stream !== 'function')
-    && (typeof stream !== 'string')
-    && (typeof stream !== 'boolean')    
-    && (typeof stream !== 'number')
-    && (!Buffer.isBuffer(stream));
-};
-
-CombinedStream.prototype.append = function(stream) {
-  var isStreamLike = CombinedStream.isStreamLike(stream);
-
-  if (isStreamLike) {
-    if (!(stream instanceof DelayedStream)) {
-      stream.on('data', this._checkDataSize.bind(this));
-
-      stream = DelayedStream.create(stream, {
-        maxDataSize: Infinity,
-        pauseStream: this.pauseStreams,
-      });
-    }
-
-    this._handleErrors(stream);
-
-    if (this.pauseStreams) {
-      stream.pause();
-    }
-  }
-
-  this._streams.push(stream);
-  return this;
-};
-
-CombinedStream.prototype.pipe = function(dest, options) {
-  Stream.prototype.pipe.call(this, dest, options);
-  this.resume();
-};
-
-CombinedStream.prototype._getNext = function() {
-  this._currentStream = null;
-  var stream = this._streams.shift();
-
-
-  if (typeof stream == 'undefined') {
-    this.end();
-    return;
-  }
-
-  if (typeof stream !== 'function') {
-    this._pipeNext(stream);
-    return;
-  }
-
-  var getStream = stream;
-  getStream(function(stream) {
-    var isStreamLike = CombinedStream.isStreamLike(stream);
-    if (isStreamLike) {
-      stream.on('data', this._checkDataSize.bind(this));
-      this._handleErrors(stream);
-    }
-
-    this._pipeNext(stream);
-  }.bind(this));
-};
-
-CombinedStream.prototype._pipeNext = function(stream) {
-  this._currentStream = stream;
-
-  var isStreamLike = CombinedStream.isStreamLike(stream);
-  if (isStreamLike) {
-    stream.on('end', this._getNext.bind(this))
-    stream.pipe(this, {end: false});
-    return;
-  }
-
-  var value = stream;
-  this.write(value);
-  this._getNext();
-};
-
-CombinedStream.prototype._handleErrors = function(stream) {
-  var self = this;
-  stream.on('error', function(err) {
-    self._emitError(err);
-  });
-};
-
-CombinedStream.prototype.write = function(data) {
-  this.emit('data', data);
-};
-
-CombinedStream.prototype.pause = function() {
-  if (!this.pauseStreams) {
-    return;
-  }
-
-  this.emit('pause');
-};
-
-CombinedStream.prototype.resume = function() {
-  if (!this._released) {
-    this._released = true;
-    this.writable = true;
-    this._getNext();
-  }
-
-  this.emit('resume');
-};
-
-CombinedStream.prototype.end = function() {
-  this._reset();
-  this.emit('end');
-};
-
-CombinedStream.prototype.destroy = function() {
-  this._reset();
-  this.emit('close');
-};
-
-CombinedStream.prototype._reset = function() {
-  this.writable = false;
-  this._streams = [];
-  this._currentStream = null;
-};
-
-CombinedStream.prototype._checkDataSize = function() {
-  this._updateDataSize();
-  if (this.dataSize <= this.maxDataSize) {
-    return;
-  }
-
-  var message =
-    'DelayedStream#maxDataSize of ' + this.maxDataSize + ' bytes exceeded.'
-  this._emitError(new Error(message));
-};
-
-CombinedStream.prototype._updateDataSize = function() {
-  this.dataSize = 0;
-
-  var self = this;
-  this._streams.forEach(function(stream) {
-    if (!stream.dataSize) {
-      return;
-    }
-
-    self.dataSize += stream.dataSize;
-  });
-
-  if (this._currentStream && this._currentStream.dataSize) {
-    this.dataSize += this._currentStream.dataSize;
-  }
-};
-
-CombinedStream.prototype._emitError = function(err) {
-  this._reset();
-  this.emit('error', err);
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-*.un~
-/node_modules/*
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/License	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-Copyright (c) 2011 Debuggable Limited <felix@debuggable.com>
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,7 +0,0 @@
-SHELL := /bin/bash
-
-test:
-	@./test/run.js
-
-.PHONY: test
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/Readme.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,154 +0,0 @@
-# delayed-stream
-
-Buffers events from a stream until you are ready to handle them.
-
-## Installation
-
-``` bash
-npm install delayed-stream
-```
-
-## Usage
-
-The following example shows how to write a http echo server that delays its
-response by 1000 ms.
-
-``` javascript
-var DelayedStream = require('delayed-stream');
-var http = require('http');
-
-http.createServer(function(req, res) {
-  var delayed = DelayedStream.create(req);
-
-  setTimeout(function() {
-    res.writeHead(200);
-    delayed.pipe(res);
-  }, 1000);
-});
-```
-
-If you are not using `Stream#pipe`, you can also manually release the buffered
-events by calling `delayedStream.resume()`:
-
-``` javascript
-var delayed = DelayedStream.create(req);
-
-setTimeout(function() {
-  // Emit all buffered events and resume underlaying source
-  delayed.resume();
-}, 1000);
-```
-
-## Implementation
-
-In order to use this meta stream properly, here are a few things you should
-know about the implementation.
-
-### Event Buffering / Proxying
-
-All events of the `source` stream are hijacked by overwriting the `source.emit`
-method. Until node implements a catch-all event listener, this is the only way.
-
-However, delayed-stream still continues to emit all events it captures on the
-`source`, regardless of whether you have released the delayed stream yet or
-not.
-
-Upon creation, delayed-stream captures all `source` events and stores them in
-an internal event buffer. Once `delayedStream.release()` is called, all
-buffered events are emitted on the `delayedStream`, and the event buffer is
-cleared. After that, delayed-stream merely acts as a proxy for the underlaying
-source.
-
-### Error handling
-
-Error events on `source` are buffered / proxied just like any other events.
-However, `delayedStream.create` attaches a no-op `'error'` listener to the
-`source`. This way you only have to handle errors on the `delayedStream`
-object, rather than in two places.
-
-### Buffer limits
-
-delayed-stream provides a `maxDataSize` property that can be used to limit
-the amount of data being buffered. In order to protect you from bad `source`
-streams that don't react to `source.pause()`, this feature is enabled by
-default.
-
-## API
-
-### DelayedStream.create(source, [options])
-
-Returns a new `delayedStream`. Available options are:
-
-* `pauseStream`
-* `maxDataSize`
-
-The description for those properties can be found below.
-
-### delayedStream.source
-
-The `source` stream managed by this object. This is useful if you are
-passing your `delayedStream` around, and you still want to access properties
-on the `source` object.
-
-### delayedStream.pauseStream = true
-
-Whether to pause the underlaying `source` when calling
-`DelayedStream.create()`. Modifying this property afterwards has no effect.
-
-### delayedStream.maxDataSize = 1024 * 1024
-
-The amount of data to buffer before emitting an `error`.
-
-If the underlaying source is emitting `Buffer` objects, the `maxDataSize`
-refers to bytes.
-
-If the underlaying source is emitting JavaScript strings, the size refers to
-characters.
-
-If you know what you are doing, you can set this property to `Infinity` to
-disable this feature. You can also modify this property during runtime.
-
-### delayedStream.maxDataSize = 1024 * 1024
-
-The amount of data to buffer before emitting an `error`.
-
-If the underlaying source is emitting `Buffer` objects, the `maxDataSize`
-refers to bytes.
-
-If the underlaying source is emitting JavaScript strings, the size refers to
-characters.
-
-If you know what you are doing, you can set this property to `Infinity` to
-disable this feature.
-
-### delayedStream.dataSize = 0
-
-The amount of data buffered so far.
-
-### delayedStream.readable
-
-An ECMA5 getter that returns the value of `source.readable`.
-
-### delayedStream.resume()
-
-If the `delayedStream` has not been released so far, `delayedStream.release()`
-is called.
-
-In either case, `source.resume()` is called.
-
-### delayedStream.pause()
-
-Calls `source.pause()`.
-
-### delayedStream.pipe(dest)
-
-Calls `delayedStream.resume()` and then proxies the arguments to `source.pipe`.
-
-### delayedStream.release()
-
-Emits and clears all events that have been buffered up so far. This does not
-resume the underlaying source, use `delayedStream.resume()` instead.
-
-## License
-
-delayed-stream is licensed under the MIT license.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/lib/delayed_stream.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,99 +0,0 @@
-var Stream = require('stream').Stream;
-var util = require('util');
-
-module.exports = DelayedStream;
-function DelayedStream() {
-  this.source = null;
-  this.dataSize = 0;
-  this.maxDataSize = 1024 * 1024;
-  this.pauseStream = true;
-
-  this._maxDataSizeExceeded = false;
-  this._released = false;
-  this._bufferedEvents = [];
-}
-util.inherits(DelayedStream, Stream);
-
-DelayedStream.create = function(source, options) {
-  var delayedStream = new this();
-
-  options = options || {};
-  for (var option in options) {
-    delayedStream[option] = options[option];
-  }
-
-  delayedStream.source = source;
-
-  var realEmit = source.emit;
-  source.emit = function() {
-    delayedStream._handleEmit(arguments);
-    return realEmit.apply(source, arguments);
-  };
-
-  source.on('error', function() {});
-  if (delayedStream.pauseStream) {
-    source.pause();
-  }
-
-  return delayedStream;
-};
-
-DelayedStream.prototype.__defineGetter__('readable', function() {
-  return this.source.readable;
-});
-
-DelayedStream.prototype.resume = function() {
-  if (!this._released) {
-    this.release();
-  }
-
-  this.source.resume();
-};
-
-DelayedStream.prototype.pause = function() {
-  this.source.pause();
-};
-
-DelayedStream.prototype.release = function() {
-  this._released = true;
-
-  this._bufferedEvents.forEach(function(args) {
-    this.emit.apply(this, args);
-  }.bind(this));
-  this._bufferedEvents = [];
-};
-
-DelayedStream.prototype.pipe = function() {
-  var r = Stream.prototype.pipe.apply(this, arguments);
-  this.resume();
-  return r;
-};
-
-DelayedStream.prototype._handleEmit = function(args) {
-  if (this._released) {
-    this.emit.apply(this, args);
-    return;
-  }
-
-  if (args[0] === 'data') {
-    this.dataSize += args[1].length;
-    this._checkIfMaxDataSizeExceeded();
-  }
-
-  this._bufferedEvents.push(args);
-};
-
-DelayedStream.prototype._checkIfMaxDataSizeExceeded = function() {
-  if (this._maxDataSizeExceeded) {
-    return;
-  }
-
-  if (this.dataSize <= this.maxDataSize) {
-    return;
-  }
-
-  this._maxDataSizeExceeded = true;
-  var message =
-    'DelayedStream#maxDataSize of ' + this.maxDataSize + ' bytes exceeded.'
-  this.emit('error', new Error(message));
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "author": {
-    "name": "Felix Geisendörfer",
-    "email": "felix@debuggable.com",
-    "url": "http://debuggable.com/"
-  },
-  "name": "delayed-stream",
-  "description": "Buffers events from a stream until you are ready to handle them.",
-  "version": "0.0.5",
-  "homepage": "https://github.com/felixge/node-delayed-stream",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/felixge/node-delayed-stream.git"
-  },
-  "main": "./lib/delayed_stream",
-  "engines": {
-    "node": ">=0.4.0"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "fake": "0.2.0",
-    "far": "0.0.1"
-  },
-  "readme": "# delayed-stream\n\nBuffers events from a stream until you are ready to handle them.\n\n## Installation\n\n``` bash\nnpm install delayed-stream\n```\n\n## Usage\n\nThe following example shows how to write a http echo server that delays its\nresponse by 1000 ms.\n\n``` javascript\nvar DelayedStream = require('delayed-stream');\nvar http = require('http');\n\nhttp.createServer(function(req, res) {\n  var delayed = DelayedStream.create(req);\n\n  setTimeout(function() {\n    res.writeHead(200);\n    delayed.pipe(res);\n  }, 1000);\n});\n```\n\nIf you are not using `Stream#pipe`, you can also manually release the buffered\nevents by calling `delayedStream.resume()`:\n\n``` javascript\nvar delayed = DelayedStream.create(req);\n\nsetTimeout(function() {\n  // Emit all buffered events and resume underlaying source\n  delayed.resume();\n}, 1000);\n```\n\n## Implementation\n\nIn order to use this meta stream properly, here are a few things you should\nknow about the implementation.\n\n### Event Buffering / Proxying\n\nAll events of the `source` stream are hijacked by overwriting the `source.emit`\nmethod. Until node implements a catch-all event listener, this is the only way.\n\nHowever, delayed-stream still continues to emit all events it captures on the\n`source`, regardless of whether you have released the delayed stream yet or\nnot.\n\nUpon creation, delayed-stream captures all `source` events and stores them in\nan internal event buffer. Once `delayedStream.release()` is called, all\nbuffered events are emitted on the `delayedStream`, and the event buffer is\ncleared. After that, delayed-stream merely acts as a proxy for the underlaying\nsource.\n\n### Error handling\n\nError events on `source` are buffered / proxied just like any other events.\nHowever, `delayedStream.create` attaches a no-op `'error'` listener to the\n`source`. This way you only have to handle errors on the `delayedStream`\nobject, rather than in two places.\n\n### Buffer limits\n\ndelayed-stream provides a `maxDataSize` property that can be used to limit\nthe amount of data being buffered. In order to protect you from bad `source`\nstreams that don't react to `source.pause()`, this feature is enabled by\ndefault.\n\n## API\n\n### DelayedStream.create(source, [options])\n\nReturns a new `delayedStream`. Available options are:\n\n* `pauseStream`\n* `maxDataSize`\n\nThe description for those properties can be found below.\n\n### delayedStream.source\n\nThe `source` stream managed by this object. This is useful if you are\npassing your `delayedStream` around, and you still want to access properties\non the `source` object.\n\n### delayedStream.pauseStream = true\n\nWhether to pause the underlaying `source` when calling\n`DelayedStream.create()`. Modifying this property afterwards has no effect.\n\n### delayedStream.maxDataSize = 1024 * 1024\n\nThe amount of data to buffer before emitting an `error`.\n\nIf the underlaying source is emitting `Buffer` objects, the `maxDataSize`\nrefers to bytes.\n\nIf the underlaying source is emitting JavaScript strings, the size refers to\ncharacters.\n\nIf you know what you are doing, you can set this property to `Infinity` to\ndisable this feature. You can also modify this property during runtime.\n\n### delayedStream.maxDataSize = 1024 * 1024\n\nThe amount of data to buffer before emitting an `error`.\n\nIf the underlaying source is emitting `Buffer` objects, the `maxDataSize`\nrefers to bytes.\n\nIf the underlaying source is emitting JavaScript strings, the size refers to\ncharacters.\n\nIf you know what you are doing, you can set this property to `Infinity` to\ndisable this feature.\n\n### delayedStream.dataSize = 0\n\nThe amount of data buffered so far.\n\n### delayedStream.readable\n\nAn ECMA5 getter that returns the value of `source.readable`.\n\n### delayedStream.resume()\n\nIf the `delayedStream` has not been released so far, `delayedStream.release()`\nis called.\n\nIn either case, `source.resume()` is called.\n\n### delayedStream.pause()\n\nCalls `source.pause()`.\n\n### delayedStream.pipe(dest)\n\nCalls `delayedStream.resume()` and then proxies the arguments to `source.pipe`.\n\n### delayedStream.release()\n\nEmits and clears all events that have been buffered up so far. This does not\nresume the underlaying source, use `delayedStream.resume()` instead.\n\n## License\n\ndelayed-stream is licensed under the MIT license.\n",
-  "readmeFilename": "Readme.md",
-  "bugs": {
-    "url": "https://github.com/felixge/node-delayed-stream/issues"
-  },
-  "_id": "delayed-stream@0.0.5",
-  "dist": {
-    "shasum": "052618e1471edc6f9affd1d89c0b0503c716f5a4"
-  },
-  "_from": "delayed-stream@0.0.5",
-  "_resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-0.0.5.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/combined-stream/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-{
-  "author": {
-    "name": "Felix Geisendörfer",
-    "email": "felix@debuggable.com",
-    "url": "http://debuggable.com/"
-  },
-  "name": "combined-stream",
-  "description": "A stream that emits multiple other streams one after another.",
-  "version": "0.0.4",
-  "homepage": "https://github.com/felixge/node-combined-stream",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/felixge/node-combined-stream.git"
-  },
-  "main": "./lib/combined_stream",
-  "engines": {
-    "node": "*"
-  },
-  "dependencies": {
-    "delayed-stream": "0.0.5"
-  },
-  "devDependencies": {
-    "far": "0.0.1"
-  },
-  "readme": "# combined-stream\n\nA stream that emits multiple other streams one after another.\n\n## Installation\n\n``` bash\nnpm install combined-stream\n```\n\n## Usage\n\nHere is a simple example that shows how you can use combined-stream to combine\ntwo files into one:\n\n``` javascript\nvar CombinedStream = require('combined-stream');\nvar fs = require('fs');\n\nvar combinedStream = CombinedStream.create();\ncombinedStream.append(fs.createReadStream('file1.txt'));\ncombinedStream.append(fs.createReadStream('file2.txt'));\n\ncombinedStream.pipe(fs.createWriteStream('combined.txt'));\n```\n\nWhile the example above works great, it will pause all source streams until\nthey are needed. If you don't want that to happen, you can set `pauseStreams`\nto `false`:\n\n``` javascript\nvar CombinedStream = require('combined-stream');\nvar fs = require('fs');\n\nvar combinedStream = CombinedStream.create({pauseStreams: false});\ncombinedStream.append(fs.createReadStream('file1.txt'));\ncombinedStream.append(fs.createReadStream('file2.txt'));\n\ncombinedStream.pipe(fs.createWriteStream('combined.txt'));\n```\n\nHowever, what if you don't have all the source streams yet, or you don't want\nto allocate the resources (file descriptors, memory, etc.) for them right away?\nWell, in that case you can simply provide a callback that supplies the stream\nby calling a `next()` function:\n\n``` javascript\nvar CombinedStream = require('combined-stream');\nvar fs = require('fs');\n\nvar combinedStream = CombinedStream.create();\ncombinedStream.append(function(next) {\n  next(fs.createReadStream('file1.txt'));\n});\ncombinedStream.append(function(next) {\n  next(fs.createReadStream('file2.txt'));\n});\n\ncombinedStream.pipe(fs.createWriteStream('combined.txt'));\n```\n\n## API\n\n### CombinedStream.create([options])\n\nReturns a new combined stream object. Available options are:\n\n* `maxDataSize`\n* `pauseStreams`\n\nThe effect of those options is described below.\n\n### combinedStream.pauseStreams = true\n\nWhether to apply back pressure to the underlaying streams. If set to `false`,\nthe underlaying streams will never be paused. If set to `true`, the\nunderlaying streams will be paused right after being appended, as well as when\n`delayedStream.pipe()` wants to throttle.\n\n### combinedStream.maxDataSize = 2 * 1024 * 1024\n\nThe maximum amount of bytes (or characters) to buffer for all source streams.\nIf this value is exceeded, `combinedStream` emits an `'error'` event.\n\n### combinedStream.dataSize = 0\n\nThe amount of bytes (or characters) currently buffered by `combinedStream`.\n\n### combinedStream.append(stream)\n\nAppends the given `stream` to the combinedStream object. If `pauseStreams` is\nset to `true, this stream will also be paused right away.\n\n`streams` can also be a function that takes one parameter called `next`. `next`\nis a function that must be invoked in order to provide the `next` stream, see\nexample above.\n\nRegardless of how the `stream` is appended, combined-stream always attaches an\n`'error'` listener to it, so you don't have to do that manually.\n\nSpecial case: `stream` can also be a String or Buffer.\n\n### combinedStream.write(data)\n\nYou should not call this, `combinedStream` takes care of piping the appended\nstreams into itself for you.\n\n### combinedStream.resume()\n\nCauses `combinedStream` to start drain the streams it manages. The function is\nidempotent, and also emits a `'resume'` event each time which usually goes to\nthe stream that is currently being drained.\n\n### combinedStream.pause();\n\nIf `combinedStream.pauseStreams` is set to `false`, this does nothing.\nOtherwise a `'pause'` event is emitted, this goes to the stream that is\ncurrently being drained, so you can use it to apply back pressure.\n\n### combinedStream.end();\n\nSets `combinedStream.writable` to false, emits an `'end'` event, and removes\nall streams from the queue.\n\n### combinedStream.destroy();\n\nSame as `combinedStream.end()`, except it emits a `'close'` event instead of\n`'end'`.\n\n## License\n\ncombined-stream is licensed under the MIT license.\n",
-  "readmeFilename": "Readme.md",
-  "bugs": {
-    "url": "https://github.com/felixge/node-combined-stream/issues"
-  },
-  "_id": "combined-stream@0.0.4",
-  "dist": {
-    "shasum": "818920f2e68d41ae5fac5f154dfbed98b675d34f"
-  },
-  "_from": "combined-stream@~0.0.4",
-  "_resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-0.0.4.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/form-data/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,46 +0,0 @@
-{
-  "author": {
-    "name": "Felix Geisendörfer",
-    "email": "felix@debuggable.com",
-    "url": "http://debuggable.com/"
-  },
-  "name": "form-data",
-  "description": "A module to create readable \"multipart/form-data\" streams.  Can be used to submit forms and file uploads to other web applications.",
-  "version": "0.1.2",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/felixge/node-form-data.git"
-  },
-  "main": "./lib/form_data",
-  "scripts": {
-    "test": "node test/run.js"
-  },
-  "engines": {
-    "node": ">= 0.6"
-  },
-  "dependencies": {
-    "combined-stream": "~0.0.4",
-    "mime": "~1.2.11",
-    "async": "~0.2.9"
-  },
-  "licenses": [
-    {
-      "type": "MIT",
-      "url": "https://raw.github.com/felixge/node-form-data/master/License"
-    }
-  ],
-  "devDependencies": {
-    "fake": "~0.2.2",
-    "far": "~0.0.7",
-    "formidable": "~1.0.14",
-    "request": "~2.27.0"
-  },
-  "readme": "# Form-Data [![Build Status](https://travis-ci.org/felixge/node-form-data.png?branch=master)](https://travis-ci.org/felixge/node-form-data) [![Dependency Status](https://gemnasium.com/felixge/node-form-data.png)](https://gemnasium.com/felixge/node-form-data)\n\nA module to create readable ```\"multipart/form-data\"``` streams. Can be used to submit forms and file uploads to other web applications.\n\nThe API of this module is inspired by the [XMLHttpRequest-2 FormData Interface][xhr2-fd].\n\n[xhr2-fd]: http://dev.w3.org/2006/webapi/XMLHttpRequest-2/Overview.html#the-formdata-interface\n[streams2-thing]: http://nodejs.org/api/stream.html#stream_compatibility_with_older_node_versions\n\n## Install\n\n```\nnpm install form-data\n```\n\n## Usage\n\nIn this example we are constructing a form with 3 fields that contain a string,\na buffer and a file stream.\n\n``` javascript\nvar FormData = require('form-data');\nvar fs = require('fs');\n\nvar form = new FormData();\nform.append('my_field', 'my value');\nform.append('my_buffer', new Buffer(10));\nform.append('my_file', fs.createReadStream('/foo/bar.jpg'));\n```\n\nAlso you can use http-response stream:\n\n``` javascript\nvar FormData = require('form-data');\nvar http = require('http');\n\nvar form = new FormData();\n\nhttp.request('http://nodejs.org/images/logo.png', function(response) {\n  form.append('my_field', 'my value');\n  form.append('my_buffer', new Buffer(10));\n  form.append('my_logo', response);\n});\n```\n\nOr @mikeal's request stream:\n\n``` javascript\nvar FormData = require('form-data');\nvar request = require('request');\n\nvar form = new FormData();\n\nform.append('my_field', 'my value');\nform.append('my_buffer', new Buffer(10));\nform.append('my_logo', request('http://nodejs.org/images/logo.png'));\n```\n\nIn order to submit this form to a web application, call ```submit(url, [callback])``` method:\n\n``` javascript\nform.submit('http://example.org/', function(err, res) {\n  // res – response object (http.IncomingMessage)  //\n  res.resume(); // for node-0.10.x\n});\n\n```\n\nFor more advanced request manipulations ```submit()``` method returns ```http.ClientRequest``` object, or you can choose from one of the alternative submission methods.\n\n### Alternative submission methods\n\nYou can use node's http client interface:\n\n``` javascript\nvar http = require('http');\n\nvar request = http.request({\n  method: 'post',\n  host: 'example.org',\n  path: '/upload',\n  headers: form.getHeaders()\n});\n\nform.pipe(request);\n\nrequest.on('response', function(res) {\n  console.log(res.statusCode);\n});\n```\n\nOr if you would prefer the `'Content-Length'` header to be set for you:\n\n``` javascript\nform.submit('example.org/upload', function(err, res) {\n  console.log(res.statusCode);\n});\n```\n\nTo use custom headers and pre-known length in parts:\n\n``` javascript\nvar CRLF = '\\r\\n';\nvar form = new FormData();\n\nvar options = {\n  header: CRLF + '--' + form.getBoundary() + CRLF + 'X-Custom-Header: 123' + CRLF + CRLF,\n  knownLength: 1\n};\n\nform.append('my_buffer', buffer, options);\n\nform.submit('http://example.com/', function(err, res) {\n  if (err) throw err;\n  console.log('Done');\n});\n```\n\nForm-Data can recognize and fetch all the required information from common types of streams (```fs.readStream```, ```http.response``` and ```mikeal's request```), for some other types of streams you'd need to provide \"file\"-related information manually:\n\n``` javascript\nsomeModule.stream(function(err, stdout, stderr) {\n  if (err) throw err;\n\n  var form = new FormData();\n\n  form.append('file', stdout, {\n    filename: 'unicycle.jpg',\n    contentType: 'image/jpg',\n    knownLength: 19806\n  });\n\n  form.submit('http://example.com/', function(err, res) {\n    if (err) throw err;\n    console.log('Done');\n  });\n});\n```\n\nFor edge cases, like POST request to URL with query string or to pass HTTP auth credentials, object can be passed to `form.submit()` as first parameter:\n\n``` javascript\nform.submit({\n  host: 'example.com',\n  path: '/probably.php?extra=params',\n  auth: 'username:password'\n}, function(err, res) {\n  console.log(res.statusCode);\n});\n```\n\n## Notes\n\n- ```getLengthSync()``` method DOESN'T calculate length for streams, use ```knownLength``` options as workaround.\n- If it feels like FormData hangs after submit and you're on ```node-0.10```, please check [Compatibility with Older Node Versions][streams2-thing]\n\n## TODO\n\n- Add new streams (0.10) support and try really hard not to break it for 0.8.x.\n\n## License\n\nForm-Data is licensed under the MIT license.\n",
-  "readmeFilename": "Readme.md",
-  "bugs": {
-    "url": "https://github.com/felixge/node-form-data/issues"
-  },
-  "homepage": "https://github.com/felixge/node-form-data",
-  "_id": "form-data@0.1.2",
-  "_from": "form-data@~0.1.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-.idea
-*.iml
-npm-debug.log
-dump.rdb
-node_modules
-results.tap
-results.xml
-npm-shrinkwrap.json
-config.json
-.DS_Store
-*/.DS_Store
-*/*/.DS_Store
-._*
-*/._*
-*/*/._*
-coverage.*
-lib-cov
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-language: node_js
-
-node_js:
-  - 0.10
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-Copyright (c) 2012-2013, Eran Hammer.
-All rights reserved.
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions are met:
-    * Redistributions of source code must retain the above copyright
-      notice, this list of conditions and the following disclaimer.
-    * Redistributions in binary form must reproduce the above copyright
-      notice, this list of conditions and the following disclaimer in the
-      documentation and/or other materials provided with the distribution.
-    * Neither the name of Eran Hammer nor the
-      names of its contributors may be used to endorse or promote products
-      derived from this software without specific prior written permission.
-
-THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
-WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
-DISCLAIMED. IN NO EVENT SHALL ERAN HAMMER BE LIABLE FOR ANY
-DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
-(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
-LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
-ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
-SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,10 +0,0 @@
-test:
-	@node node_modules/lab/bin/lab
-test-cov: 
-	@node node_modules/lab/bin/lab -r threshold -t 100
-test-cov-html:
-	@node node_modules/lab/bin/lab -r html -o coverage.html
-complexity:
-	@node node_modules/complexity-report/src/cli.js -o complexity.md -f markdown lib
-
-.PHONY: test test-cov test-cov-html complexity
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,627 +0,0 @@
-![hawk Logo](https://raw.github.com/hueniverse/hawk/master/images/hawk.png)
-
-<img align="right" src="https://raw.github.com/hueniverse/hawk/master/images/logo.png" /> **Hawk** is an HTTP authentication scheme using a message authentication code (MAC) algorithm to provide partial
-HTTP request cryptographic verification. For more complex use cases such as access delegation, see [Oz](https://github.com/hueniverse/oz).
-
-Current version: **1.0**
-
-[![Build Status](https://secure.travis-ci.org/hueniverse/hawk.png)](http://travis-ci.org/hueniverse/hawk)
-
-# Table of Content
-
-- [**Introduction**](#introduction)
-  - [Replay Protection](#replay-protection)
-  - [Usage Example](#usage-example)
-  - [Protocol Example](#protocol-example)
-    - [Payload Validation](#payload-validation)
-    - [Response Payload Validation](#response-payload-validation)
-  - [Browser Support and Considerations](#browser-support-and-considerations)
-<p></p>
-- [**Single URI Authorization**](#single-uri-authorization)
-  - [Usage Example](#bewit-usage-example)
-<p></p>
-- [**Security Considerations**](#security-considerations)
-  - [MAC Keys Transmission](#mac-keys-transmission)
-  - [Confidentiality of Requests](#confidentiality-of-requests)
-  - [Spoofing by Counterfeit Servers](#spoofing-by-counterfeit-servers)
-  - [Plaintext Storage of Credentials](#plaintext-storage-of-credentials)
-  - [Entropy of Keys](#entropy-of-keys)
-  - [Coverage Limitations](#coverage-limitations)
-  - [Future Time Manipulation](#future-time-manipulation)
-  - [Client Clock Poisoning](#client-clock-poisoning)
-  - [Bewit Limitations](#bewit-limitations)
-  - [Host Header Forgery](#host-header-forgery)
-<p></p>
-- [**Frequently Asked Questions**](#frequently-asked-questions)
-<p></p>
-- [**Acknowledgements**](#acknowledgements)
-
-# Introduction
-
-**Hawk** is an HTTP authentication scheme providing mechanisms for making authenticated HTTP requests with
-partial cryptographic verification of the request and response, covering the HTTP method, request URI, host,
-and optionally the request payload.
-
-Similar to the HTTP [Digest access authentication schemes](http://www.ietf.org/rfc/rfc2617.txt), **Hawk** uses a set of
-client credentials which include an identifier (e.g. username) and key (e.g. password). Likewise, just as with the Digest scheme,
-the key is never included in authenticated requests. Instead, it is used to calculate a request MAC value which is
-included in its place.
-
-However, **Hawk** has several differences from Digest. In particular, while both use a nonce to limit the possibility of
-replay attacks, in **Hawk** the client generates the nonce and uses it in combination with a timestamp, leading to less
-"chattiness" (interaction with the server).
-
-Also unlike Digest, this scheme is not intended to protect the key itself (the password in Digest) because
-the client and server must both have access to the key material in the clear.
-
-The primary design goals of this scheme are to:
-* simplify and improve HTTP authentication for services that are unwilling or unable to deploy TLS for all resources,
-* secure credentials against leakage (e.g., when the client uses some form of dynamic configuration to determine where
-  to send an authenticated request), and
-* avoid the exposure of credentials sent to a malicious server over an unauthenticated secure channel due to client
-  failure to validate the server's identity as part of its TLS handshake.
-
-In addition, **Hawk** supports a method for granting third-parties temporary access to individual resources using
-a query parameter called _bewit_ (in falconry, a leather strap used to attach a tracking device to the leg of a hawk).
-
-The **Hawk** scheme requires the establishment of a shared symmetric key between the client and the server,
-which is beyond the scope of this module. Typically, the shared credentials are established via an initial
-TLS-protected phase or derived from some other shared confidential information available to both the client
-and the server.
-
-
-## Replay Protection
-
-Without replay protection, an attacker can use a compromised (but otherwise valid and authenticated) request more 
-than once, gaining access to a protected resource. To mitigate this, clients include both a nonce and a timestamp when 
-making requests. This gives the server enough information to prevent replay attacks.
-
-The nonce is generated by the client, and is a string unique across all requests with the same timestamp and
-key identifier combination. 
-
-The timestamp enables the server to restrict the validity period of the credentials where requests occuring afterwards
-are rejected. It also removes the need for the server to retain an unbounded number of nonce values for future checks.
-By default, **Hawk** uses a time window of 1 minute to allow for time skew between the client and server (which in
-practice translates to a maximum of 2 minutes as the skew can be positive or negative).
-
-Using a timestamp requires the client's clock to be in sync with the server's clock. **Hawk** requires both the client
-clock and the server clock to use NTP to ensure synchronization. However, given the limitations of some client types
-(e.g. browsers) to deploy NTP, the server provides the client with its current time (in seconds precision) in response
-to a bad timestamp.
-
-There is no expectation that the client will adjust its system clock to match the server (in fact, this would be a
-potential attack vector). Instead, the client only uses the server's time to calculate an offset used only
-for communications with that particular server. The protocol rewards clients with synchronized clocks by reducing
-the number of round trips required to authenticate the first request.
-
-
-## Usage Example
-
-Server code:
-
-```javascript
-var Http = require('http');
-var Hawk = require('hawk');
-
-
-// Credentials lookup function
-
-var credentialsFunc = function (id, callback) {
-
-    var credentials = {
-        key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn',
-        algorithm: 'sha256',
-        user: 'Steve'
-    };
-
-    return callback(null, credentials);
-};
-
-// Create HTTP server
-
-var handler = function (req, res) {
-
-    // Authenticate incoming request
-
-    Hawk.server.authenticate(req, credentialsFunc, {}, function (err, credentials, artifacts) {
-
-        // Prepare response
-
-        var payload = (!err ? 'Hello ' + credentials.user + ' ' + artifacts.ext : 'Shoosh!');
-        var headers = { 'Content-Type': 'text/plain' };
-
-        // Generate Server-Authorization response header
-
-        var header = Hawk.server.header(credentials, artifacts, { payload: payload, contentType: headers['Content-Type'] });
-        headers['Server-Authorization'] = header;
-
-        // Send the response back
-
-        res.writeHead(!err ? 200 : 401, headers);
-        res.end(payload);
-    });
-};
-
-// Start server
-
-Http.createServer(handler).listen(8000, 'example.com');
-```
-
-Client code:
-
-```javascript
-var Request = require('request');
-var Hawk = require('hawk');
-
-
-// Client credentials
-
-var credentials = {
-    id: 'dh37fgj492je',
-    key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn',
-    algorithm: 'sha256'
-}
-
-// Request options
-
-var requestOptions = {
-    uri: 'http://example.com:8000/resource/1?b=1&a=2',
-    method: 'GET',
-    headers: {}
-};
-
-// Generate Authorization request header
-
-var header = Hawk.client.header('http://example.com:8000/resource/1?b=1&a=2', 'GET', { credentials: credentials, ext: 'some-app-data' });
-requestOptions.headers.Authorization = header.field;
-
-// Send authenticated request
-
-Request(requestOptions, function (error, response, body) {
-
-    // Authenticate the server's response
-
-    var isValid = Hawk.client.authenticate(response, credentials, header.artifacts, { payload: body });
-
-    // Output results
-
-    console.log(response.statusCode + ': ' + body + (isValid ? ' (valid)' : ' (invalid)'));
-});
-```
-
-**Hawk** utilized the [**SNTP**](https://github.com/hueniverse/sntp) module for time sync management. By default, the local
-machine time is used. To automatically retrieve and synchronice the clock within the application, use the SNTP 'start()' method.
-
-```javascript
-Hawk.sntp.start();
-```
-
-
-## Protocol Example
-
-The client attempts to access a protected resource without authentication, sending the following HTTP request to
-the resource server:
-
-```
-GET /resource/1?b=1&a=2 HTTP/1.1
-Host: example.com:8000
-```
-
-The resource server returns an authentication challenge.
-
-```
-HTTP/1.1 401 Unauthorized
-WWW-Authenticate: Hawk
-```
-
-The client has previously obtained a set of **Hawk** credentials for accessing resources on the "http://example.com/"
-server. The **Hawk** credentials issued to the client include the following attributes:
-
-* Key identifier: dh37fgj492je
-* Key: werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn
-* Algorithm: sha256
-
-The client generates the authentication header by calculating a timestamp (e.g. the number of seconds since January 1,
-1970 00:00:00 GMT), generating a nonce, and constructing the normalized request string (each value followed by a newline
-character):
-
-```
-hawk.1.header
-1353832234
-j4h3g2
-GET
-/resource/1?b=1&a=2
-example.com
-8000
-
-some-app-ext-data
-
-```
-
-The request MAC is calculated using HMAC with the specified hash algorithm "sha256" and the key over the normalized request string.
-The result is base64-encoded to produce the request MAC:
-
-```
-6R4rV5iE+NPoym+WwjeHzjAGXUtLNIxmo1vpMofpLAE=
-```
-
-The client includes the **Hawk** key identifier, timestamp, nonce, application specific data, and request MAC with the request using
-the HTTP `Authorization` request header field:
-
-```
-GET /resource/1?b=1&a=2 HTTP/1.1
-Host: example.com:8000
-Authorization: Hawk id="dh37fgj492je", ts="1353832234", nonce="j4h3g2", ext="some-app-ext-data", mac="6R4rV5iE+NPoym+WwjeHzjAGXUtLNIxmo1vpMofpLAE="
-```
-
-The server validates the request by calculating the request MAC again based on the request received and verifies the validity
-and scope of the **Hawk** credentials. If valid, the server responds with the requested resource.
-
-
-### Payload Validation
-
-**Hawk** provides optional payload validation. When generating the authentication header, the client calculates a payload hash
-using the specified hash algorithm. The hash is calculated over the concatenated value of (each followed by a newline character):
-* `hawk.1.payload`
-* the content-type in lowercase, without any parameters (e.g. `application/json`)
-* the request payload prior to any content encoding (the exact representation requirements should be specified by the server for payloads other than simple single-part ascii to ensure interoperability)
-
-For example:
-
-* Payload: `Thank you for flying Hawk`
-* Content Type: `text/plain`
-* Hash (sha256): `Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=`
-
-Results in the following input to the payload hash function (newline terminated values):
-
-```
-hawk.1.payload
-text/plain
-Thank you for flying Hawk
-
-```
-
-Which produces the following hash value:
-
-```
-Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=
-```
-
-The client constructs the normalized request string (newline terminated values):
-
-```
-hawk.1.header
-1353832234
-j4h3g2
-POST
-/resource/1?a=1&b=2
-example.com
-8000
-Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=
-some-app-ext-data
-
-```
-
-Then calculates the request MAC and includes the **Hawk** key identifier, timestamp, nonce, payload hash, application specific data,
-and request MAC, with the request using the HTTP `Authorization` request header field:
-
-```
-POST /resource/1?a=1&b=2 HTTP/1.1
-Host: example.com:8000
-Authorization: Hawk id="dh37fgj492je", ts="1353832234", nonce="j4h3g2", hash="Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=", ext="some-app-ext-data", mac="aSe1DERmZuRl3pI36/9BdZmnErTw3sNzOOAUlfeKjVw="
-```
-
-It is up to the server if and when it validates the payload for any given request, based solely on it's security policy
-and the nature of the data included.
-
-If the payload is available at the time of authentication, the server uses the hash value provided by the client to construct
-the normalized string and validates the MAC. If the MAC is valid, the server calculates the payload hash and compares the value
-with the provided payload hash in the header. In many cases, checking the MAC first is faster than calculating the payload hash.
-
-However, if the payload is not available at authentication time (e.g. too large to fit in memory, streamed elsewhere, or processed
-at a different stage in the application), the server may choose to defer payload validation for later by retaining the hash value
-provided by the client after validating the MAC.
-
-It is important to note that MAC validation does not mean the hash value provided by the client is valid, only that the value
-included in the header was not modified. Without calculating the payload hash on the server and comparing it to the value provided
-by the client, the payload may be modified by an attacker.
-
-
-## Response Payload Validation
-
-**Hawk** provides partial response payload validation. The server includes the `Server-Authorization` response header which enables the
-client to authenticate the response and ensure it is talking to the right server. **Hawk** defines the HTTP `Server-Authorization` header
-as a response header using the exact same syntax as the `Authorization` request header field.
-
-The header is contructed using the same process as the client's request header. The server uses the same credentials and other
-artifacts provided by the client to constructs the normalized request string. The `ext` and `hash` values are replaced with
-new values based on the server response. The rest as identical to those used by the client.
-
-The result MAC digest is included with the optional `hash` and `ext` values:
-
-```
-Server-Authorization: Hawk mac="XIJRsMl/4oL+nn+vKoeVZPdCHXB4yJkNnBbTbHFZUYE=", hash="f9cDF/TDm7TkYRLnGwRMfeDzT6LixQVLvrIKhh0vgmM=", ext="response-specific"
-```
-
-
-## Browser Support and Considerations
-
-A browser script is provided for including using a `<script>` tag in [lib/browser.js](/lib/browser.js).
-
-**Hawk** relies on the _Server-Authorization_ and _WWW-Authenticate_ headers in its response to communicate with the client.
-Therefore, in case of CORS requests, it is important to consider sending _Access-Control-Expose-Headers_ with the value
-_"WWW-Authenticate, Server-Authorization"_ on each response from your server. As explained in the
-[specifications](http://www.w3.org/TR/cors/#access-control-expose-headers-response-header), it will indicate that these headers
-can safely be accessed by the client (using getResponseHeader() on the XmlHttpRequest object). Otherwise you will be met with a
-["simple response header"](http://www.w3.org/TR/cors/#simple-response-header) which excludes these fields and would prevent the
-Hawk client from authenticating the requests.You can read more about the why and how in this
-[article](http://www.html5rocks.com/en/tutorials/cors/#toc-adding-cors-support-to-the-server)
-
-
-# Single URI Authorization
-
-There are cases in which limited and short-term access to a protected resource is granted to a third party which does not
-have access to the shared credentials. For example, displaying a protected image on a web page accessed by anyone. **Hawk**
-provides limited support for such URIs in the form of a _bewit_ - a URI query parameter appended to the request URI which contains
-the necessary credentials to authenticate the request.
-
-Because of the significant security risks involved in issuing such access, bewit usage is purposely limited only to GET requests
-and for a finite period of time. Both the client and server can issue bewit credentials, however, the server should not use the same
-credentials as the client to maintain clear traceability as to who issued which credentials.
-
-In order to simplify implementation, bewit credentials do not support single-use policy and can be replayed multiple times within
-the granted access timeframe. 
-
-
-## Bewit Usage Example
-
-Server code:
-
-```javascript
-var Http = require('http');
-var Hawk = require('hawk');
-
-
-// Credentials lookup function
-
-var credentialsFunc = function (id, callback) {
-
-    var credentials = {
-        key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn',
-        algorithm: 'sha256'
-    };
-
-    return callback(null, credentials);
-};
-
-// Create HTTP server
-
-var handler = function (req, res) {
-
-    Hawk.uri.authenticate(req, credentialsFunc, {}, function (err, credentials, attributes) {
-
-        res.writeHead(!err ? 200 : 401, { 'Content-Type': 'text/plain' });
-        res.end(!err ? 'Access granted' : 'Shoosh!');
-    });
-};
-
-Http.createServer(handler).listen(8000, 'example.com');
-```
-
-Bewit code generation:
-
-```javascript
-var Request = require('request');
-var Hawk = require('hawk');
-
-
-// Client credentials
-
-var credentials = {
-    id: 'dh37fgj492je',
-    key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn',
-    algorithm: 'sha256'
-}
-
-// Generate bewit
-
-var duration = 60 * 5;      // 5 Minutes
-var bewit = Hawk.uri.getBewit('http://example.com:8080/resource/1?b=1&a=2', { credentials: credentials, ttlSec: duration, ext: 'some-app-data' });
-var uri = 'http://example.com:8000/resource/1?b=1&a=2' + '&bewit=' + bewit;
-```
-
-
-# Security Considerations
-
-The greatest sources of security risks are usually found not in **Hawk** but in the policies and procedures surrounding its use.
-Implementers are strongly encouraged to assess how this module addresses their security requirements. This section includes
-an incomplete list of security considerations that must be reviewed and understood before deploying **Hawk** on the server.
-Many of the protections provided in **Hawk** depends on whether and how they are used.
-
-### MAC Keys Transmission
-
-**Hawk** does not provide any mechanism for obtaining or transmitting the set of shared credentials required. Any mechanism used
-to obtain **Hawk** credentials must ensure that these transmissions are protected using transport-layer mechanisms such as TLS.
-
-### Confidentiality of Requests
-
-While **Hawk** provides a mechanism for verifying the integrity of HTTP requests, it provides no guarantee of request
-confidentiality. Unless other precautions are taken, eavesdroppers will have full access to the request content. Servers should
-carefully consider the types of data likely to be sent as part of such requests, and employ transport-layer security mechanisms
-to protect sensitive resources.
-
-### Spoofing by Counterfeit Servers
-
-**Hawk** provides limited verification of the server authenticity. When receiving a response back from the server, the server
-may choose to include a response `Server-Authorization` header which the client can use to verify the response. However, it is up to
-the server to determine when such measure is included, to up to the client to enforce that policy.
-
-A hostile party could take advantage of this by intercepting the client's requests and returning misleading or otherwise
-incorrect responses. Service providers should consider such attacks when developing services using this protocol, and should
-require transport-layer security for any requests where the authenticity of the resource server or of server responses is an issue.
-
-### Plaintext Storage of Credentials
-
-The **Hawk** key functions the same way passwords do in traditional authentication systems. In order to compute the request MAC,
-the server must have access to the key in plaintext form. This is in contrast, for example, to modern operating systems, which
-store only a one-way hash of user credentials.
-
-If an attacker were to gain access to these keys - or worse, to the server's database of all such keys - he or she would be able
-to perform any action on behalf of any resource owner. Accordingly, it is critical that servers protect these keys from unauthorized
-access.
-
-### Entropy of Keys
-
-Unless a transport-layer security protocol is used, eavesdroppers will have full access to authenticated requests and request
-MAC values, and will thus be able to mount offline brute-force attacks to recover the key used. Servers should be careful to
-assign keys which are long enough, and random enough, to resist such attacks for at least the length of time that the **Hawk**
-credentials are valid.
-
-For example, if the credentials are valid for two weeks, servers should ensure that it is not possible to mount a brute force
-attack that recovers the key in less than two weeks. Of course, servers are urged to err on the side of caution, and use the
-longest key reasonable.
-
-It is equally important that the pseudo-random number generator (PRNG) used to generate these keys be of sufficiently high
-quality. Many PRNG implementations generate number sequences that may appear to be random, but which nevertheless exhibit
-patterns or other weaknesses which make cryptanalysis or brute force attacks easier. Implementers should be careful to use
-cryptographically secure PRNGs to avoid these problems.
-
-### Coverage Limitations
-
-The request MAC only covers the HTTP `Host` header and optionally the `Content-Type` header. It does not cover any other headers
-which can often affect how the request body is interpreted by the server. If the server behavior is influenced by the presence
-or value of such headers, an attacker can manipulate the request headers without being detected. Implementers should use the
-`ext` feature to pass application-specific information via the `Authorization` header which is protected by the request MAC.
-
-The response authentication, when performed, only covers the response payload, content-type, and the request information 
-provided by the client in it's request (method, resource, timestamp, nonce, etc.). It does not cover the HTTP status code or
-any other response header field (e.g. Location) which can affect the client's behaviour.
-
-### Future Time Manipulation
-
-The protocol relies on a clock sync between the client and server. To accomplish this, the server informs the client of its
-current time when an invalid timestamp is received.
-
-If an attacker is able to manipulate this information and cause the client to use an incorrect time, it would be able to cause
-the client to generate authenticated requests using time in the future. Such requests will fail when sent by the client, and will
-not likely leave a trace on the server (given the common implementation of nonce, if at all enforced). The attacker will then
-be able to replay the request at the correct time without detection.
-
-The client must only use the time information provided by the server if:
-* it was delivered over a TLS connection and the server identity has been verified, or
-* the `tsm` MAC digest calculated using the same client credentials over the timestamp has been verified.
-
-### Client Clock Poisoning
-
-When receiving a request with a bad timestamp, the server provides the client with its current time. The client must never use
-the time received from the server to adjust its own clock, and must only use it to calculate an offset for communicating with
-that particular server.
-
-### Bewit Limitations
-
-Special care must be taken when issuing bewit credentials to third parties. Bewit credentials are valid until expiration and cannot
-be revoked or limited without using other means. Whatever resource they grant access to will be completely exposed to anyone with
-access to the bewit credentials which act as bearer credentials for that particular resource. While bewit usage is limited to GET
-requests only and therefore cannot be used to perform transactions or change server state, it can still be used to expose private
-and sensitive information.
-
-### Host Header Forgery
-
-Hawk validates the incoming request MAC against the incoming HTTP Host header. However, unless the optional `host` and `port`
-options are used with `server.authenticate()`, a malicous client can mint new host names pointing to the server's IP address and
-use that to craft an attack by sending a valid request that's meant for another hostname than the one used by the server. Server
-implementors must manually verify that the host header received matches their expectation (or use the options mentioned above).
-
-# Frequently Asked Questions
-
-### Where is the protocol specification?
-
-If you are looking for some prose explaining how all this works, **this is it**. **Hawk** is being developed as an open source
-project instead of a standard. In other words, the [code](/hueniverse/hawk/tree/master/lib) is the specification. Not sure about
-something? Open an issue!
-
-### Is it done?
-
-At if version 0.10.0, **Hawk** is feature-complete. However, until this module reaches version 1.0.0 it is considered experimental
-and is likely to change. This also means your feedback and contribution are very welcome. Feel free to open issues with questions
-and suggestions.
-
-### Where can I find **Hawk** implementations in other languages?
-
-**Hawk**'s only reference implementation is provided in JavaScript as a node.js module. However, others are actively porting it to other
-platforms. There is already a [PHP](https://github.com/alexbilbie/PHP-Hawk),
-[.NET](https://github.com/pcibraro/hawknet), and [JAVA](https://github.com/wealdtech/hawk) libraries available. The full list
-is maintained [here](https://github.com/hueniverse/hawk/issues?labels=port). Please add an issue if you are working on another
-port. A cross-platform test-suite is in the works.
-
-### Why isn't the algorithm part of the challenge or dynamically negotiated?
-
-The algorithm used is closely related to the key issued as different algorithms require different key sizes (and other
-requirements). While some keys can be used for multiple algorithm, the protocol is designed to closely bind the key and algorithm
-together as part of the issued credentials.
-
-### Why is Host and Content-Type the only headers covered by the request MAC?
-
-It is really hard to include other headers. Headers can be changed by proxies and other intermediaries and there is no
-well-established way to normalize them. Many platforms change the case of header field names and values. The only
-straight-forward solution is to include the headers in some blob (say, base64 encoded JSON) and include that with the request,
-an approach taken by JWT and other such formats. However, that design violates the HTTP header boundaries, repeats information,
-and introduces other security issues because firewalls will not be aware of these "hidden" headers. In addition, any information
-repeated must be compared to the duplicated information in the header and therefore only moves the problem elsewhere.
-
-### Why not just use HTTP Digest?
-
-Digest requires pre-negotiation to establish a nonce. This means you can't just make a request - you must first send
-a protocol handshake to the server. This pattern has become unacceptable for most web services, especially mobile
-where extra round-trip are costly.
-
-### Why bother with all this nonce and timestamp business?
-
-**Hawk** is an attempt to find a reasonable, practical compromise between security and usability. OAuth 1.0 got timestamp
-and nonces halfway right but failed when it came to scalability and consistent developer experience. **Hawk** addresses
-it by requiring the client to sync its clock, but provides it with tools to accomplish it.
-
-In general, replay protection is a matter of application-specific threat model. It is less of an issue on a TLS-protected
-system where the clients are implemented using best practices and are under the control of the server. Instead of dropping
-replay protection, **Hawk** offers a required time window and an optional nonce verification. Together, it provides developers
-with the ability to decide how to enforce their security policy without impacting the client's implementation.
-
-### What are `app` and `dlg` in the authorization header and normalized mac string?
-
-The original motivation for **Hawk** was to replace the OAuth 1.0 use cases. This included both a simple client-server mode which
-this module is specifically designed for, and a delegated access mode which is being developed separately in
-[Oz](https://github.com/hueniverse/oz). In addition to the **Hawk** use cases, Oz requires another attribute: the application id `app`.
-This provides binding between the credentials and the application in a way that prevents an attacker from tricking an application
-to use credentials issued to someone else. It also has an optional 'delegated-by' attribute `dlg` which is the application id of the
-application the credentials were directly issued to. The goal of these two additions is to allow Oz to utilize **Hawk** directly,
-but with the additional security of delegated credentials.
-
-### What is the purpose of the static strings used in each normalized MAC input?
-
-When calculating a hash or MAC, a static prefix (tag) is added. The prefix is used to prevent MAC values from being
-used or reused for a purpose other than what they were created for (i.e. prevents switching MAC values between a request,
-response, and a bewit use cases). It also protects against expliots created after a potential change in how the protocol
-creates the normalized string. For example, if a future version would switch the order of nonce and timestamp, it
-can create an exploit opportunity for cases where the nonce is similar in format to a timestamp.
-
-### Does **Hawk** have anything to do with OAuth?
-
-Short answer: no.
-
-**Hawk** was originally proposed as the OAuth MAC Token specification. However, the OAuth working group in its consistent
-incompetence failed to produce a final, usable solution to address one of the most popular use cases of OAuth 1.0 - using it
-to authenticate simple client-server transactions (i.e. two-legged). As you can guess, the OAuth working group is still hard
-at work to produce more garbage.
-
-**Hawk** provides a simple HTTP authentication scheme for making client-server requests. It does not address the OAuth use case
-of delegating access to a third party. If you are looking for an OAuth alternative, check out [Oz](https://github.com/hueniverse/oz).
-
-
-# Acknowledgements
-
-**Hawk** is a derivative work of the [HTTP MAC Authentication Scheme](http://tools.ietf.org/html/draft-hammer-oauth-v2-mac-token-05) proposal
-co-authored by Ben Adida, Adam Barth, and Eran Hammer, which in turn was based on the OAuth 1.0 community specification.
-
-Special thanks to Ben Laurie for his always insightful feedback and advice.
-
-The **Hawk** logo was created by [Chris Carrasco](http://chriscarrasco.com).
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/example/usage.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,78 +0,0 @@
-// Load modules
-
-var Http = require('http');
-var Request = require('request');
-var Hawk = require('../lib');
-
-
-// Declare internals
-
-var internals = {
-    credentials: {
-        dh37fgj492je: {
-            id: 'dh37fgj492je',                                             // Required by Hawk.client.header 
-            key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn',
-            algorithm: 'sha256',
-            user: 'Steve'
-        }
-    }
-};
-
-
-// Credentials lookup function
-
-var credentialsFunc = function (id, callback) {
-
-    return callback(null, internals.credentials[id]);
-};
-
-
-// Create HTTP server
-
-var handler = function (req, res) {
-
-    Hawk.server.authenticate(req, credentialsFunc, {}, function (err, credentials, artifacts) {
-
-        var payload = (!err ? 'Hello ' + credentials.user + ' ' + artifacts.ext : 'Shoosh!');
-        var headers = {
-            'Content-Type': 'text/plain',
-            'Server-Authorization': Hawk.server.header(credentials, artifacts, { payload: payload, contentType: 'text/plain' })
-        };
-
-        res.writeHead(!err ? 200 : 401, headers);
-        res.end(payload);
-    });
-};
-
-Http.createServer(handler).listen(8000, '127.0.0.1');
-
-
-// Send unauthenticated request
-
-Request('http://127.0.0.1:8000/resource/1?b=1&a=2', function (error, response, body) {
-
-    console.log(response.statusCode + ': ' + body);
-});
-
-
-// Send authenticated request
-
-credentialsFunc('dh37fgj492je', function (err, credentials) {
-
-    var header = Hawk.client.header('http://127.0.0.1:8000/resource/1?b=1&a=2', 'GET', { credentials: credentials, ext: 'and welcome!' });
-    var options = {
-        uri: 'http://127.0.0.1:8000/resource/1?b=1&a=2',
-        method: 'GET',
-        headers: {
-            authorization: header.field
-        }
-    };
-
-    Request(options, function (error, response, body) {
-
-        var isValid = Hawk.client.authenticate(response, credentials, header.artifacts, { payload: body });
-        console.log(response.statusCode + ': ' + body + (isValid ? ' (valid)' : ' (invalid)'));
-        process.exit(0);
-    });
-});
-
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/images/hawk.png has changed
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/images/logo.png has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require('./lib');
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/browser.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,485 +0,0 @@
-/*
-    HTTP Hawk Authentication Scheme
-    Copyright (c) 2012-2013, Eran Hammer <eran@hueniverse.com>
-    MIT Licensed
-*/
-
-
-// Declare namespace
-
-var hawk = {};
-
-
-// Export if used as a module
-
-if (typeof module !== "undefined" && module.exports) {
-    module.exports = hawk;
-}
-
-hawk.client = {
-
-    // Generate an Authorization header for a given request
-
-    /*
-        uri: 'http://example.com/resource?a=b'
-        method: HTTP verb (e.g. 'GET', 'POST')
-        options: {
-    
-            // Required
-    
-            credentials: {
-                id: 'dh37fgj492je',
-                key: 'aoijedoaijsdlaksjdl',
-                algorithm: 'sha256'                                 // 'sha1', 'sha256'
-            },
-    
-            // Optional
-    
-            ext: 'application-specific',                        // Application specific data sent via the ext attribute
-            timestamp: Date.now() / 1000,                       // A pre-calculated timestamp in seconds
-            nonce: '2334f34f',                                  // A pre-generated nonce
-            localtimeOffsetMsec: 400,                           // Time offset to sync with server time (ignored if timestamp provided)
-            payload: '{"some":"payload"}',                      // UTF-8 encoded string for body hash generation (ignored if hash provided)
-            contentType: 'application/json',                    // Payload content-type (ignored if hash provided)
-            hash: 'U4MKKSmiVxk37JCCrAVIjV=',                    // Pre-calculated payload hash
-            app: '24s23423f34dx',                               // Oz application id
-            dlg: '234sz34tww3sd'                                // Oz delegated-by application id
-        }
-    */
-
-    header: function (uri, method, options) {
-
-        var result = {
-            field: '',
-            artifacts: {}
-        };
-
-        // Validate inputs
-
-        if (!uri || (typeof uri !== 'string' && typeof uri !== 'object') ||
-            !method || typeof method !== 'string' ||
-            !options || typeof options !== 'object') {
-
-            return result;
-        }
-
-        // Application time
-
-        var timestamp = options.timestamp || Math.floor((hawk.utils.now() + (options.localtimeOffsetMsec || 0)) / 1000)
-
-        // Validate credentials
-
-        var credentials = options.credentials;
-        if (!credentials ||
-            !credentials.id ||
-            !credentials.key ||
-            !credentials.algorithm) {
-
-            // Invalid credential object
-            return result;
-        }
-
-        if (hawk.crypto.algorithms.indexOf(credentials.algorithm) === -1) {
-            return result;
-        }
-
-        // Parse URI
-
-        if (typeof uri === 'string') {
-            uri = hawk.utils.parseUri(uri);
-        }
-
-        // Calculate signature
-
-        var artifacts = {
-            ts: timestamp,
-            nonce: options.nonce || hawk.utils.randomString(6),
-            method: method,
-            resource: uri.relative,
-            host: uri.hostname,
-            port: uri.port,
-            hash: options.hash,
-            ext: options.ext,
-            app: options.app,
-            dlg: options.dlg
-        };
-
-        result.artifacts = artifacts;
-
-        // Calculate payload hash
-
-        if (!artifacts.hash &&
-            options.hasOwnProperty('payload')) {
-
-            artifacts.hash = hawk.crypto.calculatePayloadHash(options.payload, credentials.algorithm, options.contentType);
-        }
-
-        var mac = hawk.crypto.calculateMac('header', credentials, artifacts);
-
-        // Construct header
-
-        var hasExt = artifacts.ext !== null && artifacts.ext !== undefined && artifacts.ext !== '';       // Other falsey values allowed
-        var header = 'Hawk id="' + credentials.id +
-                     '", ts="' + artifacts.ts +
-                     '", nonce="' + artifacts.nonce +
-                     (artifacts.hash ? '", hash="' + artifacts.hash : '') +
-                     (hasExt ? '", ext="' + hawk.utils.escapeHeaderAttribute(artifacts.ext) : '') +
-                     '", mac="' + mac + '"';
-
-        if (artifacts.app) {
-            header += ', app="' + artifacts.app +
-                      (artifacts.dlg ? '", dlg="' + artifacts.dlg : '') + '"';
-        }
-
-        result.field = header;
-
-        return result;
-    },
-
-
-    // Validate server response
-
-    /*
-        request:    object created via 'new XMLHttpRequest()' after response received
-        artifacts:  object recieved from header().artifacts
-        options: {
-            payload:    optional payload received
-            required:   specifies if a Server-Authorization header is required. Defaults to 'false'
-        }
-    */
-
-    authenticate: function (request, credentials, artifacts, options) {
-
-        options = options || {};
-
-        if (request.getResponseHeader('www-authenticate')) {
-
-            // Parse HTTP WWW-Authenticate header
-
-            var attributes = hawk.utils.parseAuthorizationHeader(request.getResponseHeader('www-authenticate'), ['ts', 'tsm', 'error']);
-            if (!attributes) {
-                return false;
-            }
-
-            if (attributes.ts) {
-                var tsm = hawk.crypto.calculateTsMac(attributes.ts, credentials);
-                if (tsm !== attributes.tsm) {
-                    return false;
-                }
-
-                hawk.utils.setNtpOffset(attributes.ts - Math.floor(Date.now() / 1000));     // Keep offset at 1 second precision
-            }
-        }
-
-        // Parse HTTP Server-Authorization header
-
-        if (!request.getResponseHeader('server-authorization') &&
-            !options.required) {
-
-            return true;
-        }
-
-        var attributes = hawk.utils.parseAuthorizationHeader(request.getResponseHeader('server-authorization'), ['mac', 'ext', 'hash']);
-        if (!attributes) {
-            return false;
-        }
-
-        var modArtifacts = {
-            ts: artifacts.ts,
-            nonce: artifacts.nonce,
-            method: artifacts.method,
-            resource: artifacts.resource,
-            host: artifacts.host,
-            port: artifacts.port,
-            hash: attributes.hash,
-            ext: attributes.ext,
-            app: artifacts.app,
-            dlg: artifacts.dlg
-        };
-
-        var mac = hawk.crypto.calculateMac('response', credentials, modArtifacts);
-        if (mac !== attributes.mac) {
-            return false;
-        }
-
-        if (!options.hasOwnProperty('payload')) {
-            return true;
-        }
-
-        if (!attributes.hash) {
-            return false;
-        }
-
-        var calculatedHash = hawk.crypto.calculatePayloadHash(options.payload, credentials.algorithm, request.getResponseHeader('content-type'));
-        return (calculatedHash === attributes.hash);
-    },
-
-    message: function (host, port, message, options) {
-
-        // Validate inputs
-
-        if (!host || typeof host !== 'string' ||
-            !port || typeof port !== 'number' ||
-            message === null || message === undefined || typeof message !== 'string' ||
-            !options || typeof options !== 'object') {
-
-            return null;
-        }
-
-        // Application time
-
-        var timestamp = options.timestamp || Math.floor((hawk.utils.now() + (options.localtimeOffsetMsec || 0)) / 1000)
-
-        // Validate credentials
-
-        var credentials = options.credentials;
-        if (!credentials ||
-            !credentials.id ||
-            !credentials.key ||
-            !credentials.algorithm) {
-
-            // Invalid credential object
-            return null;
-        }
-
-        if (hawk.crypto.algorithms.indexOf(credentials.algorithm) === -1) {
-            return null;
-        }
-
-        // Calculate signature
-
-        var artifacts = {
-            ts: timestamp,
-            nonce: options.nonce || hawk.utils.randomString(6),
-            host: host,
-            port: port,
-            hash: hawk.crypto.calculatePayloadHash(message, credentials.algorithm)
-        };
-
-        // Construct authorization
-
-        var result = {
-            id: credentials.id,
-            ts: artifacts.ts,
-            nonce: artifacts.nonce,
-            hash: artifacts.hash,
-            mac: hawk.crypto.calculateMac('message', credentials, artifacts)
-        };
-
-        return result;
-    }
-};
-
-
-hawk.crypto = {
-
-    headerVersion: '1',
-
-    algorithms: ['sha1', 'sha256'],
-
-    calculateMac: function (type, credentials, options) {
-
-        var normalized = hawk.crypto.generateNormalizedString(type, options);
-
-        var hmac = CryptoJS['Hmac' + credentials.algorithm.toUpperCase()](normalized, credentials.key);
-        return hmac.toString(CryptoJS.enc.Base64);
-    },
-
-    generateNormalizedString: function (type, options) {
-
-        var normalized = 'hawk.' + hawk.crypto.headerVersion + '.' + type + '\n' +
-                         options.ts + '\n' +
-                         options.nonce + '\n' +
-                         (options.method || '').toUpperCase() + '\n' +
-                         (options.resource || '') + '\n' +
-                         options.host.toLowerCase() + '\n' +
-                         options.port + '\n' +
-                         (options.hash || '') + '\n';
-
-        if (options.ext) {
-            normalized += options.ext.replace('\\', '\\\\').replace('\n', '\\n');
-        }
-
-        normalized += '\n';
-
-        if (options.app) {
-            normalized += options.app + '\n' +
-                          (options.dlg || '') + '\n';
-        }
-
-        return normalized;
-    },
-
-    calculatePayloadHash: function (payload, algorithm, contentType) {
-
-        var hash = CryptoJS.algo[algorithm.toUpperCase()].create();
-        hash.update('hawk.' + hawk.crypto.headerVersion + '.payload\n');
-        hash.update(hawk.utils.parseContentType(contentType) + '\n');
-        hash.update(payload || '');
-        hash.update('\n');
-        return hash.finalize().toString(CryptoJS.enc.Base64);
-    },
-
-    calculateTsMac: function (ts, credentials) {
-
-        var hash = CryptoJS['Hmac' + credentials.algorithm.toUpperCase()]('hawk.' + hawk.crypto.headerVersion + '.ts\n' + ts + '\n', credentials.key);
-        return hash.toString(CryptoJS.enc.Base64);
-    }
-};
-
-
-hawk.utils = {
-
-    storage: {                                      // localStorage compatible interface
-        _cache: {},
-        setItem: function (key, value) {
-
-            hawk.utils.storage._cache[key] = value;
-        },
-        getItem: function (key) {
-
-            return hawk.utils.storage._cache[key];
-        }
-    },
-
-    setStorage: function (storage) {
-
-        var ntpOffset = hawk.utils.getNtpOffset() || 0;
-        hawk.utils.storage = storage;
-        hawk.utils.setNtpOffset(ntpOffset);
-    },
-
-    setNtpOffset: function (offset) {
-
-        hawk.utils.storage.setItem('hawk_ntp_offset', offset);
-    },
-
-    getNtpOffset: function () {
-
-        return parseInt(hawk.utils.storage.getItem('hawk_ntp_offset') || '0', 10);
-    },
-
-    now: function () {
-
-        return Date.now() + hawk.utils.getNtpOffset();
-    },
-
-    escapeHeaderAttribute: function (attribute) {
-
-        return attribute.replace(/\\/g, '\\\\').replace(/\"/g, '\\"');
-    },
-
-    parseContentType: function (header) {
-
-        if (!header) {
-            return '';
-        }
-
-        return header.split(';')[0].trim().toLowerCase();
-    },
-
-    parseAuthorizationHeader: function (header, keys) {
-
-        if (!header) {
-            return null;
-        }
-
-        var headerParts = header.match(/^(\w+)(?:\s+(.*))?$/);       // Header: scheme[ something]
-        if (!headerParts) {
-            return null;
-        }
-
-        var scheme = headerParts[1];
-        if (scheme.toLowerCase() !== 'hawk') {
-            return null;
-        }
-
-        var attributesString = headerParts[2];
-        if (!attributesString) {
-            return null;
-        }
-
-        var attributes = {};
-        var verify = attributesString.replace(/(\w+)="([^"\\]*)"\s*(?:,\s*|$)/g, function ($0, $1, $2) {
-
-            // Check valid attribute names
-
-            if (keys.indexOf($1) === -1) {
-                return;
-            }
-
-            // Allowed attribute value characters: !#$%&'()*+,-./:;<=>?@[]^_`{|}~ and space, a-z, A-Z, 0-9
-
-            if ($2.match(/^[ \w\!#\$%&'\(\)\*\+,\-\.\/\:;<\=>\?@\[\]\^`\{\|\}~]+$/) === null) {
-                return;
-            }
-
-            // Check for duplicates
-
-            if (attributes.hasOwnProperty($1)) {
-                return;
-            }
-
-            attributes[$1] = $2;
-            return '';
-        });
-
-        if (verify !== '') {
-            return null;
-        }
-
-        return attributes;
-    },
-
-    randomString: function (size) {
-
-        var randomSource = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789';
-        var len = randomSource.length;
-
-        var result = [];
-        for (var i = 0; i < size; ++i) {
-            result[i] = randomSource[Math.floor(Math.random() * len)];
-        }
-
-        return result.join('');
-    },
-
-    parseUri: function (input) {
-
-        // Based on: parseURI 1.2.2
-        // http://blog.stevenlevithan.com/archives/parseuri
-        // (c) Steven Levithan <stevenlevithan.com>
-        // MIT License
-
-        var keys = ['source', 'protocol', 'authority', 'userInfo', 'user', 'password', 'hostname', 'port', 'resource', 'relative', 'pathname', 'directory', 'file', 'query', 'fragment'];
-
-        var uriRegex = /^(?:([^:\/?#]+):)?(?:\/\/((?:(([^:@]*)(?::([^:@]*))?)?@)?([^:\/?#]*)(?::(\d*))?))?(((((?:[^?#\/]*\/)*)([^?#]*))(?:\?([^#]*))?)(?:#(.*))?)/;
-        var uriByNumber = uriRegex.exec(input);
-        var uri = {};
-
-        var i = 15;
-        while (i--) {
-            uri[keys[i]] = uriByNumber[i] || '';
-        }
-
-        if (uri.port === null ||
-            uri.port === '') {
-
-            uri.port = (uri.protocol.toLowerCase() === 'http' ? '80' : (uri.protocol.toLowerCase() === 'https' ? '443' : ''));
-        }
-
-        return uri;
-    }
-};
-
-
-// Based on: Crypto-JS v3.1.2
-// Copyright (c) 2009-2013, Jeff Mott. All rights reserved.
-// http://code.google.com/p/crypto-js/
-// http://code.google.com/p/crypto-js/wiki/License
-
-var CryptoJS=CryptoJS||function(h,r){var k={},l=k.lib={},n=function(){},f=l.Base={extend:function(a){n.prototype=this;var b=new n;a&&b.mixIn(a);b.hasOwnProperty("init")||(b.init=function(){b.$super.init.apply(this,arguments)});b.init.prototype=b;b.$super=this;return b},create:function(){var a=this.extend();a.init.apply(a,arguments);return a},init:function(){},mixIn:function(a){for(var b in a)a.hasOwnProperty(b)&&(this[b]=a[b]);a.hasOwnProperty("toString")&&(this.toString=a.toString)},clone:function(){return this.init.prototype.extend(this)}},j=l.WordArray=f.extend({init:function(a,b){a=this.words=a||[];this.sigBytes=b!=r?b:4*a.length},toString:function(a){return(a||s).stringify(this)},concat:function(a){var b=this.words,d=a.words,c=this.sigBytes;a=a.sigBytes;this.clamp();if(c%4)for(var e=0;e<a;e++)b[c+e>>>2]|=(d[e>>>2]>>>24-8*(e%4)&255)<<24-8*((c+e)%4);else if(65535<d.length)for(e=0;e<a;e+=4)b[c+e>>>2]=d[e>>>2];else b.push.apply(b,d);this.sigBytes+=a;return this},clamp:function(){var a=this.words,b=this.sigBytes;a[b>>>2]&=4294967295<<32-8*(b%4);a.length=h.ceil(b/4)},clone:function(){var a=f.clone.call(this);a.words=this.words.slice(0);return a},random:function(a){for(var b=[],d=0;d<a;d+=4)b.push(4294967296*h.random()|0);return new j.init(b,a)}}),m=k.enc={},s=m.Hex={stringify:function(a){var b=a.words;a=a.sigBytes;for(var d=[],c=0;c<a;c++){var e=b[c>>>2]>>>24-8*(c%4)&255;d.push((e>>>4).toString(16));d.push((e&15).toString(16))}return d.join("")},parse:function(a){for(var b=a.length,d=[],c=0;c<b;c+=2)d[c>>>3]|=parseInt(a.substr(c,2),16)<<24-4*(c%8);return new j.init(d,b/2)}},p=m.Latin1={stringify:function(a){var b=a.words;a=a.sigBytes;for(var d=[],c=0;c<a;c++)d.push(String.fromCharCode(b[c>>>2]>>>24-8*(c%4)&255));return d.join("")},parse:function(a){for(var b=a.length,d=[],c=0;c<b;c++)d[c>>>2]|=(a.charCodeAt(c)&255)<<24-8*(c%4);return new j.init(d,b)}},t=m.Utf8={stringify:function(a){try{return decodeURIComponent(escape(p.stringify(a)))}catch(b){throw Error("Malformed UTF-8 data");}},parse:function(a){return p.parse(unescape(encodeURIComponent(a)))}},q=l.BufferedBlockAlgorithm=f.extend({reset:function(){this._data=new j.init;this._nDataBytes=0},_append:function(a){"string"==typeof a&&(a=t.parse(a));this._data.concat(a);this._nDataBytes+=a.sigBytes},_process:function(a){var b=this._data,d=b.words,c=b.sigBytes,e=this.blockSize,f=c/(4*e),f=a?h.ceil(f):h.max((f|0)-this._minBufferSize,0);a=f*e;c=h.min(4*a,c);if(a){for(var g=0;g<a;g+=e)this._doProcessBlock(d,g);g=d.splice(0,a);b.sigBytes-=c}return new j.init(g,c)},clone:function(){var a=f.clone.call(this);a._data=this._data.clone();return a},_minBufferSize:0});l.Hasher=q.extend({cfg:f.extend(),init:function(a){this.cfg=this.cfg.extend(a);this.reset()},reset:function(){q.reset.call(this);this._doReset()},update:function(a){this._append(a);this._process();return this},finalize:function(a){a&&this._append(a);return this._doFinalize()},blockSize:16,_createHelper:function(a){return function(b,d){return(new a.init(d)).finalize(b)}},_createHmacHelper:function(a){return function(b,d){return(new u.HMAC.init(a,d)).finalize(b)}}});var u=k.algo={};return k}(Math);
-(function () { var k = CryptoJS, b = k.lib, m = b.WordArray, l = b.Hasher, d = [], b = k.algo.SHA1 = l.extend({ _doReset: function () { this._hash = new m.init([1732584193, 4023233417, 2562383102, 271733878, 3285377520]) }, _doProcessBlock: function (n, p) { for (var a = this._hash.words, e = a[0], f = a[1], h = a[2], j = a[3], b = a[4], c = 0; 80 > c; c++) { if (16 > c) d[c] = n[p + c] | 0; else { var g = d[c - 3] ^ d[c - 8] ^ d[c - 14] ^ d[c - 16]; d[c] = g << 1 | g >>> 31 } g = (e << 5 | e >>> 27) + b + d[c]; g = 20 > c ? g + ((f & h | ~f & j) + 1518500249) : 40 > c ? g + ((f ^ h ^ j) + 1859775393) : 60 > c ? g + ((f & h | f & j | h & j) - 1894007588) : g + ((f ^ h ^ j) - 899497514); b = j; j = h; h = f << 30 | f >>> 2; f = e; e = g } a[0] = a[0] + e | 0; a[1] = a[1] + f | 0; a[2] = a[2] + h | 0; a[3] = a[3] + j | 0; a[4] = a[4] + b | 0 }, _doFinalize: function () { var b = this._data, d = b.words, a = 8 * this._nDataBytes, e = 8 * b.sigBytes; d[e >>> 5] |= 128 << 24 - e % 32; d[(e + 64 >>> 9 << 4) + 14] = Math.floor(a / 4294967296); d[(e + 64 >>> 9 << 4) + 15] = a; b.sigBytes = 4 * d.length; this._process(); return this._hash }, clone: function () { var b = l.clone.call(this); b._hash = this._hash.clone(); return b } }); k.SHA1 = l._createHelper(b); k.HmacSHA1 = l._createHmacHelper(b) })();
-(function (k) { for (var g = CryptoJS, h = g.lib, v = h.WordArray, j = h.Hasher, h = g.algo, s = [], t = [], u = function (q) { return 4294967296 * (q - (q | 0)) | 0 }, l = 2, b = 0; 64 > b;) { var d; a: { d = l; for (var w = k.sqrt(d), r = 2; r <= w; r++) if (!(d % r)) { d = !1; break a } d = !0 } d && (8 > b && (s[b] = u(k.pow(l, 0.5))), t[b] = u(k.pow(l, 1 / 3)), b++); l++ } var n = [], h = h.SHA256 = j.extend({ _doReset: function () { this._hash = new v.init(s.slice(0)) }, _doProcessBlock: function (q, h) { for (var a = this._hash.words, c = a[0], d = a[1], b = a[2], k = a[3], f = a[4], g = a[5], j = a[6], l = a[7], e = 0; 64 > e; e++) { if (16 > e) n[e] = q[h + e] | 0; else { var m = n[e - 15], p = n[e - 2]; n[e] = ((m << 25 | m >>> 7) ^ (m << 14 | m >>> 18) ^ m >>> 3) + n[e - 7] + ((p << 15 | p >>> 17) ^ (p << 13 | p >>> 19) ^ p >>> 10) + n[e - 16] } m = l + ((f << 26 | f >>> 6) ^ (f << 21 | f >>> 11) ^ (f << 7 | f >>> 25)) + (f & g ^ ~f & j) + t[e] + n[e]; p = ((c << 30 | c >>> 2) ^ (c << 19 | c >>> 13) ^ (c << 10 | c >>> 22)) + (c & d ^ c & b ^ d & b); l = j; j = g; g = f; f = k + m | 0; k = b; b = d; d = c; c = m + p | 0 } a[0] = a[0] + c | 0; a[1] = a[1] + d | 0; a[2] = a[2] + b | 0; a[3] = a[3] + k | 0; a[4] = a[4] + f | 0; a[5] = a[5] + g | 0; a[6] = a[6] + j | 0; a[7] = a[7] + l | 0 }, _doFinalize: function () { var d = this._data, b = d.words, a = 8 * this._nDataBytes, c = 8 * d.sigBytes; b[c >>> 5] |= 128 << 24 - c % 32; b[(c + 64 >>> 9 << 4) + 14] = k.floor(a / 4294967296); b[(c + 64 >>> 9 << 4) + 15] = a; d.sigBytes = 4 * b.length; this._process(); return this._hash }, clone: function () { var b = j.clone.call(this); b._hash = this._hash.clone(); return b } }); g.SHA256 = j._createHelper(h); g.HmacSHA256 = j._createHmacHelper(h) })(Math);
-(function(){var c=CryptoJS,k=c.enc.Utf8;c.algo.HMAC=c.lib.Base.extend({init:function(a,b){a=this._hasher=new a.init;"string"==typeof b&&(b=k.parse(b));var c=a.blockSize,e=4*c;b.sigBytes>e&&(b=a.finalize(b));b.clamp();for(var f=this._oKey=b.clone(),g=this._iKey=b.clone(),h=f.words,j=g.words,d=0;d<c;d++)h[d]^=1549556828,j[d]^=909522486;f.sigBytes=g.sigBytes=e;this.reset()},reset:function(){var a=this._hasher;a.reset();a.update(this._iKey)},update:function(a){this._hasher.update(a);return this},finalize:function(a){var b=this._hasher;a=b.finalize(a);b.reset();return b.finalize(this._oKey.clone().concat(a))}})})();
-(function(){var h=CryptoJS,j=h.lib.WordArray;h.enc.Base64={stringify:function(b){var e=b.words,f=b.sigBytes,c=this._map;b.clamp();b=[];for(var a=0;a<f;a+=3)for(var d=(e[a>>>2]>>>24-8*(a%4)&255)<<16|(e[a+1>>>2]>>>24-8*((a+1)%4)&255)<<8|e[a+2>>>2]>>>24-8*((a+2)%4)&255,g=0;4>g&&a+0.75*g<f;g++)b.push(c.charAt(d>>>6*(3-g)&63));if(e=c.charAt(64))for(;b.length%4;)b.push(e);return b.join("")},parse:function(b){var e=b.length,f=this._map,c=f.charAt(64);c&&(c=b.indexOf(c),-1!=c&&(e=c));for(var c=[],a=0,d=0;d<e;d++)if(d%4){var g=f.indexOf(b.charAt(d-1))<<2*(d%4),h=f.indexOf(b.charAt(d))>>>6-2*(d%4);c[a>>>2]|=(g|h)<<24-8*(a%4);a++}return j.create(c,a)},_map:"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/="}})();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/client.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,367 +0,0 @@
-// Load modules
-
-var Url = require('url');
-var Hoek = require('hoek');
-var Cryptiles = require('cryptiles');
-var Crypto = require('./crypto');
-var Utils = require('./utils');
-
-
-// Declare internals
-
-var internals = {};
-
-
-// Generate an Authorization header for a given request
-
-/*
-    uri: 'http://example.com/resource?a=b' or object from Url.parse()
-    method: HTTP verb (e.g. 'GET', 'POST')
-    options: {
-
-        // Required
-
-        credentials: {
-            id: 'dh37fgj492je',
-            key: 'aoijedoaijsdlaksjdl',
-            algorithm: 'sha256'                                 // 'sha1', 'sha256'
-        },
-
-        // Optional
-
-        ext: 'application-specific',                        // Application specific data sent via the ext attribute
-        timestamp: Date.now(),                              // A pre-calculated timestamp
-        nonce: '2334f34f',                                  // A pre-generated nonce
-        localtimeOffsetMsec: 400,                           // Time offset to sync with server time (ignored if timestamp provided)
-        payload: '{"some":"payload"}',                      // UTF-8 encoded string for body hash generation (ignored if hash provided)
-        contentType: 'application/json',                    // Payload content-type (ignored if hash provided)
-        hash: 'U4MKKSmiVxk37JCCrAVIjV=',                    // Pre-calculated payload hash
-        app: '24s23423f34dx',                               // Oz application id
-        dlg: '234sz34tww3sd'                                // Oz delegated-by application id
-    }
-*/
-
-exports.header = function (uri, method, options) {
-
-    var result = {
-        field: '',
-        artifacts: {}
-    };
-
-    // Validate inputs
-
-    if (!uri || (typeof uri !== 'string' && typeof uri !== 'object') ||
-        !method || typeof method !== 'string' ||
-        !options || typeof options !== 'object') {
-
-        return result;
-    }
-
-    // Application time
-
-    var timestamp = options.timestamp || Math.floor((Utils.now() + (options.localtimeOffsetMsec || 0)) / 1000)
-
-    // Validate credentials
-
-    var credentials = options.credentials;
-    if (!credentials ||
-        !credentials.id ||
-        !credentials.key ||
-        !credentials.algorithm) {
-
-        // Invalid credential object
-        return result;
-    }
-
-    if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) {
-        return result;
-    }
-
-    // Parse URI
-
-    if (typeof uri === 'string') {
-        uri = Url.parse(uri);
-    }
-
-    // Calculate signature
-
-    var artifacts = {
-        ts: timestamp,
-        nonce: options.nonce || Cryptiles.randomString(6),
-        method: method,
-        resource: uri.pathname + (uri.search || ''),                            // Maintain trailing '?'
-        host: uri.hostname,
-        port: uri.port || (uri.protocol === 'http:' ? 80 : 443),
-        hash: options.hash,
-        ext: options.ext,
-        app: options.app,
-        dlg: options.dlg
-    };
-
-    result.artifacts = artifacts;
-
-    // Calculate payload hash
-
-    if (!artifacts.hash &&
-        options.hasOwnProperty('payload')) {
-
-        artifacts.hash = Crypto.calculatePayloadHash(options.payload, credentials.algorithm, options.contentType);
-    }
-
-    var mac = Crypto.calculateMac('header', credentials, artifacts);
-
-    // Construct header
-
-    var hasExt = artifacts.ext !== null && artifacts.ext !== undefined && artifacts.ext !== '';       // Other falsey values allowed
-    var header = 'Hawk id="' + credentials.id +
-                 '", ts="' + artifacts.ts +
-                 '", nonce="' + artifacts.nonce +
-                 (artifacts.hash ? '", hash="' + artifacts.hash : '') +
-                 (hasExt ? '", ext="' + Utils.escapeHeaderAttribute(artifacts.ext) : '') +
-                 '", mac="' + mac + '"';
-
-    if (artifacts.app) {
-        header += ', app="' + artifacts.app +
-                  (artifacts.dlg ? '", dlg="' + artifacts.dlg : '') + '"';
-    }
-
-    result.field = header;
-
-    return result;
-};
-
-
-// Validate server response
-
-/*
-    res:        node's response object
-    artifacts:  object recieved from header().artifacts
-    options: {
-        payload:    optional payload received
-        required:   specifies if a Server-Authorization header is required. Defaults to 'false'
-    }
-*/
-
-exports.authenticate = function (res, credentials, artifacts, options) {
-
-    artifacts = Hoek.clone(artifacts);
-    options = options || {};
-
-    if (res.headers['www-authenticate']) {
-
-        // Parse HTTP WWW-Authenticate header
-
-        var attributes = Utils.parseAuthorizationHeader(res.headers['www-authenticate'], ['ts', 'tsm', 'error']);
-        if (attributes instanceof Error) {
-            return false;
-        }
-
-        if (attributes.ts) {
-            var tsm = Crypto.calculateTsMac(attributes.ts, credentials);
-            if (tsm !== attributes.tsm) {
-                return false;
-            }
-        }
-    }
-
-    // Parse HTTP Server-Authorization header
-
-    if (!res.headers['server-authorization'] &&
-        !options.required) {
-
-        return true;
-    }
-
-    var attributes = Utils.parseAuthorizationHeader(res.headers['server-authorization'], ['mac', 'ext', 'hash']);
-    if (attributes instanceof Error) {
-        return false;
-    }
-
-    artifacts.ext = attributes.ext;
-    artifacts.hash = attributes.hash;
-
-    var mac = Crypto.calculateMac('response', credentials, artifacts);
-    if (mac !== attributes.mac) {
-        return false;
-    }
-
-    if (!options.hasOwnProperty('payload')) {
-        return true;
-    }
-
-    if (!attributes.hash) {
-        return false;
-    }
-
-    var calculatedHash = Crypto.calculatePayloadHash(options.payload, credentials.algorithm, res.headers['content-type']);
-    return (calculatedHash === attributes.hash);
-};
-
-
-// Generate a bewit value for a given URI
-
-/*
- * credentials is an object with the following keys: 'id, 'key', 'algorithm'.
- * options is an object with the following optional keys: 'ext', 'localtimeOffsetMsec'
- */
-/*
-    uri: 'http://example.com/resource?a=b' or object from Url.parse()
-    options: {
-
-        // Required
-
-        credentials: {
-            id: 'dh37fgj492je',
-            key: 'aoijedoaijsdlaksjdl',
-            algorithm: 'sha256'                             // 'sha1', 'sha256'
-        },
-        ttlSec: 60 * 60,                                    // TTL in seconds
-
-        // Optional
-
-        ext: 'application-specific',                        // Application specific data sent via the ext attribute
-        localtimeOffsetMsec: 400                            // Time offset to sync with server time
-    };
-*/
-
-exports.getBewit = function (uri, options) {
-
-    // Validate inputs
-
-    if (!uri ||
-        (typeof uri !== 'string' && typeof uri !== 'object') ||
-        !options ||
-        typeof options !== 'object' ||
-        !options.ttlSec) {
-
-        return '';
-    }
-
-    options.ext = (options.ext === null || options.ext === undefined ? '' : options.ext);       // Zero is valid value
-
-    // Application time
-
-    var now = Utils.now() + (options.localtimeOffsetMsec || 0);
-
-    // Validate credentials
-
-    var credentials = options.credentials;
-    if (!credentials ||
-        !credentials.id ||
-        !credentials.key ||
-        !credentials.algorithm) {
-
-        return '';
-    }
-
-    if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) {
-        return '';
-    }
-
-    // Parse URI
-
-    if (typeof uri === 'string') {
-        uri = Url.parse(uri);
-    }
-
-    // Calculate signature
-
-    var exp = Math.floor(now / 1000) + options.ttlSec;
-    var mac = Crypto.calculateMac('bewit', credentials, {
-        ts: exp,
-        nonce: '',
-        method: 'GET',
-        resource: uri.pathname + (uri.search || ''),                            // Maintain trailing '?'
-        host: uri.hostname,
-        port: uri.port || (uri.protocol === 'http:' ? 80 : 443),
-        ext: options.ext
-    });
-
-    // Construct bewit: id\exp\mac\ext
-
-    var bewit = credentials.id + '\\' + exp + '\\' + mac + '\\' + options.ext;
-    return Utils.base64urlEncode(bewit);
-};
-
-
-// Generate an authorization string for a message
-
-/*
-    host: 'example.com',
-    port: 8000,
-    message: '{"some":"payload"}',                          // UTF-8 encoded string for body hash generation
-    options: {
-
-        // Required
-
-        credentials: {
-            id: 'dh37fgj492je',
-            key: 'aoijedoaijsdlaksjdl',
-            algorithm: 'sha256'                             // 'sha1', 'sha256'
-        },
-
-        // Optional
-
-        timestamp: Date.now(),                              // A pre-calculated timestamp
-        nonce: '2334f34f',                                  // A pre-generated nonce
-        localtimeOffsetMsec: 400,                           // Time offset to sync with server time (ignored if timestamp provided)
-    }
-*/
-
-exports.message = function (host, port, message, options) {
-
-    // Validate inputs
-
-    if (!host || typeof host !== 'string' ||
-        !port || typeof port !== 'number' ||
-        message === null || message === undefined || typeof message !== 'string' ||
-        !options || typeof options !== 'object') {
-
-        return null;
-    }
-
-    // Application time
-
-    var timestamp = options.timestamp || Math.floor((Utils.now() + (options.localtimeOffsetMsec || 0)) / 1000)
-
-    // Validate credentials
-
-    var credentials = options.credentials;
-    if (!credentials ||
-        !credentials.id ||
-        !credentials.key ||
-        !credentials.algorithm) {
-
-        // Invalid credential object
-        return null;
-    }
-
-    if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) {
-        return null;
-    }
-
-    // Calculate signature
-
-    var artifacts = {
-        ts: timestamp,
-        nonce: options.nonce || Cryptiles.randomString(6),
-        host: host,
-        port: port,
-        hash: Crypto.calculatePayloadHash(message, credentials.algorithm)
-    };
-
-    // Construct authorization
-
-    var result = {
-        id: credentials.id,
-        ts: artifacts.ts,
-        nonce: artifacts.nonce,
-        hash: artifacts.hash,
-        mac: Crypto.calculateMac('message', credentials, artifacts)
-    };
-
-    return result;
-};
-
-
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/crypto.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,111 +0,0 @@
-// Load modules
-
-var Crypto = require('crypto');
-var Url = require('url');
-var Utils = require('./utils');
-
-
-// Declare internals
-
-var internals = {};
-
-
-// MAC normalization format version
-
-exports.headerVersion = '1';                        // Prevent comparison of mac values generated with different normalized string formats
-
-
-// Supported HMAC algorithms
-
-exports.algorithms = ['sha1', 'sha256'];
-
-
-// Calculate the request MAC
-
-/*
-    type: 'header',                                 // 'header', 'bewit', 'response'
-    credentials: {
-        key: 'aoijedoaijsdlaksjdl',
-        algorithm: 'sha256'                         // 'sha1', 'sha256'
-    },
-    options: {
-        method: 'GET',
-        resource: '/resource?a=1&b=2',
-        host: 'example.com',
-        port: 8080,
-        ts: 1357718381034,
-        nonce: 'd3d345f',
-        hash: 'U4MKKSmiVxk37JCCrAVIjV/OhB3y+NdwoCr6RShbVkE=',
-        ext: 'app-specific-data',
-        app: 'hf48hd83qwkj',                        // Application id (Oz)
-        dlg: 'd8djwekds9cj'                         // Delegated by application id (Oz), requires options.app
-    }
-*/
-
-exports.calculateMac = function (type, credentials, options) {
-
-    var normalized = exports.generateNormalizedString(type, options);
-
-    var hmac = Crypto.createHmac(credentials.algorithm, credentials.key).update(normalized);
-    var digest = hmac.digest('base64');
-    return digest;
-};
-
-
-exports.generateNormalizedString = function (type, options) {
-
-    var normalized = 'hawk.' + exports.headerVersion + '.' + type + '\n' +
-                     options.ts + '\n' +
-                     options.nonce + '\n' +
-                     (options.method || '').toUpperCase() + '\n' +
-                     (options.resource || '') + '\n' +
-                     options.host.toLowerCase() + '\n' +
-                     options.port + '\n' +
-                     (options.hash || '') + '\n';
-
-    if (options.ext) {
-        normalized += options.ext.replace('\\', '\\\\').replace('\n', '\\n');
-    }
-
-    normalized += '\n';
-
-    if (options.app) {
-        normalized += options.app + '\n' +
-                      (options.dlg || '') + '\n';
-    }
-
-    return normalized;
-};
-
-
-exports.calculatePayloadHash = function (payload, algorithm, contentType) {
-
-    var hash = exports.initializePayloadHash(algorithm, contentType);
-    hash.update(payload || '');
-    return exports.finalizePayloadHash(hash);
-};
-
-
-exports.initializePayloadHash = function (algorithm, contentType) {
-
-    var hash = Crypto.createHash(algorithm);
-    hash.update('hawk.' + exports.headerVersion + '.payload\n');
-    hash.update(Utils.parseContentType(contentType) + '\n');
-    return hash;
-};
-
-
-exports.finalizePayloadHash = function (hash) {
-
-    hash.update('\n');
-    return hash.digest('base64');
-};
-
-
-exports.calculateTsMac = function (ts, credentials) {
-
-    var hmac = Crypto.createHmac(credentials.algorithm, credentials.key);
-    hmac.update('hawk.' + exports.headerVersion + '.ts\n' + ts + '\n');
-    return hmac.digest('base64');
-};
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-// Export sub-modules
-
-exports.error = exports.Error = require('boom');
-exports.sntp = require('sntp');
-exports.server = require('./server');
-exports.client = require('./client');
-exports.crypto = require('./crypto');
-exports.utils = require('./utils');
-
-exports.uri = {
-    authenticate: exports.server.authenticateBewit,
-    getBewit: exports.client.getBewit
-};
-
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/server.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,524 +0,0 @@
-// Load modules
-
-var Boom = require('boom');
-var Hoek = require('hoek');
-var Cryptiles = require('cryptiles');
-var Crypto = require('./crypto');
-var Utils = require('./utils');
-
-
-// Declare internals
-
-var internals = {};
-
-
-// Hawk authentication
-
-/*
-   req:                 node's HTTP request object or an object as follows:
-  
-                        var request = {
-                            method: 'GET',
-                            url: '/resource/4?a=1&b=2',
-                            host: 'example.com',
-                            port: 8080,
-                            authorization: 'Hawk id="dh37fgj492je", ts="1353832234", nonce="j4h3g2", ext="some-app-ext-data", mac="6R4rV5iE+NPoym+WwjeHzjAGXUtLNIxmo1vpMofpLAE="'
-                        };
-  
-   credentialsFunc:     required function to lookup the set of Hawk credentials based on the provided credentials id.
-                        The credentials include the MAC key, MAC algorithm, and other attributes (such as username)
-                        needed by the application. This function is the equivalent of verifying the username and
-                        password in Basic authentication.
-  
-                        var credentialsFunc = function (id, callback) {
-    
-                            // Lookup credentials in database
-                            db.lookup(id, function (err, item) {
-    
-                                if (err || !item) {
-                                    return callback(err);
-                                }
-    
-                                var credentials = {
-                                    // Required
-                                    key: item.key,
-                                    algorithm: item.algorithm,
-                                    // Application specific
-                                    user: item.user
-                                };
-    
-                                return callback(null, credentials);
-                            });
-                        };
-  
-   options: {
-
-        hostHeaderName:        optional header field name, used to override the default 'Host' header when used
-                               behind a cache of a proxy. Apache2 changes the value of the 'Host' header while preserving
-                               the original (which is what the module must verify) in the 'x-forwarded-host' header field.
-                               Only used when passed a node Http.ServerRequest object.
-  
-        nonceFunc:             optional nonce validation function. The function signature is function(nonce, ts, callback)
-                               where 'callback' must be called using the signature function(err).
-  
-        timestampSkewSec:      optional number of seconds of permitted clock skew for incoming timestamps. Defaults to 60 seconds.
-                               Provides a +/- skew which means actual allowed window is double the number of seconds.
-  
-        localtimeOffsetMsec:   optional local clock time offset express in a number of milliseconds (positive or negative).
-                               Defaults to 0.
-  
-        payload:               optional payload for validation. The client calculates the hash value and includes it via the 'hash'
-                               header attribute. The server always ensures the value provided has been included in the request
-                               MAC. When this option is provided, it validates the hash value itself. Validation is done by calculating
-                               a hash value over the entire payload (assuming it has already be normalized to the same format and
-                               encoding used by the client to calculate the hash on request). If the payload is not available at the time
-                               of authentication, the authenticatePayload() method can be used by passing it the credentials and
-                               attributes.hash returned in the authenticate callback.
-
-        host:                  optional host name override. Only used when passed a node request object.
-        port:                  optional port override. Only used when passed a node request object.
-    }
-
-    callback: function (err, credentials, artifacts) { }
- */
-
-exports.authenticate = function (req, credentialsFunc, options, callback) {
-
-    callback = Utils.nextTick(callback);
-    
-    // Default options
-
-    options.nonceFunc = options.nonceFunc || function (nonce, ts, nonceCallback) { return nonceCallback(); };   // No validation
-    options.timestampSkewSec = options.timestampSkewSec || 60;                                                  // 60 seconds
-
-    // Application time
-
-    var now = Utils.now() + (options.localtimeOffsetMsec || 0);                 // Measure now before any other processing
-
-    // Convert node Http request object to a request configuration object
-
-    var request = Utils.parseRequest(req, options);
-    if (request instanceof Error) {
-        return callback(Boom.badRequest(request.message));
-    }
-
-    // Parse HTTP Authorization header
-
-    var attributes = Utils.parseAuthorizationHeader(request.authorization);
-    if (attributes instanceof Error) {
-        return callback(attributes);
-    }
-
-    // Construct artifacts container
-
-    var artifacts = {
-        method: request.method,
-        host: request.host,
-        port: request.port,
-        resource: request.url,
-        ts: attributes.ts,
-        nonce: attributes.nonce,
-        hash: attributes.hash,
-        ext: attributes.ext,
-        app: attributes.app,
-        dlg: attributes.dlg,
-        mac: attributes.mac,
-        id: attributes.id
-    };
-
-    // Verify required header attributes
-
-    if (!attributes.id ||
-        !attributes.ts ||
-        !attributes.nonce ||
-        !attributes.mac) {
-
-        return callback(Boom.badRequest('Missing attributes'), null, artifacts);
-    }
-
-    // Fetch Hawk credentials
-
-    credentialsFunc(attributes.id, function (err, credentials) {
-
-        if (err) {
-            return callback(err, credentials || null, artifacts);
-        }
-
-        if (!credentials) {
-            return callback(Boom.unauthorized('Unknown credentials', 'Hawk'), null, artifacts);
-        }
-
-        if (!credentials.key ||
-            !credentials.algorithm) {
-
-            return callback(Boom.internal('Invalid credentials'), credentials, artifacts);
-        }
-
-        if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) {
-            return callback(Boom.internal('Unknown algorithm'), credentials, artifacts);
-        }
-
-        // Calculate MAC
-
-        var mac = Crypto.calculateMac('header', credentials, artifacts);
-        if (!Cryptiles.fixedTimeComparison(mac, attributes.mac)) {
-            return callback(Boom.unauthorized('Bad mac', 'Hawk'), credentials, artifacts);
-        }
-
-        // Check payload hash
-
-        if (options.payload !== null &&
-            options.payload !== undefined) {       // '' is valid
-
-            if (!attributes.hash) {
-                return callback(Boom.unauthorized('Missing required payload hash', 'Hawk'), credentials, artifacts);
-            }
-
-            var hash = Crypto.calculatePayloadHash(options.payload, credentials.algorithm, request.contentType);
-            if (!Cryptiles.fixedTimeComparison(hash, attributes.hash)) {
-                return callback(Boom.unauthorized('Bad payload hash', 'Hawk'), credentials, artifacts);
-            }
-        }
-
-        // Check nonce
-
-        options.nonceFunc(attributes.nonce, attributes.ts, function (err) {
-
-            if (err) {
-                return callback(Boom.unauthorized('Invalid nonce', 'Hawk'), credentials, artifacts);
-            }
-
-            // Check timestamp staleness
-
-            if (Math.abs((attributes.ts * 1000) - now) > (options.timestampSkewSec * 1000)) {
-                var fresh = Math.floor((Utils.now() + (options.localtimeOffsetMsec || 0)) / 1000);            // Get fresh now
-                var tsm = Crypto.calculateTsMac(fresh, credentials);
-                return callback(Boom.unauthorized('Stale timestamp', 'Hawk', { ts: fresh, tsm: tsm }), credentials, artifacts);
-            }
-
-            // Successful authentication
-
-            return callback(null, credentials, artifacts);
-        });
-    });
-};
-
-
-// Authenticate payload hash - used when payload cannot be provided during authenticate()
-
-/*
-    payload:        raw request payload
-    credentials:    from authenticate callback
-    artifacts:      from authenticate callback
-    contentType:    req.headers['content-type']
-*/
-
-exports.authenticatePayload = function (payload, credentials, artifacts, contentType) {
-
-    var calculatedHash = Crypto.calculatePayloadHash(payload, credentials.algorithm, contentType);
-    return Cryptiles.fixedTimeComparison(calculatedHash, artifacts.hash);
-};
-
-
-// Generate a Server-Authorization header for a given response
-
-/*
-    credentials: {},                                        // Object received from authenticate()
-    artifacts: {}                                           // Object received from authenticate(); 'mac', 'hash', and 'ext' - ignored
-    options: {
-        ext: 'application-specific',                        // Application specific data sent via the ext attribute
-        payload: '{"some":"payload"}',                      // UTF-8 encoded string for body hash generation (ignored if hash provided)
-        contentType: 'application/json',                    // Payload content-type (ignored if hash provided)
-        hash: 'U4MKKSmiVxk37JCCrAVIjV='                     // Pre-calculated payload hash
-    }
-*/
-
-exports.header = function (credentials, artifacts, options) {
-
-    // Prepare inputs
-
-    options = options || {};
-
-    if (!artifacts ||
-        typeof artifacts !== 'object' ||
-        typeof options !== 'object') {
-
-        return '';
-    }
-
-    artifacts = Hoek.clone(artifacts);
-    delete artifacts.mac;
-    artifacts.hash = options.hash;
-    artifacts.ext = options.ext;
-
-    // Validate credentials
-
-    if (!credentials ||
-        !credentials.key ||
-        !credentials.algorithm) {
-
-        // Invalid credential object
-        return '';
-    }
-
-    if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) {
-        return '';
-    }
-
-    // Calculate payload hash
-
-    if (!artifacts.hash &&
-        options.hasOwnProperty('payload')) {
-
-        artifacts.hash = Crypto.calculatePayloadHash(options.payload, credentials.algorithm, options.contentType);
-    }
-
-    var mac = Crypto.calculateMac('response', credentials, artifacts);
-
-    // Construct header
-
-    var header = 'Hawk mac="' + mac + '"' +
-                 (artifacts.hash ? ', hash="' + artifacts.hash + '"' : '');
-
-    if (artifacts.ext !== null &&
-        artifacts.ext !== undefined &&
-        artifacts.ext !== '') {                       // Other falsey values allowed
-
-        header += ', ext="' + Utils.escapeHeaderAttribute(artifacts.ext) + '"';
-    }
-
-    return header;
-};
-
-
-/*
- * Arguments and options are the same as authenticate() with the exception that the only supported options are:
- * 'hostHeaderName', 'localtimeOffsetMsec', 'host', 'port'
- */
-
-exports.authenticateBewit = function (req, credentialsFunc, options, callback) {
-
-    callback = Utils.nextTick(callback);
-
-    // Application time
-
-    var now = Utils.now() + (options.localtimeOffsetMsec || 0);
-
-    // Convert node Http request object to a request configuration object
-
-    var request = Utils.parseRequest(req, options);
-    if (request instanceof Error) {
-        return callback(Boom.badRequest(request.message));
-    }
-
-    // Extract bewit
-
-    //                                 1     2             3           4     
-    var resource = request.url.match(/^(\/.*)([\?&])bewit\=([^&$]*)(?:&(.+))?$/);
-    if (!resource) {
-        return callback(Boom.unauthorized(null, 'Hawk'));
-    }
-
-    // Bewit not empty
-
-    if (!resource[3]) {
-        return callback(Boom.unauthorized('Empty bewit', 'Hawk'));
-    }
-
-    // Verify method is GET
-
-    if (request.method !== 'GET' &&
-        request.method !== 'HEAD') {
-
-        return callback(Boom.unauthorized('Invalid method', 'Hawk'));
-    }
-
-    // No other authentication
-
-    if (request.authorization) {
-        return callback(Boom.badRequest('Multiple authentications', 'Hawk'));
-    }
-
-    // Parse bewit
-
-    var bewitString = Utils.base64urlDecode(resource[3]);
-    if (bewitString instanceof Error) {
-        return callback(Boom.badRequest('Invalid bewit encoding'));
-    }
-
-    // Bewit format: id\exp\mac\ext ('\' is used because it is a reserved header attribute character)
-
-    var bewitParts = bewitString.split('\\');
-    if (!bewitParts ||
-        bewitParts.length !== 4) {
-
-        return callback(Boom.badRequest('Invalid bewit structure'));
-    }
-
-    var bewit = {
-        id: bewitParts[0],
-        exp: parseInt(bewitParts[1], 10),
-        mac: bewitParts[2],
-        ext: bewitParts[3] || ''
-    };
-
-    if (!bewit.id ||
-        !bewit.exp ||
-        !bewit.mac) {
-
-        return callback(Boom.badRequest('Missing bewit attributes'));
-    }
-
-    // Construct URL without bewit
-
-    var url = resource[1];
-    if (resource[4]) {
-        url += resource[2] + resource[4];
-    }
-
-    // Check expiration
-
-    if (bewit.exp * 1000 <= now) {
-        return callback(Boom.unauthorized('Access expired', 'Hawk'), null, bewit);
-    }
-
-    // Fetch Hawk credentials
-
-    credentialsFunc(bewit.id, function (err, credentials) {
-
-        if (err) {
-            return callback(err, credentials || null, bewit.ext);
-        }
-
-        if (!credentials) {
-            return callback(Boom.unauthorized('Unknown credentials', 'Hawk'), null, bewit);
-        }
-
-        if (!credentials.key ||
-            !credentials.algorithm) {
-
-            return callback(Boom.internal('Invalid credentials'), credentials, bewit);
-        }
-
-        if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) {
-            return callback(Boom.internal('Unknown algorithm'), credentials, bewit);
-        }
-
-        // Calculate MAC
-
-        var mac = Crypto.calculateMac('bewit', credentials, {
-            ts: bewit.exp,
-            nonce: '',
-            method: 'GET',
-            resource: url,
-            host: request.host,
-            port: request.port,
-            ext: bewit.ext
-        });
-
-        if (!Cryptiles.fixedTimeComparison(mac, bewit.mac)) {
-            return callback(Boom.unauthorized('Bad mac', 'Hawk'), credentials, bewit);
-        }
-
-        // Successful authentication
-
-        return callback(null, credentials, bewit);
-    });
-};
-
-
-/*
- *  options are the same as authenticate() with the exception that the only supported options are:
- * 'nonceFunc', 'timestampSkewSec', 'localtimeOffsetMsec'
- */
-
-exports.authenticateMessage = function (host, port, message, authorization, credentialsFunc, options, callback) {
-
-    callback = Utils.nextTick(callback);
-    
-    // Default options
-
-    options.nonceFunc = options.nonceFunc || function (nonce, ts, nonceCallback) { return nonceCallback(); };   // No validation
-    options.timestampSkewSec = options.timestampSkewSec || 60;                                                  // 60 seconds
-
-    // Application time
-
-    var now = Utils.now() + (options.localtimeOffsetMsec || 0);                 // Measure now before any other processing
-
-    // Validate authorization
-    
-    if (!authorization.id ||
-        !authorization.ts ||
-        !authorization.nonce ||
-        !authorization.hash ||
-        !authorization.mac) {
-        
-            return callback(Boom.badRequest('Invalid authorization'))
-    }
-
-    // Fetch Hawk credentials
-
-    credentialsFunc(authorization.id, function (err, credentials) {
-
-        if (err) {
-            return callback(err, credentials || null);
-        }
-
-        if (!credentials) {
-            return callback(Boom.unauthorized('Unknown credentials', 'Hawk'));
-        }
-
-        if (!credentials.key ||
-            !credentials.algorithm) {
-
-            return callback(Boom.internal('Invalid credentials'), credentials);
-        }
-
-        if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) {
-            return callback(Boom.internal('Unknown algorithm'), credentials);
-        }
-
-        // Construct artifacts container
-
-        var artifacts = {
-            ts: authorization.ts,
-            nonce: authorization.nonce,
-            host: host,
-            port: port,
-            hash: authorization.hash
-        };
-
-        // Calculate MAC
-
-        var mac = Crypto.calculateMac('message', credentials, artifacts);
-        if (!Cryptiles.fixedTimeComparison(mac, authorization.mac)) {
-            return callback(Boom.unauthorized('Bad mac', 'Hawk'), credentials);
-        }
-
-        // Check payload hash
-
-        var hash = Crypto.calculatePayloadHash(message, credentials.algorithm);
-        if (!Cryptiles.fixedTimeComparison(hash, authorization.hash)) {
-            return callback(Boom.unauthorized('Bad message hash', 'Hawk'), credentials);
-        }
-
-        // Check nonce
-
-        options.nonceFunc(authorization.nonce, authorization.ts, function (err) {
-
-            if (err) {
-                return callback(Boom.unauthorized('Invalid nonce', 'Hawk'), credentials);
-            }
-
-            // Check timestamp staleness
-
-            if (Math.abs((authorization.ts * 1000) - now) > (options.timestampSkewSec * 1000)) {
-                return callback(Boom.unauthorized('Stale timestamp'), credentials);
-            }
-
-            // Successful authentication
-
-            return callback(null, credentials);
-        });
-    });
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/utils.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,183 +0,0 @@
-// Load modules
-
-var Hoek = require('hoek');
-var Sntp = require('sntp');
-var Boom = require('boom');
-
-
-// Declare internals
-
-var internals = {};
-
-
-// Import Hoek Utilities
-
-internals.import = function () {
-
-    for (var i in Hoek) {
-        if (Hoek.hasOwnProperty(i)) {
-            exports[i] = Hoek[i];
-        }
-    }
-};
-
-internals.import();
-
-
-// Hawk version
-
-exports.version = function () {
-
-    return exports.loadPackage(__dirname + '/..').version;
-};
-
-
-// Extract host and port from request
-
-exports.parseHost = function (req, hostHeaderName) {
-
-    hostHeaderName = (hostHeaderName ? hostHeaderName.toLowerCase() : 'host');
-    var hostHeader = req.headers[hostHeaderName];
-    if (!hostHeader) {
-        return null;
-    }
-
-    var hostHeaderRegex;
-    if (hostHeader[0] === '[') {
-        hostHeaderRegex = /^(?:(?:\r\n)?\s)*(\[[^\]]+\])(?::(\d+))?(?:(?:\r\n)?\s)*$/;      // IPv6
-    }
-    else {
-        hostHeaderRegex = /^(?:(?:\r\n)?\s)*([^:]+)(?::(\d+))?(?:(?:\r\n)?\s)*$/;           // IPv4, hostname
-    }
-    
-    var hostParts = hostHeader.match(hostHeaderRegex);
-
-    if (!hostParts ||
-        hostParts.length !== 3 ||
-        !hostParts[1]) {
-
-        return null;
-    }
-
-    return {
-        name: hostParts[1],
-        port: (hostParts[2] ? hostParts[2] : (req.connection && req.connection.encrypted ? 443 : 80))
-    };
-};
-
-
-// Parse Content-Type header content
-
-exports.parseContentType = function (header) {
-
-    if (!header) {
-        return '';
-    }
-
-    return header.split(';')[0].trim().toLowerCase();
-};
-
-
-// Convert node's  to request configuration object
-
-exports.parseRequest = function (req, options) {
-
-    if (!req.headers) {
-        return req;
-    }
-    
-    // Obtain host and port information
-
-    if (!options.host || !options.port) {
-        var host = exports.parseHost(req, options.hostHeaderName);
-        if (!host) {
-            return new Error('Invalid Host header');
-        }
-    }
-
-    var request = {
-        method: req.method,
-        url: req.url,
-        host: options.host || host.name,
-        port: options.port || host.port,
-        authorization: req.headers.authorization,
-        contentType: req.headers['content-type'] || ''
-    };
-
-    return request;
-};
-
-
-exports.now = function () {
-
-    return Sntp.now();
-};
-
-
-// Parse Hawk HTTP Authorization header
-
-exports.parseAuthorizationHeader = function (header, keys) {
-
-    keys = keys || ['id', 'ts', 'nonce', 'hash', 'ext', 'mac', 'app', 'dlg'];
-
-    if (!header) {
-        return Boom.unauthorized(null, 'Hawk');
-    }
-
-    var headerParts = header.match(/^(\w+)(?:\s+(.*))?$/);       // Header: scheme[ something]
-    if (!headerParts) {
-        return Boom.badRequest('Invalid header syntax');
-    }
-
-    var scheme = headerParts[1];
-    if (scheme.toLowerCase() !== 'hawk') {
-        return Boom.unauthorized(null, 'Hawk');
-    }
-
-    var attributesString = headerParts[2];
-    if (!attributesString) {
-        return Boom.badRequest('Invalid header syntax');
-    }
-
-    var attributes = {};
-    var errorMessage = '';
-    var verify = attributesString.replace(/(\w+)="([^"\\]*)"\s*(?:,\s*|$)/g, function ($0, $1, $2) {
-
-        // Check valid attribute names
-
-        if (keys.indexOf($1) === -1) {
-            errorMessage = 'Unknown attribute: ' + $1;
-            return;
-        }
-
-        // Allowed attribute value characters: !#$%&'()*+,-./:;<=>?@[]^_`{|}~ and space, a-z, A-Z, 0-9
-
-        if ($2.match(/^[ \w\!#\$%&'\(\)\*\+,\-\.\/\:;<\=>\?@\[\]\^`\{\|\}~]+$/) === null) {
-            errorMessage = 'Bad attribute value: ' + $1;
-            return;
-        }
-
-        // Check for duplicates
-
-        if (attributes.hasOwnProperty($1)) {
-            errorMessage = 'Duplicate attribute: ' + $1;
-            return;
-        }
-
-        attributes[$1] = $2;
-        return '';
-    });
-
-    if (verify !== '') {
-        return Boom.badRequest(errorMessage || 'Bad header format');
-    }
-
-    return attributes;
-};
-
-
-exports.unauthorized = function (message) {
-
-    return Boom.unauthorized(message, 'Hawk');
-};
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-.idea
-*.iml
-npm-debug.log
-dump.rdb
-node_modules
-results.tap
-results.xml
-npm-shrinkwrap.json
-config.json
-.DS_Store
-*/.DS_Store
-*/*/.DS_Store
-._*
-*/._*
-*/*/._*
-coverage.*
-lib-cov
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-language: node_js
-
-node_js:
-  - 0.10
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-Copyright (c) 2012-2013, Walmart.
-All rights reserved.
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions are met:
-    * Redistributions of source code must retain the above copyright
-      notice, this list of conditions and the following disclaimer.
-    * Redistributions in binary form must reproduce the above copyright
-      notice, this list of conditions and the following disclaimer in the
-      documentation and/or other materials provided with the distribution.
-    * Neither the name of Walmart nor the
-      names of its contributors may be used to endorse or promote products
-      derived from this software without specific prior written permission.
-
-THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
-WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
-DISCLAIMED. IN NO EVENT SHALL WALMART BE LIABLE FOR ANY
-DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
-(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
-LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
-ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
-SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
-test:
-	@node node_modules/lab/bin/lab
-test-cov: 
-	@node node_modules/lab/bin/lab -r threshold -t 100
-test-cov-html:
-	@node node_modules/lab/bin/lab -r html -o coverage.html
-complexity:
-	@node node_modules/complexity-report/src/cli.js -o complexity.md -f markdown lib
-
-.PHONY: test test-cov test-cov-html complexity
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-<a href="https://github.com/spumko"><img src="https://raw.github.com/spumko/spumko/master/images/from.png" align="right" /></a>
-![boom Logo](https://raw.github.com/spumko/boom/master/images/boom.png)
-
-HTTP-friendly error objects
-
-[![Build Status](https://secure.travis-ci.org/spumko/boom.png)](http://travis-ci.org/spumko/boom)
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/images/boom.png has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require('./lib');
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/lib/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,207 +0,0 @@
-// Load modules
-
-var Http = require('http');
-var NodeUtil = require('util');
-var Hoek = require('hoek');
-
-
-// Declare internals
-
-var internals = {};
-
-
-exports = module.exports = internals.Boom = function (/* (new Error) or (code, message) */) {
-
-    var self = this;
-
-    Hoek.assert(this.constructor === internals.Boom, 'Error must be instantiated using new');
-
-    Error.call(this);
-    this.isBoom = true;
-
-    this.response = {
-        code: 0,
-        payload: {},
-        headers: {}
-        // type: 'content-type'
-    };
-
-    if (arguments[0] instanceof Error) {
-
-        // Error
-
-        var error = arguments[0];
-
-        this.data = error;
-        this.response.code = error.code || 500;
-        if (error.message) {
-            this.message = error.message;
-        }
-    }
-    else {
-
-        // code, message
-
-        var code = arguments[0];
-        var message = arguments[1];
-
-        Hoek.assert(!isNaN(parseFloat(code)) && isFinite(code) && code >= 400, 'First argument must be a number (400+)');
-
-        this.response.code = code;
-        if (message) {
-            this.message = message;
-        }
-    }
-
-    // Response format
-
-    this.reformat();
-
-    return this;
-};
-
-NodeUtil.inherits(internals.Boom, Error);
-
-
-internals.Boom.prototype.reformat = function () {
-
-    this.response.payload.code = this.response.code;
-    this.response.payload.error = Http.STATUS_CODES[this.response.code] || 'Unknown';
-    if (this.message) {
-        this.response.payload.message = Hoek.escapeHtml(this.message);         // Prevent XSS from error message
-    }
-};
-
-
-// Utilities
-
-internals.Boom.badRequest = function (message) {
-
-    return new internals.Boom(400, message);
-};
-
-
-internals.Boom.unauthorized = function (message, scheme, attributes) {          // Or function (message, wwwAuthenticate[])
-
-    var err = new internals.Boom(401, message);
-
-    if (!scheme) {
-        return err;
-    }
-
-    var wwwAuthenticate = '';
-
-    if (typeof scheme === 'string') {
-
-        // function (message, scheme, attributes)
-
-        wwwAuthenticate = scheme;
-        if (attributes) {
-            var names = Object.keys(attributes);
-            for (var i = 0, il = names.length; i < il; ++i) {
-                if (i) {
-                    wwwAuthenticate += ',';
-                }
-
-                var value = attributes[names[i]];
-                if (value === null ||
-                    value === undefined) {              // Value can be zero
-
-                    value = '';
-                }
-                wwwAuthenticate += ' ' + names[i] + '="' + Hoek.escapeHeaderAttribute(value.toString()) + '"';
-            }
-        }
-
-        if (message) {
-            if (attributes) {
-                wwwAuthenticate += ',';
-            }
-            wwwAuthenticate += ' error="' + Hoek.escapeHeaderAttribute(message) + '"';
-        }
-        else {
-            err.isMissing = true;
-        }
-    }
-    else {
-
-        // function (message, wwwAuthenticate[])
-
-        var wwwArray = scheme;
-        for (var i = 0, il = wwwArray.length; i < il; ++i) {
-            if (i) {
-                wwwAuthenticate += ', ';
-            }
-
-            wwwAuthenticate += wwwArray[i];
-        }
-    }
-
-    err.response.headers['WWW-Authenticate'] = wwwAuthenticate;
-
-    return err;
-};
-
-
-internals.Boom.clientTimeout = function (message) {
-
-    return new internals.Boom(408, message);
-};
-
-
-internals.Boom.serverTimeout = function (message) {
-
-    return new internals.Boom(503, message);
-};
-
-
-internals.Boom.forbidden = function (message) {
-
-    return new internals.Boom(403, message);
-};
-
-
-internals.Boom.notFound = function (message) {
-
-    return new internals.Boom(404, message);
-};
-
-
-internals.Boom.internal = function (message, data) {
-
-    var err = new internals.Boom(500, message);
-
-    if (data && data.stack) {
-        err.trace = data.stack.split('\n');
-        err.outterTrace = Hoek.displayStack(1);
-    }
-    else {
-        err.trace = Hoek.displayStack(1);
-    }
-
-    err.data = data;
-    err.response.payload.message = 'An internal server error occurred';                     // Hide actual error from user
-
-    return err;
-};
-
-
-internals.Boom.passThrough = function (code, payload, contentType, headers) {
-
-    var err = new internals.Boom(500, 'Pass-through');                                      // 500 code is only used to initialize
-
-    err.data = {
-        code: code,
-        payload: payload,
-        type: contentType
-    };
-
-    err.response.code = code;
-    err.response.type = contentType;
-    err.response.headers = headers;
-    err.response.payload = payload;
-
-    return err;
-};
-
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,51 +0,0 @@
-{
-  "name": "boom",
-  "description": "HTTP-friendly error objects",
-  "version": "0.4.2",
-  "author": {
-    "name": "Eran Hammer",
-    "email": "eran@hueniverse.com",
-    "url": "http://hueniverse.com"
-  },
-  "contributors": [],
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/spumko/boom"
-  },
-  "main": "index",
-  "keywords": [
-    "error",
-    "http"
-  ],
-  "engines": {
-    "node": ">=0.8.0"
-  },
-  "dependencies": {
-    "hoek": "0.9.x"
-  },
-  "devDependencies": {
-    "lab": "0.1.x",
-    "complexity-report": "0.x.x"
-  },
-  "scripts": {
-    "test": "make test-cov"
-  },
-  "licenses": [
-    {
-      "type": "BSD",
-      "url": "http://github.com/spumko/boom/raw/master/LICENSE"
-    }
-  ],
-  "readme": "<a href=\"https://github.com/spumko\"><img src=\"https://raw.github.com/spumko/spumko/master/images/from.png\" align=\"right\" /></a>\n![boom Logo](https://raw.github.com/spumko/boom/master/images/boom.png)\n\nHTTP-friendly error objects\n\n[![Build Status](https://secure.travis-ci.org/spumko/boom.png)](http://travis-ci.org/spumko/boom)\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/spumko/boom/issues"
-  },
-  "homepage": "https://github.com/spumko/boom",
-  "_id": "boom@0.4.2",
-  "dist": {
-    "shasum": "f0f4575f078f5fe7a3587dd9f999b6fedd482f4c"
-  },
-  "_from": "boom@0.4.x",
-  "_resolved": "https://registry.npmjs.org/boom/-/boom-0.4.2.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-.idea
-*.iml
-npm-debug.log
-dump.rdb
-node_modules
-results.tap
-results.xml
-npm-shrinkwrap.json
-config.json
-.DS_Store
-*/.DS_Store
-*/*/.DS_Store
-._*
-*/._*
-*/*/._*
-coverage.*
-lib-cov
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-language: node_js
-
-node_js:
-  - 0.10
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-Copyright (c) 2012-2013, Eran Hammer.
-All rights reserved.
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions are met:
-    * Redistributions of source code must retain the above copyright
-      notice, this list of conditions and the following disclaimer.
-    * Redistributions in binary form must reproduce the above copyright
-      notice, this list of conditions and the following disclaimer in the
-      documentation and/or other materials provided with the distribution.
-    * Neither the name of Eran Hammer nor the
-      names of its contributors may be used to endorse or promote products
-      derived from this software without specific prior written permission.
-
-THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
-WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
-DISCLAIMED. IN NO EVENT SHALL ERAN HAMMER BE LIABLE FOR ANY
-DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
-(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
-LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
-ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
-SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
-test:
-	@./node_modules/.bin/lab
-test-cov: 
-	@./node_modules/.bin/lab -r threshold -t 100
-test-cov-html:
-	@./node_modules/.bin/lab -r html -o coverage.html
-complexity:
-	@./node_modules/.bin/cr -o complexity.md -f markdown lib
-
-.PHONY: test test-cov test-cov-html complexity
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-cryptiles
-=========
-
-General purpose crypto utilities
-
-[![Build Status](https://secure.travis-ci.org/hueniverse/cryptiles.png)](http://travis-ci.org/hueniverse/cryptiles)
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require('./lib');
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/lib/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-// Load modules
-
-var Crypto = require('crypto');
-var Boom = require('boom');
-
-
-// Declare internals
-
-var internals = {};
-
-
-// Generate a cryptographically strong pseudo-random data
-
-exports.randomString = function (size) {
-
-    var buffer = exports.randomBits((size + 1) * 6);
-    if (buffer instanceof Error) {
-        return buffer;
-    }
-
-    var string = buffer.toString('base64').replace(/\+/g, '-').replace(/\//g, '_').replace(/\=/g, '');
-    return string.slice(0, size);
-};
-
-
-exports.randomBits = function (bits) {
-
-    if (!bits ||
-        bits < 0) {
-
-        return Boom.internal('Invalid random bits count');
-    }
-
-    var bytes = Math.ceil(bits / 8);
-    try {
-        return Crypto.randomBytes(bytes);
-    }
-    catch (err) {
-        return Boom.internal('Failed generating random bits: ' + err.message);
-    }
-};
-
-
-// Compare two strings using fixed time algorithm (to prevent time-based analysis of MAC digest match)
-
-exports.fixedTimeComparison = function (a, b) {
-
-    if (typeof a !== 'string' ||
-        typeof b !== 'string') {
-
-        return false;
-    }
-
-    var mismatch = (a.length === b.length ? 0 : 1);
-    if (mismatch) {
-        b = a;
-    }
-
-    for (var i = 0, il = a.length; i < il; ++i) {
-        var ac = a.charCodeAt(i);
-        var bc = b.charCodeAt(i);
-        mismatch += (ac === bc ? 0 : 1);
-    }
-
-    return (mismatch === 0);
-};
-
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,52 +0,0 @@
-{
-  "name": "cryptiles",
-  "description": "General purpose crypto utilities",
-  "version": "0.2.2",
-  "author": {
-    "name": "Eran Hammer",
-    "email": "eran@hueniverse.com",
-    "url": "http://hueniverse.com"
-  },
-  "contributors": [],
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/hueniverse/cryptiles"
-  },
-  "main": "index",
-  "keywords": [
-    "cryptography",
-    "security",
-    "utilites"
-  ],
-  "engines": {
-    "node": ">=0.8.0"
-  },
-  "dependencies": {
-    "boom": "0.4.x"
-  },
-  "devDependencies": {
-    "lab": "0.1.x",
-    "complexity-report": "0.x.x"
-  },
-  "scripts": {
-    "test": "make test-cov"
-  },
-  "licenses": [
-    {
-      "type": "BSD",
-      "url": "http://github.com/hueniverse/cryptiles/raw/master/LICENSE"
-    }
-  ],
-  "readme": "cryptiles\n=========\n\nGeneral purpose crypto utilities\n\n[![Build Status](https://secure.travis-ci.org/hueniverse/cryptiles.png)](http://travis-ci.org/hueniverse/cryptiles)\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/hueniverse/cryptiles/issues"
-  },
-  "homepage": "https://github.com/hueniverse/cryptiles",
-  "_id": "cryptiles@0.2.2",
-  "dist": {
-    "shasum": "f063b421b587cb91b0738de6db92e02b9fc30923"
-  },
-  "_from": "cryptiles@0.2.x",
-  "_resolved": "https://registry.npmjs.org/cryptiles/-/cryptiles-0.2.2.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-.idea
-*.iml
-npm-debug.log
-dump.rdb
-node_modules
-results.tap
-results.xml
-npm-shrinkwrap.json
-config.json
-.DS_Store
-*/.DS_Store
-*/*/.DS_Store
-._*
-*/._*
-*/*/._*
-coverage.*
-lib-cov
-complexity.md
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-language: node_js
-
-node_js:
-  - 0.10
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-Copyright (c) 2011-2013, Walmart.
-All rights reserved.
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions are met:
-    * Redistributions of source code must retain the above copyright
-      notice, this list of conditions and the following disclaimer.
-    * Redistributions in binary form must reproduce the above copyright
-      notice, this list of conditions and the following disclaimer in the
-      documentation and/or other materials provided with the distribution.
-    * Neither the name of Walmart nor the
-      names of its contributors may be used to endorse or promote products
-      derived from this software without specific prior written permission.
-
-THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
-WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
-DISCLAIMED. IN NO EVENT SHALL WALMART BE LIABLE FOR ANY
-DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
-(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
-LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
-ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
-SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-
-
-                                  *   *   *
-
-
-Portions of this project were initially based on Postmile, Copyright (c) 2011, Yahoo Inc.
-Postmile is published at https://github.com/yahoo/postmile and its licensing terms are
-published at https://github.com/yahoo/postmile/blob/master/LICENSE.
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,10 +0,0 @@
-test:
-	@node node_modules/lab/bin/lab
-test-cov: 
-	@node node_modules/lab/bin/lab -r threshold -t 100
-test-cov-html:
-	@node node_modules/lab/bin/lab -r html -o coverage.html
-complexity:
-	@node node_modules/complexity-report/src/cli.js -o complexity.md -f markdown lib
-
-.PHONY: test test-cov test-cov-html complexity
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,436 +0,0 @@
-<a href="https://github.com/spumko"><img src="https://raw.github.com/spumko/spumko/master/images/from.png" align="right" /></a>
-![hoek Logo](https://raw.github.com/spumko/hoek/master/images/hoek.png)
-
-General purpose node utilities
-
-[![Build Status](https://secure.travis-ci.org/spumko/hoek.png)](http://travis-ci.org/spumko/hoek)
-
-# Table of Contents
-
-* [Introduction](#introduction "Introduction")
-* [Object](#object "Object")
-  * [clone](#cloneobj "clone")
-  * [merge](#mergetarget-source-isnulloverride-ismergearrays "merge")
-  * [applyToDefaults](#applytodefaultsdefaults-options "applyToDefaults")
-  * [unique](#uniquearray-key "unique")
-  * [mapToObject](#maptoobjectarray-key "mapToObject")
-  * [intersect](#intersectarray1-array2 "intersect")
-  * [matchKeys](#matchkeysobj-keys "matchKeys")
-  * [flatten](#flattenarray-target "flatten")
-  * [removeKeys](#removekeysobject-keys "removeKeys")
-  * [reach](#reachobj-chain "reach")
-  * [inheritAsync](#inheritasyncself-obj-keys "inheritAsync")
-  * [rename](#renameobj-from-to "rename")
-* [Timer](#timer "Timer")
-* [Binary Encoding/Decoding](#binary "Binary Encoding/Decoding")
-  * [base64urlEncode](#binary64urlEncodevalue "binary64urlEncode")
-  * [base64urlDecode](#binary64urlDecodevalue "binary64urlDecode")
-* [Escaping Characters](#escaped "Escaping Characters")
-  * [escapeHtml](#escapeHtmlstring "escapeHtml")
-  * [escapeHeaderAttribute](#escapeHeaderAttributeattribute "escapeHeaderAttribute")
-  * [escapeRegex](#escapeRegexstring "escapeRegex")
-* [Errors](#errors "Errors")
-  * [assert](#assertmessage "assert")
-  * [abort](#abortmessage "abort")
-  * [displayStack](#displayStackslice "displayStack")
-  * [callStack](#callStackslice "callStack")
-  * [toss](#tosscondition "toss")
-* [Load files](#load-files "Load Files")
-  * [loadPackage](#loadPackagedir "loadpackage")
-  * [loadDirModules](#loadDirModulespath-excludefiles-target "loaddirmodules")
-
-
-
-# Introduction
-
-The *Hoek* general purpose node utilities library is used to aid in a variety of manners. It comes with useful methods for Arrays (clone, merge, applyToDefaults), Objects (removeKeys, copy), Asserting and more. 
-
-For example, to use Hoek to set configuration with default options:
-```javascript
-var Hoek = require('hoek');
-
-var default = {url : "www.github.com", port : "8000", debug : true}
-
-var config = Hoek.applyToDefaults(default, {port : "3000", admin : true});
-
-// In this case, config would be { url: 'www.github.com', port: '3000', debug: true, admin: true }
-```
-
-Under each of the sections (such as Array), there are subsections which correspond to Hoek methods. Each subsection will explain how to use the corresponding method. In each js excerpt below, the var Hoek = require('hoek') is omitted for brevity.
-
-## Object
-
-Hoek provides several helpful methods for objects and arrays.
-
-### clone(obj)
-
-This method is used to clone an object or an array. A *deep copy* is made (duplicates everything, including values that are objects). 
-
-```javascript
-
-var nestedObj = {
-        w: /^something$/ig,
-        x: {
-            a: [1, 2, 3],
-            b: 123456,
-            c: new Date()
-        },
-        y: 'y',
-        z: new Date()
-    };
-
-var copy = Hoek.clone(nestedObj);
-
-copy.x.b = 100;
-
-console.log(copy.y)        // results in 'y'
-console.log(nestedObj.x.b) // results in 123456
-console.log(copy.x.b)      // results in 100
-```
-
-### merge(target, source, isNullOverride, isMergeArrays)
-isNullOverride, isMergeArrays default to true
-
-Merge all the properties of source into target, source wins in conflic, and by default null and undefined from source are applied
-
-
-```javascript
-
-var target = {a: 1, b : 2}
-var source = {a: 0, c: 5}
-var source2 = {a: null, c: 5}
-
-var targetArray = [1, 2, 3];
-var sourceArray = [4, 5];
-
-var newTarget = Hoek.merge(target, source);     // results in {a: 0, b: 2, c: 5}
-newTarget = Hoek.merge(target, source2);        // results in {a: null, b: 2, c: 5}
-newTarget = Hoek.merge(target, source2, false); // results in {a: 1, b: 2, c: 5}
-
-newTarget = Hoek.merge(targetArray, sourceArray)              // results in [1, 2, 3, 4, 5]
-newTarget = Hoek.merge(targetArray, sourceArray, true, false) // results in [4, 5]
-
-
-
-
-```
-
-### applyToDefaults(defaults, options)
-
-Apply options to a copy of the defaults
-
-```javascript
-
-var defaults = {host: "localhost", port: 8000};
-var options = {port: 8080};
-
-var config = Hoek.applyToDefaults(defaults, options); // results in {host: "localhost", port: 8080};
-
-
-```
-
-### unique(array, key)
-
-Remove duplicate items from Array
-
-```javascript
-
-var array = [1, 2, 2, 3, 3, 4, 5, 6];
-
-var newArray = Hoek.unique(array); // results in [1,2,3,4,5,6];
-
-array = [{id: 1}, {id: 1}, {id: 2}];
-
-newArray = Hoek.unique(array, "id") // results in [{id: 1}, {id: 2}]
-
-```
-
-### mapToObject(array, key)
-
-Convert an Array into an Object
-
-```javascript
-
-var array = [1,2,3];
-var newObject = Hoek.mapToObject(array); // results in [{"1": true}, {"2": true}, {"3": true}]
-
-array = [{id: 1}, {id: 2}];
-newObject = Hoek.mapToObject(array, "id") // results in [{"id": 1}, {"id": 2}]
-
-```
-### intersect(array1, array2)
-
-Find the common unique items in two arrays
-
-```javascript
-
-var array1 = [1, 2, 3];
-var array2 = [1, 4, 5];
-
-var newArray = Hoek.intersect(array1, array2) // results in [1]
-
-```
-
-### matchKeys(obj, keys) 
-
-Find which keys are present
-
-```javascript
-
-var obj = {a: 1, b: 2, c: 3};
-var keys = ["a", "e"];
-
-Hoek.matchKeys(obj, keys) // returns ["a"]
-
-```
-
-### flatten(array, target)
-
-Flatten an array
-
-```javascript
-
-var array = [1, 2, 3];
-var target = [4, 5]; 
-
-var flattenedArray = Hoek.flatten(array, target) // results in [4, 5, 1, 2, 3];
-
-```
-
-### removeKeys(object, keys)
-
-Remove keys
-
-```javascript
-
-var object = {a: 1, b: 2, c: 3, d: 4};
-
-var keys = ["a", "b"];
-
-Hoek.removeKeys(object, keys) // object is now {c: 3, d: 4}
-
-```
-
-### reach(obj, chain)
-
-Converts an object key chain string to reference
-
-```javascript
-
-var chain = 'a.b.c';
-var obj = {a : {b : { c : 1}}};
-
-Hoek.reach(obj, chain) // returns 1
-
-```
-
-### inheritAsync(self, obj, keys) 
-
-Inherits a selected set of methods from an object, wrapping functions in asynchronous syntax and catching errors
-
-```javascript
-
-var targetFunc = function () { };
-
-var proto = {
-                a: function () {
-                    return 'a!';
-                },
-                b: function () {
-                    return 'b!';
-                },
-                c: function () {
-                    throw new Error('c!');
-                }
-            };
-
-var keys = ['a', 'c'];
-
-Hoek.inheritAsync(targetFunc, proto, ['a', 'c']);
-
-var target = new targetFunc();
-
-target.a(function(err, result){console.log(result)}         // returns 'a!'       
-
-target.c(function(err, result){console.log(result)}         // returns undefined
-
-target.b(function(err, result){console.log(result)}         // gives error: Object [object Object] has no method 'b'
-
-```
-
-### rename(obj, from, to)
-
-Rename a key of an object
-
-```javascript
-
-var obj = {a : 1, b : 2};
-
-Hoek.rename(obj, "a", "c");     // obj is now {c : 1, b : 2}
-
-```
-
-
-# Timer
-
-A Timer object. Initializing a new timer object sets the ts to the number of milliseconds elapsed since 1 January 1970 00:00:00 UTC.
-
-```javascript
-
-
-example : 
-
-
-var timerObj = new Hoek.Timer();
-console.log("Time is now: " + timerObj.ts)
-console.log("Elapsed time from initialization: " + timerObj.elapsed() + 'milliseconds')
-
-```
-
-# Binary Encoding/Decoding
-
-### base64urlEncode(value)
-
-Encodes value in Base64 or URL encoding
-
-### base64urlDecode(value)
-
-Decodes data in Base64 or URL encoding.
-# Escaping Characters
-
-Hoek provides convenient methods for escaping html characters. The escaped characters are as followed:
-
-```javascript
-
-internals.htmlEscaped = {
-    '&': '&amp;',
-    '<': '&lt;',
-    '>': '&gt;',
-    '"': '&quot;',
-    "'": '&#x27;',
-    '`': '&#x60;'
-};
-
-```
-
-### escapeHtml(string)
-
-```javascript
-
-var string = '<html> hey </html>';
-var escapedString = Hoek.escapeHtml(string); // returns &lt;html&gt; hey &lt;/html&gt;
-
-```
-
-### escapeHeaderAttribute(attribute)
-
-Escape attribute value for use in HTTP header
-
-```javascript
-
-var a = Hoek.escapeHeaderAttribute('I said "go w\\o me"');  //returns I said \"go w\\o me\"
-
-
-```
-
-
-### escapeRegex(string)
-
-Escape string for Regex construction
-
-```javascript
-
-var a = Hoek.escapeRegex('4^f$s.4*5+-_?%=#!:@|~\\/`"(>)[<]d{}s,');  // returns 4\^f\$s\.4\*5\+\-_\?%\=#\!\:@\|~\\\/`"\(>\)\[<\]d\{\}s\,
-
-
-
-```
-
-# Errors
-
-### assert(message)
-
-```javascript
-
-var a = 1, b =2;
-
-Hoek.assert(a === b, 'a should equal b');  // ABORT: a should equal b
-
-```
-
-### abort(message)
-
-First checks if process.env.NODE_ENV === 'test', and if so, throws error message. Otherwise,
-displays most recent stack and then exits process.
-
-
-
-### displayStack(slice)
-
-Displays the trace stack
-
-```javascript
-
-var stack = Hoek.displayStack();
-console.log(stack) // returns something like:
-
-[ 'null (/Users/user/Desktop/hoek/test.js:4:18)',
-  'Module._compile (module.js:449:26)',
-  'Module._extensions..js (module.js:467:10)',
-  'Module.load (module.js:356:32)',
-  'Module._load (module.js:312:12)',
-  'Module.runMain (module.js:492:10)',
-  'startup.processNextTick.process._tickCallback (node.js:244:9)' ]
-
-```
-
-### callStack(slice)
-
-Returns a trace stack array.
-
-```javascript
-
-var stack = Hoek.callStack();
-console.log(stack)  // returns something like:
-
-[ [ '/Users/user/Desktop/hoek/test.js', 4, 18, null, false ],
-  [ 'module.js', 449, 26, 'Module._compile', false ],
-  [ 'module.js', 467, 10, 'Module._extensions..js', false ],
-  [ 'module.js', 356, 32, 'Module.load', false ],
-  [ 'module.js', 312, 12, 'Module._load', false ],
-  [ 'module.js', 492, 10, 'Module.runMain', false ],
-  [ 'node.js',
-    244,
-    9,
-    'startup.processNextTick.process._tickCallback',
-    false ] ]
-
-
-```
-
-### toss(condition)
-
-toss(condition /*, [message], callback */)
-
-Return an error as first argument of a callback
-
-
-# Load Files
-
-### loadPackage(dir)
-
-Load and parse package.json process root or given directory
-
-```javascript
-
-var pack = Hoek.loadPackage();  // pack.name === 'hoek'
-
-```
-
-### loadDirModules(path, excludeFiles, target) 
-
-Loads modules from a given path; option to exclude files (array).
-
-
-
-
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/images/hoek.png has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require('./lib');
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/lib/escape.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,132 +0,0 @@
-// Declare internals
-
-var internals = {};
-
-
-exports.escapeJavaScript = function (input) {
-
-    if (!input) {
-        return '';
-    }
-
-    var escaped = '';
-
-    for (var i = 0, il = input.length; i < il; ++i) {
-
-        var charCode = input.charCodeAt(i);
-
-        if (internals.isSafe(charCode)) {
-            escaped += input[i];
-        }
-        else {
-            escaped += internals.escapeJavaScriptChar(charCode);
-        }
-    }
-
-    return escaped;
-};
-
-
-exports.escapeHtml = function (input) {
-
-    if (!input) {
-        return '';
-    }
-
-    var escaped = '';
-
-    for (var i = 0, il = input.length; i < il; ++i) {
-
-        var charCode = input.charCodeAt(i);
-
-        if (internals.isSafe(charCode)) {
-            escaped += input[i];
-        }
-        else {
-            escaped += internals.escapeHtmlChar(charCode);
-        }
-    }
-
-    return escaped;
-};
-
-
-internals.escapeJavaScriptChar = function (charCode) {
-
-    if (charCode >= 256) {
-        return '\\u' + internals.padLeft('' + charCode, 4);
-    }
-
-    var hexValue = new Buffer(String.fromCharCode(charCode), 'ascii').toString('hex');
-    return '\\x' + internals.padLeft(hexValue, 2);
-};
-
-
-internals.escapeHtmlChar = function (charCode) {
-
-    var namedEscape = internals.namedHtml[charCode];
-    if (typeof namedEscape !== 'undefined') {
-        return namedEscape;
-    }
-
-    if (charCode >= 256) {
-        return '&#' + charCode + ';';
-    }
-
-    var hexValue = new Buffer(String.fromCharCode(charCode), 'ascii').toString('hex');
-    return '&#x' + internals.padLeft(hexValue, 2) + ';';
-};
-
-
-internals.padLeft = function (str, len) {
-
-    while (str.length < len) {
-        str = '0' + str;
-    }
-
-    return str;
-};
-
-
-internals.isSafe = function (charCode) {
-
-    return (typeof internals.safeCharCodes[charCode] !== 'undefined');
-};
-
-
-internals.namedHtml = {
-    '38': '&amp;',
-    '60': '&lt;',
-    '62': '&gt;',
-    '34': '&quot;',
-    '160': '&nbsp;',
-    '162': '&cent;',
-    '163': '&pound;',
-    '164': '&curren;',
-    '169': '&copy;',
-    '174': '&reg;'
-};
-
-
-internals.safeCharCodes = (function () {
-
-    var safe = {};
-
-    for (var i = 32; i < 123; ++i) {
-
-        if ((i >= 97 && i <= 122) ||         // a-z
-            (i >= 65 && i <= 90) ||          // A-Z
-            (i >= 48 && i <= 57) ||          // 0-9
-            i === 32 ||                      // space
-            i === 46 ||                      // .
-            i === 44 ||                      // ,
-            i === 45 ||                      // -
-            i === 58 ||                      // :
-            i === 95) {                      // _
-
-            safe[i] = null;
-        }
-    }
-
-    return safe;
-}());
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/lib/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,585 +0,0 @@
-// Load modules
-
-var Fs = require('fs');
-var Escape = require('./escape');
-
-
-// Declare internals
-
-var internals = {};
-
-
-// Clone object or array
-
-exports.clone = function (obj, seen) {
-
-    if (typeof obj !== 'object' ||
-        obj === null) {
-
-        return obj;
-    }
-
-    seen = seen || { orig: [], copy: [] };
-
-    var lookup = seen.orig.indexOf(obj);
-    if (lookup !== -1) {
-        return seen.copy[lookup];
-    }
-
-    var newObj = (obj instanceof Array) ? [] : {};
-
-    seen.orig.push(obj);
-    seen.copy.push(newObj);
-
-    for (var i in obj) {
-        if (obj.hasOwnProperty(i)) {
-            if (obj[i] instanceof Buffer) {
-                newObj[i] = new Buffer(obj[i]);
-            }
-            else if (obj[i] instanceof Date) {
-                newObj[i] = new Date(obj[i].getTime());
-            }
-            else if (obj[i] instanceof RegExp) {
-                var flags = '' + (obj[i].global ? 'g' : '') + (obj[i].ignoreCase ? 'i' : '') + (obj[i].multiline ? 'm' : '');
-                newObj[i] = new RegExp(obj[i].source, flags);
-            }
-            else {
-                newObj[i] = exports.clone(obj[i], seen);
-            }
-        }
-    }
-
-    return newObj;
-};
-
-
-// Merge all the properties of source into target, source wins in conflic, and by default null and undefined from source are applied
-
-exports.merge = function (target, source, isNullOverride /* = true */, isMergeArrays /* = true */) {
-
-    exports.assert(target && typeof target == 'object', 'Invalid target value: must be an object');
-    exports.assert(source === null || source === undefined || typeof source === 'object', 'Invalid source value: must be null, undefined, or an object');
-
-    if (!source) {
-        return target;
-    }
-
-    if (source instanceof Array) {
-        exports.assert(target instanceof Array, 'Cannot merge array onto an object');
-        if (isMergeArrays === false) {                                                  // isMergeArrays defaults to true
-            target.length = 0;                                                          // Must not change target assignment
-        }
-
-        for (var i = 0, il = source.length; i < il; ++i) {
-            target.push(source[i]);
-        }
-
-        return target;
-    }
-
-    var keys = Object.keys(source);
-    for (var k = 0, kl = keys.length; k < kl; ++k) {
-        var key = keys[k];
-        var value = source[key];
-        if (value &&
-            typeof value === 'object') {
-
-            if (!target[key] ||
-                typeof target[key] !== 'object') {
-
-                target[key] = exports.clone(value);
-            }
-            else {
-                exports.merge(target[key], source[key], isNullOverride, isMergeArrays);
-            }
-        }
-        else {
-            if (value !== null && value !== undefined) {            // Explicit to preserve empty strings
-                target[key] = value;
-            }
-            else if (isNullOverride !== false) {                    // Defaults to true
-                target[key] = value;
-            }
-        }
-    }
-
-    return target;
-};
-
-
-// Apply options to a copy of the defaults
-
-exports.applyToDefaults = function (defaults, options) {
-
-    exports.assert(defaults && typeof defaults == 'object', 'Invalid defaults value: must be an object');
-    exports.assert(!options || options === true || typeof options === 'object', 'Invalid options value: must be true, falsy or an object');
-
-    if (!options) {                                                 // If no options, return null
-        return null;
-    }
-
-    var copy = exports.clone(defaults);
-
-    if (options === true) {                                         // If options is set to true, use defaults
-        return copy;
-    }
-
-    return exports.merge(copy, options, false, false);
-};
-
-
-// Remove duplicate items from array
-
-exports.unique = function (array, key) {
-
-    var index = {};
-    var result = [];
-
-    for (var i = 0, il = array.length; i < il; ++i) {
-        var id = (key ? array[i][key] : array[i]);
-        if (index[id] !== true) {
-
-            result.push(array[i]);
-            index[id] = true;
-        }
-    }
-
-    return result;
-};
-
-
-// Convert array into object
-
-exports.mapToObject = function (array, key) {
-
-    if (!array) {
-        return null;
-    }
-
-    var obj = {};
-    for (var i = 0, il = array.length; i < il; ++i) {
-        if (key) {
-            if (array[i][key]) {
-                obj[array[i][key]] = true;
-            }
-        }
-        else {
-            obj[array[i]] = true;
-        }
-    }
-
-    return obj;
-};
-
-
-// Find the common unique items in two arrays
-
-exports.intersect = function (array1, array2, justFirst) {
-
-    if (!array1 || !array2) {
-        return [];
-    }
-
-    var common = [];
-    var hash = (array1 instanceof Array ? exports.mapToObject(array1) : array1);
-    var found = {};
-    for (var i = 0, il = array2.length; i < il; ++i) {
-        if (hash[array2[i]] && !found[array2[i]]) {
-            if (justFirst) {
-                return array2[i];
-            }
-
-            common.push(array2[i]);
-            found[array2[i]] = true;
-        }
-    }
-
-    return (justFirst ? null : common);
-};
-
-
-// Find which keys are present
-
-exports.matchKeys = function (obj, keys) {
-
-    var matched = [];
-    for (var i = 0, il = keys.length; i < il; ++i) {
-        if (obj.hasOwnProperty(keys[i])) {
-            matched.push(keys[i]);
-        }
-    }
-    return matched;
-};
-
-
-// Flatten array
-
-exports.flatten = function (array, target) {
-
-    var result = target || [];
-
-    for (var i = 0, il = array.length; i < il; ++i) {
-        if (Array.isArray(array[i])) {
-            exports.flatten(array[i], result);
-        }
-        else {
-            result.push(array[i]);
-        }
-    }
-
-    return result;
-};
-
-
-// Remove keys
-
-exports.removeKeys = function (object, keys) {
-
-    for (var i = 0, il = keys.length; i < il; i++) {
-        delete object[keys[i]];
-    }
-};
-
-
-// Convert an object key chain string ('a.b.c') to reference (object[a][b][c])
-
-exports.reach = function (obj, chain) {
-
-    var path = chain.split('.');
-    var ref = obj;
-    for (var i = 0, il = path.length; i < il; ++i) {
-        if (ref) {
-            ref = ref[path[i]];
-        }
-    }
-
-    return ref;
-};
-
-
-// Inherits a selected set of methods from an object, wrapping functions in asynchronous syntax and catching errors
-
-exports.inheritAsync = function (self, obj, keys) {
-
-    keys = keys || null;
-
-    for (var i in obj) {
-        if (obj.hasOwnProperty(i)) {
-            if (keys instanceof Array &&
-                keys.indexOf(i) < 0) {
-
-                continue;
-            }
-
-            self.prototype[i] = (function (fn) {
-
-                return function (next) {
-
-                    var result = null;
-                    try {
-                        result = fn();
-                    }
-                    catch (err) {
-                        return next(err);
-                    }
-
-                    return next(null, result);
-                };
-            })(obj[i]);
-        }
-    }
-};
-
-
-exports.formatStack = function (stack) {
-
-    var trace = [];
-    for (var i = 0, il = stack.length; i < il; ++i) {
-        var item = stack[i];
-        trace.push([item.getFileName(), item.getLineNumber(), item.getColumnNumber(), item.getFunctionName(), item.isConstructor()]);
-    }
-
-    return trace;
-};
-
-
-exports.formatTrace = function (trace) {
-
-    var display = [];
-
-    for (var i = 0, il = trace.length; i < il; ++i) {
-        var row = trace[i];
-        display.push((row[4] ? 'new ' : '') + row[3] + ' (' + row[0] + ':' + row[1] + ':' + row[2] + ')');
-    }
-
-    return display;
-};
-
-
-exports.callStack = function (slice) {
-
-    // http://code.google.com/p/v8/wiki/JavaScriptStackTraceApi
-
-    var v8 = Error.prepareStackTrace;
-    Error.prepareStackTrace = function (err, stack) {
-
-        return stack;
-    };
-
-    var capture = {};
-    Error.captureStackTrace(capture, arguments.callee);
-    var stack = capture.stack;
-
-    Error.prepareStackTrace = v8;
-
-    var trace = exports.formatStack(stack);
-
-    if (slice) {
-        return trace.slice(slice);
-    }
-
-    return trace;
-};
-
-
-exports.displayStack = function (slice) {
-
-    var trace = exports.callStack(slice === undefined ? 1 : slice + 1);
-
-    return exports.formatTrace(trace);
-};
-
-
-exports.abortThrow = false;
-
-
-exports.abort = function (message, hideStack) {
-
-    if (process.env.NODE_ENV === 'test' || exports.abortThrow === true) {
-        throw new Error(message || 'Unknown error');
-    }
-
-    var stack = '';
-    if (!hideStack) {
-        stack = exports.displayStack(1).join('\n\t');
-    }
-    console.log('ABORT: ' + message + '\n\t' + stack);
-    process.exit(1);
-};
-
-
-exports.assert = function (condition /*, msg1, msg2, msg3 */) {
-
-    if (condition) {
-        return;
-    }
-
-    var msgs = Array.prototype.slice.call(arguments, 1);
-    msgs = msgs.map(function (msg) {
-
-        return typeof msg === 'string' ? msg : msg instanceof Error ? msg.message : JSON.stringify(msg);
-    });
-    throw new Error(msgs.join(' ') || 'Unknown error');
-};
-
-
-exports.loadDirModules = function (path, excludeFiles, target) {      // target(filename, name, capName)
-
-    var exclude = {};
-    for (var i = 0, il = excludeFiles.length; i < il; ++i) {
-        exclude[excludeFiles[i] + '.js'] = true;
-    }
-
-    var files = Fs.readdirSync(path);
-    for (i = 0, il = files.length; i < il; ++i) {
-        var filename = files[i];
-        if (/\.js$/.test(filename) &&
-            !exclude[filename]) {
-
-            var name = filename.substr(0, filename.lastIndexOf('.'));
-            var capName = name.charAt(0).toUpperCase() + name.substr(1).toLowerCase();
-
-            if (typeof target !== 'function') {
-                target[capName] = require(path + '/' + name);
-            }
-            else {
-                target(path + '/' + name, name, capName);
-            }
-        }
-    }
-};
-
-
-exports.rename = function (obj, from, to) {
-
-    obj[to] = obj[from];
-    delete obj[from];
-};
-
-
-exports.Timer = function () {
-
-    this.reset();
-};
-
-
-exports.Timer.prototype.reset = function () {
-
-    this.ts = Date.now();
-};
-
-
-exports.Timer.prototype.elapsed = function () {
-
-    return Date.now() - this.ts;
-};
-
-
-// Load and parse package.json process root or given directory
-
-exports.loadPackage = function (dir) {
-
-    var result = {};
-    var filepath = (dir || process.env.PWD) + '/package.json';
-    if (Fs.existsSync(filepath)) {
-        try {
-            result = JSON.parse(Fs.readFileSync(filepath));
-        }
-        catch (e) { }
-    }
-
-    return result;
-};
-
-
-// Escape string for Regex construction
-
-exports.escapeRegex = function (string) {
-
-    // Escape ^$.*+-?=!:|\/()[]{},
-    return string.replace(/[\^\$\.\*\+\-\?\=\!\:\|\\\/\(\)\[\]\{\}\,]/g, '\\$&');
-};
-
-
-// Return an error as first argument of a callback
-
-exports.toss = function (condition /*, [message], next */) {
-
-    var message = (arguments.length === 3 ? arguments[1] : '');
-    var next = (arguments.length === 3 ? arguments[2] : arguments[1]);
-
-    var err = (message instanceof Error ? message : (message ? new Error(message) : (condition instanceof Error ? condition : new Error())));
-
-    if (condition instanceof Error ||
-        !condition) {
-
-        return next(err);
-    }
-};
-
-
-// Base64url (RFC 4648) encode
-
-exports.base64urlEncode = function (value) {
-
-    return (new Buffer(value, 'binary')).toString('base64').replace(/\+/g, '-').replace(/\//g, '_').replace(/\=/g, '');
-};
-
-
-// Base64url (RFC 4648) decode
-
-exports.base64urlDecode = function (encoded) {
-
-    if (encoded &&
-        !encoded.match(/^[\w\-]*$/)) {
-
-        return new Error('Invalid character');
-    }
-
-    try {
-        return (new Buffer(encoded.replace(/-/g, '+').replace(/:/g, '/'), 'base64')).toString('binary');
-    }
-    catch (err) {
-        return err;
-    }
-};
-
-
-// Escape attribute value for use in HTTP header
-
-exports.escapeHeaderAttribute = function (attribute) {
-
-    // Allowed value characters: !#$%&'()*+,-./:;<=>?@[]^_`{|}~ and space, a-z, A-Z, 0-9, \, "
-
-    exports.assert(attribute.match(/^[ \w\!#\$%&'\(\)\*\+,\-\.\/\:;<\=>\?@\[\]\^`\{\|\}~\"\\]*$/), 'Bad attribute value (' + attribute + ')');
-
-    return attribute.replace(/\\/g, '\\\\').replace(/\"/g, '\\"');                             // Escape quotes and slash
-};
-
-
-exports.escapeHtml = function (string) {
-
-    return Escape.escapeHtml(string);
-};
-
-
-exports.escapeJavaScript = function (string) {
-
-    return Escape.escapeJavaScript(string);
-};
-
-
-/*
-var event = {
-    timestamp: now.getTime(),
-    tags: ['tag'],
-    data: { some: 'data' }
-};
-*/
-
-exports.consoleFunc = console.log;
-
-exports.printEvent = function (event) {
-
-    var pad = function (value) {
-
-        return (value < 10 ? '0' : '') + value;
-    };
-
-    var now = new Date(event.timestamp);
-    var timestring = (now.getYear() - 100).toString() +
-        pad(now.getMonth() + 1) +
-        pad(now.getDate()) +
-        '/' +
-        pad(now.getHours()) +
-        pad(now.getMinutes()) +
-        pad(now.getSeconds()) +
-        '.' +
-        now.getMilliseconds();
-
-    var data = event.data;
-    if (typeof event.data !== 'string') {
-        try {
-            data = JSON.stringify(event.data);
-        }
-        catch (e) {
-            data = 'JSON Error: ' + e.message;
-        }
-    }
-
-    var output = timestring + ', ' + event.tags[0] + ', ' + data;
-    exports.consoleFunc(output);
-};
-
-
-exports.nextTick = function (callback) {
-
-    return function () {
-
-        var args = arguments;
-        process.nextTick(function () {
-
-            callback.apply(null, args);
-        });
-    };
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-{
-  "name": "hoek",
-  "description": "General purpose node utilities",
-  "version": "0.9.1",
-  "author": {
-    "name": "Eran Hammer",
-    "email": "eran@hueniverse.com",
-    "url": "http://hueniverse.com"
-  },
-  "contributors": [
-    {
-      "name": "Van Nguyen",
-      "email": "the.gol.effect@gmail.com"
-    }
-  ],
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/spumko/hoek"
-  },
-  "main": "index",
-  "keywords": [
-    "utilities"
-  ],
-  "engines": {
-    "node": ">=0.8.0"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "lab": "0.1.x",
-    "complexity-report": "0.x.x"
-  },
-  "scripts": {
-    "test": "make test-cov"
-  },
-  "licenses": [
-    {
-      "type": "BSD",
-      "url": "http://github.com/spumko/hoek/raw/master/LICENSE"
-    }
-  ],
-  "readme": "<a href=\"https://github.com/spumko\"><img src=\"https://raw.github.com/spumko/spumko/master/images/from.png\" align=\"right\" /></a>\r\n![hoek Logo](https://raw.github.com/spumko/hoek/master/images/hoek.png)\r\n\r\nGeneral purpose node utilities\r\n\r\n[![Build Status](https://secure.travis-ci.org/spumko/hoek.png)](http://travis-ci.org/spumko/hoek)\r\n\r\n# Table of Contents\r\n\r\n* [Introduction](#introduction \"Introduction\")\r\n* [Object](#object \"Object\")\r\n  * [clone](#cloneobj \"clone\")\r\n  * [merge](#mergetarget-source-isnulloverride-ismergearrays \"merge\")\r\n  * [applyToDefaults](#applytodefaultsdefaults-options \"applyToDefaults\")\r\n  * [unique](#uniquearray-key \"unique\")\r\n  * [mapToObject](#maptoobjectarray-key \"mapToObject\")\r\n  * [intersect](#intersectarray1-array2 \"intersect\")\r\n  * [matchKeys](#matchkeysobj-keys \"matchKeys\")\r\n  * [flatten](#flattenarray-target \"flatten\")\r\n  * [removeKeys](#removekeysobject-keys \"removeKeys\")\r\n  * [reach](#reachobj-chain \"reach\")\r\n  * [inheritAsync](#inheritasyncself-obj-keys \"inheritAsync\")\r\n  * [rename](#renameobj-from-to \"rename\")\r\n* [Timer](#timer \"Timer\")\r\n* [Binary Encoding/Decoding](#binary \"Binary Encoding/Decoding\")\r\n  * [base64urlEncode](#binary64urlEncodevalue \"binary64urlEncode\")\r\n  * [base64urlDecode](#binary64urlDecodevalue \"binary64urlDecode\")\r\n* [Escaping Characters](#escaped \"Escaping Characters\")\r\n  * [escapeHtml](#escapeHtmlstring \"escapeHtml\")\r\n  * [escapeHeaderAttribute](#escapeHeaderAttributeattribute \"escapeHeaderAttribute\")\r\n  * [escapeRegex](#escapeRegexstring \"escapeRegex\")\r\n* [Errors](#errors \"Errors\")\r\n  * [assert](#assertmessage \"assert\")\r\n  * [abort](#abortmessage \"abort\")\r\n  * [displayStack](#displayStackslice \"displayStack\")\r\n  * [callStack](#callStackslice \"callStack\")\r\n  * [toss](#tosscondition \"toss\")\r\n* [Load files](#load-files \"Load Files\")\r\n  * [loadPackage](#loadPackagedir \"loadpackage\")\r\n  * [loadDirModules](#loadDirModulespath-excludefiles-target \"loaddirmodules\")\r\n\r\n\r\n\r\n# Introduction\r\n\r\nThe *Hoek* general purpose node utilities library is used to aid in a variety of manners. It comes with useful methods for Arrays (clone, merge, applyToDefaults), Objects (removeKeys, copy), Asserting and more. \r\n\r\nFor example, to use Hoek to set configuration with default options:\r\n```javascript\r\nvar Hoek = require('hoek');\r\n\r\nvar default = {url : \"www.github.com\", port : \"8000\", debug : true}\r\n\r\nvar config = Hoek.applyToDefaults(default, {port : \"3000\", admin : true});\r\n\r\n// In this case, config would be { url: 'www.github.com', port: '3000', debug: true, admin: true }\r\n```\r\n\r\nUnder each of the sections (such as Array), there are subsections which correspond to Hoek methods. Each subsection will explain how to use the corresponding method. In each js excerpt below, the var Hoek = require('hoek') is omitted for brevity.\r\n\r\n## Object\r\n\r\nHoek provides several helpful methods for objects and arrays.\r\n\r\n### clone(obj)\r\n\r\nThis method is used to clone an object or an array. A *deep copy* is made (duplicates everything, including values that are objects). \r\n\r\n```javascript\r\n\r\nvar nestedObj = {\r\n        w: /^something$/ig,\r\n        x: {\r\n            a: [1, 2, 3],\r\n            b: 123456,\r\n            c: new Date()\r\n        },\r\n        y: 'y',\r\n        z: new Date()\r\n    };\r\n\r\nvar copy = Hoek.clone(nestedObj);\r\n\r\ncopy.x.b = 100;\r\n\r\nconsole.log(copy.y)        // results in 'y'\r\nconsole.log(nestedObj.x.b) // results in 123456\r\nconsole.log(copy.x.b)      // results in 100\r\n```\r\n\r\n### merge(target, source, isNullOverride, isMergeArrays)\r\nisNullOverride, isMergeArrays default to true\r\n\r\nMerge all the properties of source into target, source wins in conflic, and by default null and undefined from source are applied\r\n\r\n\r\n```javascript\r\n\r\nvar target = {a: 1, b : 2}\r\nvar source = {a: 0, c: 5}\r\nvar source2 = {a: null, c: 5}\r\n\r\nvar targetArray = [1, 2, 3];\r\nvar sourceArray = [4, 5];\r\n\r\nvar newTarget = Hoek.merge(target, source);     // results in {a: 0, b: 2, c: 5}\r\nnewTarget = Hoek.merge(target, source2);        // results in {a: null, b: 2, c: 5}\r\nnewTarget = Hoek.merge(target, source2, false); // results in {a: 1, b: 2, c: 5}\r\n\r\nnewTarget = Hoek.merge(targetArray, sourceArray)              // results in [1, 2, 3, 4, 5]\r\nnewTarget = Hoek.merge(targetArray, sourceArray, true, false) // results in [4, 5]\r\n\r\n\r\n\r\n\r\n```\r\n\r\n### applyToDefaults(defaults, options)\r\n\r\nApply options to a copy of the defaults\r\n\r\n```javascript\r\n\r\nvar defaults = {host: \"localhost\", port: 8000};\r\nvar options = {port: 8080};\r\n\r\nvar config = Hoek.applyToDefaults(defaults, options); // results in {host: \"localhost\", port: 8080};\r\n\r\n\r\n```\r\n\r\n### unique(array, key)\r\n\r\nRemove duplicate items from Array\r\n\r\n```javascript\r\n\r\nvar array = [1, 2, 2, 3, 3, 4, 5, 6];\r\n\r\nvar newArray = Hoek.unique(array); // results in [1,2,3,4,5,6];\r\n\r\narray = [{id: 1}, {id: 1}, {id: 2}];\r\n\r\nnewArray = Hoek.unique(array, \"id\") // results in [{id: 1}, {id: 2}]\r\n\r\n```\r\n\r\n### mapToObject(array, key)\r\n\r\nConvert an Array into an Object\r\n\r\n```javascript\r\n\r\nvar array = [1,2,3];\r\nvar newObject = Hoek.mapToObject(array); // results in [{\"1\": true}, {\"2\": true}, {\"3\": true}]\r\n\r\narray = [{id: 1}, {id: 2}];\r\nnewObject = Hoek.mapToObject(array, \"id\") // results in [{\"id\": 1}, {\"id\": 2}]\r\n\r\n```\r\n### intersect(array1, array2)\r\n\r\nFind the common unique items in two arrays\r\n\r\n```javascript\r\n\r\nvar array1 = [1, 2, 3];\r\nvar array2 = [1, 4, 5];\r\n\r\nvar newArray = Hoek.intersect(array1, array2) // results in [1]\r\n\r\n```\r\n\r\n### matchKeys(obj, keys) \r\n\r\nFind which keys are present\r\n\r\n```javascript\r\n\r\nvar obj = {a: 1, b: 2, c: 3};\r\nvar keys = [\"a\", \"e\"];\r\n\r\nHoek.matchKeys(obj, keys) // returns [\"a\"]\r\n\r\n```\r\n\r\n### flatten(array, target)\r\n\r\nFlatten an array\r\n\r\n```javascript\r\n\r\nvar array = [1, 2, 3];\r\nvar target = [4, 5]; \r\n\r\nvar flattenedArray = Hoek.flatten(array, target) // results in [4, 5, 1, 2, 3];\r\n\r\n```\r\n\r\n### removeKeys(object, keys)\r\n\r\nRemove keys\r\n\r\n```javascript\r\n\r\nvar object = {a: 1, b: 2, c: 3, d: 4};\r\n\r\nvar keys = [\"a\", \"b\"];\r\n\r\nHoek.removeKeys(object, keys) // object is now {c: 3, d: 4}\r\n\r\n```\r\n\r\n### reach(obj, chain)\r\n\r\nConverts an object key chain string to reference\r\n\r\n```javascript\r\n\r\nvar chain = 'a.b.c';\r\nvar obj = {a : {b : { c : 1}}};\r\n\r\nHoek.reach(obj, chain) // returns 1\r\n\r\n```\r\n\r\n### inheritAsync(self, obj, keys) \r\n\r\nInherits a selected set of methods from an object, wrapping functions in asynchronous syntax and catching errors\r\n\r\n```javascript\r\n\r\nvar targetFunc = function () { };\r\n\r\nvar proto = {\r\n                a: function () {\r\n                    return 'a!';\r\n                },\r\n                b: function () {\r\n                    return 'b!';\r\n                },\r\n                c: function () {\r\n                    throw new Error('c!');\r\n                }\r\n            };\r\n\r\nvar keys = ['a', 'c'];\r\n\r\nHoek.inheritAsync(targetFunc, proto, ['a', 'c']);\r\n\r\nvar target = new targetFunc();\r\n\r\ntarget.a(function(err, result){console.log(result)}         // returns 'a!'       \r\n\r\ntarget.c(function(err, result){console.log(result)}         // returns undefined\r\n\r\ntarget.b(function(err, result){console.log(result)}         // gives error: Object [object Object] has no method 'b'\r\n\r\n```\r\n\r\n### rename(obj, from, to)\r\n\r\nRename a key of an object\r\n\r\n```javascript\r\n\r\nvar obj = {a : 1, b : 2};\r\n\r\nHoek.rename(obj, \"a\", \"c\");     // obj is now {c : 1, b : 2}\r\n\r\n```\r\n\r\n\r\n# Timer\r\n\r\nA Timer object. Initializing a new timer object sets the ts to the number of milliseconds elapsed since 1 January 1970 00:00:00 UTC.\r\n\r\n```javascript\r\n\r\n\r\nexample : \r\n\r\n\r\nvar timerObj = new Hoek.Timer();\r\nconsole.log(\"Time is now: \" + timerObj.ts)\r\nconsole.log(\"Elapsed time from initialization: \" + timerObj.elapsed() + 'milliseconds')\r\n\r\n```\r\n\r\n# Binary Encoding/Decoding\r\n\r\n### base64urlEncode(value)\r\n\r\nEncodes value in Base64 or URL encoding\r\n\r\n### base64urlDecode(value)\r\n\r\nDecodes data in Base64 or URL encoding.\r\n# Escaping Characters\r\n\r\nHoek provides convenient methods for escaping html characters. The escaped characters are as followed:\r\n\r\n```javascript\r\n\r\ninternals.htmlEscaped = {\r\n    '&': '&amp;',\r\n    '<': '&lt;',\r\n    '>': '&gt;',\r\n    '\"': '&quot;',\r\n    \"'\": '&#x27;',\r\n    '`': '&#x60;'\r\n};\r\n\r\n```\r\n\r\n### escapeHtml(string)\r\n\r\n```javascript\r\n\r\nvar string = '<html> hey </html>';\r\nvar escapedString = Hoek.escapeHtml(string); // returns &lt;html&gt; hey &lt;/html&gt;\r\n\r\n```\r\n\r\n### escapeHeaderAttribute(attribute)\r\n\r\nEscape attribute value for use in HTTP header\r\n\r\n```javascript\r\n\r\nvar a = Hoek.escapeHeaderAttribute('I said \"go w\\\\o me\"');  //returns I said \\\"go w\\\\o me\\\"\r\n\r\n\r\n```\r\n\r\n\r\n### escapeRegex(string)\r\n\r\nEscape string for Regex construction\r\n\r\n```javascript\r\n\r\nvar a = Hoek.escapeRegex('4^f$s.4*5+-_?%=#!:@|~\\\\/`\"(>)[<]d{}s,');  // returns 4\\^f\\$s\\.4\\*5\\+\\-_\\?%\\=#\\!\\:@\\|~\\\\\\/`\"\\(>\\)\\[<\\]d\\{\\}s\\,\r\n\r\n\r\n\r\n```\r\n\r\n# Errors\r\n\r\n### assert(message)\r\n\r\n```javascript\r\n\r\nvar a = 1, b =2;\r\n\r\nHoek.assert(a === b, 'a should equal b');  // ABORT: a should equal b\r\n\r\n```\r\n\r\n### abort(message)\r\n\r\nFirst checks if process.env.NODE_ENV === 'test', and if so, throws error message. Otherwise,\r\ndisplays most recent stack and then exits process.\r\n\r\n\r\n\r\n### displayStack(slice)\r\n\r\nDisplays the trace stack\r\n\r\n```javascript\r\n\r\nvar stack = Hoek.displayStack();\r\nconsole.log(stack) // returns something like:\r\n\r\n[ 'null (/Users/user/Desktop/hoek/test.js:4:18)',\r\n  'Module._compile (module.js:449:26)',\r\n  'Module._extensions..js (module.js:467:10)',\r\n  'Module.load (module.js:356:32)',\r\n  'Module._load (module.js:312:12)',\r\n  'Module.runMain (module.js:492:10)',\r\n  'startup.processNextTick.process._tickCallback (node.js:244:9)' ]\r\n\r\n```\r\n\r\n### callStack(slice)\r\n\r\nReturns a trace stack array.\r\n\r\n```javascript\r\n\r\nvar stack = Hoek.callStack();\r\nconsole.log(stack)  // returns something like:\r\n\r\n[ [ '/Users/user/Desktop/hoek/test.js', 4, 18, null, false ],\r\n  [ 'module.js', 449, 26, 'Module._compile', false ],\r\n  [ 'module.js', 467, 10, 'Module._extensions..js', false ],\r\n  [ 'module.js', 356, 32, 'Module.load', false ],\r\n  [ 'module.js', 312, 12, 'Module._load', false ],\r\n  [ 'module.js', 492, 10, 'Module.runMain', false ],\r\n  [ 'node.js',\r\n    244,\r\n    9,\r\n    'startup.processNextTick.process._tickCallback',\r\n    false ] ]\r\n\r\n\r\n```\r\n\r\n### toss(condition)\r\n\r\ntoss(condition /*, [message], callback */)\r\n\r\nReturn an error as first argument of a callback\r\n\r\n\r\n# Load Files\r\n\r\n### loadPackage(dir)\r\n\r\nLoad and parse package.json process root or given directory\r\n\r\n```javascript\r\n\r\nvar pack = Hoek.loadPackage();  // pack.name === 'hoek'\r\n\r\n```\r\n\r\n### loadDirModules(path, excludeFiles, target) \r\n\r\nLoads modules from a given path; option to exclude files (array).\r\n\r\n\r\n\r\n\r\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/spumko/hoek/issues"
-  },
-  "homepage": "https://github.com/spumko/hoek",
-  "_id": "hoek@0.9.1",
-  "dist": {
-    "shasum": "396f2118033eabc93ae5c2cd6ca75f0a89c03592"
-  },
-  "_from": "hoek@0.9.x",
-  "_resolved": "https://registry.npmjs.org/hoek/-/hoek-0.9.1.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-.idea
-*.iml
-npm-debug.log
-dump.rdb
-node_modules
-results.tap
-results.xml
-npm-shrinkwrap.json
-config.json
-.DS_Store
-*/.DS_Store
-*/*/.DS_Store
-._*
-*/._*
-*/*/._*
-coverage.*
-lib-cov
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-language: node_js
-
-node_js:
-  - 0.10
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-Copyright (c) 2012-2013, Eran Hammer.
-All rights reserved.
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions are met:
-    * Redistributions of source code must retain the above copyright
-      notice, this list of conditions and the following disclaimer.
-    * Redistributions in binary form must reproduce the above copyright
-      notice, this list of conditions and the following disclaimer in the
-      documentation and/or other materials provided with the distribution.
-    * Neither the name of Eran Hammer nor the
-      names of its contributors may be used to endorse or promote products
-      derived from this software without specific prior written permission.
-
-THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
-WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
-DISCLAIMED. IN NO EVENT SHALL ERAN HAMMER BE LIABLE FOR ANY
-DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
-(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
-LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
-ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
-SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
-test:
-	@./node_modules/.bin/lab
-test-cov: 
-	@./node_modules/.bin/lab -r threshold -t 100
-test-cov-html:
-	@./node_modules/.bin/lab -r html -o coverage.html
-complexity:
-	@./node_modules/.bin/cr -o complexity.md -f markdown lib
-
-.PHONY: test test-cov test-cov-html complexity
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-# sntp
-
-An SNTP v4 client (RFC4330) for node. Simpy connects to the NTP or SNTP server requested and returns the server time
-along with the roundtrip duration and clock offset. To adjust the local time to the NTP time, add the returned `t` offset
-to the local time.
-
-[![Build Status](https://secure.travis-ci.org/hueniverse/sntp.png)](http://travis-ci.org/hueniverse/sntp)
-
-# Usage
-
-```javascript
-var Sntp = require('sntp');
-
-// All options are optional
-
-var options = {
-    host: 'nist1-sj.ustiming.org',  // Defaults to pool.ntp.org
-    port: 123,                      // Defaults to 123 (NTP)
-    resolveReference: true,         // Default to false (not resolving)
-    timeout: 1000                   // Defaults to zero (no timeout)
-};
-
-// Request server time
-
-Sntp.time(options, function (err, time) {
-
-    if (err) {
-        console.log('Failed: ' + err.message);
-        process.exit(1);
-    }
-
-    console.log('Local clock is off by: ' + time.t + ' milliseconds');
-    process.exit(0);
-});
-```
-
-If an application needs to maintain continuous time synchronization, the module provides a stateful method for
-querying the current offset only when the last one is too old (defaults to daily).
-
-```javascript
-// Request offset once
-
-Sntp.offset(function (err, offset) {
-
-    console.log(offset);                    // New (served fresh)
-
-    // Request offset again
-
-    Sntp.offset(function (err, offset) {
-
-        console.log(offset);                // Identical (served from cache)
-    });
-});
-```
-
-To set a background offset refresh, start the interval and use the provided now() method. If for any reason the
-client fails to obtain an up-to-date offset, the current system clock is used.
-
-```javascript
-var before = Sntp.now();                    // System time without offset
-
-Sntp.start(function () {
-
-    var now = Sntp.now();                   // With offset
-    Sntp.stop();
-});
-```
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/examples/offset.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-var Sntp = require('../lib');
-
-// Request offset once
-
-Sntp.offset(function (err, offset) {
-
-    console.log(offset);                    // New (served fresh)
-
-    // Request offset again
-
-    Sntp.offset(function (err, offset) {
-
-        console.log(offset);                // Identical (served from cache)
-    });
-});
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/examples/time.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-var Sntp = require('../lib');
-
-// All options are optional
-
-var options = {
-    host: 'nist1-sj.ustiming.org',  // Defaults to pool.ntp.org
-    port: 123,                      // Defaults to 123 (NTP)
-    resolveReference: true,         // Default to false (not resolving)
-    timeout: 1000                   // Defaults to zero (no timeout)
-};
-
-// Request server time
-
-Sntp.time(options, function (err, time) {
-
-    if (err) {
-        console.log('Failed: ' + err.message);
-        process.exit(1);
-    }
-
-    console.log(time);
-    console.log('Local clock is off by: ' + time.t + ' milliseconds');
-    process.exit(0);
-});
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require('./lib');
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/lib/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,409 +0,0 @@
-// Load modules
-
-var Dgram = require('dgram');
-var Dns = require('dns');
-var Hoek = require('hoek');
-
-
-// Declare internals
-
-var internals = {};
-
-
-exports.time = function (options, callback) {
-
-    if (arguments.length !== 2) {
-        callback = arguments[0];
-        options = {};
-    }
-
-    var settings = Hoek.clone(options);
-    settings.host = settings.host || 'pool.ntp.org';
-    settings.port = settings.port || 123;
-    settings.resolveReference = settings.resolveReference || false;
-
-    // Declare variables used by callback
-
-    var timeoutId = 0;
-    var sent = 0;
-
-    // Ensure callback is only called once
-
-    var isFinished = false;
-    var finish = function (err, result) {
-
-        if (timeoutId) {
-            clearTimeout(timeoutId);
-            timeoutId = 0;
-        }
-
-        if (!isFinished) {
-            isFinished = true;
-            socket.removeAllListeners();
-            socket.close();
-            return callback(err, result);
-        }
-    };
-
-    // Create UDP socket
-
-    var socket = Dgram.createSocket('udp4');
-
-    socket.once('error', function (err) {
-
-        return finish(err);
-    });
-
-    // Listen to incoming messages
-
-    socket.on('message', function (buffer, rinfo) {
-
-        var received = Date.now();
-
-        var message = new internals.NtpMessage(buffer);
-        if (!message.isValid) {
-            return finish(new Error('Invalid server response'), message);
-        }
-
-        if (message.originateTimestamp !== sent) {
-            return finish(new Error('Wrong originate timestamp'), message);
-        }
-
-        // Timestamp Name          ID   When Generated
-        // ------------------------------------------------------------
-        // Originate Timestamp     T1   time request sent by client
-        // Receive Timestamp       T2   time request received by server
-        // Transmit Timestamp      T3   time reply sent by server
-        // Destination Timestamp   T4   time reply received by client
-        //
-        // The roundtrip delay d and system clock offset t are defined as:
-        //
-        // d = (T4 - T1) - (T3 - T2)     t = ((T2 - T1) + (T3 - T4)) / 2
-
-        var T1 = message.originateTimestamp;
-        var T2 = message.receiveTimestamp;
-        var T3 = message.transmitTimestamp;
-        var T4 = received;
-
-        message.d = (T4 - T1) - (T3 - T2);
-        message.t = ((T2 - T1) + (T3 - T4)) / 2;
-        message.receivedLocally = received;
-
-        if (!settings.resolveReference ||
-            message.stratum !== 'secondary') {
-
-            return finish(null, message);
-        }
-
-        // Resolve reference IP address
-
-        Dns.reverse(message.referenceId, function (err, domains) {
-
-            if (!err) {
-                message.referenceHost = domains[0];
-            }
-
-            return finish(null, message);
-        });
-    });
-
-    // Set timeout
-
-    if (settings.timeout) {
-        timeoutId = setTimeout(function () {
-
-            timeoutId = 0;
-            return finish(new Error('Timeout'));
-        }, settings.timeout);
-    }
-
-    // Construct NTP message
-
-    var message = new Buffer(48);
-    for (var i = 0; i < 48; i++) {                      // Zero message
-        message[i] = 0;
-    }
-
-    message[0] = (0 << 6) + (4 << 3) + (3 << 0)         // Set version number to 4 and Mode to 3 (client)
-    sent = Date.now();
-    internals.fromMsecs(sent, message, 40);               // Set transmit timestamp (returns as originate)
-
-    // Send NTP request
-
-    socket.send(message, 0, message.length, settings.port, settings.host, function (err, bytes) {
-
-        if (err ||
-            bytes !== 48) {
-
-            return finish(err || new Error('Could not send entire message'));
-        }
-    });
-};
-
-
-internals.NtpMessage = function (buffer) {
-
-    this.isValid = false;
-
-    // Validate
-
-    if (buffer.length !== 48) {
-        return;
-    }
-
-    // Leap indicator
-
-    var li = (buffer[0] >> 6);
-    switch (li) {
-        case 0: this.leapIndicator = 'no-warning'; break;
-        case 1: this.leapIndicator = 'last-minute-61'; break;
-        case 2: this.leapIndicator = 'last-minute-59'; break;
-        case 3: this.leapIndicator = 'alarm'; break;
-    }
-
-    // Version
-
-    var vn = ((buffer[0] & 0x38) >> 3);
-    this.version = vn;
-
-    // Mode
-
-    var mode = (buffer[0] & 0x7);
-    switch (mode) {
-        case 1: this.mode = 'symmetric-active'; break;
-        case 2: this.mode = 'symmetric-passive'; break;
-        case 3: this.mode = 'client'; break;
-        case 4: this.mode = 'server'; break;
-        case 5: this.mode = 'broadcast'; break;
-        case 0:
-        case 6:
-        case 7: this.mode = 'reserved'; break;
-    }
-
-    // Stratum
-
-    var stratum = buffer[1];
-    if (stratum === 0) {
-        this.stratum = 'death';
-    }
-    else if (stratum === 1) {
-        this.stratum = 'primary';
-    }
-    else if (stratum <= 15) {
-        this.stratum = 'secondary';
-    }
-    else {
-        this.stratum = 'reserved';
-    }
-
-    // Poll interval (msec)
-
-    this.pollInterval = Math.round(Math.pow(2, buffer[2])) * 1000;
-
-    // Precision (msecs)
-
-    this.precision = Math.pow(2, buffer[3]) * 1000;
-
-    // Root delay (msecs)
-
-    var rootDelay = 256 * (256 * (256 * buffer[4] + buffer[5]) + buffer[6]) + buffer[7];
-    this.rootDelay = 1000 * (rootDelay / 0x10000);
-
-    // Root dispersion (msecs)
-
-    this.rootDispersion = ((buffer[8] << 8) + buffer[9] + ((buffer[10] << 8) + buffer[11]) / Math.pow(2, 16)) * 1000;
-
-    // Reference identifier
-
-    this.referenceId = '';
-    switch (this.stratum) {
-        case 'death':
-        case 'primary':
-            this.referenceId = String.fromCharCode(buffer[12]) + String.fromCharCode(buffer[13]) + String.fromCharCode(buffer[14]) + String.fromCharCode(buffer[15]);
-            break;
-        case 'secondary':
-            this.referenceId = '' + buffer[12] + '.' + buffer[13] + '.' + buffer[14] + '.' + buffer[15];
-            break;
-    }
-
-    // Reference timestamp
-
-    this.referenceTimestamp = internals.toMsecs(buffer, 16);
-
-    // Originate timestamp
-
-    this.originateTimestamp = internals.toMsecs(buffer, 24);
-
-    // Receive timestamp
-
-    this.receiveTimestamp = internals.toMsecs(buffer, 32);
-
-    // Transmit timestamp
-
-    this.transmitTimestamp = internals.toMsecs(buffer, 40);
-
-    // Validate
-
-    if (this.version === 4 &&
-        this.stratum !== 'reserved' &&
-        this.mode === 'server' &&
-        this.originateTimestamp &&
-        this.receiveTimestamp &&
-        this.transmitTimestamp) {
-
-        this.isValid = true;
-    }
-
-    return this;
-};
-
-
-internals.toMsecs = function (buffer, offset) {
-
-    var seconds = 0;
-    var fraction = 0;
-
-    for (var i = 0; i < 4; ++i) {
-        seconds = (seconds * 256) + buffer[offset + i];
-    }
-
-    for (i = 4; i < 8; ++i) {
-        fraction = (fraction * 256) + buffer[offset + i];
-    }
-
-    return ((seconds - 2208988800 + (fraction / Math.pow(2, 32))) * 1000);
-};
-
-
-internals.fromMsecs = function (ts, buffer, offset) {
-
-    var seconds = Math.floor(ts / 1000) + 2208988800;
-    var fraction = Math.round((ts % 1000) / 1000 * Math.pow(2, 32));
-
-    buffer[offset + 0] = (seconds & 0xFF000000) >> 24;
-    buffer[offset + 1] = (seconds & 0x00FF0000) >> 16;
-    buffer[offset + 2] = (seconds & 0x0000FF00) >> 8;
-    buffer[offset + 3] = (seconds & 0x000000FF);
-
-    buffer[offset + 4] = (fraction & 0xFF000000) >> 24;
-    buffer[offset + 5] = (fraction & 0x00FF0000) >> 16;
-    buffer[offset + 6] = (fraction & 0x0000FF00) >> 8;
-    buffer[offset + 7] = (fraction & 0x000000FF);
-};
-
-
-// Offset singleton
-
-internals.last = {
-    offset: 0,
-    expires: 0,
-    host: '',
-    port: 0
-};
-
-
-exports.offset = function (options, callback) {
-
-    if (arguments.length !== 2) {
-        callback = arguments[0];
-        options = {};
-    }
-
-    var now = Date.now();
-    var clockSyncRefresh = options.clockSyncRefresh || 24 * 60 * 60 * 1000;                    // Daily
-
-    if (internals.last.offset &&
-        internals.last.host === options.host &&
-        internals.last.port === options.port &&
-        now < internals.last.expires) {
-
-        process.nextTick(function () {
-                
-            callback(null, internals.last.offset);
-        });
-
-        return;
-    }
-
-    exports.time(options, function (err, time) {
-
-        if (err) {
-            return callback(err, 0);
-        }
-
-        internals.last = {
-            offset: Math.round(time.t),
-            expires: now + clockSyncRefresh,
-            host: options.host,
-            port: options.port
-        };
-
-        return callback(null, internals.last.offset);
-    });
-};
-
-
-// Now singleton
-
-internals.now = {
-    intervalId: 0
-};
-
-
-exports.start = function (options, callback) {
-
-    if (arguments.length !== 2) {
-        callback = arguments[0];
-        options = {};
-    }
-
-    if (internals.now.intervalId) {
-        process.nextTick(function () {
-            
-            callback();
-        });
-        
-        return;
-    }
-
-    exports.offset(options, function (err, offset) {
-
-        internals.now.intervalId = setInterval(function () {
-
-            exports.offset(options, function () { });
-        }, options.clockSyncRefresh || 24 * 60 * 60 * 1000);                                // Daily
-
-        return callback();
-    });
-};
-
-
-exports.stop = function () {
-
-    if (!internals.now.intervalId) {
-        return;
-    }
-
-    clearInterval(internals.now.intervalId);
-    internals.now.intervalId = 0;
-};
-
-
-exports.isLive = function () {
-
-    return !!internals.now.intervalId;
-};
-
-
-exports.now = function () {
-
-    var now = Date.now();
-    if (!exports.isLive() ||
-        now >= internals.last.expires) {
-
-        return now;
-    }
-
-    return now + internals.last.offset;
-};
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,52 +0,0 @@
-{
-  "name": "sntp",
-  "description": "SNTP Client",
-  "version": "0.2.4",
-  "author": {
-    "name": "Eran Hammer",
-    "email": "eran@hueniverse.com",
-    "url": "http://hueniverse.com"
-  },
-  "contributors": [],
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/hueniverse/sntp"
-  },
-  "main": "index",
-  "keywords": [
-    "sntp",
-    "ntp",
-    "time"
-  ],
-  "engines": {
-    "node": ">=0.8.0"
-  },
-  "dependencies": {
-    "hoek": "0.9.x"
-  },
-  "devDependencies": {
-    "lab": "0.1.x",
-    "complexity-report": "0.x.x"
-  },
-  "scripts": {
-    "test": "make test-cov"
-  },
-  "licenses": [
-    {
-      "type": "BSD",
-      "url": "http://github.com/hueniverse/sntp/raw/master/LICENSE"
-    }
-  ],
-  "readme": "# sntp\n\nAn SNTP v4 client (RFC4330) for node. Simpy connects to the NTP or SNTP server requested and returns the server time\nalong with the roundtrip duration and clock offset. To adjust the local time to the NTP time, add the returned `t` offset\nto the local time.\n\n[![Build Status](https://secure.travis-ci.org/hueniverse/sntp.png)](http://travis-ci.org/hueniverse/sntp)\n\n# Usage\n\n```javascript\nvar Sntp = require('sntp');\n\n// All options are optional\n\nvar options = {\n    host: 'nist1-sj.ustiming.org',  // Defaults to pool.ntp.org\n    port: 123,                      // Defaults to 123 (NTP)\n    resolveReference: true,         // Default to false (not resolving)\n    timeout: 1000                   // Defaults to zero (no timeout)\n};\n\n// Request server time\n\nSntp.time(options, function (err, time) {\n\n    if (err) {\n        console.log('Failed: ' + err.message);\n        process.exit(1);\n    }\n\n    console.log('Local clock is off by: ' + time.t + ' milliseconds');\n    process.exit(0);\n});\n```\n\nIf an application needs to maintain continuous time synchronization, the module provides a stateful method for\nquerying the current offset only when the last one is too old (defaults to daily).\n\n```javascript\n// Request offset once\n\nSntp.offset(function (err, offset) {\n\n    console.log(offset);                    // New (served fresh)\n\n    // Request offset again\n\n    Sntp.offset(function (err, offset) {\n\n        console.log(offset);                // Identical (served from cache)\n    });\n});\n```\n\nTo set a background offset refresh, start the interval and use the provided now() method. If for any reason the\nclient fails to obtain an up-to-date offset, the current system clock is used.\n\n```javascript\nvar before = Sntp.now();                    // System time without offset\n\nSntp.start(function () {\n\n    var now = Sntp.now();                   // With offset\n    Sntp.stop();\n});\n```\n\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/hueniverse/sntp/issues"
-  },
-  "homepage": "https://github.com/hueniverse/sntp",
-  "_id": "sntp@0.2.4",
-  "dist": {
-    "shasum": "80612680552b9c47a4ba312d125f366be68ad8f7"
-  },
-  "_from": "sntp@0.2.x",
-  "_resolved": "https://registry.npmjs.org/sntp/-/sntp-0.2.4.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/hawk/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-{
-  "name": "hawk",
-  "description": "HTTP Hawk Authentication Scheme",
-  "version": "1.0.0",
-  "author": {
-    "name": "Eran Hammer",
-    "email": "eran@hueniverse.com",
-    "url": "http://hueniverse.com"
-  },
-  "contributors": [],
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/hueniverse/hawk"
-  },
-  "main": "index",
-  "keywords": [
-    "http",
-    "authentication",
-    "scheme",
-    "hawk"
-  ],
-  "engines": {
-    "node": ">=0.8.0"
-  },
-  "dependencies": {
-    "hoek": "0.9.x",
-    "boom": "0.4.x",
-    "cryptiles": "0.2.x",
-    "sntp": "0.2.x"
-  },
-  "devDependencies": {
-    "lab": "0.1.x",
-    "complexity-report": "0.x.x",
-    "localStorage": "1.0.x"
-  },
-  "scripts": {
-    "test": "make test-cov"
-  },
-  "licenses": [
-    {
-      "type": "BSD",
-      "url": "http://github.com/hueniverse/hawk/raw/master/LICENSE"
-    }
-  ],
-  "readme": "![hawk Logo](https://raw.github.com/hueniverse/hawk/master/images/hawk.png)\n\n<img align=\"right\" src=\"https://raw.github.com/hueniverse/hawk/master/images/logo.png\" /> **Hawk** is an HTTP authentication scheme using a message authentication code (MAC) algorithm to provide partial\nHTTP request cryptographic verification. For more complex use cases such as access delegation, see [Oz](https://github.com/hueniverse/oz).\n\nCurrent version: **1.0**\n\n[![Build Status](https://secure.travis-ci.org/hueniverse/hawk.png)](http://travis-ci.org/hueniverse/hawk)\n\n# Table of Content\n\n- [**Introduction**](#introduction)\n  - [Replay Protection](#replay-protection)\n  - [Usage Example](#usage-example)\n  - [Protocol Example](#protocol-example)\n    - [Payload Validation](#payload-validation)\n    - [Response Payload Validation](#response-payload-validation)\n  - [Browser Support and Considerations](#browser-support-and-considerations)\n<p></p>\n- [**Single URI Authorization**](#single-uri-authorization)\n  - [Usage Example](#bewit-usage-example)\n<p></p>\n- [**Security Considerations**](#security-considerations)\n  - [MAC Keys Transmission](#mac-keys-transmission)\n  - [Confidentiality of Requests](#confidentiality-of-requests)\n  - [Spoofing by Counterfeit Servers](#spoofing-by-counterfeit-servers)\n  - [Plaintext Storage of Credentials](#plaintext-storage-of-credentials)\n  - [Entropy of Keys](#entropy-of-keys)\n  - [Coverage Limitations](#coverage-limitations)\n  - [Future Time Manipulation](#future-time-manipulation)\n  - [Client Clock Poisoning](#client-clock-poisoning)\n  - [Bewit Limitations](#bewit-limitations)\n  - [Host Header Forgery](#host-header-forgery)\n<p></p>\n- [**Frequently Asked Questions**](#frequently-asked-questions)\n<p></p>\n- [**Acknowledgements**](#acknowledgements)\n\n# Introduction\n\n**Hawk** is an HTTP authentication scheme providing mechanisms for making authenticated HTTP requests with\npartial cryptographic verification of the request and response, covering the HTTP method, request URI, host,\nand optionally the request payload.\n\nSimilar to the HTTP [Digest access authentication schemes](http://www.ietf.org/rfc/rfc2617.txt), **Hawk** uses a set of\nclient credentials which include an identifier (e.g. username) and key (e.g. password). Likewise, just as with the Digest scheme,\nthe key is never included in authenticated requests. Instead, it is used to calculate a request MAC value which is\nincluded in its place.\n\nHowever, **Hawk** has several differences from Digest. In particular, while both use a nonce to limit the possibility of\nreplay attacks, in **Hawk** the client generates the nonce and uses it in combination with a timestamp, leading to less\n\"chattiness\" (interaction with the server).\n\nAlso unlike Digest, this scheme is not intended to protect the key itself (the password in Digest) because\nthe client and server must both have access to the key material in the clear.\n\nThe primary design goals of this scheme are to:\n* simplify and improve HTTP authentication for services that are unwilling or unable to deploy TLS for all resources,\n* secure credentials against leakage (e.g., when the client uses some form of dynamic configuration to determine where\n  to send an authenticated request), and\n* avoid the exposure of credentials sent to a malicious server over an unauthenticated secure channel due to client\n  failure to validate the server's identity as part of its TLS handshake.\n\nIn addition, **Hawk** supports a method for granting third-parties temporary access to individual resources using\na query parameter called _bewit_ (in falconry, a leather strap used to attach a tracking device to the leg of a hawk).\n\nThe **Hawk** scheme requires the establishment of a shared symmetric key between the client and the server,\nwhich is beyond the scope of this module. Typically, the shared credentials are established via an initial\nTLS-protected phase or derived from some other shared confidential information available to both the client\nand the server.\n\n\n## Replay Protection\n\nWithout replay protection, an attacker can use a compromised (but otherwise valid and authenticated) request more \nthan once, gaining access to a protected resource. To mitigate this, clients include both a nonce and a timestamp when \nmaking requests. This gives the server enough information to prevent replay attacks.\n\nThe nonce is generated by the client, and is a string unique across all requests with the same timestamp and\nkey identifier combination. \n\nThe timestamp enables the server to restrict the validity period of the credentials where requests occuring afterwards\nare rejected. It also removes the need for the server to retain an unbounded number of nonce values for future checks.\nBy default, **Hawk** uses a time window of 1 minute to allow for time skew between the client and server (which in\npractice translates to a maximum of 2 minutes as the skew can be positive or negative).\n\nUsing a timestamp requires the client's clock to be in sync with the server's clock. **Hawk** requires both the client\nclock and the server clock to use NTP to ensure synchronization. However, given the limitations of some client types\n(e.g. browsers) to deploy NTP, the server provides the client with its current time (in seconds precision) in response\nto a bad timestamp.\n\nThere is no expectation that the client will adjust its system clock to match the server (in fact, this would be a\npotential attack vector). Instead, the client only uses the server's time to calculate an offset used only\nfor communications with that particular server. The protocol rewards clients with synchronized clocks by reducing\nthe number of round trips required to authenticate the first request.\n\n\n## Usage Example\n\nServer code:\n\n```javascript\nvar Http = require('http');\nvar Hawk = require('hawk');\n\n\n// Credentials lookup function\n\nvar credentialsFunc = function (id, callback) {\n\n    var credentials = {\n        key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn',\n        algorithm: 'sha256',\n        user: 'Steve'\n    };\n\n    return callback(null, credentials);\n};\n\n// Create HTTP server\n\nvar handler = function (req, res) {\n\n    // Authenticate incoming request\n\n    Hawk.server.authenticate(req, credentialsFunc, {}, function (err, credentials, artifacts) {\n\n        // Prepare response\n\n        var payload = (!err ? 'Hello ' + credentials.user + ' ' + artifacts.ext : 'Shoosh!');\n        var headers = { 'Content-Type': 'text/plain' };\n\n        // Generate Server-Authorization response header\n\n        var header = Hawk.server.header(credentials, artifacts, { payload: payload, contentType: headers['Content-Type'] });\n        headers['Server-Authorization'] = header;\n\n        // Send the response back\n\n        res.writeHead(!err ? 200 : 401, headers);\n        res.end(payload);\n    });\n};\n\n// Start server\n\nHttp.createServer(handler).listen(8000, 'example.com');\n```\n\nClient code:\n\n```javascript\nvar Request = require('request');\nvar Hawk = require('hawk');\n\n\n// Client credentials\n\nvar credentials = {\n    id: 'dh37fgj492je',\n    key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn',\n    algorithm: 'sha256'\n}\n\n// Request options\n\nvar requestOptions = {\n    uri: 'http://example.com:8000/resource/1?b=1&a=2',\n    method: 'GET',\n    headers: {}\n};\n\n// Generate Authorization request header\n\nvar header = Hawk.client.header('http://example.com:8000/resource/1?b=1&a=2', 'GET', { credentials: credentials, ext: 'some-app-data' });\nrequestOptions.headers.Authorization = header.field;\n\n// Send authenticated request\n\nRequest(requestOptions, function (error, response, body) {\n\n    // Authenticate the server's response\n\n    var isValid = Hawk.client.authenticate(response, credentials, header.artifacts, { payload: body });\n\n    // Output results\n\n    console.log(response.statusCode + ': ' + body + (isValid ? ' (valid)' : ' (invalid)'));\n});\n```\n\n**Hawk** utilized the [**SNTP**](https://github.com/hueniverse/sntp) module for time sync management. By default, the local\nmachine time is used. To automatically retrieve and synchronice the clock within the application, use the SNTP 'start()' method.\n\n```javascript\nHawk.sntp.start();\n```\n\n\n## Protocol Example\n\nThe client attempts to access a protected resource without authentication, sending the following HTTP request to\nthe resource server:\n\n```\nGET /resource/1?b=1&a=2 HTTP/1.1\nHost: example.com:8000\n```\n\nThe resource server returns an authentication challenge.\n\n```\nHTTP/1.1 401 Unauthorized\nWWW-Authenticate: Hawk\n```\n\nThe client has previously obtained a set of **Hawk** credentials for accessing resources on the \"http://example.com/\"\nserver. The **Hawk** credentials issued to the client include the following attributes:\n\n* Key identifier: dh37fgj492je\n* Key: werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn\n* Algorithm: sha256\n\nThe client generates the authentication header by calculating a timestamp (e.g. the number of seconds since January 1,\n1970 00:00:00 GMT), generating a nonce, and constructing the normalized request string (each value followed by a newline\ncharacter):\n\n```\nhawk.1.header\n1353832234\nj4h3g2\nGET\n/resource/1?b=1&a=2\nexample.com\n8000\n\nsome-app-ext-data\n\n```\n\nThe request MAC is calculated using HMAC with the specified hash algorithm \"sha256\" and the key over the normalized request string.\nThe result is base64-encoded to produce the request MAC:\n\n```\n6R4rV5iE+NPoym+WwjeHzjAGXUtLNIxmo1vpMofpLAE=\n```\n\nThe client includes the **Hawk** key identifier, timestamp, nonce, application specific data, and request MAC with the request using\nthe HTTP `Authorization` request header field:\n\n```\nGET /resource/1?b=1&a=2 HTTP/1.1\nHost: example.com:8000\nAuthorization: Hawk id=\"dh37fgj492je\", ts=\"1353832234\", nonce=\"j4h3g2\", ext=\"some-app-ext-data\", mac=\"6R4rV5iE+NPoym+WwjeHzjAGXUtLNIxmo1vpMofpLAE=\"\n```\n\nThe server validates the request by calculating the request MAC again based on the request received and verifies the validity\nand scope of the **Hawk** credentials. If valid, the server responds with the requested resource.\n\n\n### Payload Validation\n\n**Hawk** provides optional payload validation. When generating the authentication header, the client calculates a payload hash\nusing the specified hash algorithm. The hash is calculated over the concatenated value of (each followed by a newline character):\n* `hawk.1.payload`\n* the content-type in lowercase, without any parameters (e.g. `application/json`)\n* the request payload prior to any content encoding (the exact representation requirements should be specified by the server for payloads other than simple single-part ascii to ensure interoperability)\n\nFor example:\n\n* Payload: `Thank you for flying Hawk`\n* Content Type: `text/plain`\n* Hash (sha256): `Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=`\n\nResults in the following input to the payload hash function (newline terminated values):\n\n```\nhawk.1.payload\ntext/plain\nThank you for flying Hawk\n\n```\n\nWhich produces the following hash value:\n\n```\nYi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=\n```\n\nThe client constructs the normalized request string (newline terminated values):\n\n```\nhawk.1.header\n1353832234\nj4h3g2\nPOST\n/resource/1?a=1&b=2\nexample.com\n8000\nYi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=\nsome-app-ext-data\n\n```\n\nThen calculates the request MAC and includes the **Hawk** key identifier, timestamp, nonce, payload hash, application specific data,\nand request MAC, with the request using the HTTP `Authorization` request header field:\n\n```\nPOST /resource/1?a=1&b=2 HTTP/1.1\nHost: example.com:8000\nAuthorization: Hawk id=\"dh37fgj492je\", ts=\"1353832234\", nonce=\"j4h3g2\", hash=\"Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=\", ext=\"some-app-ext-data\", mac=\"aSe1DERmZuRl3pI36/9BdZmnErTw3sNzOOAUlfeKjVw=\"\n```\n\nIt is up to the server if and when it validates the payload for any given request, based solely on it's security policy\nand the nature of the data included.\n\nIf the payload is available at the time of authentication, the server uses the hash value provided by the client to construct\nthe normalized string and validates the MAC. If the MAC is valid, the server calculates the payload hash and compares the value\nwith the provided payload hash in the header. In many cases, checking the MAC first is faster than calculating the payload hash.\n\nHowever, if the payload is not available at authentication time (e.g. too large to fit in memory, streamed elsewhere, or processed\nat a different stage in the application), the server may choose to defer payload validation for later by retaining the hash value\nprovided by the client after validating the MAC.\n\nIt is important to note that MAC validation does not mean the hash value provided by the client is valid, only that the value\nincluded in the header was not modified. Without calculating the payload hash on the server and comparing it to the value provided\nby the client, the payload may be modified by an attacker.\n\n\n## Response Payload Validation\n\n**Hawk** provides partial response payload validation. The server includes the `Server-Authorization` response header which enables the\nclient to authenticate the response and ensure it is talking to the right server. **Hawk** defines the HTTP `Server-Authorization` header\nas a response header using the exact same syntax as the `Authorization` request header field.\n\nThe header is contructed using the same process as the client's request header. The server uses the same credentials and other\nartifacts provided by the client to constructs the normalized request string. The `ext` and `hash` values are replaced with\nnew values based on the server response. The rest as identical to those used by the client.\n\nThe result MAC digest is included with the optional `hash` and `ext` values:\n\n```\nServer-Authorization: Hawk mac=\"XIJRsMl/4oL+nn+vKoeVZPdCHXB4yJkNnBbTbHFZUYE=\", hash=\"f9cDF/TDm7TkYRLnGwRMfeDzT6LixQVLvrIKhh0vgmM=\", ext=\"response-specific\"\n```\n\n\n## Browser Support and Considerations\n\nA browser script is provided for including using a `<script>` tag in [lib/browser.js](/lib/browser.js).\n\n**Hawk** relies on the _Server-Authorization_ and _WWW-Authenticate_ headers in its response to communicate with the client.\nTherefore, in case of CORS requests, it is important to consider sending _Access-Control-Expose-Headers_ with the value\n_\"WWW-Authenticate, Server-Authorization\"_ on each response from your server. As explained in the\n[specifications](http://www.w3.org/TR/cors/#access-control-expose-headers-response-header), it will indicate that these headers\ncan safely be accessed by the client (using getResponseHeader() on the XmlHttpRequest object). Otherwise you will be met with a\n[\"simple response header\"](http://www.w3.org/TR/cors/#simple-response-header) which excludes these fields and would prevent the\nHawk client from authenticating the requests.You can read more about the why and how in this\n[article](http://www.html5rocks.com/en/tutorials/cors/#toc-adding-cors-support-to-the-server)\n\n\n# Single URI Authorization\n\nThere are cases in which limited and short-term access to a protected resource is granted to a third party which does not\nhave access to the shared credentials. For example, displaying a protected image on a web page accessed by anyone. **Hawk**\nprovides limited support for such URIs in the form of a _bewit_ - a URI query parameter appended to the request URI which contains\nthe necessary credentials to authenticate the request.\n\nBecause of the significant security risks involved in issuing such access, bewit usage is purposely limited only to GET requests\nand for a finite period of time. Both the client and server can issue bewit credentials, however, the server should not use the same\ncredentials as the client to maintain clear traceability as to who issued which credentials.\n\nIn order to simplify implementation, bewit credentials do not support single-use policy and can be replayed multiple times within\nthe granted access timeframe. \n\n\n## Bewit Usage Example\n\nServer code:\n\n```javascript\nvar Http = require('http');\nvar Hawk = require('hawk');\n\n\n// Credentials lookup function\n\nvar credentialsFunc = function (id, callback) {\n\n    var credentials = {\n        key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn',\n        algorithm: 'sha256'\n    };\n\n    return callback(null, credentials);\n};\n\n// Create HTTP server\n\nvar handler = function (req, res) {\n\n    Hawk.uri.authenticate(req, credentialsFunc, {}, function (err, credentials, attributes) {\n\n        res.writeHead(!err ? 200 : 401, { 'Content-Type': 'text/plain' });\n        res.end(!err ? 'Access granted' : 'Shoosh!');\n    });\n};\n\nHttp.createServer(handler).listen(8000, 'example.com');\n```\n\nBewit code generation:\n\n```javascript\nvar Request = require('request');\nvar Hawk = require('hawk');\n\n\n// Client credentials\n\nvar credentials = {\n    id: 'dh37fgj492je',\n    key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn',\n    algorithm: 'sha256'\n}\n\n// Generate bewit\n\nvar duration = 60 * 5;      // 5 Minutes\nvar bewit = Hawk.uri.getBewit('http://example.com:8080/resource/1?b=1&a=2', { credentials: credentials, ttlSec: duration, ext: 'some-app-data' });\nvar uri = 'http://example.com:8000/resource/1?b=1&a=2' + '&bewit=' + bewit;\n```\n\n\n# Security Considerations\n\nThe greatest sources of security risks are usually found not in **Hawk** but in the policies and procedures surrounding its use.\nImplementers are strongly encouraged to assess how this module addresses their security requirements. This section includes\nan incomplete list of security considerations that must be reviewed and understood before deploying **Hawk** on the server.\nMany of the protections provided in **Hawk** depends on whether and how they are used.\n\n### MAC Keys Transmission\n\n**Hawk** does not provide any mechanism for obtaining or transmitting the set of shared credentials required. Any mechanism used\nto obtain **Hawk** credentials must ensure that these transmissions are protected using transport-layer mechanisms such as TLS.\n\n### Confidentiality of Requests\n\nWhile **Hawk** provides a mechanism for verifying the integrity of HTTP requests, it provides no guarantee of request\nconfidentiality. Unless other precautions are taken, eavesdroppers will have full access to the request content. Servers should\ncarefully consider the types of data likely to be sent as part of such requests, and employ transport-layer security mechanisms\nto protect sensitive resources.\n\n### Spoofing by Counterfeit Servers\n\n**Hawk** provides limited verification of the server authenticity. When receiving a response back from the server, the server\nmay choose to include a response `Server-Authorization` header which the client can use to verify the response. However, it is up to\nthe server to determine when such measure is included, to up to the client to enforce that policy.\n\nA hostile party could take advantage of this by intercepting the client's requests and returning misleading or otherwise\nincorrect responses. Service providers should consider such attacks when developing services using this protocol, and should\nrequire transport-layer security for any requests where the authenticity of the resource server or of server responses is an issue.\n\n### Plaintext Storage of Credentials\n\nThe **Hawk** key functions the same way passwords do in traditional authentication systems. In order to compute the request MAC,\nthe server must have access to the key in plaintext form. This is in contrast, for example, to modern operating systems, which\nstore only a one-way hash of user credentials.\n\nIf an attacker were to gain access to these keys - or worse, to the server's database of all such keys - he or she would be able\nto perform any action on behalf of any resource owner. Accordingly, it is critical that servers protect these keys from unauthorized\naccess.\n\n### Entropy of Keys\n\nUnless a transport-layer security protocol is used, eavesdroppers will have full access to authenticated requests and request\nMAC values, and will thus be able to mount offline brute-force attacks to recover the key used. Servers should be careful to\nassign keys which are long enough, and random enough, to resist such attacks for at least the length of time that the **Hawk**\ncredentials are valid.\n\nFor example, if the credentials are valid for two weeks, servers should ensure that it is not possible to mount a brute force\nattack that recovers the key in less than two weeks. Of course, servers are urged to err on the side of caution, and use the\nlongest key reasonable.\n\nIt is equally important that the pseudo-random number generator (PRNG) used to generate these keys be of sufficiently high\nquality. Many PRNG implementations generate number sequences that may appear to be random, but which nevertheless exhibit\npatterns or other weaknesses which make cryptanalysis or brute force attacks easier. Implementers should be careful to use\ncryptographically secure PRNGs to avoid these problems.\n\n### Coverage Limitations\n\nThe request MAC only covers the HTTP `Host` header and optionally the `Content-Type` header. It does not cover any other headers\nwhich can often affect how the request body is interpreted by the server. If the server behavior is influenced by the presence\nor value of such headers, an attacker can manipulate the request headers without being detected. Implementers should use the\n`ext` feature to pass application-specific information via the `Authorization` header which is protected by the request MAC.\n\nThe response authentication, when performed, only covers the response payload, content-type, and the request information \nprovided by the client in it's request (method, resource, timestamp, nonce, etc.). It does not cover the HTTP status code or\nany other response header field (e.g. Location) which can affect the client's behaviour.\n\n### Future Time Manipulation\n\nThe protocol relies on a clock sync between the client and server. To accomplish this, the server informs the client of its\ncurrent time when an invalid timestamp is received.\n\nIf an attacker is able to manipulate this information and cause the client to use an incorrect time, it would be able to cause\nthe client to generate authenticated requests using time in the future. Such requests will fail when sent by the client, and will\nnot likely leave a trace on the server (given the common implementation of nonce, if at all enforced). The attacker will then\nbe able to replay the request at the correct time without detection.\n\nThe client must only use the time information provided by the server if:\n* it was delivered over a TLS connection and the server identity has been verified, or\n* the `tsm` MAC digest calculated using the same client credentials over the timestamp has been verified.\n\n### Client Clock Poisoning\n\nWhen receiving a request with a bad timestamp, the server provides the client with its current time. The client must never use\nthe time received from the server to adjust its own clock, and must only use it to calculate an offset for communicating with\nthat particular server.\n\n### Bewit Limitations\n\nSpecial care must be taken when issuing bewit credentials to third parties. Bewit credentials are valid until expiration and cannot\nbe revoked or limited without using other means. Whatever resource they grant access to will be completely exposed to anyone with\naccess to the bewit credentials which act as bearer credentials for that particular resource. While bewit usage is limited to GET\nrequests only and therefore cannot be used to perform transactions or change server state, it can still be used to expose private\nand sensitive information.\n\n### Host Header Forgery\n\nHawk validates the incoming request MAC against the incoming HTTP Host header. However, unless the optional `host` and `port`\noptions are used with `server.authenticate()`, a malicous client can mint new host names pointing to the server's IP address and\nuse that to craft an attack by sending a valid request that's meant for another hostname than the one used by the server. Server\nimplementors must manually verify that the host header received matches their expectation (or use the options mentioned above).\n\n# Frequently Asked Questions\n\n### Where is the protocol specification?\n\nIf you are looking for some prose explaining how all this works, **this is it**. **Hawk** is being developed as an open source\nproject instead of a standard. In other words, the [code](/hueniverse/hawk/tree/master/lib) is the specification. Not sure about\nsomething? Open an issue!\n\n### Is it done?\n\nAt if version 0.10.0, **Hawk** is feature-complete. However, until this module reaches version 1.0.0 it is considered experimental\nand is likely to change. This also means your feedback and contribution are very welcome. Feel free to open issues with questions\nand suggestions.\n\n### Where can I find **Hawk** implementations in other languages?\n\n**Hawk**'s only reference implementation is provided in JavaScript as a node.js module. However, others are actively porting it to other\nplatforms. There is already a [PHP](https://github.com/alexbilbie/PHP-Hawk),\n[.NET](https://github.com/pcibraro/hawknet), and [JAVA](https://github.com/wealdtech/hawk) libraries available. The full list\nis maintained [here](https://github.com/hueniverse/hawk/issues?labels=port). Please add an issue if you are working on another\nport. A cross-platform test-suite is in the works.\n\n### Why isn't the algorithm part of the challenge or dynamically negotiated?\n\nThe algorithm used is closely related to the key issued as different algorithms require different key sizes (and other\nrequirements). While some keys can be used for multiple algorithm, the protocol is designed to closely bind the key and algorithm\ntogether as part of the issued credentials.\n\n### Why is Host and Content-Type the only headers covered by the request MAC?\n\nIt is really hard to include other headers. Headers can be changed by proxies and other intermediaries and there is no\nwell-established way to normalize them. Many platforms change the case of header field names and values. The only\nstraight-forward solution is to include the headers in some blob (say, base64 encoded JSON) and include that with the request,\nan approach taken by JWT and other such formats. However, that design violates the HTTP header boundaries, repeats information,\nand introduces other security issues because firewalls will not be aware of these \"hidden\" headers. In addition, any information\nrepeated must be compared to the duplicated information in the header and therefore only moves the problem elsewhere.\n\n### Why not just use HTTP Digest?\n\nDigest requires pre-negotiation to establish a nonce. This means you can't just make a request - you must first send\na protocol handshake to the server. This pattern has become unacceptable for most web services, especially mobile\nwhere extra round-trip are costly.\n\n### Why bother with all this nonce and timestamp business?\n\n**Hawk** is an attempt to find a reasonable, practical compromise between security and usability. OAuth 1.0 got timestamp\nand nonces halfway right but failed when it came to scalability and consistent developer experience. **Hawk** addresses\nit by requiring the client to sync its clock, but provides it with tools to accomplish it.\n\nIn general, replay protection is a matter of application-specific threat model. It is less of an issue on a TLS-protected\nsystem where the clients are implemented using best practices and are under the control of the server. Instead of dropping\nreplay protection, **Hawk** offers a required time window and an optional nonce verification. Together, it provides developers\nwith the ability to decide how to enforce their security policy without impacting the client's implementation.\n\n### What are `app` and `dlg` in the authorization header and normalized mac string?\n\nThe original motivation for **Hawk** was to replace the OAuth 1.0 use cases. This included both a simple client-server mode which\nthis module is specifically designed for, and a delegated access mode which is being developed separately in\n[Oz](https://github.com/hueniverse/oz). In addition to the **Hawk** use cases, Oz requires another attribute: the application id `app`.\nThis provides binding between the credentials and the application in a way that prevents an attacker from tricking an application\nto use credentials issued to someone else. It also has an optional 'delegated-by' attribute `dlg` which is the application id of the\napplication the credentials were directly issued to. The goal of these two additions is to allow Oz to utilize **Hawk** directly,\nbut with the additional security of delegated credentials.\n\n### What is the purpose of the static strings used in each normalized MAC input?\n\nWhen calculating a hash or MAC, a static prefix (tag) is added. The prefix is used to prevent MAC values from being\nused or reused for a purpose other than what they were created for (i.e. prevents switching MAC values between a request,\nresponse, and a bewit use cases). It also protects against expliots created after a potential change in how the protocol\ncreates the normalized string. For example, if a future version would switch the order of nonce and timestamp, it\ncan create an exploit opportunity for cases where the nonce is similar in format to a timestamp.\n\n### Does **Hawk** have anything to do with OAuth?\n\nShort answer: no.\n\n**Hawk** was originally proposed as the OAuth MAC Token specification. However, the OAuth working group in its consistent\nincompetence failed to produce a final, usable solution to address one of the most popular use cases of OAuth 1.0 - using it\nto authenticate simple client-server transactions (i.e. two-legged). As you can guess, the OAuth working group is still hard\nat work to produce more garbage.\n\n**Hawk** provides a simple HTTP authentication scheme for making client-server requests. It does not address the OAuth use case\nof delegating access to a third party. If you are looking for an OAuth alternative, check out [Oz](https://github.com/hueniverse/oz).\n\n\n# Acknowledgements\n\n**Hawk** is a derivative work of the [HTTP MAC Authentication Scheme](http://tools.ietf.org/html/draft-hammer-oauth-v2-mac-token-05) proposal\nco-authored by Ben Adida, Adam Barth, and Eran Hammer, which in turn was based on the OAuth 1.0 community specification.\n\nSpecial thanks to Ben Laurie for his always insightful feedback and advice.\n\nThe **Hawk** logo was created by [Chris Carrasco](http://chriscarrasco.com).\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/hueniverse/hawk/issues"
-  },
-  "homepage": "https://github.com/hueniverse/hawk",
-  "_id": "hawk@1.0.0",
-  "_from": "hawk@~1.0.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/.dir-locals.el	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-((nil . ((indent-tabs-mode . nil)
-         (tab-width . 8)
-         (fill-column . 80)))
- (js-mode . ((js-indent-level . 2)
-             (indent-tabs-mode . nil)
-             )))
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,7 +0,0 @@
-.gitmodules
-deps
-docs
-Makefile
-node_modules
-test
-tools
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-Copyright Joyent, Inc. All rights reserved.
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to
-deal in the Software without restriction, including without limitation the
-rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
-sell copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
-IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,75 +0,0 @@
-# node-http-signature
-
-node-http-signature is a node.js library that has client and server components
-for Joyent's [HTTP Signature Scheme](http_signing.md).
-
-## Usage
-
-Note the example below signs a request with the same key/cert used to start an
-HTTP server. This is almost certainly not what you actaully want, but is just
-used to illustrate the API calls; you will need to provide your own key
-management in addition to this library.
-
-### Client
-
-    var fs = require('fs');
-    var https = require('https');
-    var httpSignature = require('http-signature');
-
-    var key = fs.readFileSync('./key.pem', 'ascii');
-
-    var options = {
-      host: 'localhost',
-      port: 8443,
-      path: '/',
-      method: 'GET',
-      headers: {}
-    };
-
-    // Adds a 'Date' header in, signs it, and adds the
-    // 'Authorization' header in.
-    var req = https.request(options, function(res) {
-      console.log(res.statusCode);
-    });
-
-
-    httpSignature.sign(req, {
-      key: key,
-      keyId: './cert.pem'
-    });
-
-    req.end();
-
-### Server
-
-    var fs = require('fs');
-    var https = require('https');
-    var httpSignature = require('http-signature');
-
-    var options = {
-      key: fs.readFileSync('./key.pem'),
-      cert: fs.readFileSync('./cert.pem')
-    };
-
-    https.createServer(options, function (req, res) {
-      var rc = 200;
-      var parsed = httpSignature.parseRequest(req);
-      var pub = fs.readFileSync(parsed.keyId, 'ascii');
-      if (!httpSignature.verifySignature(parsed, pub))
-        rc = 401;
-
-      res.writeHead(rc);
-      res.end();
-    }).listen(8443);
-
-## Installation
-
-    npm install http-signature
-
-## License
-
-MIT.
-
-## Bugs
-
-See <https://github.com/joyent/node-http-signature/issues>.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/http_signing.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,296 +0,0 @@
-# Abstract
-
-This document describes a way to add origin authentication, message integrity,
-and replay resistance to HTTP REST requests.  It is intended to be used over
-the HTTPS protocol.
-
-# Copyright Notice
-
-Copyright (c) 2011 Joyent, Inc. and the persons identified as document authors.
-All rights reserved.
-
-Code Components extracted from this document must include MIT License text.
-
-# Introduction
-
-This protocol is intended to provide a standard way for clients to sign HTTP
-requests.  RFC2617 (HTTP Authentication) defines Basic and Digest authentication
-mechanisms, and RFC5246 (TLS 1.2) defines client-auth, both of which are widely
-employed on the Internet today.  However, it is common place that the burdens of
-PKI prevent web service operators from deploying that methodoloy, and so many
-fall back to Basic authentication, which has poor security characteristics.
-
-Additionally, OAuth provides a fully-specified alternative for authorization
-of web service requests, but is not (always) ideal for machine to machine
-communication, as the key acquisition steps (generally) imply a fixed
-infrastructure that may not make sense to a service provider (e.g., symmetric
-keys).
-
-Several web service providers have invented their own schemes for signing
-HTTP requests, but to date, none have been placed in the public domain as a
-standard.  This document serves that purpose.  There are no techniques in this
-proposal that are novel beyond previous art, however, this aims to be a simple
-mechanism for signing these requests.
-
-# Signature Authentication Scheme
-
-The "signature" authentication scheme is based on the model that the client must
-authenticate itself with a digital signature produced by either a private
-asymmetric key (e.g., RSA) or a shared symmetric key (e.g., HMAC).  The scheme
-is parameterized enough such that it is not bound to any particular key type or
-signing algorithm.  However, it does explicitly assume that clients can send an
-HTTP `Date` header.
-
-## Authorization Header
-
-The client is expected to send an Authorization header (as defined in RFC 2617)
-with the following parameterization:
-
-    credentials := "Signature" params
-    params := 1#(keyId | algorithm | [headers] | [ext] | signature)
-    digitalSignature := plain-string
-
-    keyId := "keyId" "=" <"> plain-string <">
-    algorithm := "algorithm" "=" <"> plain-string <">
-    headers := "headers" "=" <"> 1#headers-value <">
-    ext := "ext" "=" <"> plain-string <">
-    signature := "signature" "=" <"> plain-string <">
-
-    headers-value := plain-string
-    plain-string   = 1*( %x20-21 / %x23-5B / %x5D-7E )
-
-### Signature Parameters
-
-#### keyId
-
-REQUIRED.  The `keyId` field is an opaque string that the server can use to look
-up the component they need to validate the signature.  It could be an SSH key
-fingerprint, an LDAP DN, etc.  Management of keys and assignment of `keyId` is
-out of scope for this document.
-
-#### algorithm
-
-REQUIRED. The `algorithm` parameter is used if the client and server agree on a
-non-standard digital signature algorithm.  The full list of supported signature
-mechanisms is listed below.
-
-#### headers
-
-OPTIONAL.  The `headers` parameter is used to specify the list of HTTP headers
-used to sign the request.  If specified, it should be a quoted list of HTTP
-header names, separated by a single space character.  By default, only one
-HTTP header is signed, which is the `Date` header.  Note that the list MUST be
-specified in the order the values are concatenated together during signing. To
-include the HTTP request line in the signature calculation, use the special
-`request-line` value.  While this is overloading the definition of `headers` in
-HTTP linguism, the request-line is defined in RFC 2616, and as the outlier from
-headers in useful signature calculation, it is deemed simpler to simply use
-`request-line` than to add a separate parameter for it.
-
-#### extensions
-
-OPTIONAL.  The `extensions` parameter is used to include additional information
-which is covered by the request.  The content and format of the string is out of
-scope for this document, and expected to be specified by implementors.
-
-#### signature
-
-REQUIRED.  The `signature` parameter is a `Base64` encoded digital signature
-generated by the client. The client uses the `algorithm` and `headers` request
-parameters to form a canonicalized `signing string`.  This `signing string` is
-then signed with the key associated with `keyId` and the algorithm
-corresponding to `algorithm`.  The `signature` parameter is then set to the
-`Base64` encoding of the signature.
-
-### Signing String Composition
-
-In order to generate the string that is signed with a key, the client MUST take
-the values of each HTTP header specified by `headers` in the order they appear.
-
-1. If the header name is not `request-line` then append the lowercased header
-   name followed with an ASCII colon `:` and an ASCII space ` `.
-2. If the header name is `request-line` then appened the HTTP request line,
-   otherwise append the header value.
-3. If value is not the last value then append an ASCII newline `\n`. The string
-   MUST NOT include a trailing ASCII newline.
-
-# Example Requests
-
-All requests refer to the following request (body ommitted):
-
-    POST /foo HTTP/1.1
-    Host: example.org
-    Date: Tue, 07 Jun 2011 20:51:35 GMT
-    Content-Type: application/json
-    Content-MD5: h0auK8hnYJKmHTLhKtMTkQ==
-    Content-Length: 123
-
-The "rsa-key-1" keyId refers to a private key known to the client and a public
-key known to the server. The "hmac-key-1" keyId refers to key known to the
-client and server.
-
-## Default parameterization
-
-The authorization header and signature would be generated as:
-
-    Authorization: Signature keyId="rsa-key-1",algorithm="rsa-sha256",signature="Base64(RSA-SHA256(signing string))"
-
-The client would compose the signing string as:
-
-    date: Tue, 07 Jun 2011 20:51:35 GMT
-
-## Header List
-
-The authorization header and signature would be generated as:
-
-    Authorization: Signature keyId="rsa-key-1",algorithm="rsa-sha256",headers="request-line date content-type content-md5",signature="Base64(RSA-SHA256(signing string))"
-
-The client would compose the signing string as (`+ "\n"` inserted for
-readability):
-
-    POST /foo HTTP/1.1 + "\n"
-    date: Tue, 07 Jun 2011 20:51:35 GMT + "\n"
-    content-type: application/json + "\n"
-    content-md5: h0auK8hnYJKmHTLhKtMTkQ==
-
-## Algorithm
-
-The authorization header and signature would be generated as:
-
-    Authorization: Signature keyId="hmac-key-1",algorithm="hmac-sha1",signature="Base64(HMAC-SHA1(signing string))"
-
-The client would compose the signing string as:
-
-    date: Tue, 07 Jun 2011 20:51:35 GMT
-
-# Signing Algorithms
-
-Currently supported algorithm names are:
-
-* rsa-sha1
-* rsa-sha256
-* rsa-sha512
-* dsa-sha1
-* hmac-sha1
-* hmac-sha256
-* hmac-sha512
-
-# Security Considerations
-
-## Default Parameters
-
-Note the default parameterization of the `Signature` scheme is only safe if all
-requests are carried over a secure transport (i.e., TLS).  Sending the default
-scheme over a non-secure transport will leave the request vulnerable to
-spoofing, tampering, replay/repudiaton, and integrity violations (if using the
-STRIDE threat-modeling methodology).
-
-## Insecure Transports
-
-If sending the request over plain HTTP, service providers SHOULD require clients
-to sign ALL HTTP headers, and the `request-line`.  Additionally, service
-providers SHOULD require `Content-MD5` calculations to be performed to ensure
-against any tampering from clients.
-
-## Nonces
-
-Nonces are out of scope for this document simply because many service providers
-fail to implement them correctly, or do not adopt security specfiications
-because of the infrastructure complexity.  Given the `header` parameterization,
-a service provider is fully enabled to add nonce semantics into this scheme by
-using something like an `x-request-nonce` header, and ensuring it is signed
-with the `Date` header.
-
-## Clock Skew
-
-As the default scheme is to sign the `Date` header, service providers SHOULD
-protect against logged replay attacks by enforcing a clock skew.  The server
-SHOULD be synchronized with NTP, and the recommendation in this specification
-is to allow 300s of clock skew (in either direction).
-
-## Required Headers to Sign
-
-It is out of scope for this document to dictate what headers a service provider
-will want to enforce, but service providers SHOULD at minimum include the
-`Date` header.
-
-# References
-
-## Normative References
-
-* [RFC2616] Hypertext Transfer Protocol -- HTTP/1.1
-* [RFC2617] HTTP Authentication: Basic and Digest Access Authentication
-* [RFC5246] The Transport Layer Security (TLS) Protocol Version 1.2
-
-## Informative References
-
-    Name: Mark Cavage (editor)
-    Company: Joyent, Inc.
-    Email: mark.cavage@joyent.com
-    URI: http://www.joyent.com
-
-# Appendix A - Test Values
-
-The following test data uses the RSA (2048b) keys, which we will refer
-to as `keyId=Test` in the following samples:
-
-   -----BEGIN PUBLIC KEY-----
-   MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDCFENGw33yGihy92pDjZQhl0C3
-   6rPJj+CvfSC8+q28hxA161QFNUd13wuCTUcq0Qd2qsBe/2hFyc2DCJJg0h1L78+6
-   Z4UMR7EOcpfdUE9Hf3m/hs+FUR45uBJeDK1HSFHD8bHKD6kv8FPGfJTotc+2xjJw
-   oYi+1hqp1fIekaxsyQIDAQAB
-   -----END PUBLIC KEY-----
-
-    -----BEGIN RSA PRIVATE KEY-----
-    MIICXgIBAAKBgQDCFENGw33yGihy92pDjZQhl0C36rPJj+CvfSC8+q28hxA161QF
-    NUd13wuCTUcq0Qd2qsBe/2hFyc2DCJJg0h1L78+6Z4UMR7EOcpfdUE9Hf3m/hs+F
-    UR45uBJeDK1HSFHD8bHKD6kv8FPGfJTotc+2xjJwoYi+1hqp1fIekaxsyQIDAQAB
-    AoGBAJR8ZkCUvx5kzv+utdl7T5MnordT1TvoXXJGXK7ZZ+UuvMNUCdN2QPc4sBiA
-    QWvLw1cSKt5DsKZ8UETpYPy8pPYnnDEz2dDYiaew9+xEpubyeW2oH4Zx71wqBtOK
-    kqwrXa/pzdpiucRRjk6vE6YY7EBBs/g7uanVpGibOVAEsqH1AkEA7DkjVH28WDUg
-    f1nqvfn2Kj6CT7nIcE3jGJsZZ7zlZmBmHFDONMLUrXR/Zm3pR5m0tCmBqa5RK95u
-    412jt1dPIwJBANJT3v8pnkth48bQo/fKel6uEYyboRtA5/uHuHkZ6FQF7OUkGogc
-    mSJluOdc5t6hI1VsLn0QZEjQZMEOWr+wKSMCQQCC4kXJEsHAve77oP6HtG/IiEn7
-    kpyUXRNvFsDE0czpJJBvL/aRFUJxuRK91jhjC68sA7NsKMGg5OXb5I5Jj36xAkEA
-    gIT7aFOYBFwGgQAQkWNKLvySgKbAZRTeLBacpHMuQdl1DfdntvAyqpAZ0lY0RKmW
-    G6aFKaqQfOXKCyWoUiVknQJAXrlgySFci/2ueKlIE1QqIiLSZ8V8OlpFLRnb1pzI
-    7U1yQXnTAEFYM560yJlzUpOb1V4cScGd365tiSMvxLOvTA==
-    -----END RSA PRIVATE KEY-----
-
-And all examples use this request:
-
-    POST /foo?param=value&pet=dog HTTP/1.1
-    Host: example.com
-    Date: Thu, 05 Jan 2012 21:31:40 GMT
-    Content-Type: application/json
-    Content-MD5: Sd/dVLAcvNLSq16eXua5uQ==
-    Content-Length: 18
-
-    {"hello": "world"}
-
-### Default
-
-The string to sign would be:
-
-    date: Thu, 05 Jan 2012 21:31:40 GMT
-
-The Authorization header would be:
-
-    Authorization: Signature keyId="Test",algorithm="rsa-sha256",signature="JldXnt8W9t643M2Sce10gqCh/+E7QIYLiI+bSjnFBGCti7s+mPPvOjVb72sbd1FjeOUwPTDpKbrQQORrm+xBYfAwCxF3LBSSzORvyJ5nRFCFxfJ3nlQD6Kdxhw8wrVZX5nSem4A/W3C8qH5uhFTRwF4ruRjh+ENHWuovPgO/HGQ="
-
-### All Headers
-
-Parameterized to include all headers, the string to sign would be (`+ "\n"`
-inserted for readability):
-
-    POST /foo?param=value&pet=dog HTTP/1.1 + "\n"
-    host: example.com + "\n"
-    date: Thu, 05 Jan 2012 21:31:40 GMT + "\n"
-    content-type: application/json + "\n"
-    content-md5: Sd/dVLAcvNLSq16eXua5uQ== + "\n"
-    content-length: 18
-
-The Authorization header would be:
-
-    Authorization: Signature keyId="Test",algorithm="rsa-sha256",headers="request-line host date content-type content-md5 content-length",signature="Gm7W/r+e90REDpWytALMrft4MqZxCmslOTOvwJX17ViEBA5E65QqvWI0vIH3l/vSsGiaMVmuUgzYsJLYMLcm5dGrv1+a+0fCoUdVKPZWHyImQEqpLkopVwqEH67LVECFBqFTAKlQgBn676zrfXQbb+b/VebAsNUtvQMe6cTjnDY="
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-// Copyright 2011 Joyent, Inc.  All rights reserved.
-
-var parser = require('./parser');
-var signer = require('./signer');
-var verify = require('./verify');
-var util = require('./util');
-
-
-
-///--- API
-
-module.exports = {
-
-  parse: parser.parseRequest,
-  parseRequest: parser.parseRequest,
-
-  sign: signer.signRequest,
-  signRequest: signer.signRequest,
-
-  sshKeyToPEM: util.sshKeyToPEM,
-  sshKeyFingerprint: util.fingerprint,
-
-  verify: verify.verifySignature,
-  verifySignature: verify.verifySignature
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/parser.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,304 +0,0 @@
-// Copyright 2012 Joyent, Inc.  All rights reserved.
-
-var assert = require('assert-plus');
-var util = require('util');
-
-
-
-///--- Globals
-
-var Algorithms = {
-  'rsa-sha1': true,
-  'rsa-sha256': true,
-  'rsa-sha512': true,
-  'dsa-sha1': true,
-  'hmac-sha1': true,
-  'hmac-sha256': true,
-  'hmac-sha512': true
-};
-
-var State = {
-  New: 0,
-  Params: 1
-};
-
-var ParamsState = {
-  Name: 0,
-  Quote: 1,
-  Value: 2,
-  Comma: 3
-};
-
-
-
-///--- Specific Errors
-
-function HttpSignatureError(message, caller) {
-  if (Error.captureStackTrace)
-    Error.captureStackTrace(this, caller || HttpSignatureError);
-
-  this.message = message;
-  this.name = caller.name;
-}
-util.inherits(HttpSignatureError, Error);
-
-function ExpiredRequestError(message) {
-  HttpSignatureError.call(this, message, ExpiredRequestError);
-}
-util.inherits(ExpiredRequestError, HttpSignatureError);
-
-
-function InvalidHeaderError(message) {
-  HttpSignatureError.call(this, message, InvalidHeaderError);
-}
-util.inherits(InvalidHeaderError, HttpSignatureError);
-
-
-function InvalidParamsError(message) {
-  HttpSignatureError.call(this, message, InvalidParamsError);
-}
-util.inherits(InvalidParamsError, HttpSignatureError);
-
-
-function MissingHeaderError(message) {
-  HttpSignatureError.call(this, message, MissingHeaderError);
-}
-util.inherits(MissingHeaderError, HttpSignatureError);
-
-
-
-///--- Exported API
-
-module.exports = {
-
-  /**
-   * Parses the 'Authorization' header out of an http.ServerRequest object.
-   *
-   * Note that this API will fully validate the Authorization header, and throw
-   * on any error.  It will not however check the signature, or the keyId format
-   * as those are specific to your environment.  You can use the options object
-   * to pass in extra constraints.
-   *
-   * As a response object you can expect this:
-   *
-   *     {
-   *       "scheme": "Signature",
-   *       "params": {
-   *         "keyId": "foo",
-   *         "algorithm": "rsa-sha256",
-   *         "headers": [
-   *           "date" or "x-date",
-   *           "content-md5"
-   *         ],
-   *         "signature": "base64"
-   *       },
-   *       "signingString": "ready to be passed to crypto.verify()"
-   *     }
-   *
-   * @param {Object} request an http.ServerRequest.
-   * @param {Object} options an optional options object with:
-   *                   - clockSkew: allowed clock skew in seconds (default 300).
-   *                   - headers: required header names (def: date or x-date)
-   *                   - algorithms: algorithms to support (default: all).
-   * @return {Object} parsed out object (see above).
-   * @throws {TypeError} on invalid input.
-   * @throws {InvalidHeaderError} on an invalid Authorization header error.
-   * @throws {InvalidParamsError} if the params in the scheme are invalid.
-   * @throws {MissingHeaderError} if the params indicate a header not present,
-   *                              either in the request headers from the params,
-   *                              or not in the params from a required header
-   *                              in options.
-   * @throws {ExpiredRequestError} if the value of date or x-date exceeds skew.
-   */
-  parseRequest: function parseRequest(request, options) {
-    assert.object(request, 'request');
-    assert.object(request.headers, 'request.headers');
-    if (options === undefined) {
-      options = {};
-    }
-    if (options.headers === undefined) {
-      options.headers = [request.headers['x-date'] ? 'x-date' : 'date'];
-    }
-    assert.object(options, 'options');
-    assert.arrayOfString(options.headers, 'options.headers');
-    assert.optionalNumber(options.clockSkew, 'options.clockSkew');
-
-    if (!request.headers.authorization)
-      throw new MissingHeaderError('no authorization header present in ' +
-                                   'the request');
-
-    options.clockSkew = options.clockSkew || 300;
-
-
-    var i = 0;
-    var state = State.New;
-    var substate = ParamsState.Name;
-    var tmpName = '';
-    var tmpValue = '';
-
-    var parsed = {
-      scheme: '',
-      params: {},
-      signingString: '',
-
-      get algorithm() {
-        return this.params.algorithm.toUpperCase();
-      },
-
-      get keyId() {
-        return this.params.keyId;
-      }
-
-    };
-
-    var authz = request.headers.authorization;
-    for (i = 0; i < authz.length; i++) {
-      var c = authz.charAt(i);
-
-      switch (Number(state)) {
-
-      case State.New:
-        if (c !== ' ') parsed.scheme += c;
-        else state = State.Params;
-        break;
-
-      case State.Params:
-        switch (Number(substate)) {
-
-        case ParamsState.Name:
-          var code = c.charCodeAt(0);
-          // restricted name of A-Z / a-z
-          if ((code >= 0x41 && code <= 0x5a) || // A-Z
-              (code >= 0x61 && code <= 0x7a)) { // a-z
-            tmpName += c;
-          } else if (c === '=') {
-            if (tmpName.length === 0)
-              throw new InvalidHeaderError('bad param format');
-            substate = ParamsState.Quote;
-          } else {
-            throw new InvalidHeaderError('bad param format');
-          }
-          break;
-
-        case ParamsState.Quote:
-          if (c === '"') {
-            tmpValue = '';
-            substate = ParamsState.Value;
-          } else {
-            throw new InvalidHeaderError('bad param format');
-          }
-          break;
-
-        case ParamsState.Value:
-          if (c === '"') {
-            parsed.params[tmpName] = tmpValue;
-            substate = ParamsState.Comma;
-          } else {
-            tmpValue += c;
-          }
-          break;
-
-        case ParamsState.Comma:
-          if (c === ',') {
-            tmpName = '';
-            substate = ParamsState.Name;
-          } else {
-            throw new InvalidHeaderError('bad param format');
-          }
-          break;
-
-        default:
-          throw new Error('Invalid substate');
-        }
-        break;
-
-      default:
-        throw new Error('Invalid substate');
-      }
-
-    }
-
-    if (!parsed.params.headers || parsed.params.headers === '') {
-      if (request.headers['x-date']) {
-        parsed.params.headers = ['x-date'];
-      } else {
-        parsed.params.headers = ['date'];
-      }
-    } else {
-      parsed.params.headers = parsed.params.headers.split(' ');
-    }
-
-    // Minimally validate the parsed object
-    if (!parsed.scheme || parsed.scheme !== 'Signature')
-      throw new InvalidHeaderError('scheme was not "Signature"');
-
-    if (!parsed.params.keyId)
-      throw new InvalidHeaderError('keyId was not specified');
-
-    if (!parsed.params.algorithm)
-      throw new InvalidHeaderError('algorithm was not specified');
-
-    if (!parsed.params.signature)
-      throw new InvalidHeaderError('signature was not specified');
-
-    // Check the algorithm against the official list
-    parsed.params.algorithm = parsed.params.algorithm.toLowerCase();
-    if (!Algorithms[parsed.params.algorithm])
-      throw new InvalidParamsError(parsed.params.algorithm +
-                                   ' is not supported');
-
-    // Build the signingString
-    for (i = 0; i < parsed.params.headers.length; i++) {
-      var h = parsed.params.headers[i].toLowerCase();
-      parsed.params.headers[i] = h;
-
-      if (h !== 'request-line') {
-        var value = request.headers[h];
-        if (!value)
-          throw new MissingHeaderError(h + ' was not in the request');
-        parsed.signingString += h + ': ' + value;
-      } else {
-        parsed.signingString +=
-          request.method + ' ' + request.url + ' HTTP/' + request.httpVersion;
-      }
-
-      if ((i + 1) < parsed.params.headers.length)
-        parsed.signingString += '\n';
-    }
-
-    // Check against the constraints
-    var date;
-    if (request.headers.date || request.headers['x-date']) {
-        if (request.headers['x-date']) {
-          date = new Date(request.headers['x-date']);
-        } else {
-          date = new Date(request.headers.date);
-        }
-      var now = new Date();
-      var skew = Math.abs(now.getTime() - date.getTime());
-
-      if (skew > options.clockSkew * 1000) {
-        throw new ExpiredRequestError('clock skew of ' +
-                                      (skew / 1000) +
-                                      's was greater than ' +
-                                      options.clockSkew + 's');
-      }
-    }
-
-    options.headers.forEach(function (hdr) {
-      // Remember that we already checked any headers in the params
-      // were in the request, so if this passes we're good.
-      if (parsed.params.headers.indexOf(hdr) < 0)
-        throw new MissingHeaderError(hdr + ' was not a signed header');
-    });
-
-    if (options.algorithms) {
-      if (options.algorithms.indexOf(parsed.params.algorithm) === -1)
-        throw new InvalidParamsError(parsed.params.algorithm +
-                                     ' is not a supported algorithm');
-    }
-
-    return parsed;
-  }
-
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/signer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,179 +0,0 @@
-// Copyright 2012 Joyent, Inc.  All rights reserved.
-
-var assert = require('assert-plus');
-var crypto = require('crypto');
-var http = require('http');
-
-var sprintf = require('util').format;
-
-
-
-///--- Globals
-
-var Algorithms = {
-  'rsa-sha1': true,
-  'rsa-sha256': true,
-  'rsa-sha512': true,
-  'dsa-sha1': true,
-  'hmac-sha1': true,
-  'hmac-sha256': true,
-  'hmac-sha512': true
-};
-
-var Authorization =
-  'Signature keyId="%s",algorithm="%s",headers="%s",signature="%s"';
-
-
-
-///--- Specific Errors
-
-function MissingHeaderError(message) {
-    this.name = 'MissingHeaderError';
-    this.message = message;
-    this.stack = (new Error()).stack;
-}
-MissingHeaderError.prototype = new Error();
-
-
-function InvalidAlgorithmError(message) {
-    this.name = 'InvalidAlgorithmError';
-    this.message = message;
-    this.stack = (new Error()).stack;
-}
-InvalidAlgorithmError.prototype = new Error();
-
-
-
-///--- Internal Functions
-
-function _pad(val) {
-  if (parseInt(val, 10) < 10) {
-    val = '0' + val;
-  }
-  return val;
-}
-
-
-function _rfc1123() {
-  var date = new Date();
-
-  var months = ['Jan',
-                'Feb',
-                'Mar',
-                'Apr',
-                'May',
-                'Jun',
-                'Jul',
-                'Aug',
-                'Sep',
-                'Oct',
-                'Nov',
-                'Dec'];
-  var days = ['Sun', 'Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat'];
-  return days[date.getUTCDay()] + ', ' +
-    _pad(date.getUTCDate()) + ' ' +
-    months[date.getUTCMonth()] + ' ' +
-    date.getUTCFullYear() + ' ' +
-    _pad(date.getUTCHours()) + ':' +
-    _pad(date.getUTCMinutes()) + ':' +
-    _pad(date.getUTCSeconds()) +
-    ' GMT';
-}
-
-
-
-///--- Exported API
-
-module.exports = {
-
-  /**
-   * Adds an 'Authorization' header to an http.ClientRequest object.
-   *
-   * Note that this API will add a Date header if it's not already set. Any
-   * other headers in the options.headers array MUST be present, or this
-   * will throw.
-   *
-   * You shouldn't need to check the return type; it's just there if you want
-   * to be pedantic.
-   *
-   * @param {Object} request an instance of http.ClientRequest.
-   * @param {Object} options signing parameters object:
-   *                   - {String} keyId required.
-   *                   - {String} key required (either a PEM or HMAC key).
-   *                   - {Array} headers optional; defaults to ['date'].
-   *                   - {String} algorithm optional; defaults to 'rsa-sha256'.
-   *                   - {String} httpVersion optional; defaults to '1.1'.
-   * @return {Boolean} true if Authorization (and optionally Date) were added.
-   * @throws {TypeError} on bad parameter types (input).
-   * @throws {InvalidAlgorithmError} if algorithm was bad.
-   * @throws {MissingHeaderError} if a header to be signed was specified but
-   *                              was not present.
-   */
-  signRequest: function signRequest(request, options) {
-    assert.object(request, 'request');
-    assert.object(options, 'options');
-    assert.optionalString(options.algorithm, 'options.algorithm');
-    assert.string(options.keyId, 'options.keyId');
-    assert.optionalArrayOfString(options.headers, 'options.headers');
-    assert.optionalString(options.httpVersion, 'options.httpVersion');
-
-    if (!request.getHeader('Date'))
-      request.setHeader('Date', _rfc1123());
-    if (!options.headers)
-      options.headers = ['date'];
-    if (!options.algorithm)
-      options.algorithm = 'rsa-sha256';
-    if (!options.httpVersion)
-      options.httpVersion = '1.1';
-
-    options.algorithm = options.algorithm.toLowerCase();
-
-    if (!Algorithms[options.algorithm])
-      throw new InvalidAlgorithmError(options.algorithm + ' is not supported');
-
-    var i;
-    var stringToSign = '';
-    for (i = 0; i < options.headers.length; i++) {
-      if (typeof (options.headers[i]) !== 'string')
-        throw new TypeError('options.headers must be an array of Strings');
-
-      var h = options.headers[i].toLowerCase();
-
-      if (h !== 'request-line') {
-        var value = request.getHeader(h);
-        if (!value) {
-          throw new MissingHeaderError(h + ' was not in the request');
-        }
-        stringToSign += h + ': ' + value;
-      } else {
-        value =
-        stringToSign +=
-          request.method + ' ' + request.path + ' HTTP/' + options.httpVersion;
-      }
-
-      if ((i + 1) < options.headers.length)
-        stringToSign += '\n';
-    }
-
-    var alg = options.algorithm.match(/(hmac|rsa)-(\w+)/);
-    var signature;
-    if (alg[1] === 'hmac') {
-      var hmac = crypto.createHmac(alg[2].toUpperCase(), options.key);
-      hmac.update(stringToSign);
-      signature = hmac.digest('base64');
-    } else {
-      var signer = crypto.createSign(options.algorithm.toUpperCase());
-      signer.update(stringToSign);
-      signature = signer.sign(options.key, 'base64');
-    }
-
-    request.setHeader('Authorization', sprintf(Authorization,
-                                               options.keyId,
-                                               options.algorithm,
-                                               options.headers.join(' '),
-                                               signature));
-
-    return true;
-  }
-
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/util.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,249 +0,0 @@
-// Copyright 2012 Joyent, Inc.  All rights reserved.
-
-var assert = require('assert-plus');
-var crypto = require('crypto');
-
-var asn1 = require('asn1');
-var ctype = require('ctype');
-
-
-
-///--- Helpers
-
-function readNext(buffer, offset) {
-  var len = ctype.ruint32(buffer, 'big', offset);
-  offset += 4;
-
-  var newOffset = offset + len;
-
-  return {
-    data: buffer.slice(offset, newOffset),
-    offset: newOffset
-  };
-}
-
-
-function writeInt(writer, buffer) {
-  writer.writeByte(0x02); // ASN1.Integer
-  writer.writeLength(buffer.length);
-
-  for (var i = 0; i < buffer.length; i++)
-    writer.writeByte(buffer[i]);
-
-  return writer;
-}
-
-
-function rsaToPEM(key) {
-  var buffer;
-  var der;
-  var exponent;
-  var i;
-  var modulus;
-  var newKey = '';
-  var offset = 0;
-  var type;
-  var tmp;
-
-  try {
-    buffer = new Buffer(key.split(' ')[1], 'base64');
-
-    tmp = readNext(buffer, offset);
-    type = tmp.data.toString();
-    offset = tmp.offset;
-
-    if (type !== 'ssh-rsa')
-      throw new Error('Invalid ssh key type: ' + type);
-
-    tmp = readNext(buffer, offset);
-    exponent = tmp.data;
-    offset = tmp.offset;
-
-    tmp = readNext(buffer, offset);
-    modulus = tmp.data;
-  } catch (e) {
-    throw new Error('Invalid ssh key: ' + key);
-  }
-
-  // DER is a subset of BER
-  der = new asn1.BerWriter();
-
-  der.startSequence();
-
-  der.startSequence();
-  der.writeOID('1.2.840.113549.1.1.1');
-  der.writeNull();
-  der.endSequence();
-
-  der.startSequence(0x03); // bit string
-  der.writeByte(0x00);
-
-  // Actual key
-  der.startSequence();
-  writeInt(der, modulus);
-  writeInt(der, exponent);
-  der.endSequence();
-
-  // bit string
-  der.endSequence();
-
-  der.endSequence();
-
-  tmp = der.buffer.toString('base64');
-  for (i = 0; i < tmp.length; i++) {
-    if ((i % 64) === 0)
-      newKey += '\n';
-    newKey += tmp.charAt(i);
-  }
-
-  if (!/\\n$/.test(newKey))
-    newKey += '\n';
-
-  return '-----BEGIN PUBLIC KEY-----' + newKey + '-----END PUBLIC KEY-----\n';
-}
-
-
-function dsaToPEM(key) {
-  var buffer;
-  var offset = 0;
-  var tmp;
-  var der;
-  var newKey = '';
-
-  var type;
-  var p;
-  var q;
-  var g;
-  var y;
-
-  try {
-    buffer = new Buffer(key.split(' ')[1], 'base64');
-
-    tmp = readNext(buffer, offset);
-    type = tmp.data.toString();
-    offset = tmp.offset;
-
-    /* JSSTYLED */
-    if (!/^ssh-ds[as].*/.test(type))
-      throw new Error('Invalid ssh key type: ' + type);
-
-    tmp = readNext(buffer, offset);
-    p = tmp.data;
-    offset = tmp.offset;
-
-    tmp = readNext(buffer, offset);
-    q = tmp.data;
-    offset = tmp.offset;
-
-    tmp = readNext(buffer, offset);
-    g = tmp.data;
-    offset = tmp.offset;
-
-    tmp = readNext(buffer, offset);
-    y = tmp.data;
-  } catch (e) {
-    console.log(e.stack);
-    throw new Error('Invalid ssh key: ' + key);
-  }
-
-  // DER is a subset of BER
-  der = new asn1.BerWriter();
-
-  der.startSequence();
-
-  der.startSequence();
-  der.writeOID('1.2.840.10040.4.1');
-
-  der.startSequence();
-  writeInt(der, p);
-  writeInt(der, q);
-  writeInt(der, g);
-  der.endSequence();
-
-  der.endSequence();
-
-  der.startSequence(0x03); // bit string
-  der.writeByte(0x00);
-  writeInt(der, y);
-  der.endSequence();
-
-  der.endSequence();
-
-  tmp = der.buffer.toString('base64');
-  for (var i = 0; i < tmp.length; i++) {
-    if ((i % 64) === 0)
-      newKey += '\n';
-    newKey += tmp.charAt(i);
-  }
-
-  if (!/\\n$/.test(newKey))
-    newKey += '\n';
-
-  return '-----BEGIN PUBLIC KEY-----' + newKey + '-----END PUBLIC KEY-----\n';
-}
-
-
-///--- API
-
-module.exports = {
-
-  /**
-   * Converts an OpenSSH public key (rsa only) to a PKCS#8 PEM file.
-   *
-   * The intent of this module is to interoperate with OpenSSL only,
-   * specifically the node crypto module's `verify` method.
-   *
-   * @param {String} key an OpenSSH public key.
-   * @return {String} PEM encoded form of the RSA public key.
-   * @throws {TypeError} on bad input.
-   * @throws {Error} on invalid ssh key formatted data.
-   */
-  sshKeyToPEM: function sshKeyToPEM(key) {
-    assert.string(key, 'ssh_key');
-
-    /* JSSTYLED */
-    if (/^ssh-rsa.*/.test(key))
-      return rsaToPEM(key);
-
-    /* JSSTYLED */
-    if (/^ssh-ds[as].*/.test(key))
-      return dsaToPEM(key);
-
-    throw new Error('Only RSA and DSA public keys are allowed');
-  },
-
-
-  /**
-   * Generates an OpenSSH fingerprint from an ssh public key.
-   *
-   * @param {String} key an OpenSSH public key.
-   * @return {String} key fingerprint.
-   * @throws {TypeError} on bad input.
-   * @throws {Error} if what you passed doesn't look like an ssh public key.
-   */
-  fingerprint: function fingerprint(key) {
-    assert.string(key, 'ssh_key');
-
-    var pieces = key.split(' ');
-    if (!pieces || !pieces.length || pieces.length < 2)
-      throw new Error('invalid ssh key');
-
-    var data = new Buffer(pieces[1], 'base64');
-
-    var hash = crypto.createHash('md5');
-    hash.update(data);
-    var digest = hash.digest('hex');
-
-    var fp = '';
-    for (var i = 0; i < digest.length; i++) {
-      if (i && i % 2 === 0)
-        fp += ':';
-
-      fp += digest[i];
-    }
-
-    return fp;
-  }
-
-
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/verify.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-// Copyright 2011 Joyent, Inc.  All rights reserved.
-
-var assert = require('assert-plus');
-var crypto = require('crypto');
-
-
-
-///--- Exported API
-
-module.exports = {
-
-  /**
-   * Simply wraps up the node crypto operations for you, and returns
-   * true or false.  You are expected to pass in an object that was
-   * returned from `parse()`.
-   *
-   * @param {Object} parsedSignature the object you got from `parse`.
-   * @param {String} key either an RSA private key PEM or HMAC secret.
-   * @return {Boolean} true if valid, false otherwise.
-   * @throws {TypeError} if you pass in bad arguments.
-   */
-  verifySignature: function verifySignature(parsedSignature, key) {
-    assert.object(parsedSignature, 'parsedSignature');
-    assert.string(key, 'key');
-
-    var alg = parsedSignature.algorithm.match(/(HMAC|RSA|DSA)-(\w+)/);
-    if (!alg || alg.length !== 3)
-      throw new TypeError('parsedSignature: unsupported algorithm ' +
-                          parsedSignature.algorithm);
-
-    if (alg[1] === 'HMAC') {
-      var hmac = crypto.createHmac(alg[2].toUpperCase(), key);
-      hmac.update(parsedSignature.signingString);
-      return (hmac.digest('base64') === parsedSignature.params.signature);
-    } else {
-      var verify = crypto.createVerify(alg[0]);
-      verify.update(parsedSignature.signingString);
-      return verify.verify(key, parsedSignature.params.signature, 'base64');
-    }
-  }
-
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-node_modules
-*.log
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-Copyright (c) 2011 Mark Cavage, All rights reserved.
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,50 +0,0 @@
-node-asn1 is a library for encoding and decoding ASN.1 datatypes in pure JS.
-Currently BER encoding is supported; at some point I'll likely have to do DER.
-
-## Usage
-
-Mostly, if you're *actually* needing to read and write ASN.1, you probably don't
-need this readme to explain what and why.  If you have no idea what ASN.1 is,
-see this: ftp://ftp.rsa.com/pub/pkcs/ascii/layman.asc
-
-The source is pretty much self-explanatory, and has read/write methods for the
-common types out there.
-
-### Decoding
-
-The following reads an ASN.1 sequence with a boolean.
-
-    var Ber = require('asn1').Ber;
-
-    var reader = new Ber.Reader(new Buffer([0x30, 0x03, 0x01, 0x01, 0xff]));
-
-    reader.readSequence();
-    console.log('Sequence len: ' + reader.length);
-    if (reader.peek() === Ber.Boolean)
-      console.log(reader.readBoolean());
-
-### Encoding
-
-The following generates the same payload as above.
-
-    var Ber = require('asn1').Ber;
-
-    var writer = new Ber.Writer();
-
-    writer.startSequence();
-    writer.writeBoolean(true);
-    writer.endSequence();
-
-    console.log(writer.buffer);
-
-## Installation
-
-    npm install asn1
-
-## License
-
-MIT.
-
-## Bugs
-
-See <https://github.com/mcavage/node-asn1/issues>.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/errors.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
-
-
-module.exports = {
-
-  newInvalidAsn1Error: function(msg) {
-    var e = new Error();
-    e.name = 'InvalidAsn1Error';
-    e.message = msg || '';
-    return e;
-  }
-
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
-
-var errors = require('./errors');
-var types = require('./types');
-
-var Reader = require('./reader');
-var Writer = require('./writer');
-
-
-///--- Exports
-
-module.exports = {
-
-  Reader: Reader,
-
-  Writer: Writer
-
-};
-
-for (var t in types) {
-  if (types.hasOwnProperty(t))
-    module.exports[t] = types[t];
-}
-for (var e in errors) {
-  if (errors.hasOwnProperty(e))
-    module.exports[e] = errors[e];
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/reader.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,267 +0,0 @@
-// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
-
-var assert = require('assert');
-
-var ASN1 = require('./types');
-var errors = require('./errors');
-
-
-///--- Globals
-
-var newInvalidAsn1Error = errors.newInvalidAsn1Error;
-
-
-
-///--- API
-
-function Reader(data) {
-  if (!data || !Buffer.isBuffer(data))
-    throw new TypeError('data must be a node Buffer');
-
-  this._buf = data;
-  this._size = data.length;
-
-  // These hold the "current" state
-  this._len = 0;
-  this._offset = 0;
-
-  var self = this;
-  this.__defineGetter__('length', function() { return self._len; });
-  this.__defineGetter__('offset', function() { return self._offset; });
-  this.__defineGetter__('remain', function() {
-    return self._size - self._offset;
-  });
-  this.__defineGetter__('buffer', function() {
-    return self._buf.slice(self._offset);
-  });
-}
-
-
-/**
- * Reads a single byte and advances offset; you can pass in `true` to make this
- * a "peek" operation (i.e., get the byte, but don't advance the offset).
- *
- * @param {Boolean} peek true means don't move offset.
- * @return {Number} the next byte, null if not enough data.
- */
-Reader.prototype.readByte = function(peek) {
-  if (this._size - this._offset < 1)
-    return null;
-
-  var b = this._buf[this._offset] & 0xff;
-
-  if (!peek)
-    this._offset += 1;
-
-  return b;
-};
-
-
-Reader.prototype.peek = function() {
-  return this.readByte(true);
-};
-
-
-/**
- * Reads a (potentially) variable length off the BER buffer.  This call is
- * not really meant to be called directly, as callers have to manipulate
- * the internal buffer afterwards.
- *
- * As a result of this call, you can call `Reader.length`, until the
- * next thing called that does a readLength.
- *
- * @return {Number} the amount of offset to advance the buffer.
- * @throws {InvalidAsn1Error} on bad ASN.1
- */
-Reader.prototype.readLength = function(offset) {
-  if (offset === undefined)
-    offset = this._offset;
-
-  if (offset >= this._size)
-    return null;
-
-  var lenB = this._buf[offset++] & 0xff;
-  if (lenB === null)
-    return null;
-
-  if ((lenB & 0x80) == 0x80) {
-    lenB &= 0x7f;
-
-    if (lenB == 0)
-      throw newInvalidAsn1Error('Indefinite length not supported');
-
-    if (lenB > 4)
-      throw newInvalidAsn1Error('encoding too long');
-
-    if (this._size - offset < lenB)
-      return null;
-
-    this._len = 0;
-    for (var i = 0; i < lenB; i++)
-      this._len = (this._len << 8) + (this._buf[offset++] & 0xff);
-
-  } else {
-    // Wasn't a variable length
-    this._len = lenB;
-  }
-
-  return offset;
-};
-
-
-/**
- * Parses the next sequence in this BER buffer.
- *
- * To get the length of the sequence, call `Reader.length`.
- *
- * @return {Number} the sequence's tag.
- */
-Reader.prototype.readSequence = function(tag) {
-  var seq = this.peek();
-  if (seq === null)
-    return null;
-  if (tag !== undefined && tag !== seq)
-    throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) +
-                              ': got 0x' + seq.toString(16));
-
-  var o = this.readLength(this._offset + 1); // stored in `length`
-  if (o === null)
-    return null;
-
-  this._offset = o;
-  return seq;
-};
-
-
-Reader.prototype.readInt = function() {
-  return this._readTag(ASN1.Integer);
-};
-
-
-Reader.prototype.readBoolean = function() {
-  return (this._readTag(ASN1.Boolean) === 0 ? false : true);
-};
-
-
-Reader.prototype.readEnumeration = function() {
-  return this._readTag(ASN1.Enumeration);
-};
-
-
-Reader.prototype.readString = function(tag, retbuf) {
-  if (!tag)
-    tag = ASN1.OctetString;
-
-  var b = this.peek();
-  if (b === null)
-    return null;
-
-  if (b !== tag)
-    throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) +
-                              ': got 0x' + b.toString(16));
-
-  var o = this.readLength(this._offset + 1); // stored in `length`
-
-  if (o === null)
-    return null;
-
-  if (this.length > this._size - o)
-    return null;
-
-  this._offset = o;
-
-  if (this.length === 0)
-    return '';
-
-  var str = this._buf.slice(this._offset, this._offset + this.length);
-  this._offset += this.length;
-
-  return retbuf ? str : str.toString('utf8');
-};
-
-Reader.prototype.readOID = function(tag) {
-  if (!tag)
-    tag = ASN1.OID;
-
-  var b = this.peek();
-  if (b === null)
-    return null;
-
-  if (b !== tag)
-    throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) +
-                              ': got 0x' + b.toString(16));
-
-  var o = this.readLength(this._offset + 1); // stored in `length`
-  if (o === null)
-    return null;
-
-  if (this.length > this._size - o)
-    return null;
-
-  this._offset = o;
-
-  var values = [];
-  var value = 0;
-
-  for (var i = 0; i < this.length; i++) {
-    var byte = this._buf[this._offset++] & 0xff;
-
-    value <<= 7;
-    value += byte & 0x7f;
-    if ((byte & 0x80) == 0) {
-      values.push(value);
-      value = 0;
-    }
-  }
-
-  value = values.shift();
-  values.unshift(value % 40);
-  values.unshift((value / 40) >> 0);
-
-  return values.join('.');
-};
-
-
-Reader.prototype._readTag = function(tag) {
-  assert.ok(tag !== undefined);
-
-  var b = this.peek();
-
-  if (b === null)
-    return null;
-
-  if (b !== tag)
-    throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) +
-                              ': got 0x' + b.toString(16));
-
-  var o = this.readLength(this._offset + 1); // stored in `length`
-  if (o === null)
-    return null;
-
-  if (this.length > 4)
-    throw newInvalidAsn1Error('Integer too long: ' + this.length);
-
-  if (this.length > this._size - o)
-    return null;
-  this._offset = o;
-
-  var fb = this._buf[this._offset++];
-  var value = 0;
-
-  value = fb & 0x7F;
-  for (var i = 1; i < this.length; i++) {
-    value <<= 8;
-    value |= (this._buf[this._offset++] & 0xff);
-  }
-
-  if ((fb & 0x80) == 0x80)
-    value = -value;
-
-  return value;
-};
-
-
-
-///--- Exported API
-
-module.exports = Reader;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/types.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
-
-
-module.exports = {
-  EOC: 0,
-  Boolean: 1,
-  Integer: 2,
-  BitString: 3,
-  OctetString: 4,
-  Null: 5,
-  OID: 6,
-  ObjectDescriptor: 7,
-  External: 8,
-  Real: 9, // float
-  Enumeration: 10,
-  PDV: 11,
-  Utf8String: 12,
-  RelativeOID: 13,
-  Sequence: 16,
-  Set: 17,
-  NumericString: 18,
-  PrintableString: 19,
-  T61String: 20,
-  VideotexString: 21,
-  IA5String: 22,
-  UTCTime: 23,
-  GeneralizedTime: 24,
-  GraphicString: 25,
-  VisibleString: 26,
-  GeneralString: 28,
-  UniversalString: 29,
-  CharacterString: 30,
-  BMPString: 31,
-  Constructor: 32,
-  Context: 128
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/writer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,317 +0,0 @@
-// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
-
-var assert = require('assert');
-var ASN1 = require('./types');
-var errors = require('./errors');
-
-
-///--- Globals
-
-var newInvalidAsn1Error = errors.newInvalidAsn1Error;
-
-var DEFAULT_OPTS = {
-  size: 1024,
-  growthFactor: 8
-};
-
-
-///--- Helpers
-
-function merge(from, to) {
-  assert.ok(from);
-  assert.equal(typeof(from), 'object');
-  assert.ok(to);
-  assert.equal(typeof(to), 'object');
-
-  var keys = Object.getOwnPropertyNames(from);
-  keys.forEach(function(key) {
-    if (to[key])
-      return;
-
-    var value = Object.getOwnPropertyDescriptor(from, key);
-    Object.defineProperty(to, key, value);
-  });
-
-  return to;
-}
-
-
-
-///--- API
-
-function Writer(options) {
-  options = merge(DEFAULT_OPTS, options || {});
-
-  this._buf = new Buffer(options.size || 1024);
-  this._size = this._buf.length;
-  this._offset = 0;
-  this._options = options;
-
-  // A list of offsets in the buffer where we need to insert
-  // sequence tag/len pairs.
-  this._seq = [];
-
-  var self = this;
-  this.__defineGetter__('buffer', function() {
-    if (self._seq.length)
-      throw new InvalidAsn1Error(self._seq.length + ' unended sequence(s)');
-
-    return self._buf.slice(0, self._offset);
-  });
-}
-
-
-Writer.prototype.writeByte = function(b) {
-  if (typeof(b) !== 'number')
-    throw new TypeError('argument must be a Number');
-
-  this._ensure(1);
-  this._buf[this._offset++] = b;
-};
-
-
-Writer.prototype.writeInt = function(i, tag) {
-  if (typeof(i) !== 'number')
-    throw new TypeError('argument must be a Number');
-  if (typeof(tag) !== 'number')
-    tag = ASN1.Integer;
-
-  var sz = 4;
-
-  while ((((i & 0xff800000) === 0) || ((i & 0xff800000) === 0xff800000)) &&
-         (sz > 1)) {
-    sz--;
-    i <<= 8;
-  }
-
-  if (sz > 4)
-    throw new InvalidAsn1Error('BER ints cannot be > 0xffffffff');
-
-  this._ensure(2 + sz);
-  this._buf[this._offset++] = tag;
-  this._buf[this._offset++] = sz;
-
-  while (sz-- > 0) {
-    this._buf[this._offset++] = ((i & 0xff000000) >> 24);
-    i <<= 8;
-  }
-
-};
-
-
-Writer.prototype.writeNull = function() {
-  this.writeByte(ASN1.Null);
-  this.writeByte(0x00);
-};
-
-
-Writer.prototype.writeEnumeration = function(i, tag) {
-  if (typeof(i) !== 'number')
-    throw new TypeError('argument must be a Number');
-  if (typeof(tag) !== 'number')
-    tag = ASN1.Enumeration;
-
-  return this.writeInt(i, tag);
-};
-
-
-Writer.prototype.writeBoolean = function(b, tag) {
-  if (typeof(b) !== 'boolean')
-    throw new TypeError('argument must be a Boolean');
-  if (typeof(tag) !== 'number')
-    tag = ASN1.Boolean;
-
-  this._ensure(3);
-  this._buf[this._offset++] = tag;
-  this._buf[this._offset++] = 0x01;
-  this._buf[this._offset++] = b ? 0xff : 0x00;
-};
-
-
-Writer.prototype.writeString = function(s, tag) {
-  if (typeof(s) !== 'string')
-    throw new TypeError('argument must be a string (was: ' + typeof(s) + ')');
-  if (typeof(tag) !== 'number')
-    tag = ASN1.OctetString;
-
-  var len = Buffer.byteLength(s);
-  this.writeByte(tag);
-  this.writeLength(len);
-  if (len) {
-    this._ensure(len);
-    this._buf.write(s, this._offset);
-    this._offset += len;
-  }
-};
-
-
-Writer.prototype.writeBuffer = function(buf, tag) {
-  if (typeof(tag) !== 'number')
-    throw new TypeError('tag must be a number');
-  if (!Buffer.isBuffer(buf))
-    throw new TypeError('argument must be a buffer');
-
-  this.writeByte(tag);
-  this.writeLength(buf.length);
-  this._ensure(buf.length);
-  buf.copy(this._buf, this._offset, 0, buf.length);
-  this._offset += buf.length;
-};
-
-
-Writer.prototype.writeStringArray = function(strings) {
-  if ((!strings instanceof Array))
-    throw new TypeError('argument must be an Array[String]');
-
-  var self = this;
-  strings.forEach(function(s) {
-    self.writeString(s);
-  });
-};
-
-// This is really to solve DER cases, but whatever for now
-Writer.prototype.writeOID = function(s, tag) {
-  if (typeof(s) !== 'string')
-    throw new TypeError('argument must be a string');
-  if (typeof(tag) !== 'number')
-    tag = ASN1.OID;
-
-  if (!/^([0-9]+\.){3,}[0-9]+$/.test(s))
-    throw new Error('argument is not a valid OID string');
-
-  function encodeOctet(bytes, octet) {
-    if (octet < 128) {
-        bytes.push(octet);
-    } else if (octet < 16384) {
-        bytes.push((octet >>> 7) | 0x80);
-        bytes.push(octet & 0x7F);
-    } else if (octet < 2097152) {
-      bytes.push((octet >>> 14) | 0x80);
-      bytes.push(((octet >>> 7) | 0x80) & 0xFF);
-      bytes.push(octet & 0x7F);
-    } else if (octet < 268435456) {
-      bytes.push((octet >>> 21) | 0x80);
-      bytes.push(((octet >>> 14) | 0x80) & 0xFF);
-      bytes.push(((octet >>> 7) | 0x80) & 0xFF);
-      bytes.push(octet & 0x7F);
-    } else {
-      bytes.push(((octet >>> 28) | 0x80) & 0xFF);
-      bytes.push(((octet >>> 21) | 0x80) & 0xFF);
-      bytes.push(((octet >>> 14) | 0x80) & 0xFF);
-      bytes.push(((octet >>> 7) | 0x80) & 0xFF);
-      bytes.push(octet & 0x7F);
-    }
-  }
-
-  var tmp = s.split('.');
-  var bytes = [];
-  bytes.push(parseInt(tmp[0], 10) * 40 + parseInt(tmp[1], 10));
-  tmp.slice(2).forEach(function(b) {
-    encodeOctet(bytes, parseInt(b, 10));
-  });
-
-  var self = this;
-  this._ensure(2 + bytes.length);
-  this.writeByte(tag);
-  this.writeLength(bytes.length);
-  bytes.forEach(function(b) {
-    self.writeByte(b);
-  });
-};
-
-
-Writer.prototype.writeLength = function(len) {
-  if (typeof(len) !== 'number')
-    throw new TypeError('argument must be a Number');
-
-  this._ensure(4);
-
-  if (len <= 0x7f) {
-    this._buf[this._offset++] = len;
-  } else if (len <= 0xff) {
-    this._buf[this._offset++] = 0x81;
-    this._buf[this._offset++] = len;
-  } else if (len <= 0xffff) {
-    this._buf[this._offset++] = 0x82;
-    this._buf[this._offset++] = len >> 8;
-    this._buf[this._offset++] = len;
-  } else if (len <= 0xffffff) {
-    this._shift(start, len, 1);
-    this._buf[this._offset++] = 0x83;
-    this._buf[this._offset++] = len >> 16;
-    this._buf[this._offset++] = len >> 8;
-    this._buf[this._offset++] = len;
-  } else {
-    throw new InvalidAsn1ERror('Length too long (> 4 bytes)');
-  }
-};
-
-Writer.prototype.startSequence = function(tag) {
-  if (typeof(tag) !== 'number')
-    tag = ASN1.Sequence | ASN1.Constructor;
-
-  this.writeByte(tag);
-  this._seq.push(this._offset);
-  this._ensure(3);
-  this._offset += 3;
-};
-
-
-Writer.prototype.endSequence = function() {
-  var seq = this._seq.pop();
-  var start = seq + 3;
-  var len = this._offset - start;
-
-  if (len <= 0x7f) {
-    this._shift(start, len, -2);
-    this._buf[seq] = len;
-  } else if (len <= 0xff) {
-    this._shift(start, len, -1);
-    this._buf[seq] = 0x81;
-    this._buf[seq + 1] = len;
-  } else if (len <= 0xffff) {
-    this._buf[seq] = 0x82;
-    this._buf[seq + 1] = len >> 8;
-    this._buf[seq + 2] = len;
-  } else if (len <= 0xffffff) {
-    this._shift(start, len, 1);
-    this._buf[seq] = 0x83;
-    this._buf[seq + 1] = len >> 16;
-    this._buf[seq + 2] = len >> 8;
-    this._buf[seq + 3] = len;
-  } else {
-    throw new InvalidAsn1Error('Sequence too long');
-  }
-};
-
-
-Writer.prototype._shift = function(start, len, shift) {
-  assert.ok(start !== undefined);
-  assert.ok(len !== undefined);
-  assert.ok(shift);
-
-  this._buf.copy(this._buf, start + shift, start, start + len);
-  this._offset += shift;
-};
-
-Writer.prototype._ensure = function(len) {
-  assert.ok(len);
-
-  if (this._size - this._offset < len) {
-    var sz = this._size * this._options.growthFactor;
-    if (sz - this._offset < len)
-      sz += len;
-
-    var buf = new Buffer(sz);
-
-    this._buf.copy(buf, 0, 0, this._offset);
-    this._buf = buf;
-    this._size = sz;
-  }
-};
-
-
-
-///--- Exported API
-
-module.exports = Writer;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,20 +0,0 @@
-// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
-
-// If you have no idea what ASN.1 or BER is, see this:
-// ftp://ftp.rsa.com/pub/pkcs/ascii/layman.asc
-
-var Ber = require('./ber/index');
-
-
-
-///--- Exported API
-
-module.exports = {
-
-  Ber: Ber,
-
-  BerReader: Ber.Reader,
-
-  BerWriter: Ber.Writer
-
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,43 +0,0 @@
-{
-  "author": {
-    "name": "Mark Cavage",
-    "email": "mcavage@gmail.com"
-  },
-  "contributors": [
-    {
-      "name": "David Gwynne",
-      "email": "loki@animata.net"
-    },
-    {
-      "name": "Yunong Xiao",
-      "email": "yunong@joyent.com"
-    }
-  ],
-  "name": "asn1",
-  "description": "Contains parsers and serializers for ASN.1 (currently BER only)",
-  "version": "0.1.11",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/mcavage/node-asn1.git"
-  },
-  "main": "lib/index.js",
-  "engines": {
-    "node": ">=0.4.9"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "tap": "0.1.4"
-  },
-  "scripts": {
-    "pretest": "which gjslint; if [[ \"$?\" = 0 ]] ; then  gjslint --nojsdoc -r lib -r tst; else echo \"Missing gjslint. Skipping lint\"; fi",
-    "test": "./node_modules/.bin/tap ./tst"
-  },
-  "readme": "node-asn1 is a library for encoding and decoding ASN.1 datatypes in pure JS.\nCurrently BER encoding is supported; at some point I'll likely have to do DER.\n\n## Usage\n\nMostly, if you're *actually* needing to read and write ASN.1, you probably don't\nneed this readme to explain what and why.  If you have no idea what ASN.1 is,\nsee this: ftp://ftp.rsa.com/pub/pkcs/ascii/layman.asc\n\nThe source is pretty much self-explanatory, and has read/write methods for the\ncommon types out there.\n\n### Decoding\n\nThe following reads an ASN.1 sequence with a boolean.\n\n    var Ber = require('asn1').Ber;\n\n    var reader = new Ber.Reader(new Buffer([0x30, 0x03, 0x01, 0x01, 0xff]));\n\n    reader.readSequence();\n    console.log('Sequence len: ' + reader.length);\n    if (reader.peek() === Ber.Boolean)\n      console.log(reader.readBoolean());\n\n### Encoding\n\nThe following generates the same payload as above.\n\n    var Ber = require('asn1').Ber;\n\n    var writer = new Ber.Writer();\n\n    writer.startSequence();\n    writer.writeBoolean(true);\n    writer.endSequence();\n\n    console.log(writer.buffer);\n\n## Installation\n\n    npm install asn1\n\n## License\n\nMIT.\n\n## Bugs\n\nSee <https://github.com/mcavage/node-asn1/issues>.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/mcavage/node-asn1/issues"
-  },
-  "homepage": "https://github.com/mcavage/node-asn1",
-  "_id": "asn1@0.1.11",
-  "_from": "asn1@0.1.11"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/tst/ber/reader.test.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,172 +0,0 @@
-// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
-
-var test = require('tap').test;
-
-
-
-///--- Globals
-
-var BerReader;
-
-
-
-///--- Tests
-
-test('load library', function(t) {
-  BerReader = require('../../lib/index').BerReader;
-  t.ok(BerReader);
-  try {
-    new BerReader();
-    t.fail('Should have thrown');
-  } catch (e) {
-    t.ok(e instanceof TypeError, 'Should have been a type error');
-  }
-  t.end();
-});
-
-
-test('read byte', function(t) {
-  var reader = new BerReader(new Buffer([0xde]));
-  t.ok(reader);
-  t.equal(reader.readByte(), 0xde, 'wrong value');
-  t.end();
-});
-
-
-test('read 1 byte int', function(t) {
-  var reader = new BerReader(new Buffer([0x02, 0x01, 0x03]));
-  t.ok(reader);
-  t.equal(reader.readInt(), 0x03, 'wrong value');
-  t.equal(reader.length, 0x01, 'wrong length');
-  t.end();
-});
-
-
-test('read 2 byte int', function(t) {
-  var reader = new BerReader(new Buffer([0x02, 0x02, 0x7e, 0xde]));
-  t.ok(reader);
-  t.equal(reader.readInt(), 0x7ede, 'wrong value');
-  t.equal(reader.length, 0x02, 'wrong length');
-  t.end();
-});
-
-
-test('read 3 byte int', function(t) {
-  var reader = new BerReader(new Buffer([0x02, 0x03, 0x7e, 0xde, 0x03]));
-  t.ok(reader);
-  t.equal(reader.readInt(), 0x7ede03, 'wrong value');
-  t.equal(reader.length, 0x03, 'wrong length');
-  t.end();
-});
-
-
-test('read 4 byte int', function(t) {
-  var reader = new BerReader(new Buffer([0x02, 0x04, 0x7e, 0xde, 0x03, 0x01]));
-  t.ok(reader);
-  t.equal(reader.readInt(), 0x7ede0301, 'wrong value');
-  t.equal(reader.length, 0x04, 'wrong length');
-  t.end();
-});
-
-
-test('read boolean true', function(t) {
-  var reader = new BerReader(new Buffer([0x01, 0x01, 0xff]));
-  t.ok(reader);
-  t.equal(reader.readBoolean(), true, 'wrong value');
-  t.equal(reader.length, 0x01, 'wrong length');
-  t.end();
-});
-
-
-test('read boolean false', function(t) {
-  var reader = new BerReader(new Buffer([0x01, 0x01, 0x00]));
-  t.ok(reader);
-  t.equal(reader.readBoolean(), false, 'wrong value');
-  t.equal(reader.length, 0x01, 'wrong length');
-  t.end();
-});
-
-
-test('read enumeration', function(t) {
-  var reader = new BerReader(new Buffer([0x0a, 0x01, 0x20]));
-  t.ok(reader);
-  t.equal(reader.readEnumeration(), 0x20, 'wrong value');
-  t.equal(reader.length, 0x01, 'wrong length');
-  t.end();
-});
-
-
-test('read string', function(t) {
-  var dn = 'cn=foo,ou=unit,o=test';
-  var buf = new Buffer(dn.length + 2);
-  buf[0] = 0x04;
-  buf[1] = Buffer.byteLength(dn);
-  buf.write(dn, 2);
-  var reader = new BerReader(buf);
-  t.ok(reader);
-  t.equal(reader.readString(), dn, 'wrong value');
-  t.equal(reader.length, dn.length, 'wrong length');
-  t.end();
-});
-
-
-test('read sequence', function(t) {
-  var reader = new BerReader(new Buffer([0x30, 0x03, 0x01, 0x01, 0xff]));
-  t.ok(reader);
-  t.equal(reader.readSequence(), 0x30, 'wrong value');
-  t.equal(reader.length, 0x03, 'wrong length');
-  t.equal(reader.readBoolean(), true, 'wrong value');
-  t.equal(reader.length, 0x01, 'wrong length');
-  t.end();
-});
-
-
-test('anonymous LDAPv3 bind', function(t) {
-  var BIND = new Buffer(14);
-  BIND[0] = 0x30;  // Sequence
-  BIND[1] = 12;    // len
-  BIND[2] = 0x02;  // ASN.1 Integer
-  BIND[3] = 1;     // len
-  BIND[4] = 0x04;  // msgid (make up 4)
-  BIND[5] = 0x60;  // Bind Request
-  BIND[6] = 7;     // len
-  BIND[7] = 0x02;  // ASN.1 Integer
-  BIND[8] = 1;     // len
-  BIND[9] = 0x03;  // v3
-  BIND[10] = 0x04; // String (bind dn)
-  BIND[11] = 0;    // len
-  BIND[12] = 0x80; // ContextSpecific (choice)
-  BIND[13] = 0;    // simple bind
-
-  // Start testing ^^
-  var ber = new BerReader(BIND);
-  t.equal(ber.readSequence(), 48, 'Not an ASN.1 Sequence');
-  t.equal(ber.length, 12, 'Message length should be 12');
-  t.equal(ber.readInt(), 4, 'Message id should have been 4');
-  t.equal(ber.readSequence(), 96, 'Bind Request should have been 96');
-  t.equal(ber.length, 7, 'Bind length should have been 7');
-  t.equal(ber.readInt(), 3, 'LDAP version should have been 3');
-  t.equal(ber.readString(), '', 'Bind DN should have been empty');
-  t.equal(ber.length, 0, 'string length should have been 0');
-  t.equal(ber.readByte(), 0x80, 'Should have been ContextSpecific (choice)');
-  t.equal(ber.readByte(), 0, 'Should have been simple bind');
-  t.equal(null, ber.readByte(), 'Should be out of data');
-  t.end();
-});
-
-
-test('long string', function(t) {
-  var buf = new Buffer(256);
-  var o;
-  var s =
-    '2;649;CN=Red Hat CS 71GA Demo,O=Red Hat CS 71GA Demo,C=US;' +
-    'CN=RHCS Agent - admin01,UID=admin01,O=redhat,C=US [1] This is ' +
-    'Teena Vradmin\'s description.';
-  buf[0] = 0x04;
-  buf[1] = 0x81;
-  buf[2] = 0x94;
-  buf.write(s, 3);
-  var ber = new BerReader(buf.slice(0, 3 + s.length));
-  t.equal(ber.readString(), s);
-  t.end();
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/tst/ber/writer.test.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,296 +0,0 @@
-// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
-
-var test = require('tap').test;
-var sys = require('sys');
-
-///--- Globals
-
-var BerWriter;
-
-var BerReader;
-
-
-///--- Tests
-
-test('load library', function(t) {
-  BerWriter = require('../../lib/index').BerWriter;
-  t.ok(BerWriter);
-  t.ok(new BerWriter());
-  t.end();
-});
-
-
-test('write byte', function(t) {
-  var writer = new BerWriter();
-
-  writer.writeByte(0xC2);
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 1, 'Wrong length');
-  t.equal(ber[0], 0xC2, 'value wrong');
-
-  t.end();
-});
-
-
-test('write 1 byte int', function(t) {
-  var writer = new BerWriter();
-
-  writer.writeInt(0x7f);
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 3, 'Wrong length for an int: ' + ber.length);
-  t.equal(ber[0], 0x02, 'ASN.1 tag wrong (2) -> ' + ber[0]);
-  t.equal(ber[1], 0x01, 'length wrong(1) -> ' + ber[1]);
-  t.equal(ber[2], 0x7f, 'value wrong(3) -> ' + ber[2]);
-
-  t.end();
-});
-
-
-test('write 2 byte int', function(t) {
-  var writer = new BerWriter();
-
-  writer.writeInt(0x7ffe);
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 4, 'Wrong length for an int');
-  t.equal(ber[0], 0x02, 'ASN.1 tag wrong');
-  t.equal(ber[1], 0x02, 'length wrong');
-  t.equal(ber[2], 0x7f, 'value wrong (byte 1)');
-  t.equal(ber[3], 0xfe, 'value wrong (byte 2)');
-
-  t.end();
-});
-
-
-test('write 3 byte int', function(t) {
-  var writer = new BerWriter();
-
-  writer.writeInt(0x7ffffe);
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 5, 'Wrong length for an int');
-  t.equal(ber[0], 0x02, 'ASN.1 tag wrong');
-  t.equal(ber[1], 0x03, 'length wrong');
-  t.equal(ber[2], 0x7f, 'value wrong (byte 1)');
-  t.equal(ber[3], 0xff, 'value wrong (byte 2)');
-  t.equal(ber[4], 0xfe, 'value wrong (byte 3)');
-
-  t.end();
-});
-
-
-test('write 4 byte int', function(t) {
-  var writer = new BerWriter();
-
-  writer.writeInt(0x7ffffffe);
-  var ber = writer.buffer;
-
-  t.ok(ber);
-
-  t.equal(ber.length, 6, 'Wrong length for an int');
-  t.equal(ber[0], 0x02, 'ASN.1 tag wrong');
-  t.equal(ber[1], 0x04, 'length wrong');
-  t.equal(ber[2], 0x7f, 'value wrong (byte 1)');
-  t.equal(ber[3], 0xff, 'value wrong (byte 2)');
-  t.equal(ber[4], 0xff, 'value wrong (byte 3)');
-  t.equal(ber[5], 0xfe, 'value wrong (byte 4)');
-
-  t.end();
-});
-
-
-test('write boolean', function(t) {
-  var writer = new BerWriter();
-
-  writer.writeBoolean(true);
-  writer.writeBoolean(false);
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 6, 'Wrong length');
-  t.equal(ber[0], 0x01, 'tag wrong');
-  t.equal(ber[1], 0x01, 'length wrong');
-  t.equal(ber[2], 0xff, 'value wrong');
-  t.equal(ber[3], 0x01, 'tag wrong');
-  t.equal(ber[4], 0x01, 'length wrong');
-  t.equal(ber[5], 0x00, 'value wrong');
-
-  t.end();
-});
-
-
-test('write string', function(t) {
-  var writer = new BerWriter();
-  writer.writeString('hello world');
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 13, 'wrong length');
-  t.equal(ber[0], 0x04, 'wrong tag');
-  t.equal(ber[1], 11, 'wrong length');
-  t.equal(ber.slice(2).toString('utf8'), 'hello world', 'wrong value');
-
-  t.end();
-});
-
-test('write buffer', function(t) {
-  var writer = new BerWriter();
-  // write some stuff to start with
-  writer.writeString('hello world');
-  var ber = writer.buffer;
-  var buf = new Buffer([0x04, 0x0b, 0x30, 0x09, 0x02, 0x01, 0x0f, 0x01, 0x01,
-     0xff, 0x01, 0x01, 0xff]);
-  writer.writeBuffer(buf.slice(2, buf.length), 0x04);
-  ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 26, 'wrong length');
-  t.equal(ber[0], 0x04, 'wrong tag');
-  t.equal(ber[1], 11, 'wrong length');
-  t.equal(ber.slice(2, 13).toString('utf8'), 'hello world', 'wrong value');
-  t.equal(ber[13], buf[0], 'wrong tag');
-  t.equal(ber[14], buf[1], 'wrong length');
-  for (var i = 13, j = 0; i < ber.length && j < buf.length; i++, j++) {
-    t.equal(ber[i], buf[j], 'buffer contents not identical');
-  }
-  t.end();
-});
-
-test('write string array', function(t) {
-  var writer = new BerWriter();
-  writer.writeStringArray(['hello world', 'fubar!']);
-  var ber = writer.buffer;
-
-  t.ok(ber);
-
-  t.equal(ber.length, 21, 'wrong length');
-  t.equal(ber[0], 0x04, 'wrong tag');
-  t.equal(ber[1], 11, 'wrong length');
-  t.equal(ber.slice(2, 13).toString('utf8'), 'hello world', 'wrong value');
-
-  t.equal(ber[13], 0x04, 'wrong tag');
-  t.equal(ber[14], 6, 'wrong length');
-  t.equal(ber.slice(15).toString('utf8'), 'fubar!', 'wrong value');
-
-  t.end();
-});
-
-
-test('resize internal buffer', function(t) {
-  var writer = new BerWriter({size: 2});
-  writer.writeString('hello world');
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 13, 'wrong length');
-  t.equal(ber[0], 0x04, 'wrong tag');
-  t.equal(ber[1], 11, 'wrong length');
-  t.equal(ber.slice(2).toString('utf8'), 'hello world', 'wrong value');
-
-  t.end();
-});
-
-
-test('sequence', function(t) {
-  var writer = new BerWriter({size: 25});
-  writer.startSequence();
-  writer.writeString('hello world');
-  writer.endSequence();
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  console.log(ber);
-  t.equal(ber.length, 15, 'wrong length');
-  t.equal(ber[0], 0x30, 'wrong tag');
-  t.equal(ber[1], 13, 'wrong length');
-  t.equal(ber[2], 0x04, 'wrong tag');
-  t.equal(ber[3], 11, 'wrong length');
-  t.equal(ber.slice(4).toString('utf8'), 'hello world', 'wrong value');
-
-  t.end();
-});
-
-
-test('nested sequence', function(t) {
-  var writer = new BerWriter({size: 25});
-  writer.startSequence();
-  writer.writeString('hello world');
-  writer.startSequence();
-  writer.writeString('hello world');
-  writer.endSequence();
-  writer.endSequence();
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 30, 'wrong length');
-  t.equal(ber[0], 0x30, 'wrong tag');
-  t.equal(ber[1], 28, 'wrong length');
-  t.equal(ber[2], 0x04, 'wrong tag');
-  t.equal(ber[3], 11, 'wrong length');
-  t.equal(ber.slice(4, 15).toString('utf8'), 'hello world', 'wrong value');
-  t.equal(ber[15], 0x30, 'wrong tag');
-  t.equal(ber[16], 13, 'wrong length');
-  t.equal(ber[17], 0x04, 'wrong tag');
-  t.equal(ber[18], 11, 'wrong length');
-  t.equal(ber.slice(19, 30).toString('utf8'), 'hello world', 'wrong value');
-
-  t.end();
-});
-
-
-test('LDAP bind message', function(t) {
-  var dn = 'cn=foo,ou=unit,o=test';
-  var writer = new BerWriter();
-  writer.startSequence();
-  writer.writeInt(3);             // msgid = 3
-  writer.startSequence(0x60);     // ldap bind
-  writer.writeInt(3);             // ldap v3
-  writer.writeString(dn);
-  writer.writeByte(0x80);
-  writer.writeByte(0x00);
-  writer.endSequence();
-  writer.endSequence();
-  var ber = writer.buffer;
-
-  t.ok(ber);
-  t.equal(ber.length, 35, 'wrong length (buffer)');
-  t.equal(ber[0], 0x30, 'wrong tag');
-  t.equal(ber[1], 33, 'wrong length');
-  t.equal(ber[2], 0x02, 'wrong tag');
-  t.equal(ber[3], 1, 'wrong length');
-  t.equal(ber[4], 0x03, 'wrong value');
-  t.equal(ber[5], 0x60, 'wrong tag');
-  t.equal(ber[6], 28, 'wrong length');
-  t.equal(ber[7], 0x02, 'wrong tag');
-  t.equal(ber[8], 1, 'wrong length');
-  t.equal(ber[9], 0x03, 'wrong value');
-  t.equal(ber[10], 0x04, 'wrong tag');
-  t.equal(ber[11], dn.length, 'wrong length');
-  t.equal(ber.slice(12, 33).toString('utf8'), dn, 'wrong value');
-  t.equal(ber[33], 0x80, 'wrong tag');
-  t.equal(ber[34], 0x00, 'wrong len');
-
-  t.end();
-});
-
-
-test('Write OID', function(t) {
-  var oid = '1.2.840.113549.1.1.1';
-  var writer = new BerWriter();
-  writer.writeOID(oid);
-
-  var ber = writer.buffer;
-  t.ok(ber);
-  console.log(require('util').inspect(ber));
-  console.log(require('util').inspect(new Buffer([0x06, 0x09, 0x2a, 0x86,
-                                                  0x48, 0x86, 0xf7, 0x0d,
-                                                  0x01, 0x01, 0x01])));
-
-  t.end();
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,126 +0,0 @@
-# node-assert-plus
-
-This library is a super small wrapper over node's assert module that has two
-things: (1) the ability to disable assertions with the environment variable
-NODE_NDEBUG, and (2) some API wrappers for argument testing.  Like
-`assert.string(myArg, 'myArg')`.  As a simple example, most of my code looks
-like this:
-
-    var assert = require('assert-plus');
-
-    function fooAccount(options, callback) {
-	    assert.object(options, 'options');
-		assert.number(options.id, 'options.id);
-		assert.bool(options.isManager, 'options.isManager');
-		assert.string(options.name, 'options.name');
-		assert.arrayOfString(options.email, 'options.email');
-		assert.func(callback, 'callback');
-
-        // Do stuff
-		callback(null, {});
-    }
-
-# API
-
-All methods that *aren't* part of node's core assert API are simply assumed to
-take an argument, and then a string 'name' that's not a message; `AssertionError`
-will be thrown if the assertion fails with a message like:
-
-    AssertionError: foo (string) is required
-	at test (/home/mark/work/foo/foo.js:3:9)
-	at Object.<anonymous> (/home/mark/work/foo/foo.js:15:1)
-	at Module._compile (module.js:446:26)
-	at Object..js (module.js:464:10)
-	at Module.load (module.js:353:31)
-	at Function._load (module.js:311:12)
-	at Array.0 (module.js:484:10)
-	at EventEmitter._tickCallback (node.js:190:38)
-
-from:
-
-    function test(foo) {
-	    assert.string(foo, 'foo');
-    }
-
-There you go.  You can check that arrays are of a homogenous type with `Arrayof$Type`:
-
-    function test(foo) {
-	    assert.arrayOfString(foo, 'foo');
-    }
-
-You can assert IFF an argument is not `undefined` (i.e., an optional arg):
-
-    assert.optionalString(foo, 'foo');
-
-Lastly, you can opt-out of assertion checking altogether by setting the
-environment variable `NODE_NDEBUG=1`.  This is pseudo-useful if you have
-lots of assertions, and don't want to pay `typeof ()` taxes to v8 in
-production.
-
-The complete list of APIs is:
-
-* assert.bool
-* assert.buffer
-* assert.func
-* assert.number
-* assert.object
-* assert.string
-* assert.arrayOfBool
-* assert.arrayOfFunc
-* assert.arrayOfNumber
-* assert.arrayOfObject
-* assert.arrayOfString
-* assert.optionalBool
-* assert.optionalBuffer
-* assert.optionalFunc
-* assert.optionalNumber
-* assert.optionalObject
-* assert.optionalString
-* assert.optionalArrayOfBool
-* assert.optionalArrayOfFunc
-* assert.optionalArrayOfNumber
-* assert.optionalArrayOfObject
-* assert.optionalArrayOfString
-* assert.AssertionError
-* assert.fail
-* assert.ok
-* assert.equal
-* assert.notEqual
-* assert.deepEqual
-* assert.notDeepEqual
-* assert.strictEqual
-* assert.notStrictEqual
-* assert.throws
-* assert.doesNotThrow
-* assert.ifError
-
-# Installation
-
-    npm install assert-plus
-
-## License
-
-The MIT License (MIT)
-Copyright (c) 2012 Mark Cavage
-
-Permission is hereby granted, free of charge, to any person obtaining a copy of
-this software and associated documentation files (the "Software"), to deal in
-the Software without restriction, including without limitation the rights to
-use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
-the Software, and to permit persons to whom the Software is furnished to do so,
-subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all
-copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-SOFTWARE.
-
-## Bugs
-
-See <https://github.com/mcavage/node-assert-plus/issues>.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/assert.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,196 +0,0 @@
-// Copyright (c) 2012, Mark Cavage. All rights reserved.
-
-var assert = require('assert');
-var Stream = require('stream').Stream;
-var util = require('util');
-
-
-
-///--- Globals
-
-var NDEBUG = process.env.NODE_NDEBUG || false;
-
-
-
-///--- Messages
-
-var ARRAY_TYPE_REQUIRED = '%s ([%s]) required';
-var TYPE_REQUIRED = '%s (%s) is required';
-
-
-
-///--- Internal
-
-function capitalize(str) {
-        return (str.charAt(0).toUpperCase() + str.slice(1));
-}
-
-function uncapitalize(str) {
-        return (str.charAt(0).toLowerCase() + str.slice(1));
-}
-
-function _() {
-        return (util.format.apply(util, arguments));
-}
-
-
-function _assert(arg, type, name, stackFunc) {
-        if (!NDEBUG) {
-                name = name || type;
-                stackFunc = stackFunc || _assert.caller;
-                var t = typeof (arg);
-
-                if (t !== type) {
-                        throw new assert.AssertionError({
-                                message: _(TYPE_REQUIRED, name, type),
-                                actual: t,
-                                expected: type,
-                                operator: '===',
-                                stackStartFunction: stackFunc
-                        });
-                }
-        }
-}
-
-
-
-///--- API
-
-function array(arr, type, name) {
-        if (!NDEBUG) {
-                name = name || type;
-
-                if (!Array.isArray(arr)) {
-                        throw new assert.AssertionError({
-                                message: _(ARRAY_TYPE_REQUIRED, name, type),
-                                actual: typeof (arr),
-                                expected: 'array',
-                                operator: 'Array.isArray',
-                                stackStartFunction: array.caller
-                        });
-                }
-
-                for (var i = 0; i < arr.length; i++) {
-                        _assert(arr[i], type, name, array);
-                }
-        }
-}
-
-
-function bool(arg, name) {
-        _assert(arg, 'boolean', name, bool);
-}
-
-
-function buffer(arg, name) {
-        if (!Buffer.isBuffer(arg)) {
-                throw new assert.AssertionError({
-                        message: _(TYPE_REQUIRED, name, type),
-                        actual: typeof (arg),
-                        expected: 'buffer',
-                        operator: 'Buffer.isBuffer',
-                        stackStartFunction: buffer
-                });
-        }
-}
-
-
-function func(arg, name) {
-        _assert(arg, 'function', name);
-}
-
-
-function number(arg, name) {
-        _assert(arg, 'number', name);
-}
-
-
-function object(arg, name) {
-        _assert(arg, 'object', name);
-}
-
-
-function stream(arg, name) {
-        if (!(arg instanceof Stream)) {
-                throw new assert.AssertionError({
-                        message: _(TYPE_REQUIRED, name, type),
-                        actual: typeof (arg),
-                        expected: 'Stream',
-                        operator: 'instanceof',
-                        stackStartFunction: buffer
-                });
-        }
-}
-
-
-function string(arg, name) {
-        _assert(arg, 'string', name);
-}
-
-
-
-///--- Exports
-
-module.exports = {
-        bool: bool,
-        buffer: buffer,
-        func: func,
-        number: number,
-        object: object,
-        stream: stream,
-        string: string
-};
-
-
-Object.keys(module.exports).forEach(function (k) {
-        if (k === 'buffer')
-                return;
-
-        var name = 'arrayOf' + capitalize(k);
-
-        if (k === 'bool')
-                k = 'boolean';
-        if (k === 'func')
-                k = 'function';
-        module.exports[name] = function (arg, name) {
-                array(arg, k, name);
-        };
-});
-
-Object.keys(module.exports).forEach(function (k) {
-        var _name = 'optional' + capitalize(k);
-        var s = uncapitalize(k.replace('arrayOf', ''));
-        if (s === 'bool')
-                s = 'boolean';
-        if (s === 'func')
-                s = 'function';
-
-        if (k.indexOf('arrayOf') !== -1) {
-          module.exports[_name] = function (arg, name) {
-                  if (!NDEBUG && arg !== undefined) {
-                          array(arg, s, name);
-                  }
-          };
-        } else {
-          module.exports[_name] = function (arg, name) {
-                  if (!NDEBUG && arg !== undefined) {
-                          _assert(arg, s, name);
-                  }
-          };
-        }
-});
-
-
-// Reexport built-in assertions
-Object.keys(assert).forEach(function (k) {
-        if (k === 'AssertionError') {
-                module.exports[k] = assert[k];
-                return;
-        }
-
-        module.exports[k] = function () {
-                if (!NDEBUG) {
-                        assert[k].apply(assert[k], arguments);
-                }
-        };
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,20 +0,0 @@
-{
-  "author": {
-    "name": "Mark Cavage",
-    "email": "mcavage@gmail.com"
-  },
-  "name": "assert-plus",
-  "description": "Extra assertions on top of node's assert module",
-  "version": "0.1.2",
-  "main": "./assert.js",
-  "dependencies": {},
-  "devDependencies": {},
-  "optionalDependencies": {},
-  "engines": {
-    "node": ">=0.6"
-  },
-  "readme": "# node-assert-plus\n\nThis library is a super small wrapper over node's assert module that has two\nthings: (1) the ability to disable assertions with the environment variable\nNODE_NDEBUG, and (2) some API wrappers for argument testing.  Like\n`assert.string(myArg, 'myArg')`.  As a simple example, most of my code looks\nlike this:\n\n    var assert = require('assert-plus');\n\n    function fooAccount(options, callback) {\n\t    assert.object(options, 'options');\n\t\tassert.number(options.id, 'options.id);\n\t\tassert.bool(options.isManager, 'options.isManager');\n\t\tassert.string(options.name, 'options.name');\n\t\tassert.arrayOfString(options.email, 'options.email');\n\t\tassert.func(callback, 'callback');\n\n        // Do stuff\n\t\tcallback(null, {});\n    }\n\n# API\n\nAll methods that *aren't* part of node's core assert API are simply assumed to\ntake an argument, and then a string 'name' that's not a message; `AssertionError`\nwill be thrown if the assertion fails with a message like:\n\n    AssertionError: foo (string) is required\n\tat test (/home/mark/work/foo/foo.js:3:9)\n\tat Object.<anonymous> (/home/mark/work/foo/foo.js:15:1)\n\tat Module._compile (module.js:446:26)\n\tat Object..js (module.js:464:10)\n\tat Module.load (module.js:353:31)\n\tat Function._load (module.js:311:12)\n\tat Array.0 (module.js:484:10)\n\tat EventEmitter._tickCallback (node.js:190:38)\n\nfrom:\n\n    function test(foo) {\n\t    assert.string(foo, 'foo');\n    }\n\nThere you go.  You can check that arrays are of a homogenous type with `Arrayof$Type`:\n\n    function test(foo) {\n\t    assert.arrayOfString(foo, 'foo');\n    }\n\nYou can assert IFF an argument is not `undefined` (i.e., an optional arg):\n\n    assert.optionalString(foo, 'foo');\n\nLastly, you can opt-out of assertion checking altogether by setting the\nenvironment variable `NODE_NDEBUG=1`.  This is pseudo-useful if you have\nlots of assertions, and don't want to pay `typeof ()` taxes to v8 in\nproduction.\n\nThe complete list of APIs is:\n\n* assert.bool\n* assert.buffer\n* assert.func\n* assert.number\n* assert.object\n* assert.string\n* assert.arrayOfBool\n* assert.arrayOfFunc\n* assert.arrayOfNumber\n* assert.arrayOfObject\n* assert.arrayOfString\n* assert.optionalBool\n* assert.optionalBuffer\n* assert.optionalFunc\n* assert.optionalNumber\n* assert.optionalObject\n* assert.optionalString\n* assert.optionalArrayOfBool\n* assert.optionalArrayOfFunc\n* assert.optionalArrayOfNumber\n* assert.optionalArrayOfObject\n* assert.optionalArrayOfString\n* assert.AssertionError\n* assert.fail\n* assert.ok\n* assert.equal\n* assert.notEqual\n* assert.deepEqual\n* assert.notDeepEqual\n* assert.strictEqual\n* assert.notStrictEqual\n* assert.throws\n* assert.doesNotThrow\n* assert.ifError\n\n# Installation\n\n    npm install assert-plus\n\n## License\n\nThe MIT License (MIT)\nCopyright (c) 2012 Mark Cavage\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of\nthe Software, and to permit persons to whom the Software is furnished to do so,\nsubject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n## Bugs\n\nSee <https://github.com/mcavage/node-assert-plus/issues>.\n",
-  "readmeFilename": "README.md",
-  "_id": "assert-plus@0.1.2",
-  "_from": "assert-plus@0.1.2"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/CHANGELOG	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,71 +0,0 @@
-This contains tickets fixed in each version release in reverse chronological
-order. There is one ticket per line. Each commits message has the tickets fixed
-in it. The commit message also has the corresponding github issue. i.e. CTYPE-42
-would be issue 42. Each issue can be found at:
-https://github.com/rmustacc/node-ctype/issues/%d.
-
-CTYPE v0.5.2
-CTYPE-46 Release 0.5.2
-CTYPE-45 error in setEndian logic
-
-v0.5.1
-CTYPE-44 Release 0.5.1
-Contributed by Terin Stock:
-CTYPE-41 CTypeParser.writeStruct should return its offset
-Contributed by Terin Stock:
-CTYPE-42 int64_t returns wrong size
-
-v0.5.0
-CTYPE-40 Release 0.5.0
-CTYPE-39 want > 0.6 engine support
-
-v0.4.0
-CTYPE-37 Release v0.4.0
-CTYPE-6 want additional entry point for write
-CTYPE-20 Add 64-bit int support into core parser
-CTYPE-31 Fix bounds errors node/2129
-CTYPE-33 Update copyright holders
-CTYPE-34 ctf.js confuses sign bit.
-CTYPE-35 Make the README more useful for getting started
-CTYPE-36 want manual page on ctio functions
-
-v0.3.1
-CTYPE-29 Release 0.3.1
-CTYPE-28 Want v0.6 npm support
-
-v0.3.0
-CTYPE-27 Release v0.3.0
-CTYPE-26 Want alternate default char behavior
-
-v0.2.1
-CTYPE-25 Release v0.2.1
-CTYPE-24 Writing structs is busted
-
-v0.2.0:
-CTYPE-23 Release v0.2.0
-CTYPE-21 Add support for CTF JSON data
-CTYPE-22 Add Javascriptlint profile
-CTYPE-15 Pull in ctio updates from node/master
-
-v0.1.0:
-CTYPE-18 Bump version to v0.1.0
-CTYPE-17 Fix nested structures
-CTYPE-16 Remove extraneous logging
-CTYPE-14 toAbs64 and toApprox64 are not exported
-
-v0.0.3:
-CTYPE-12 Bump version to v0.0.3
-CTYPE-11 fix typo in wuint64
-CTYPE-10 Integrate jsstyle
-
-v0.0.2:
-CTYPE-8 dump npm version to v0.0.2
-CTYPE-9 want changelog
-CTYPE-7 fix typo in detypes.
-
-v0.0.1:
-CTYPE-5 Missing from NPM registry
-CTYPE-4 int16_t calls wrong read function
-CTYPE-3 API example types are missing quotes as strings
-CTYPE-2 doc missing 64-bit functions
-CTYPE-1 Need license
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-The following license applies to all files unless the file is specified below.
-Each file specified below has its license information embedded in it:
-
-tools/jsstyle
-
-Copyright 2011, Robert Mustacchi. All rights reserved.
-Copyright 2011, Joyent, Inc. All rights reserved.
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to
-deal in the Software without restriction, including without limitation the
-rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
-sell copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
-IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/README	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,82 +0,0 @@
-Node-CType is a way to read and write binary data in structured and easy to use
-format. Its name comes from the C header file.
-
-To get started, simply clone the repository or use npm to install it. Once it is
-there, simply require it.
-
-git clone git://github.com/rmustacc/node-ctype
-npm install ctype
-var mod_ctype = require('ctype')
-
-
-There are two APIs that you can use, depending on what abstraction you'd like.
-The low level API let's you read and write individual integers and floats from
-buffers. The higher level API let's you read and write structures of these. To
-illustrate this, let's looks look at how we would read and write a binary
-encoded x,y point.
-
-In C we would define this structure as follows:
-
-typedef struct point {
-	uint16_t	p_x;
-	uint16_t	p_y;
-} point_t;
-
-To read a binary encoded point from a Buffer, we first need to create a CType
-parser (where we specify the endian and other options) and add the typedef.
-
-var parser = new mod_ctype.Parser({ endian: 'big' });
-parser.typedef('point_t', [
-	{ x: { type: 'uint16_t' } },
-	{ y: { type: 'uint16_t' } }
-]);
-
-From here, given a buffer buf and an offset into it, we can read a point.
-
-var out = parser.readData([ { point: { type: 'point_t' } } ], buffer, 0);
-console.log(out);
-{ point: { x: 23, y: 42 } }
-
-Another way to get the same information would be to use the low level methods.
-Note that these require you to manually deal with the offset. Here's how we'd
-get the same values of x and y from the buffer.
-
-var x = mod_ctype.ruint16(buf, 'big', 0);
-var y = mod_ctype.ruint16(buf, 'big', 2);
-console.log(x + ', ' + y);
-23, 42
-
-The true power of this API comes from the ability to define and nest typedefs,
-just as you would in C. By default, the following types are defined by default.
-Note that they return a Number, unless indicated otherwise.
-
-    * int8_t
-    * int16_t
-    * int32_t
-    * int64_t (returns an array where val[0] << 32 + val[1] would be the value)
-    * uint8_t
-    * uint16_t
-    * uint32_t
-    * uint64_t (returns an array where val[0] << 32 + val[1] would be the value)
-    * float
-    * double
-    * char (either returns a buffer with that character or a uint8_t)
-    * char[] (returns an object with the buffer and the number of characters read which is either the total amount requested or until the first 0)
-
-
-ctf2json integration:
-
-Node-CType supports consuming the output of ctf2json. Once you read in a JSON file,
-all you have to do to add all the definitions it contains is:
-
-var data, parser;
-data = JSON.parse(parsedJSONData);
-parser = mod_ctype.parseCTF(data, { endian: 'big' });
-
-For more documentation, see the file README.old. Full documentation is in the
-process of being rewritten as a series of manual pages which will be available
-in the repository and online for viewing.
-
-To read the ctio manual page simple run, from the root of the workspace:
-
-man -Mman -s 3ctype ctio
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/README.old	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,298 +0,0 @@
-This library provides a way to read and write binary data.
-
-Node CType is a way to read and write binary data in structured and easy to use
-formats. It's name comes from the header file, though it does not share as much
-with it as it perhaps should.
-
-There are two levels of the API. One is the raw API which everything is built on
-top of, while the other provides a much nicer abstraction and is built entirely
-by using the lower level API. The hope is that the low level API is both clear
-and useful. The low level API gets it's names from stdint.h (a rather
-appropriate source). The lower level API is presented at the end of this
-document.
-
-Standard CType API
-
-The CType interface is presented as a parser object that controls the
-endianness combined with a series of methods to change that value, parse and
-write out buffers, and a way to provide typedefs.  Standard Types
-
-The CType parser supports the following basic types which return Numbers except
-as indicated:
-
-    * int8_t
-    * int16_t
-    * int32_t
-    * int64_t (returns an array where val[0] << 32 + val[1] would be the value)
-    * uint8_t
-    * uint16_t
-    * uint32_t
-    * uint64_t (returns an array where val[0] << 32 + val[1] would be the value)
-    * float
-    * double
-    * char (returns a buffer with just that single character)
-    * char[] (returns an object with the buffer and the number of characters read which is either the total amount requested or until the first 0)
-
-Specifying Structs
-
-The CType parser also supports the notion of structs. A struct is an array of
-JSON objects that defines an order of keys which have types and values. One
-would build a struct to represent a point (x,y) as follows:
-
-[
-    { x: { type: 'int16_t' }},
-    { y: { type: 'int16_t' }}
-]
-
-When this is passed into the read routine, it would read the first two bytes
-(as defined by int16_t) to determine the Number to use for X, and then it would
-read the next two bytes to determine the value of Y. When read this could
-return something like:
-
-{
-    x: 42,
-    y: -23
-}
-
-When someone wants to write values, we use the same format as above, but with
-additional value field:
-
-[
-    { x: { type: 'int16_t', value: 42 }},
-    { y: { type: 'int16_t', value: -23 }}
-]
-
-Now, the structure above may be optionally annotated with offsets. This tells
-us to rather than read continuously we should read the given value at the
-specified offset. If an offset is provided, it is is effectively the equivalent
-of lseek(offset, SEEK_SET). Thus, subsequent values will be read from that
-offset and incremented by the appropriate value. As an example:
-
-[
-    { x: { type: 'int16_t' }},
-    { y: { type: 'int16_t', offset: 20 }},
-    { z: { type: 'int16_t' }}
-]
-
-We would read x from the first starting offset given to us, for the sake of
-example, let's assume that's 0. After reading x, the next offset to read from
-would be 2; however, y specifies an offset, thus we jump directly to that
-offset and read y from byte 20. We would then read z from byte 22.
-
-The same offsets may be used when writing values.
-
-Typedef
-
-The basic set of types while covers the basics, is somewhat limiting. To make
-this richer, there is functionality to typedef something like in C. One can use
-typedef to add a new name to an existing type or to define a name to refer to a
-struct. Thus the following are all examples of a typedef:
-
-typedef('size_t', 'uint32_t');
-typedef('ssize_t', 'int32_t');
-typedef('point_t', [
-    { x: { type: 'int16_t' }},
-    { y: { type: 'int16_t' }}
-]);
-
-Once something has been typedef'd it can be used in any of the definitions
-previously shown.
-
-One cannot remove a typedef once created, this is analogous to C.
-
-The set of defined types can be printed with lstypes. The format of this output
-is subject to change, but likely will look something like:
-
-> lstypes();
-{
-    size_t: 'uint32_t',
-    ssize_t: 'int32_t',
-    point_t: [
-        { x: { type: 'int16_t' }},
-        { y: { type: 'int16_t' }}
-    ]
-}
-
-Specifying arrays
-
-Arrays can be specified by appending []s to a type. Arrays must have the size
-specified. The size must be specified and it can be done in one of two ways:
-
-    * An explicit non-zero integer size
-    * A name of a previously declared variable in the struct whose value is a
-      number.
-
-Note, that when using the name of a variable, it should be the string name for
-the key. This is only valid inside structs and the value must be declared
-before the value with the array. The following are examples:
-
-[
-    { ip_addr4: { type: 'uint8_t[4]' }},
-    { len: { type: 'uint32_t' }},
-    { data: { type: 'uint8_t[len]' }}
-]
-
-Arrays are permitted in typedefs; however, they must have a declared integer
-size. The following are examples of valid and invalid arrays:
-
-typedef('path', 'char[1024]'); /* Good */
-typedef('path', 'char[len]');  /* Bad! */
-
-64 bit values:
-
-Unfortunately Javascript represents values with a double, so you lose precision
-and the ability to represent Integers roughly beyond 2^53. To alleviate this, I
-propose the following for returning 64 bit integers when read:
-
-value[2]: Each entry is a 32 bit number which can be reconstructed to the
-original by the following formula:
-
-value[0] << 32 + value[1] (Note this will not work in Javascript)
-
-CTF JSON data:
-
-node-ctype can also handle JSON data that mathces the format described in the
-documentation of the tool ctf2json. Given the JSON data which specifies type
-information, it will transform that into a parser that understands all of the
-types defined inside of it. This is useful for more complicated structures that
-have a lot of typedefs.
-
-Interface overview
-
-The following is the header-file like interface to the parser object:
-
-/*
- * Create a new instance of the parser. Each parser has its own store of
- * typedefs and endianness. Conf is an object with the following values:
- *
- *      endian          Either 'big' or 'little' do determine the endianness we
- *                      want to read from or write to.
- *
- */
-function CTypeParser(conf);
-
-/*
- * Parses the CTF JSON data and creates a parser that understands all of those
- * types.
- *
- *	data		Parsed JSON data that maches that CTF JSON
- *			specification.
- *
- *	conf		The configuration object to create a new CTypeParser
- *			from.
- */
-CTypeParser parseCTF(data, conf);
-
-/*
- * This is what we were born to do. We read the data from a buffer and return it
- * in an object whose keys match the values from the object.
- *
- *      def             The array definition of the data to read in
- *
- *      buffer          The buffer to read data from
- *
- *      offset          The offset to start writing to
- *
- * Returns an object where each key corresponds to an entry in def and the value
- * is the read value.
- */
-Object CTypeParser.readData(<Type Definition>, buffer, offset);
-
-/*
- * This is the second half of what we were born to do, write out the data
- * itself.
- *
- *      def             The array definition of the data to write out with
- *                      values
- *
- *      buffer          The buffer to write to
- *
- *      offset          The offset in the buffer to write to
- */
-void CTypeParser.writeData(<Type Definition>, buffer, offset);
-
-/*
- * A user has requested to add a type, let us honor their request. Yet, if their
- * request doth spurn us, send them unto the Hells which Dante describes.
- *
- *      name            The string for the type definition we're adding
- *
- *      value           Either a string that is a type/array name or an object
- *                      that describes a struct.
- */
-void CTypeParser.prototype.typedef(name, value);
-
-Object CTypeParser.prototype.lstypes();
-
-/*
- * Get the endian value for the current parser
- */
-String CTypeParser.prototype.getEndian();
-
-/*
- * Sets the current endian value for the Parser. If the value is not valid,
- * throws an Error.
- *
- *      endian          Either 'big' or 'little' do determine the endianness we
- *                      want to read from or write to.
- *
- */
-void CTypeParser.protoype.setEndian(String);
-
-/*
- * Attempts to convert an array of two integers returned from rsint64 / ruint64
- * into an absolute 64 bit number. If however the value would exceed 2^52 this
- * will instead throw an error. The mantissa in a double is a 52 bit number and
- * rather than potentially give you a value that is an approximation this will
- * error. If you would rather an approximation, please see toApprox64.
- *
- *	val		An array of two 32-bit integers
- */
-Number function toAbs64(val)
-
-/*
- * Will return the 64 bit value as returned in an array from rsint64 / ruint64
- * to a value as close as it can. Note that Javascript stores all numbers as a
- * double and the mantissa only has 52 bits. Thus this version may approximate
- * the value.
- *
- *	val		An array of two 32-bit integers
- */
-Number function toApprox64(val)
-
-Low Level API
-
-The following function are provided at the low level:
-
-Read unsigned integers from a buffer:
-Number ruint8(buffer, endian, offset);
-Number ruint16(buffer, endian, offset);
-Number ruint32(buffer, endian, offset);
-Number[] ruint64(buffer, endian, offset);
-
-Read signed integers from a buffer:
-Number rsint8(buffer, endian, offset);
-Number rsint16(buffer, endian, offset);
-Number rsint32(buffer, endian, offset);
-Number[] rsint64(buffer, endian, offset);
-
-Read floating point numbers from a buffer:
-Number rfloat(buffer, endian, offset);   /* IEEE-754 Single precision */
-Number rdouble(buffer, endian, offset);  /* IEEE-754 Double precision */
-
-Write unsigned integers to a buffer:
-void wuint8(Number, endian, buffer, offset);
-void wuint16(Number, endian, buffer, offset);
-void wuint32(Number, endian, buffer, offset);
-void wuint64(Number[], endian, buffer, offset);
-
-Write signed integers from a buffer:
-void wsint8(Number, endian, buffer, offset);
-void wsint16(Number, endian, buffer, offset);
-void wsint32(Number, endian, buffer, offset);
-void wsint64(Number[], endian, buffer offset);
-
-Write floating point numbers from a buffer:
-void wfloat(Number, buffer, endian, offset);   /* IEEE-754 Single precision */
-void wdouble(Number, buffer, endian, offset);  /* IEEE-754 Double precision */
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/ctf.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,245 +0,0 @@
-/*
- * ctf.js
- *
- * Understand and parse all of the different JSON formats of CTF data and
- * translate that into a series of node-ctype friendly pieces. The reason for
- * the abstraction is to handle different changes in the file format.
- *
- * We have to be careful here that we don't end up using a name that is already
- * a built in type.
- */
-var mod_assert = require('assert');
-var ASSERT = mod_assert.ok;
-
-var ctf_versions = [ '1.0' ];
-var ctf_entries = [ 'integer', 'float', 'typedef', 'struct' ];
-var ctf_deftypes = [ 'int8_t', 'uint8_t', 'int16_t', 'uint16_t', 'int32_t',
-    'uint32_t', 'float', 'double' ];
-
-function ctfParseInteger(entry, ctype)
-{
-	var name, sign, len, type;
-
-	name = entry['name'];
-	if (!('signed' in entry['integer']))
-		throw (new Error('Malformed CTF JSON: integer missing ' +
-		    'signed value'));
-
-
-	if (!('length' in entry['integer']))
-		throw (new Error('Malformed CTF JSON: integer missing ' +
-		    'length value'));
-
-	sign = entry['integer']['signed'];
-	len = entry['integer']['length'];
-	type = null;
-
-	if (sign && len == 1)
-		type = 'int8_t';
-	else if (len == 1)
-		type = 'uint8_t';
-	else if (sign && len == 2)
-		type = 'int16_t';
-	else if (len == 2)
-		type = 'uint16_t';
-	else if (sign && len == 4)
-		type = 'int32_t';
-	else if (len == 4)
-		type = 'uint32_t';
-	else if (sign && len == 8)
-		type = 'int64_t';
-	else if (len == 8)
-		type = 'uint64_t';
-
-	if (type === null)
-		throw (new Error('Malformed CTF JSON: integer has ' +
-		    'unsupported length and sign - ' + len + '/' + sign));
-
-	/*
-	 * This means that this is the same as one of our built in types. If
-	 * that's the case defining it would be an error. So instead of trying
-	 * to typedef it, we'll return here.
-	 */
-	if (name == type)
-		return;
-
-	if (name == 'char') {
-		ASSERT(type == 'int8_t');
-		return;
-	}
-
-	ctype.typedef(name, type);
-}
-
-function ctfParseFloat(entry, ctype)
-{
-	var name, len;
-
-	name = entry['name'];
-	if (!('length' in entry['float']))
-		throw (new Error('Malformed CTF JSON: float missing ' +
-		    'length value'));
-
-	len = entry['float']['length'];
-	if (len != 4 && len != 8)
-		throw (new Error('Malformed CTF JSON: float has invalid ' +
-		    'length value'));
-
-	if (len == 4) {
-		if (name == 'float')
-			return;
-		ctype.typedef(name, 'float');
-	} else if (len == 8) {
-		if (name == 'double')
-			return;
-		ctype.typedef(name, 'double');
-	}
-}
-
-function ctfParseTypedef(entry, ctype)
-{
-	var name, type, ii;
-
-	name = entry['name'];
-	if (typeof (entry['typedef']) != 'string')
-		throw (new Error('Malformed CTF JSON: typedef value in not ' +
-		    'a string'));
-
-	type = entry['typedef'];
-
-	/*
-	 * We need to ensure that we're not looking at type that's one of our
-	 * built in types. Traditionally in C a uint32_t would be a typedef to
-	 * some kind of integer. However, those size types are built ins.
-	 */
-	for (ii = 0; ii < ctf_deftypes.length; ii++) {
-		if (name == ctf_deftypes[ii])
-			return;
-	}
-
-	ctype.typedef(name, type);
-}
-
-function ctfParseStruct(entry, ctype)
-{
-	var name, type, ii, val, index, member, push;
-
-	member = [];
-	if (!Array.isArray(entry['struct']))
-		throw (new Error('Malformed CTF JSON: struct value is not ' +
-		    'an array'));
-
-	for (ii = 0; ii < entry['struct'].length; ii++) {
-		val = entry['struct'][ii];
-		if (!('name' in val))
-			throw (new Error('Malformed CTF JSON: struct member ' +
-			    'missing name'));
-
-		if (!('type' in val))
-			throw (new Error('Malformed CTF JSON: struct member ' +
-			    'missing type'));
-
-		if (typeof (val['name']) != 'string')
-			throw (new Error('Malformed CTF JSON: struct member ' +
-			    'name isn\'t a string'));
-
-		if (typeof (val['type']) != 'string')
-			throw (new Error('Malformed CTF JSON: struct member ' +
-			    'type isn\'t a string'));
-
-		/*
-		 * CTF version 2 specifies array names as <type> [<num>] where
-		 * as node-ctype does this as <type>[<num>].
-		 */
-		name = val['name'];
-		type = val['type'];
-		index = type.indexOf(' [');
-		if (index != -1) {
-			type = type.substring(0, index) +
-			    type.substring(index + 1, type.length);
-		}
-		push = {};
-		push[name] = { 'type': type };
-		member.push(push);
-	}
-
-	name = entry['name'];
-	ctype.typedef(name, member);
-}
-
-function ctfParseEntry(entry, ctype)
-{
-	var ii, found;
-
-	if (!('name' in entry))
-		throw (new Error('Malformed CTF JSON: entry missing "name" ' +
-		    'section'));
-
-	for (ii = 0; ii < ctf_entries.length; ii++) {
-		if (ctf_entries[ii] in entry)
-			found++;
-	}
-
-	if (found === 0)
-		throw (new Error('Malformed CTF JSON: found no entries'));
-
-	if (found >= 2)
-		throw (new Error('Malformed CTF JSON: found more than one ' +
-		    'entry'));
-
-	if ('integer' in entry) {
-		ctfParseInteger(entry, ctype);
-		return;
-	}
-
-	if ('float' in entry) {
-		ctfParseFloat(entry, ctype);
-		return;
-	}
-
-	if ('typedef' in entry) {
-		ctfParseTypedef(entry, ctype);
-		return;
-	}
-
-	if ('struct' in entry) {
-		ctfParseStruct(entry, ctype);
-		return;
-	}
-
-	ASSERT(false, 'shouldn\'t reach here');
-}
-
-function ctfParseJson(json, ctype)
-{
-	var version, ii;
-
-	ASSERT(json);
-	ASSERT(ctype);
-	if (!('metadata' in json))
-		throw (new Error('Invalid CTF JSON: missing metadata section'));
-
-	if (!('ctf2json_version' in json['metadata']))
-		throw (new Error('Invalid CTF JSON: missing ctf2json_version'));
-
-	version = json['metadata']['ctf2json_version'];
-	for (ii = 0; ii < ctf_versions.length; ii++) {
-		if (ctf_versions[ii] == version)
-			break;
-	}
-
-	if (ii == ctf_versions.length)
-		throw (new Error('Unsuported ctf2json_version: ' + version));
-
-	if (!('data' in json))
-		throw (new Error('Invalid CTF JSON: missing data section'));
-
-	if (!Array.isArray(json['data']))
-		throw (new Error('Malformed CTF JSON: data section is not ' +
-		    'an array'));
-
-	for (ii = 0; ii < json['data'].length; ii++)
-		ctfParseEntry(json['data'][ii], ctype);
-}
-
-exports.ctfParseJson = ctfParseJson;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/ctio.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1485 +0,0 @@
-/*
- * rm - Feb 2011
- * ctio.js:
- *
- * A simple way to read and write simple ctypes. Of course, as you'll find the
- * code isn't as simple as it might appear. The following types are currently
- * supported in big and little endian formats:
- *
- * 	uint8_t			int8_t
- * 	uint16_t		int16_t
- * 	uint32_t		int32_t
- *	float (single precision IEEE 754)
- *	double (double precision IEEE 754)
- *
- * This is designed to work in Node and v8. It may in fact work in other
- * Javascript interpreters (that'd be pretty neat), but it hasn't been tested.
- * If you find that it does in fact work, that's pretty cool. Try and pass word
- * back to the original author.
- *
- * Note to the reader: If you're tabstop isn't set to 8, parts of this may look
- * weird.
- */
-
-/*
- * Numbers in Javascript have a secret: all numbers must be represented with an
- * IEEE-754 double. The double has a mantissa with a length of 52 bits with an
- * implicit one. Thus the range of integers that can be represented is limited
- * to the size of the mantissa, this makes reading and writing 64-bit integers
- * difficult, but far from impossible.
- *
- * Another side effect of this representation is what happens when you use the
- * bitwise operators, i.e. shift left, shift right, and, or, etc. In Javascript,
- * each operand and the result is cast to a signed 32-bit number. However, in
- * the case of >>> the values are cast to an unsigned number.
- */
-
-/*
- * A reminder on endian related issues:
- *
- * Big Endian: MSB -> First byte
- * Little Endian: MSB->Last byte
- */
-var mod_assert = require('assert');
-
-/*
- * An 8 bit unsigned integer involves doing no significant work.
- */
-function ruint8(buffer, endian, offset)
-{
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	return (buffer[offset]);
-}
-
-/*
- * For 16 bit unsigned numbers we can do all the casting that we want to do.
- */
-function rgint16(buffer, endian, offset)
-{
-	var val = 0;
-
-	if (endian == 'big') {
-		val = buffer[offset] << 8;
-		val |=  buffer[offset+1];
-	} else {
-		val = buffer[offset];
-		val |= buffer[offset+1] << 8;
-	}
-
-	return (val);
-
-}
-
-function ruint16(buffer, endian, offset)
-{
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 1 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	return (rgint16(buffer, endian, offset));
-}
-
-/*
- * Because most bitshifting is done using signed numbers, if we would go into
- * the realm where we use that 32nd bit, we'll end up going into the negative
- * range. i.e.:
- * > 200 << 24
- * -939524096
- *
- * Not the value you'd expect. To work around this, we end up having to do some
- * abuse of the JavaScript standard. in this case, we know that a >>> shift is
- * defined to cast our value to an *unsigned* 32-bit number. Because of that, we
- * use that instead to save us some additional math, though it does feel a
- * little weird and it isn't obvious as to why you woul dwant to do this at
- * first.
- */
-function rgint32(buffer, endian, offset)
-{
-	var val = 0;
-
-	if (endian == 'big') {
-		val = buffer[offset+1] << 16;
-		val |= buffer[offset+2] << 8;
-		val |= buffer[offset+3];
-		val = val + (buffer[offset] << 24 >>> 0);
-	} else {
-		val = buffer[offset+2] << 16;
-		val |= buffer[offset+1] << 8;
-		val |= buffer[offset];
-		val = val + (buffer[offset + 3] << 24 >>> 0);
-	}
-
-	return (val);
-}
-
-function ruint32(buffer, endian, offset)
-{
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 3 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	return (rgint32(buffer, endian, offset));
-}
-
-/*
- * Reads a 64-bit unsigned number. The astue observer will note that this
- * doesn't quite work. Javascript has chosen to only have numbers that can be
- * represented by a double. A double only has 52 bits of mantissa with an
- * implicit 1, thus we have up to 53 bits to represent an integer. However, 2^53
- * doesn't quite give us what we want. Isn't 53 bits enough for anyone? What
- * could you have possibly wanted to represent that was larger than that? Oh,
- * maybe a size? You mean we bypassed the 4 GB limit on file sizes, when did
- * that happen?
- *
- * To get around this egregious language issue, we're going to instead construct
- * an array of two 32 bit unsigned integers. Where arr[0] << 32 + arr[1] would
- * give the actual number. However, note that the above code probably won't
- * produce the desired results because of the way Javascript numbers are
- * doubles.
- */
-function rgint64(buffer, endian, offset)
-{
-	var val = new Array(2);
-
-	if (endian == 'big') {
-		val[0] = ruint32(buffer, endian, offset);
-		val[1] = ruint32(buffer, endian, offset+4);
-	} else {
-		val[0] = ruint32(buffer, endian, offset+4);
-		val[1] = ruint32(buffer, endian, offset);
-	}
-
-	return (val);
-}
-
-function ruint64(buffer, endian, offset)
-{
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 7 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	return (rgint64(buffer, endian, offset));
-}
-
-
-/*
- * Signed integer types, yay team! A reminder on how two's complement actually
- * works. The first bit is the signed bit, i.e. tells us whether or not the
- * number should be positive or negative. If the two's complement value is
- * positive, then we're done, as it's equivalent to the unsigned representation.
- *
- * Now if the number is positive, you're pretty much done, you can just leverage
- * the unsigned translations and return those. Unfortunately, negative numbers
- * aren't quite that straightforward.
- *
- * At first glance, one might be inclined to use the traditional formula to
- * translate binary numbers between the positive and negative values in two's
- * complement. (Though it doesn't quite work for the most negative value)
- * Mainly:
- *  - invert all the bits
- *  - add one to the result
- *
- * Of course, this doesn't quite work in Javascript. Take for example the value
- * of -128. This could be represented in 16 bits (big-endian) as 0xff80. But of
- * course, Javascript will do the following:
- *
- * > ~0xff80
- * -65409
- *
- * Whoh there, Javascript, that's not quite right. But wait, according to
- * Javascript that's perfectly correct. When Javascript ends up seeing the
- * constant 0xff80, it has no notion that it is actually a signed number. It
- * assumes that we've input the unsigned value 0xff80. Thus, when it does the
- * binary negation, it casts it into a signed value, (positive 0xff80). Then
- * when you perform binary negation on that, it turns it into a negative number.
- *
- * Instead, we're going to have to use the following general formula, that works
- * in a rather Javascript friendly way. I'm glad we don't support this kind of
- * weird numbering scheme in the kernel.
- *
- * (BIT-MAX - (unsigned)val + 1) * -1
- *
- * The astute observer, may think that this doesn't make sense for 8-bit numbers
- * (really it isn't necessary for them). However, when you get 16-bit numbers,
- * you do. Let's go back to our prior example and see how this will look:
- *
- * (0xffff - 0xff80 + 1) * -1
- * (0x007f + 1) * -1
- * (0x0080) * -1
- *
- * Doing it this way ends up allowing us to treat it appropriately in
- * Javascript. Sigh, that's really quite ugly for what should just be a few bit
- * shifts, ~ and &.
- */
-
-/*
- * Endianness doesn't matter for 8-bit signed values. We could in fact optimize
- * this case because the more traditional methods work, but for consistency,
- * we'll keep doing this the same way.
- */
-function rsint8(buffer, endian, offset)
-{
-	var neg;
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	neg = buffer[offset] & 0x80;
-	if (!neg)
-		return (buffer[offset]);
-
-	return ((0xff - buffer[offset] + 1) * -1);
-}
-
-/*
- * The 16-bit version requires a bit more effort. In this case, we can leverage
- * our unsigned code to generate the value we want to return.
- */
-function rsint16(buffer, endian, offset)
-{
-	var neg, val;
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 1 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	val = rgint16(buffer, endian, offset);
-	neg = val & 0x8000;
-	if (!neg)
-		return (val);
-
-	return ((0xffff - val + 1) * -1);
-}
-
-/*
- * We really shouldn't leverage our 32-bit code here and instead utilize the
- * fact that we know that since these are signed numbers, we can do all the
- * shifting and binary anding to generate the 32-bit number. But, for
- * consistency we'll do the same. If we want to do otherwise, we should instead
- * make the 32 bit unsigned code do the optimization. But as long as there
- * aren't floats secretly under the hood for that, we /should/ be okay.
- */
-function rsint32(buffer, endian, offset)
-{
-	var neg, val;
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 3 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	val = rgint32(buffer, endian, offset);
-	neg = val & 0x80000000;
-	if (!neg)
-		return (val);
-
-	return ((0xffffffff - val + 1) * -1);
-}
-
-/*
- * The signed version of this code suffers from all of the same problems of the
- * other 64 bit version.
- */
-function rsint64(buffer, endian, offset)
-{
-	var neg, val;
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 3 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	val = rgint64(buffer, endian, offset);
-	neg = val[0] & 0x80000000;
-
-	if (!neg)
-		return (val);
-
-	val[0] = (0xffffffff - val[0]) * -1;
-	val[1] = (0xffffffff - val[1] + 1) * -1;
-
-	/*
-	 * If we had the key 0x8000000000000000, that would leave the lower 32
-	 * bits as 0xffffffff, however, since we're goint to add one, that would
-	 * actually leave the lower 32-bits as 0x100000000, which would break
-	 * our ability to write back a value that we received. To work around
-	 * this, if we actually get that value, we're going to bump the upper
-	 * portion by 1 and set this to zero.
-	 */
-	mod_assert.ok(val[1] <= 0x100000000);
-	if (val[1] == -0x100000000) {
-		val[1] = 0;
-		val[0]--;
-	}
-
-	return (val);
-}
-
-/*
- * We now move onto IEEE 754: The traditional form for floating point numbers
- * and what is secretly hiding at the heart of everything in this. I really hope
- * that someone is actually using this, as otherwise, this effort is probably
- * going to be more wasted.
- *
- * One might be tempted to use parseFloat here, but that wouldn't work at all
- * for several reasons. Mostly due to the way floats actually work, and
- * parseFloat only actually works in base 10. I don't see base 10 anywhere near
- * this file.
- *
- * In this case we'll implement the single and double precision versions. The
- * quadruple precision, while probably useful, wouldn't really be accepted by
- * Javascript, so let's not even waste our time.
- *
- * So let's review how this format looks like. A single precision value is 32
- * bits and has three parts:
- *   -  Sign bit
- *   -  Exponent (Using bias notation)
- *   -  Mantissa
- *
- * |s|eeeeeeee|mmmmmmmmmmmmmmmmmmmmmmmmm|
- * 31| 30-23  |  22    	-       0       |
- *
- * The exponent is stored in a biased input. The bias in this case 127.
- * Therefore, our exponent is equal to the 8-bit value - 127.
- *
- * By default, a number is normalized in IEEE, that means that the mantissa has
- * an implicit one that we don't see. So really the value stored is 1.m.
- * However, if the exponent is all zeros, then instead we have to shift
- * everything to the right one and there is no more implicit one.
- *
- * Special values:
- *  - Positive Infinity:
- *	Sign:		0
- *	Exponent: 	All 1s
- *	Mantissa:	0
- *  - Negative Infinity:
- *	Sign:		1
- *	Exponent: 	All 1s
- *	Mantissa:	0
- *  - NaN:
- *	Sign:		*
- *	Exponent: 	All 1s
- *	Mantissa:	non-zero
- *  - Zero:
- *	Sign:		*
- *	Exponent:	All 0s
- *	Mantissa:	0
- *
- * In the case of zero, the sign bit determines whether we get a positive or
- * negative zero. However, since Javascript cannot determine the difference
- * between the two: i.e. -0 == 0, we just always return 0.
- *
- */
-function rfloat(buffer, endian, offset)
-{
-	var bytes = [];
-	var sign, exponent, mantissa, val;
-	var bias = 127;
-	var maxexp = 0xff;
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 3 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	/* Normalize the bytes to be in endian order */
-	if (endian == 'big') {
-		bytes[0] = buffer[offset];
-		bytes[1] = buffer[offset+1];
-		bytes[2] = buffer[offset+2];
-		bytes[3] = buffer[offset+3];
-	} else {
-		bytes[3] = buffer[offset];
-		bytes[2] = buffer[offset+1];
-		bytes[1] = buffer[offset+2];
-		bytes[0] = buffer[offset+3];
-	}
-
-	sign = bytes[0] & 0x80;
-	exponent = (bytes[0] & 0x7f) << 1;
-	exponent |= (bytes[1] & 0x80) >>> 7;
-	mantissa = (bytes[1] & 0x7f) << 16;
-	mantissa |= bytes[2] << 8;
-	mantissa |= bytes[3];
-
-	/* Check for special cases before we do general parsing */
-	if (!sign && exponent == maxexp && mantissa === 0)
-		return (Number.POSITIVE_INFINITY);
-
-	if (sign && exponent == maxexp && mantissa === 0)
-		return (Number.NEGATIVE_INFINITY);
-
-	if (exponent == maxexp && mantissa !== 0)
-		return (Number.NaN);
-
-	/*
-	 * Javascript really doesn't have support for positive or negative zero.
-	 * So we're not going to try and give it to you. That would be just
-	 * plain weird. Besides -0 == 0.
-	 */
-	if (exponent === 0 && mantissa === 0)
-		return (0);
-
-	/*
-	 * Now we can deal with the bias and the determine whether the mantissa
-	 * has the implicit one or not.
-	 */
-	exponent -= bias;
-	if (exponent == -bias) {
-		exponent++;
-		val = 0;
-	} else {
-		val = 1;
-	}
-
-	val = (val + mantissa * Math.pow(2, -23)) * Math.pow(2, exponent);
-
-	if (sign)
-		val *= -1;
-
-	return (val);
-}
-
-/*
- * Doubles in IEEE 754 are like their brothers except for a few changes and
- * increases in size:
- *   - The exponent is now 11 bits
- *   - The mantissa is now 52 bits
- *   - The bias is now 1023
- *
- * |s|eeeeeeeeeee|mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm|
- * 63| 62 - 52   | 	51		-			0     |
- * 63| 62 - 52   |      51              -                       0     |
- *
- * While the size has increased a fair amount, we're going to end up keeping the
- * same general formula for calculating the final value. As a reminder, this
- * formula is:
- *
- * (-1)^s * (n + m) * 2^(e-b)
- *
- * Where:
- *	s	is the sign bit
- *	n	is (exponent > 0) ? 1 : 0 -- Determines whether we're normalized
- *					     or not
- *	m	is the mantissa
- *	e	is the exponent specified
- *	b	is the bias for the exponent
- *
- */
-function rdouble(buffer, endian, offset)
-{
-	var bytes = [];
-	var sign, exponent, mantissa, val, lowmant;
-	var bias = 1023;
-	var maxexp = 0x7ff;
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 7 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	/* Normalize the bytes to be in endian order */
-	if (endian == 'big') {
-		bytes[0] = buffer[offset];
-		bytes[1] = buffer[offset+1];
-		bytes[2] = buffer[offset+2];
-		bytes[3] = buffer[offset+3];
-		bytes[4] = buffer[offset+4];
-		bytes[5] = buffer[offset+5];
-		bytes[6] = buffer[offset+6];
-		bytes[7] = buffer[offset+7];
-	} else {
-		bytes[7] = buffer[offset];
-		bytes[6] = buffer[offset+1];
-		bytes[5] = buffer[offset+2];
-		bytes[4] = buffer[offset+3];
-		bytes[3] = buffer[offset+4];
-		bytes[2] = buffer[offset+5];
-		bytes[1] = buffer[offset+6];
-		bytes[0] = buffer[offset+7];
-	}
-
-	/*
-	 * We can construct the exponent and mantissa the same way as we did in
-	 * the case of a float, just increase the range of the exponent.
-	 */
-	sign = bytes[0] & 0x80;
-	exponent = (bytes[0] & 0x7f) << 4;
-	exponent |= (bytes[1] & 0xf0) >>> 4;
-
-	/*
-	 * This is going to be ugly but then again, we're dealing with IEEE 754.
-	 * This could probably be done as a node add on in a few lines of C++,
-	 * but oh we'll, we've made it this far so let's be native the rest of
-	 * the way...
-	 *
-	 * What we're going to do is break the mantissa into two parts, the
-	 * lower 24 bits and the upper 28 bits. We'll multiply the upper 28 bits
-	 * by the appropriate power and then add in the lower 24-bits. Not
-	 * really that great. It's pretty much a giant kludge to deal with
-	 * Javascript eccentricities around numbers.
-	 */
-	lowmant = bytes[7];
-	lowmant |= bytes[6] << 8;
-	lowmant |= bytes[5] << 16;
-	mantissa = bytes[4];
-	mantissa |= bytes[3] << 8;
-	mantissa |= bytes[2] << 16;
-	mantissa |= (bytes[1] & 0x0f) << 24;
-	mantissa *= Math.pow(2, 24); /* Equivalent to << 24, but JS compat */
-	mantissa += lowmant;
-
-	/* Check for special cases before we do general parsing */
-	if (!sign && exponent == maxexp && mantissa === 0)
-		return (Number.POSITIVE_INFINITY);
-
-	if (sign && exponent == maxexp && mantissa === 0)
-		return (Number.NEGATIVE_INFINITY);
-
-	if (exponent == maxexp && mantissa !== 0)
-		return (Number.NaN);
-
-	/*
-	 * Javascript really doesn't have support for positive or negative zero.
-	 * So we're not going to try and give it to you. That would be just
-	 * plain weird. Besides -0 == 0.
-	 */
-	if (exponent === 0 && mantissa === 0)
-		return (0);
-
-	/*
-	 * Now we can deal with the bias and the determine whether the mantissa
-	 * has the implicit one or not.
-	 */
-	exponent -= bias;
-	if (exponent == -bias) {
-		exponent++;
-		val = 0;
-	} else {
-		val = 1;
-	}
-
-	val = (val + mantissa * Math.pow(2, -52)) * Math.pow(2, exponent);
-
-	if (sign)
-		val *= -1;
-
-	return (val);
-}
-
-/*
- * Now that we have gone through the pain of reading the individual types, we're
- * probably going to want some way to write these back. None of this is going to
- * be good. But since we have Javascript numbers this should certainly be more
- * interesting. Though we can constrain this end a little bit more in what is
- * valid. For now, let's go back to our friends the unsigned value.
- */
-
-/*
- * Unsigned numbers seem deceptively easy. Here are the general steps and rules
- * that we are going to take:
- *   -  If the number is negative, throw an Error
- *   -  Truncate any floating point portion
- *   -  Take the modulus of the number in our base
- *   -  Write it out to the buffer in the endian format requested at the offset
- */
-
-/*
- * We have to make sure that the value is a valid integer. This means that it is
- * non-negative. It has no fractional component and that it does not exceed the
- * maximum allowed value.
- *
- *	value		The number to check for validity
- *
- *	max		The maximum value
- */
-function prepuint(value, max)
-{
-	if (typeof (value) != 'number')
-		throw (new (Error('cannot write a non-number as a number')));
-
-	if (value < 0)
-		throw (new Error('specified a negative value for writing an ' +
-		    'unsigned value'));
-
-	if (value > max)
-		throw (new Error('value is larger than maximum value for ' +
-		    'type'));
-
-	if (Math.floor(value) !== value)
-		throw (new Error('value has a fractional component'));
-
-	return (value);
-}
-
-/*
- * 8-bit version, classy. We can ignore endianness which is good.
- */
-function wuint8(value, endian, buffer, offset)
-{
-	var val;
-
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	val = prepuint(value, 0xff);
-	buffer[offset] = val;
-}
-
-/*
- * Pretty much the same as the 8-bit version, just this time we need to worry
- * about endian related issues.
- */
-function wgint16(val, endian, buffer, offset)
-{
-	if (endian == 'big') {
-		buffer[offset] = (val & 0xff00) >>> 8;
-		buffer[offset+1] = val & 0x00ff;
-	} else {
-		buffer[offset+1] = (val & 0xff00) >>> 8;
-		buffer[offset] = val & 0x00ff;
-	}
-}
-
-function wuint16(value, endian, buffer, offset)
-{
-	var val;
-
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 1 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	val = prepuint(value, 0xffff);
-	wgint16(val, endian, buffer, offset);
-}
-
-/*
- * The 32-bit version is going to have to be a little different unfortunately.
- * We can't quite bitshift to get the largest byte, because that would end up
- * getting us caught by the signed values.
- *
- * And yes, we do want to subtract out the lower part by default. This means
- * that when we do the division, it will be treated as a bit shift and we won't
- * end up generating a floating point value. If we did generate a floating point
- * value we'd have to truncate it intelligently, this saves us that problem and
- * may even be somewhat faster under the hood.
- */
-function wgint32(val, endian, buffer, offset)
-{
-	if (endian == 'big') {
-		buffer[offset] = (val - (val & 0x00ffffff)) / Math.pow(2, 24);
-		buffer[offset+1] = (val >>> 16) & 0xff;
-		buffer[offset+2] = (val >>> 8) & 0xff;
-		buffer[offset+3] = val & 0xff;
-	} else {
-		buffer[offset+3] = (val - (val & 0x00ffffff)) /
-		    Math.pow(2, 24);
-		buffer[offset+2] = (val >>> 16) & 0xff;
-		buffer[offset+1] = (val >>> 8) & 0xff;
-		buffer[offset] = val & 0xff;
-	}
-}
-
-function wuint32(value, endian, buffer, offset)
-{
-	var val;
-
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 3 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	val = prepuint(value, 0xffffffff);
-	wgint32(val, endian, buffer, offset);
-}
-
-/*
- * Unlike the other versions, we expect the value to be in the form of two
- * arrays where value[0] << 32 + value[1] would result in the value that we
- * want.
- */
-function wgint64(value, endian, buffer, offset)
-{
-	if (endian == 'big') {
-		wgint32(value[0], endian, buffer, offset);
-		wgint32(value[1], endian, buffer, offset+4);
-	} else {
-		wgint32(value[0], endian, buffer, offset+4);
-		wgint32(value[1], endian, buffer, offset);
-	}
-}
-
-function wuint64(value, endian, buffer, offset)
-{
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (!(value instanceof Array))
-		throw (new Error('value must be an array'));
-
-	if (value.length != 2)
-		throw (new Error('value must be an array of length 2'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 7 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	prepuint(value[0], 0xffffffff);
-	prepuint(value[1], 0xffffffff);
-	wgint64(value, endian, buffer, offset);
-}
-
-/*
- * We now move onto our friends in the signed number category. Unlike unsigned
- * numbers, we're going to have to worry a bit more about how we put values into
- * arrays. Since we are only worrying about signed 32-bit values, we're in
- * slightly better shape. Unfortunately, we really can't do our favorite binary
- * & in this system. It really seems to do the wrong thing. For example:
- *
- * > -32 & 0xff
- * 224
- *
- * What's happening above is really: 0xe0 & 0xff = 0xe0. However, the results of
- * this aren't treated as a signed number. Ultimately a bad thing.
- *
- * What we're going to want to do is basically create the unsigned equivalent of
- * our representation and pass that off to the wuint* functions. To do that
- * we're going to do the following:
- *
- *  - if the value is positive
- *	we can pass it directly off to the equivalent wuint
- *  - if the value is negative
- *	we do the following computation:
- *	mb + val + 1, where
- *	mb	is the maximum unsigned value in that byte size
- *	val	is the Javascript negative integer
- *
- *
- * As a concrete value, take -128. In signed 16 bits this would be 0xff80. If
- * you do out the computations:
- *
- * 0xffff - 128 + 1
- * 0xffff - 127
- * 0xff80
- *
- * You can then encode this value as the signed version. This is really rather
- * hacky, but it should work and get the job done which is our goal here.
- *
- * Thus the overall flow is:
- *   -  Truncate the floating point part of the number
- *   -  We don't have to take the modulus, because the unsigned versions will
- *   	take care of that for us. And we don't have to worry about that
- *   	potentially causing bad things to happen because of sign extension
- *   -  Pass it off to the appropriate unsigned version, potentially modifying
- *	the negative portions as necessary.
- */
-
-/*
- * A series of checks to make sure we actually have a signed 32-bit number
- */
-function prepsint(value, max, min)
-{
-	if (typeof (value) != 'number')
-		throw (new (Error('cannot write a non-number as a number')));
-
-	if (value > max)
-		throw (new Error('value larger than maximum allowed value'));
-
-	if (value < min)
-		throw (new Error('value smaller than minimum allowed value'));
-
-	if (Math.floor(value) !== value)
-		throw (new Error('value has a fractional component'));
-
-	return (value);
-}
-
-/*
- * The 8-bit version of the signed value. Overall, fairly straightforward.
- */
-function wsint8(value, endian, buffer, offset)
-{
-	var val;
-
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	val = prepsint(value, 0x7f, -0x80);
-	if (val >= 0)
-		wuint8(val, endian, buffer, offset);
-	else
-		wuint8(0xff + val + 1, endian, buffer, offset);
-}
-
-/*
- * The 16-bit version of the signed value. Also, fairly straightforward.
- */
-function wsint16(value, endian, buffer, offset)
-{
-	var val;
-
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 1 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	val = prepsint(value, 0x7fff, -0x8000);
-	if (val >= 0)
-		wgint16(val, endian, buffer, offset);
-	else
-		wgint16(0xffff + val + 1, endian, buffer, offset);
-
-}
-
-/*
- * We can do this relatively easily by leveraging the code used for 32-bit
- * unsigned code.
- */
-function wsint32(value, endian, buffer, offset)
-{
-	var val;
-
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 3 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	val = prepsint(value, 0x7fffffff, -0x80000000);
-	if (val >= 0)
-		wgint32(val, endian, buffer, offset);
-	else
-		wgint32(0xffffffff + val + 1, endian, buffer, offset);
-}
-
-/*
- * The signed 64 bit integer should by in the same format as when received.
- * Mainly it should ensure that the value is an array of two integers where
- * value[0] << 32 + value[1] is the desired number. Furthermore, the two values
- * need to be equal.
- */
-function wsint64(value, endian, buffer, offset)
-{
-	var vzpos, vopos;
-	var vals = new Array(2);
-
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (!(value instanceof Array))
-		throw (new Error('value must be an array'));
-
-	if (value.length != 2)
-		throw (new Error('value must be an array of length 2'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-	if (offset + 7 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	/*
-	 * We need to make sure that we have the same sign on both values. The
-	 * hokiest way to to do this is to multiply the number by +inf. If we do
-	 * this, we'll get either +/-inf depending on the sign of the value.
-	 * Once we have this, we can compare it to +inf to see if the number is
-	 * positive or not.
-	 */
-	vzpos = (value[0] * Number.POSITIVE_INFINITY) ==
-	    Number.POSITIVE_INFINITY;
-	vopos = (value[1] * Number.POSITIVE_INFINITY) ==
-	    Number.POSITIVE_INFINITY;
-
-	/*
-	 * If either of these is zero, then we don't actually need this check.
-	 */
-	if (value[0] != 0 && value[1] != 0 && vzpos != vopos)
-		throw (new Error('Both entries in the array must have ' +
-		    'the same sign'));
-
-	/*
-	 * Doing verification for a signed 64-bit integer is actually a big
-	 * trickier than it appears. We can't quite use our standard techniques
-	 * because we need to compare both sets of values. The first value is
-	 * pretty straightforward. If the first value is beond the extremes than
-	 * we error out. However, the valid range of the second value varies
-	 * based on the first one. If the first value is negative, and *not* the
-	 * largest negative value, than it can be any integer within the range [
-	 * 0, 0xffffffff ]. If it is the largest negative number, it must be
-	 * zero.
-	 *
-	 * If the first number is positive, than it doesn't matter what the
-	 * value is. We just simply have to make sure we have a valid positive
-	 * integer.
-	 */
-	if (vzpos) {
-		prepuint(value[0], 0x7fffffff);
-		prepuint(value[1], 0xffffffff);
-	} else {
-		prepsint(value[0], 0, -0x80000000);
-		prepsint(value[1], 0, -0xffffffff);
-		if (value[0] == -0x80000000 && value[1] != 0)
-			throw (new Error('value smaller than minimum ' +
-			    'allowed value'));
-	}
-
-	/* Fix negative numbers */
-	if (value[0] < 0 || value[1] < 0) {
-		vals[0] = 0xffffffff - Math.abs(value[0]);
-		vals[1] = 0x100000000 - Math.abs(value[1]);
-		if (vals[1] == 0x100000000) {
-			vals[1] = 0;
-			vals[0]++;
-		}
-	} else {
-		vals[0] = value[0];
-		vals[1] = value[1];
-	}
-	wgint64(vals, endian, buffer, offset);
-}
-
-/*
- * Now we are moving onto the weirder of these, the float and double. For this
- * we're going to just have to do something that's pretty weird. First off, we
- * have no way to get at the underlying float representation, at least not
- * easily. But that doesn't mean we can't figure it out, we just have to use our
- * heads.
- *
- * One might propose to use Number.toString(2). Of course, this is not really
- * that good, because the ECMAScript 262 v3 Standard says the following Section
- * 15.7.4.2-Number.prototype.toString (radix):
- *
- * If radix is an integer from 2 to 36, but not 10, the result is a string, the
- * choice of which is implementation-dependent.
- *
- * Well that doesn't really help us one bit now does it? We could use the
- * standard base 10 version of the string, but that's just going to create more
- * errors as we end up trying to convert it back to a binary value. So, really
- * this just means we have to be non-lazy and parse the structure intelligently.
- *
- * First off, we can do the basic checks: NaN, positive and negative infinity.
- *
- * Now that those are done we can work backwards to generate the mantissa and
- * exponent.
- *
- * The first thing we need to do is determine the sign bit, easy to do, check
- * whether the value is less than 0. And convert the number to its absolute
- * value representation. Next, we need to determine if the value is less than
- * one or greater than or equal to one and from there determine what power was
- * used to get there. What follows is now specific to floats, though the general
- * ideas behind this will hold for doubles as well, but the exact numbers
- * involved will change.
- *
- * Once we have that power we can determine the exponent and the mantissa. Call
- * the value that has the number of bits to reach the power ebits. In the
- * general case they have the following values:
- *
- *	exponent	127 + ebits
- *	mantissa	value * 2^(23 - ebits) & 0x7fffff
- *
- * In the case where the value of ebits is <= -127 we are now in the case where
- * we no longer have normalized numbers. In this case the values take on the
- * following values:
- *
- * 	exponent	0
- *	mantissa	value * 2^149 & 0x7fffff
- *
- * Once we have the values for the sign, mantissa, and exponent. We reconstruct
- * the four bytes as follows:
- *
- *	byte0		sign bit and seven most significant bits from the exp
- *			sign << 7 | (exponent & 0xfe) >>> 1
- *
- *	byte1		lsb from the exponent and 7 top bits from the mantissa
- *			(exponent & 0x01) << 7 | (mantissa & 0x7f0000) >>> 16
- *
- *	byte2		bits 8-15 (zero indexing) from mantissa
- *			mantissa & 0xff00 >> 8
- *
- *	byte3		bits 0-7 from mantissa
- *			mantissa & 0xff
- *
- * Once we have this we have to assign them into the buffer in proper endian
- * order.
- */
-
-/*
- * Compute the log base 2 of the value. Now, someone who remembers basic
- * properties of logarithms will point out that we could use the change of base
- * formula for logs, and in fact that would be astute, because that's what we'll
- * do for now. It feels cleaner, albeit it may be less efficient than just
- * iterating and dividing by 2. We may want to come back and revisit that some
- * day.
- */
-function log2(value)
-{
-	return (Math.log(value) / Math.log(2));
-}
-
-/*
- * Helper to determine the exponent of the number we're looking at.
- */
-function intexp(value)
-{
-	return (Math.floor(log2(value)));
-}
-
-/*
- * Helper to determine the exponent of the fractional part of the value.
- */
-function fracexp(value)
-{
-	return (Math.floor(log2(value)));
-}
-
-function wfloat(value, endian, buffer, offset)
-{
-	var sign, exponent, mantissa, ebits;
-	var bytes = [];
-
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-
-	if (offset + 3 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	if (isNaN(value)) {
-		sign = 0;
-		exponent = 0xff;
-		mantissa = 23;
-	} else if (value == Number.POSITIVE_INFINITY) {
-		sign = 0;
-		exponent = 0xff;
-		mantissa = 0;
-	} else if (value == Number.NEGATIVE_INFINITY) {
-		sign = 1;
-		exponent = 0xff;
-		mantissa = 0;
-	} else {
-		/* Well we have some work to do */
-
-		/* Thankfully the sign bit is trivial */
-		if (value < 0) {
-			sign = 1;
-			value = Math.abs(value);
-		} else {
-			sign = 0;
-		}
-
-		/* Use the correct function to determine number of bits */
-		if (value < 1)
-			ebits = fracexp(value);
-		else
-			ebits = intexp(value);
-
-		/* Time to deal with the issues surrounding normalization */
-		if (ebits <= -127) {
-			exponent = 0;
-			mantissa = (value * Math.pow(2, 149)) & 0x7fffff;
-		} else {
-			exponent = 127 + ebits;
-			mantissa = value * Math.pow(2, 23 - ebits);
-			mantissa &= 0x7fffff;
-		}
-	}
-
-	bytes[0] = sign << 7 | (exponent & 0xfe) >>> 1;
-	bytes[1] = (exponent & 0x01) << 7 | (mantissa & 0x7f0000) >>> 16;
-	bytes[2] = (mantissa & 0x00ff00) >>> 8;
-	bytes[3] = mantissa & 0x0000ff;
-
-	if (endian == 'big') {
-		buffer[offset] = bytes[0];
-		buffer[offset+1] = bytes[1];
-		buffer[offset+2] = bytes[2];
-		buffer[offset+3] = bytes[3];
-	} else {
-		buffer[offset] = bytes[3];
-		buffer[offset+1] = bytes[2];
-		buffer[offset+2] = bytes[1];
-		buffer[offset+3] = bytes[0];
-	}
-}
-
-/*
- * Now we move onto doubles. Doubles are similar to floats in pretty much all
- * ways except that the processing isn't quite as straightforward because we
- * can't always use shifting, i.e. we have > 32 bit values.
- *
- * We're going to proceed in an identical fashion to floats and utilize the same
- * helper functions. All that really is changing are the specific values that we
- * use to do the calculations. Thus, to review we have to do the following.
- *
- * First get the sign bit and convert the value to its absolute value
- * representation. Next, we determine the number of bits that we used to get to
- * the value, branching whether the value is greater than or less than 1. Once
- * we have that value which we will again call ebits, we have to do the
- * following in the general case:
- *
- *	exponent	1023 + ebits
- *	mantissa	[value * 2^(52 - ebits)] % 2^52
- *
- * In the case where the value of ebits <= -1023 we no longer use normalized
- * numbers, thus like with floats we have to do slightly different processing:
- *
- *	exponent	0
- *	mantissa	[value * 2^1074] % 2^52
- *
- * Once we have determined the sign, exponent and mantissa we can construct the
- * bytes as follows:
- *
- *	byte0		sign bit and seven most significant bits form the exp
- *			sign << 7 | (exponent & 0x7f0) >>> 4
- *
- *	byte1		Remaining 4 bits from the exponent and the four most
- *			significant bits from the mantissa 48-51
- *			(exponent & 0x00f) << 4 | mantissa >>> 48
- *
- *	byte2		Bits 40-47 from the mantissa
- *			(mantissa >>> 40) & 0xff
- *
- *	byte3		Bits 32-39 from the mantissa
- *			(mantissa >>> 32) & 0xff
- *
- *	byte4		Bits 24-31 from the mantissa
- *			(mantissa >>> 24) & 0xff
- *
- *	byte5		Bits 16-23 from the Mantissa
- *			(mantissa >>> 16) & 0xff
- *
- *	byte6		Bits 8-15 from the mantissa
- *			(mantissa >>> 8) & 0xff
- *
- *	byte7		Bits 0-7 from the mantissa
- *			mantissa & 0xff
- *
- * Now we can't quite do the right shifting that we want in bytes 1 - 3, because
- * we'll have extended too far and we'll lose those values when we try and do
- * the shift. Instead we have to use an alternate approach. To try and stay out
- * of floating point, what we'll do is say that mantissa -= bytes[4-7] and then
- * divide by 2^32. Once we've done that we can use binary arithmetic. Oof,
- * that's ugly, but it seems to avoid using floating point (just based on how v8
- * seems to be optimizing for base 2 arithmetic).
- */
-function wdouble(value, endian, buffer, offset)
-{
-	var sign, exponent, mantissa, ebits;
-	var bytes = [];
-
-	if (value === undefined)
-		throw (new Error('missing value'));
-
-	if (endian === undefined)
-		throw (new Error('missing endian'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset'));
-
-
-	if (offset + 7 >= buffer.length)
-		throw (new Error('Trying to read beyond buffer length'));
-
-	if (isNaN(value)) {
-		sign = 0;
-		exponent = 0x7ff;
-		mantissa = 23;
-	} else if (value == Number.POSITIVE_INFINITY) {
-		sign = 0;
-		exponent = 0x7ff;
-		mantissa = 0;
-	} else if (value == Number.NEGATIVE_INFINITY) {
-		sign = 1;
-		exponent = 0x7ff;
-		mantissa = 0;
-	} else {
-		/* Well we have some work to do */
-
-		/* Thankfully the sign bit is trivial */
-		if (value < 0) {
-			sign = 1;
-			value = Math.abs(value);
-		} else {
-			sign = 0;
-		}
-
-		/* Use the correct function to determine number of bits */
-		if (value < 1)
-			ebits = fracexp(value);
-		else
-			ebits = intexp(value);
-
-		/*
-		 * This is a total hack to determine a denormalized value.
-		 * Unfortunately, we sometimes do not get a proper value for
-		 * ebits, i.e. we lose the values that would get rounded off.
-		 *
-		 *
-		 * The astute observer may wonder why we would be
-		 * multiplying by two Math.pows rather than just summing
-		 * them. Well, that's to get around a small bug in the
-		 * way v8 seems to implement the function. On occasion
-		 * doing:
-		 *
-		 * foo * Math.pow(2, 1023 + 51)
-		 *
-		 * Causes us to overflow to infinity, where as doing:
-		 *
-		 * foo * Math.pow(2, 1023) * Math.pow(2, 51)
-		 *
-		 * Does not cause us to overflow. Go figure.
-		 *
-		 */
-		if (value <= 2.225073858507201e-308 || ebits <= -1023) {
-			exponent = 0;
-			mantissa = value * Math.pow(2, 1023) * Math.pow(2, 51);
-			mantissa %= Math.pow(2, 52);
-		} else {
-			/*
-			 * We might have gotten fucked by our floating point
-			 * logarithm magic. This is rather crappy, but that's
-			 * our luck. If we just had a log base 2 or access to
-			 * the stupid underlying representation this would have
-			 * been much easier and we wouldn't have such stupid
-			 * kludges or hacks.
-			 */
-			if (ebits > 1023)
-				ebits = 1023;
-			exponent = 1023 + ebits;
-			mantissa = value * Math.pow(2, -ebits);
-			mantissa *= Math.pow(2, 52);
-			mantissa %= Math.pow(2, 52);
-		}
-	}
-
-	/* Fill the bytes in backwards to deal with the size issues */
-	bytes[7] = mantissa & 0xff;
-	bytes[6] = (mantissa >>> 8) & 0xff;
-	bytes[5] = (mantissa >>> 16) & 0xff;
-	mantissa = (mantissa - (mantissa & 0xffffff)) / Math.pow(2, 24);
-	bytes[4] = mantissa & 0xff;
-	bytes[3] = (mantissa >>> 8) & 0xff;
-	bytes[2] = (mantissa >>> 16) & 0xff;
-	bytes[1] = (exponent & 0x00f) << 4 | mantissa >>> 24;
-	bytes[0] = (sign << 7) | (exponent & 0x7f0) >>> 4;
-
-	if (endian == 'big') {
-		buffer[offset] = bytes[0];
-		buffer[offset+1] = bytes[1];
-		buffer[offset+2] = bytes[2];
-		buffer[offset+3] = bytes[3];
-		buffer[offset+4] = bytes[4];
-		buffer[offset+5] = bytes[5];
-		buffer[offset+6] = bytes[6];
-		buffer[offset+7] = bytes[7];
-	} else {
-		buffer[offset+7] = bytes[0];
-		buffer[offset+6] = bytes[1];
-		buffer[offset+5] = bytes[2];
-		buffer[offset+4] = bytes[3];
-		buffer[offset+3] = bytes[4];
-		buffer[offset+2] = bytes[5];
-		buffer[offset+1] = bytes[6];
-		buffer[offset] = bytes[7];
-	}
-}
-
-/*
- * Actually export our work above. One might argue that we shouldn't expose
- * these interfaces and just force people to use the higher level abstractions
- * around this work. However, unlike say other libraries we've come across, this
- * interface has several properties: it makes sense, it's simple, and it's
- * useful.
- */
-exports.ruint8 = ruint8;
-exports.ruint16 = ruint16;
-exports.ruint32 = ruint32;
-exports.ruint64 = ruint64;
-exports.wuint8 = wuint8;
-exports.wuint16 = wuint16;
-exports.wuint32 = wuint32;
-exports.wuint64 = wuint64;
-
-exports.rsint8 = rsint8;
-exports.rsint16 = rsint16;
-exports.rsint32 = rsint32;
-exports.rsint64 = rsint64;
-exports.wsint8 = wsint8;
-exports.wsint16 = wsint16;
-exports.wsint32 = wsint32;
-exports.wsint64 = wsint64;
-
-exports.rfloat = rfloat;
-exports.rdouble = rdouble;
-exports.wfloat = wfloat;
-exports.wdouble = wdouble;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/ctype.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,944 +0,0 @@
-/*
- * rm - Feb 2011
- * ctype.js
- *
- * This module provides a simple abstraction towards reading and writing
- * different types of binary data. It is designed to use ctio.js and provide a
- * richer and more expressive API on top of it.
- *
- * By default we support the following as built in basic types:
- *	int8_t
- *	int16_t
- *	int32_t
- *	uint8_t
- *	uint16_t
- *	uint32_t
- *	uint64_t
- *	float
- *	double
- *	char
- *	char[]
- *
- * Each type is returned as a Number, with the exception of char and char[]
- * which are returned as Node Buffers. A char is considered a uint8_t.
- *
- * Requests to read and write data are specified as an array of JSON objects.
- * This is also the same way that one declares structs. Even if just a single
- * value is requested, it must be done as a struct. The array order determines
- * the order that we try and read values. Each entry has the following format
- * with values marked with a * being optional.
- *
- * { key: { type: /type/, value*: /value/, offset*: /offset/ }
- *
- * If offset is defined, we lseek(offset, SEEK_SET) before reading the next
- * value. Value is defined when we're writing out data, otherwise it's ignored.
- *
- */
-
-var mod_ctf = require('./ctf.js');
-var mod_ctio = require('./ctio.js');
-var mod_assert = require('assert');
-
-/*
- * This is the set of basic types that we support.
- *
- *	read		The function to call to read in a value from a buffer
- *
- *	write		The function to call to write a value to a buffer
- *
- */
-var deftypes = {
-    'uint8_t':  { read: ctReadUint8, write: ctWriteUint8 },
-    'uint16_t': { read: ctReadUint16, write: ctWriteUint16 },
-    'uint32_t': { read: ctReadUint32, write: ctWriteUint32 },
-    'uint64_t': { read: ctReadUint64, write: ctWriteUint64 },
-    'int8_t': { read: ctReadSint8, write: ctWriteSint8 },
-    'int16_t': { read: ctReadSint16, write: ctWriteSint16 },
-    'int32_t': { read: ctReadSint32, write: ctWriteSint32 },
-    'int64_t': { read: ctReadSint64, write: ctWriteSint64 },
-    'float': { read: ctReadFloat, write: ctWriteFloat },
-    'double': { read: ctReadDouble, write: ctWriteDouble },
-    'char': { read: ctReadChar, write: ctWriteChar },
-    'char[]': { read: ctReadCharArray, write: ctWriteCharArray }
-};
-
-/*
- * The following are wrappers around the CType IO low level API. They encode
- * knowledge about the size and return something in the expected format.
- */
-function ctReadUint8(endian, buffer, offset)
-{
-	var val = mod_ctio.ruint8(buffer, endian, offset);
-	return ({ value: val, size: 1 });
-}
-
-function ctReadUint16(endian, buffer, offset)
-{
-	var val = mod_ctio.ruint16(buffer, endian, offset);
-	return ({ value: val, size: 2 });
-}
-
-function ctReadUint32(endian, buffer, offset)
-{
-	var val = mod_ctio.ruint32(buffer, endian, offset);
-	return ({ value: val, size: 4 });
-}
-
-function ctReadUint64(endian, buffer, offset)
-{
-	var val = mod_ctio.ruint64(buffer, endian, offset);
-	return ({ value: val, size: 8 });
-}
-
-function ctReadSint8(endian, buffer, offset)
-{
-	var val = mod_ctio.rsint8(buffer, endian, offset);
-	return ({ value: val, size: 1 });
-}
-
-function ctReadSint16(endian, buffer, offset)
-{
-	var val = mod_ctio.rsint16(buffer, endian, offset);
-	return ({ value: val, size: 2 });
-}
-
-function ctReadSint32(endian, buffer, offset)
-{
-	var val = mod_ctio.rsint32(buffer, endian, offset);
-	return ({ value: val, size: 4 });
-}
-
-function ctReadSint64(endian, buffer, offset)
-{
-	var val = mod_ctio.rsint64(buffer, endian, offset);
-	return ({ value: val, size: 8 });
-}
-
-function ctReadFloat(endian, buffer, offset)
-{
-	var val = mod_ctio.rfloat(buffer, endian, offset);
-	return ({ value: val, size: 4 });
-}
-
-function ctReadDouble(endian, buffer, offset)
-{
-	var val = mod_ctio.rdouble(buffer, endian, offset);
-	return ({ value: val, size: 8 });
-}
-
-/*
- * Reads a single character into a node buffer
- */
-function ctReadChar(endian, buffer, offset)
-{
-	var res = new Buffer(1);
-	res[0] = mod_ctio.ruint8(buffer, endian, offset);
-	return ({ value: res, size: 1 });
-}
-
-function ctReadCharArray(length, endian, buffer, offset)
-{
-	var ii;
-	var res = new Buffer(length);
-
-	for (ii = 0; ii < length; ii++)
-		res[ii] = mod_ctio.ruint8(buffer, endian, offset + ii);
-
-	return ({ value: res, size: length });
-}
-
-function ctWriteUint8(value, endian, buffer, offset)
-{
-	mod_ctio.wuint8(value, endian, buffer, offset);
-	return (1);
-}
-
-function ctWriteUint16(value, endian, buffer, offset)
-{
-	mod_ctio.wuint16(value, endian, buffer, offset);
-	return (2);
-}
-
-function ctWriteUint32(value, endian, buffer, offset)
-{
-	mod_ctio.wuint32(value, endian, buffer, offset);
-	return (4);
-}
-
-function ctWriteUint64(value, endian, buffer, offset)
-{
-	mod_ctio.wuint64(value, endian, buffer, offset);
-	return (8);
-}
-
-function ctWriteSint8(value, endian, buffer, offset)
-{
-	mod_ctio.wsint8(value, endian, buffer, offset);
-	return (1);
-}
-
-function ctWriteSint16(value, endian, buffer, offset)
-{
-	mod_ctio.wsint16(value, endian, buffer, offset);
-	return (2);
-}
-
-function ctWriteSint32(value, endian, buffer, offset)
-{
-	mod_ctio.wsint32(value, endian, buffer, offset);
-	return (4);
-}
-
-function ctWriteSint64(value, endian, buffer, offset)
-{
-	mod_ctio.wsint64(value, endian, buffer, offset);
-	return (8);
-}
-
-function ctWriteFloat(value, endian, buffer, offset)
-{
-	mod_ctio.wfloat(value, endian, buffer, offset);
-	return (4);
-}
-
-function ctWriteDouble(value, endian, buffer, offset)
-{
-	mod_ctio.wdouble(value, endian, buffer, offset);
-	return (8);
-}
-
-/*
- * Writes a single character into a node buffer
- */
-function ctWriteChar(value, endian, buffer, offset)
-{
-	if (!(value instanceof Buffer))
-		throw (new Error('Input must be a buffer'));
-
-	mod_ctio.ruint8(value[0], endian, buffer, offset);
-	return (1);
-}
-
-/*
- * We're going to write 0s into the buffer if the string is shorter than the
- * length of the array.
- */
-function ctWriteCharArray(value, length, endian, buffer, offset)
-{
-	var ii;
-
-	if (!(value instanceof Buffer))
-		throw (new Error('Input must be a buffer'));
-
-	if (value.length > length)
-		throw (new Error('value length greater than array length'));
-
-	for (ii = 0; ii < value.length && ii < length; ii++)
-		mod_ctio.wuint8(value[ii], endian, buffer, offset + ii);
-
-	for (; ii < length; ii++)
-		mod_ctio.wuint8(0, endian, offset + ii);
-
-
-	return (length);
-}
-
-/*
- * Each parser has their own set of types. We want to make sure that they each
- * get their own copy as they may need to modify it.
- */
-function ctGetBasicTypes()
-{
-	var ret = {};
-	var key;
-	for (key in deftypes)
-		ret[key] = deftypes[key];
-
-	return (ret);
-}
-
-/*
- * Given a string in the form of type[length] we want to split this into an
- * object that extracts that information. We want to note that we could possibly
- * have nested arrays so this should only check the furthest one. It may also be
- * the case that we have no [] pieces, in which case we just return the current
- * type.
- */
-function ctParseType(str)
-{
-	var begInd, endInd;
-	var type, len;
-	if (typeof (str) != 'string')
-		throw (new Error('type must be a Javascript string'));
-
-	endInd = str.lastIndexOf(']');
-	if (endInd == -1) {
-		if (str.lastIndexOf('[') != -1)
-			throw (new Error('found invalid type with \'[\' but ' +
-			    'no corresponding \']\''));
-
-		return ({ type: str });
-	}
-
-	begInd = str.lastIndexOf('[');
-	if (begInd == -1)
-		throw (new Error('found invalid type with \']\' but ' +
-		    'no corresponding \'[\''));
-
-	if (begInd >= endInd)
-		throw (new Error('malformed type, \']\' appears before \'[\''));
-
-	type = str.substring(0, begInd);
-	len = str.substring(begInd + 1, endInd);
-
-	return ({ type: type, len: len });
-}
-
-/*
- * Given a request validate that all of the fields for it are valid and make
- * sense. This includes verifying the following notions:
- *  - Each type requested is present in types
- *  - Only allow a name for a field to be specified once
- *  - If an array is specified, validate that the requested field exists and
- *    comes before it.
- *  - If fields is defined, check that each entry has the occurrence of field
- */
-function ctCheckReq(def, types, fields)
-{
-	var ii, jj;
-	var req, keys, key;
-	var found = {};
-
-	if (!(def instanceof Array))
-		throw (new Error('definition is not an array'));
-
-	if (def.length === 0)
-		throw (new Error('definition must have at least one element'));
-
-	for (ii = 0; ii < def.length; ii++) {
-		req = def[ii];
-		if (!(req instanceof Object))
-			throw (new Error('definition must be an array of' +
-			    'objects'));
-
-		keys = Object.keys(req);
-		if (keys.length != 1)
-			throw (new Error('definition entry must only have ' +
-			    'one key'));
-
-		if (keys[0] in found)
-			throw (new Error('Specified name already ' +
-			    'specified: ' + keys[0]));
-
-		if (!('type' in req[keys[0]]))
-			throw (new Error('missing required type definition'));
-
-		key = ctParseType(req[keys[0]]['type']);
-
-		/*
-		 * We may have nested arrays, we need to check the validity of
-		 * the types until the len field is undefined in key. However,
-		 * each time len is defined we need to verify it is either an
-		 * integer or corresponds to an already seen key.
-		 */
-		while (key['len'] !== undefined) {
-			if (isNaN(parseInt(key['len'], 10))) {
-				if (!(key['len'] in found))
-					throw (new Error('Given an array ' +
-					    'length without a matching type'));
-
-			}
-
-			key = ctParseType(key['type']);
-		}
-
-		/* Now we can validate if the type is valid */
-		if (!(key['type'] in types))
-			throw (new Error('type not found or typdefed: ' +
-			    key['type']));
-
-		/* Check for any required fields */
-		if (fields !== undefined) {
-			for (jj = 0; jj < fields.length; jj++) {
-				if (!(fields[jj] in req[keys[0]]))
-					throw (new Error('Missing required ' +
-					    'field: ' + fields[jj]));
-			}
-		}
-
-		found[keys[0]] = true;
-	}
-}
-
-
-/*
- * Create a new instance of the parser. Each parser has its own store of
- * typedefs and endianness. Conf is an object with the following required
- * values:
- *
- *	endian		Either 'big' or 'little' do determine the endianness we
- *			want to read from or write to.
- *
- * And the following optional values:
- *
- * 	char-type	Valid options here are uint8 and int8. If uint8 is
- * 			specified this changes the default behavior of a single
- * 			char from being a buffer of a single character to being
- * 			a uint8_t. If int8, it becomes an int8_t instead.
- */
-function CTypeParser(conf)
-{
-	if (!conf) throw (new Error('missing required argument'));
-
-	if (!('endian' in conf))
-		throw (new Error('missing required endian value'));
-
-	if (conf['endian'] != 'big' && conf['endian'] != 'little')
-		throw (new Error('Invalid endian type'));
-
-	if ('char-type' in conf && (conf['char-type'] != 'uint8' &&
-	    conf['char-type'] != 'int8'))
-		throw (new Error('invalid option for char-type: ' +
-		    conf['char-type']));
-
-	this.endian = conf['endian'];
-	this.types = ctGetBasicTypes();
-
-	/*
-	 * There may be a more graceful way to do this, but this will have to
-	 * serve.
-	 */
-	if ('char-type' in conf && conf['char-type'] == 'uint8')
-		this.types['char'] = this.types['uint8_t'];
-
-	if ('char-type' in conf && conf['char-type'] == 'int8')
-		this.types['char'] = this.types['int8_t'];
-}
-
-/*
- * Sets the current endian value for the Parser. If the value is not valid,
- * throws an Error.
- *
- *	endian		Either 'big' or 'little' do determine the endianness we
- *			want to read from or write to.
- *
- */
-CTypeParser.prototype.setEndian = function (endian)
-{
-	if (endian != 'big' && endian != 'little')
-		throw (new Error('invalid endian type, must be big or ' +
-		    'little'));
-
-	this.endian = endian;
-};
-
-/*
- * Returns the current value of the endian value for the parser.
- */
-CTypeParser.prototype.getEndian = function ()
-{
-	return (this.endian);
-};
-
-/*
- * A user has requested to add a type, let us honor their request. Yet, if their
- * request doth spurn us, send them unto the Hells which Dante describes.
- *
- * 	name		The string for the type definition we're adding
- *
- *	value		Either a string that is a type/array name or an object
- *			that describes a struct.
- */
-CTypeParser.prototype.typedef = function (name, value)
-{
-	var type;
-
-	if (name === undefined)
-		throw (new (Error('missing required typedef argument: name')));
-
-	if (value === undefined)
-		throw (new (Error('missing required typedef argument: value')));
-
-	if (typeof (name) != 'string')
-		throw (new (Error('the name of a type must be a string')));
-
-	type = ctParseType(name);
-
-	if (type['len'] !== undefined)
-		throw (new Error('Cannot have an array in the typedef name'));
-
-	if (name in this.types)
-		throw (new Error('typedef name already present: ' + name));
-
-	if (typeof (value) != 'string' && !(value instanceof Array))
-		throw (new Error('typedef value must either be a string or ' +
-		    'struct'));
-
-	if (typeof (value) == 'string') {
-		type = ctParseType(value);
-		if (type['len'] !== undefined) {
-			if (isNaN(parseInt(type['len'], 10)))
-				throw (new (Error('typedef value must use ' +
-				    'fixed size array when outside of a ' +
-				    'struct')));
-		}
-
-		this.types[name] = value;
-	} else {
-		/* We have a struct, validate it */
-		ctCheckReq(value, this.types);
-		this.types[name] = value;
-	}
-};
-
-/*
- * Include all of the typedefs, but none of the built in types. This should be
- * treated as read-only.
- */
-CTypeParser.prototype.lstypes = function ()
-{
-	var key;
-	var ret = {};
-
-	for (key in this.types) {
-		if (key in deftypes)
-			continue;
-		ret[key] = this.types[key];
-	}
-
-	return (ret);
-};
-
-/*
- * Given a type string that may have array types that aren't numbers, try and
- * fill them in from the values object. The object should be of the format where
- * indexing into it should return a number for that type.
- *
- *	str		The type string
- *
- *	values		An object that can be used to fulfill type information
- */
-function ctResolveArray(str, values)
-{
-	var ret = '';
-	var type = ctParseType(str);
-
-	while (type['len'] !== undefined) {
-		if (isNaN(parseInt(type['len'], 10))) {
-			if (typeof (values[type['len']]) != 'number')
-				throw (new Error('cannot sawp in non-number ' +
-				    'for array value'));
-			ret = '[' + values[type['len']] + ']' + ret;
-		} else {
-			ret = '[' + type['len'] + ']' + ret;
-		}
-		type = ctParseType(type['type']);
-	}
-
-	ret = type['type'] + ret;
-
-	return (ret);
-}
-
-/*
- * [private] Either the typedef resolves to another type string or to a struct.
- * If it resolves to a struct, we just pass it off to read struct. If not, we
- * can just pass it off to read entry.
- */
-CTypeParser.prototype.resolveTypedef = function (type, dispatch, buffer,
-    offset, value)
-{
-	var pt;
-
-	mod_assert.ok(type in this.types);
-	if (typeof (this.types[type]) == 'string') {
-		pt = ctParseType(this.types[type]);
-		if (dispatch == 'read')
-			return (this.readEntry(pt, buffer, offset));
-		else if (dispatch == 'write')
-			return (this.writeEntry(value, pt, buffer, offset));
-		else
-			throw (new Error('invalid dispatch type to ' +
-			    'resolveTypedef'));
-	} else {
-		if (dispatch == 'read')
-			return (this.readStruct(this.types[type], buffer,
-			    offset));
-		else if (dispatch == 'write')
-			return (this.writeStruct(value, this.types[type],
-			    buffer, offset));
-		else
-			throw (new Error('invalid dispatch type to ' +
-			    'resolveTypedef'));
-	}
-
-};
-
-/*
- * [private] Try and read in the specific entry.
- */
-CTypeParser.prototype.readEntry = function (type, buffer, offset)
-{
-	var parse, len;
-
-	/*
-	 * Because we want to special case char[]s this is unfortunately
-	 * a bit uglier than it really should be. We want to special
-	 * case char[]s so that we return a node buffer, thus they are a
-	 * first class type where as all other arrays just call into a
-	 * generic array routine which calls their data-specific routine
-	 * the specified number of times.
-	 *
-	 * The valid dispatch options we have are:
-	 *  - Array and char => char[] handler
-	 *  - Generic array handler
-	 *  - Generic typedef handler
-	 *  - Basic type handler
-	 */
-	if (type['len'] !== undefined) {
-		len = parseInt(type['len'], 10);
-		if (isNaN(len))
-			throw (new Error('somehow got a non-numeric length'));
-
-		if (type['type'] == 'char')
-			parse = this.types['char[]']['read'](len,
-			    this.endian, buffer, offset);
-		else
-			parse = this.readArray(type['type'],
-			    len, buffer, offset);
-	} else {
-		if (type['type'] in deftypes)
-			parse = this.types[type['type']]['read'](this.endian,
-			    buffer, offset);
-		else
-			parse = this.resolveTypedef(type['type'], 'read',
-			    buffer, offset);
-	}
-
-	return (parse);
-};
-
-/*
- * [private] Read an array of data
- */
-CTypeParser.prototype.readArray = function (type, length, buffer, offset)
-{
-	var ii, ent, pt;
-	var baseOffset = offset;
-	var ret = new Array(length);
-	pt = ctParseType(type);
-
-	for (ii = 0; ii < length; ii++) {
-		ent = this.readEntry(pt, buffer, offset);
-		offset += ent['size'];
-		ret[ii] = ent['value'];
-	}
-
-	return ({ value: ret, size: offset - baseOffset });
-};
-
-/*
- * [private] Read a single struct in.
- */
-CTypeParser.prototype.readStruct = function (def, buffer, offset)
-{
-	var parse, ii, type, entry, key;
-	var baseOffset = offset;
-	var ret = {};
-
-	/* Walk it and handle doing what's necessary */
-	for (ii = 0; ii < def.length; ii++) {
-		key = Object.keys(def[ii])[0];
-		entry = def[ii][key];
-
-		/* Resolve all array values */
-		type = ctParseType(ctResolveArray(entry['type'], ret));
-
-		if ('offset' in entry)
-			offset = baseOffset + entry['offset'];
-
-		parse = this.readEntry(type, buffer, offset);
-
-		offset += parse['size'];
-		ret[key] = parse['value'];
-	}
-
-	return ({ value: ret, size: (offset-baseOffset)});
-};
-
-/*
- * This is what we were born to do. We read the data from a buffer and return it
- * in an object whose keys match the values from the object.
- *
- *	def		The array definition of the data to read in
- *
- *	buffer		The buffer to read data from
- *
- *	offset		The offset to start writing to
- *
- * Returns an object where each key corresponds to an entry in def and the value
- * is the read value.
- */
-CTypeParser.prototype.readData = function (def, buffer, offset)
-{
-	/* Sanity check for arguments */
-	if (def === undefined)
-		throw (new Error('missing definition for what we should be' +
-		    'parsing'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer for what we should be ' +
-		    'parsing'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset for what we should be ' +
-		    'parsing'));
-
-	/* Sanity check the object definition */
-	ctCheckReq(def, this.types);
-
-	return (this.readStruct(def, buffer, offset)['value']);
-};
-
-/*
- * [private] Write out an array of data
- */
-CTypeParser.prototype.writeArray = function (value, type, length, buffer,
-    offset)
-{
-	var ii, pt;
-	var baseOffset = offset;
-	if (!(value instanceof Array))
-		throw (new Error('asked to write an array, but value is not ' +
-		    'an array'));
-
-	if (value.length != length)
-		throw (new Error('asked to write array of length ' + length +
-		    ' but that does not match value length: ' + value.length));
-
-	pt = ctParseType(type);
-	for (ii = 0; ii < length; ii++)
-		offset += this.writeEntry(value[ii], pt, buffer, offset);
-
-	return (offset - baseOffset);
-};
-
-/*
- * [private] Write the specific entry
- */
-CTypeParser.prototype.writeEntry = function (value, type, buffer, offset)
-{
-	var len, ret;
-
-	if (type['len'] !== undefined) {
-		len = parseInt(type['len'], 10);
-		if (isNaN(len))
-			throw (new Error('somehow got a non-numeric length'));
-
-		if (type['type'] == 'char')
-			ret = this.types['char[]']['write'](value, len,
-			    this.endian, buffer, offset);
-		else
-			ret = this.writeArray(value, type['type'],
-			    len, buffer, offset);
-	} else {
-		if (type['type'] in deftypes)
-			ret = this.types[type['type']]['write'](value,
-			    this.endian, buffer, offset);
-		else
-			ret = this.resolveTypedef(type['type'], 'write',
-			    buffer, offset, value);
-	}
-
-	return (ret);
-};
-
-/*
- * [private] Write a single struct out.
- */
-CTypeParser.prototype.writeStruct = function (value, def, buffer, offset)
-{
-	var ii, entry, type, key;
-	var baseOffset = offset;
-	var vals = {};
-
-	for (ii = 0; ii < def.length; ii++) {
-		key = Object.keys(def[ii])[0];
-		entry = def[ii][key];
-
-		type = ctParseType(ctResolveArray(entry['type'], vals));
-
-		if ('offset' in entry)
-			offset = baseOffset + entry['offset'];
-
-		offset += this.writeEntry(value[ii], type, buffer, offset);
-		/* Now that we've written it out, we can use it for arrays */
-		vals[key] = value[ii];
-	}
-
-	return (offset);
-};
-
-/*
- * Unfortunately, we're stuck with the sins of an initial poor design. Because
- * of that, we are going to have to support the old way of writing data via
- * writeData. There we insert the values that you want to write into the
- * definition. A little baroque. Internally, we use the new model. So we need to
- * just get those values out of there. But to maintain the principle of least
- * surprise, we're not going to modify the input data.
- */
-function getValues(def)
-{
-	var ii, out, key;
-	out = [];
-	for (ii = 0; ii < def.length; ii++) {
-		key = Object.keys(def[ii])[0];
-		mod_assert.ok('value' in def[ii][key]);
-		out.push(def[ii][key]['value']);
-	}
-
-	return (out);
-}
-
-/*
- * This is the second half of what we were born to do, write out the data
- * itself. Historically this function required you to put your values in the
- * definition section. This was not the smartest thing to do and a bit of an
- * oversight to be honest. As such, this function now takes a values argument.
- * If values is non-null and non-undefined, it will be used to determine the
- * values. This means that the old method is still supported, but is no longer
- * acceptable.
- *
- *	def		The array definition of the data to write out with
- *			values
- *
- *	buffer		The buffer to write to
- *
- *	offset		The offset in the buffer to write to
- *
- *	values		An array of values to write.
- */
-CTypeParser.prototype.writeData = function (def, buffer, offset, values)
-{
-	var hv;
-
-	if (def === undefined)
-		throw (new Error('missing definition for what we should be' +
-		    'parsing'));
-
-	if (buffer === undefined)
-		throw (new Error('missing buffer for what we should be ' +
-		    'parsing'));
-
-	if (offset === undefined)
-		throw (new Error('missing offset for what we should be ' +
-		    'parsing'));
-
-	hv = (values != null && values != undefined);
-	if (hv) {
-		if (!Array.isArray(values))
-			throw (new Error('missing values for writing'));
-		ctCheckReq(def, this.types);
-	} else {
-		ctCheckReq(def, this.types, [ 'value' ]);
-	}
-
-	this.writeStruct(hv ? values : getValues(def), def, buffer, offset);
-};
-
-/*
- * Functions to go to and from 64 bit numbers in a way that is compatible with
- * Javascript limitations. There are two sets. One where the user is okay with
- * an approximation and one where they are definitely not okay with an
- * approximation.
- */
-
-/*
- * Attempts to convert an array of two integers returned from rsint64 / ruint64
- * into an absolute 64 bit number. If however the value would exceed 2^52 this
- * will instead throw an error. The mantissa in a double is a 52 bit number and
- * rather than potentially give you a value that is an approximation this will
- * error. If you would rather an approximation, please see toApprox64.
- *
- *	val		An array of two 32-bit integers
- */
-function toAbs64(val)
-{
-	if (val === undefined)
-		throw (new Error('missing required arg: value'));
-
-	if (!Array.isArray(val))
-		throw (new Error('value must be an array'));
-
-	if (val.length != 2)
-		throw (new Error('value must be an array of length 2'));
-
-	/* We have 20 bits worth of precision in this range */
-	if (val[0] >= 0x100000)
-		throw (new Error('value would become approximated'));
-
-	return (val[0] * Math.pow(2, 32) + val[1]);
-}
-
-/*
- * Will return the 64 bit value as returned in an array from rsint64 / ruint64
- * to a value as close as it can. Note that Javascript stores all numbers as a
- * double and the mantissa only has 52 bits. Thus this version may approximate
- * the value.
- *
- *	val		An array of two 32-bit integers
- */
-function toApprox64(val)
-{
-	if (val === undefined)
-		throw (new Error('missing required arg: value'));
-
-	if (!Array.isArray(val))
-		throw (new Error('value must be an array'));
-
-	if (val.length != 2)
-		throw (new Error('value must be an array of length 2'));
-
-	return (Math.pow(2, 32) * val[0] + val[1]);
-}
-
-function parseCTF(json, conf)
-{
-	var ctype = new CTypeParser(conf);
-	mod_ctf.ctfParseJson(json, ctype);
-
-	return (ctype);
-}
-
-/*
- * Export the few things we actually want to. Currently this is just the CType
- * Parser and ctio.
- */
-exports.Parser = CTypeParser;
-exports.toAbs64 = toAbs64;
-exports.toApprox64 = toApprox64;
-
-exports.parseCTF = parseCTF;
-
-exports.ruint8 = mod_ctio.ruint8;
-exports.ruint16 = mod_ctio.ruint16;
-exports.ruint32 = mod_ctio.ruint32;
-exports.ruint64 = mod_ctio.ruint64;
-exports.wuint8 = mod_ctio.wuint8;
-exports.wuint16 = mod_ctio.wuint16;
-exports.wuint32 = mod_ctio.wuint32;
-exports.wuint64 = mod_ctio.wuint64;
-
-exports.rsint8 = mod_ctio.rsint8;
-exports.rsint16 = mod_ctio.rsint16;
-exports.rsint32 = mod_ctio.rsint32;
-exports.rsint64 = mod_ctio.rsint64;
-exports.wsint8 = mod_ctio.wsint8;
-exports.wsint16 = mod_ctio.wsint16;
-exports.wsint32 = mod_ctio.wsint32;
-exports.wsint64 = mod_ctio.wsint64;
-
-exports.rfloat = mod_ctio.rfloat;
-exports.rdouble = mod_ctio.rdouble;
-exports.wfloat = mod_ctio.wfloat;
-exports.wdouble = mod_ctio.wdouble;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/man/man3ctype/ctio.3ctype	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,241 +0,0 @@
-'\" te
-.\" Copyright (c) 2011, Robert Mustacchi.  All Rights Reserved.
-.\" Copyright (c) 2011, Joyent, Inc.  All Rights Reserved.
-.\"
-.\" Permission is hereby granted, free of charge, to any person obtaining a copy
-.\" of this software and associated documentation files (the "Software"), to
-.\" deal in the Software without restriction, including without limitation the
-.\" rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
-.\" sell copies of the Software, and to permit persons to whom the Software is
-.\" furnished to do so, subject to the following conditions:
-.\"
-.\" The above copyright notice and this permission notice shall be included in
-.\" all copies or substantial portions of the Software.
-.\" 
-.\" THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-.\" IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-.\" FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-.\" AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-.\" LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-.\" FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
-.\" IN THE SOFTWARE.
-.TH CTIO 3CTYPE "December 12, 2011"
-.SH NAME
-ctio, ruint8, ruint16, ruint32, ruint64, wuint8, wuint16, wuint32, wuint64,
-rsint8, rsint16, rsint32, rsint64, wsint8, wsint16, wsint32, wsint64, rfloat,
-rdouble, wfloat, wdouble \- integer and float operations
-.SH SYNOPSIS
-.LP
-.nf
-var mod_ctype = require('ctype');
-
-\fBNumber\fR \fBmod_ctype.ruint8\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBNumber\fR \fBmod_ctype.ruint16\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBNumber\fR \fBmod_ctype.ruint32\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBNumber[2]\fR \fBmod_ctype.ruint64\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBNumber\fR \fBmod_ctype.rsint8\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBNumber\fR \fBmod_ctype.rsint16\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBNumber\fR \fBmod_ctype.rsint32\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBNumber[2]\fR \fBmod_ctype.rsint64\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBNumber\fR \fBmod_ctype.rfloat\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBNumber\fR \fBmod_ctype.rdouble\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wuint8\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wuint16\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wuint32\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wuint64\fR(\fBNumber[2]\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wsint8\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wsint16\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wsint32\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wsint64\fR(\fBNumber[2]\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wfloat\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.LP
-.nf
-\fBvoid\fR \fBmod_ctype.wdouble\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR);
-.fi
-
-.SH DESCRIPTION
-.sp
-.LP
-The argument \fIbuf\fR refers to a valid buffer (from calling new Buffer()). The
-argument \fIendian\fR is either the string 'big' or 'little' and controls
-whether the data in the buffer is interpreted as big or little endian. The argument
-\fIoffset\fR indicates the starting index into the buffer to read or write. All
-functions ensure that starting at \fIoffset\fR does not overflow the end of the
-buffer. The argument \fIvalue\fR is a Number that is the valid type for the
-specific function. All functions that take \fIvalue\fR as an argument, verify
-that the passed value is valid.
-
-.SS "\fBruint8()\fR, \fBruint16()\fR, \fBruint32()\fR"
-.sp
-.LP
-The \fBruint8()\fR, \fBruint16()\fR, and \fBruint32()\fR functions read an 8,
-16, and 32-bit unsigned value from \fIbuf\fR and return it. The value read is
-influenced by the values of \fIoffset\fR and \fRendian\fI.
-
-
-.SS "\fBrsint8()\fR, \fBrsint16()\fR, \fBrsint32()\fR"
-.sp
-.LP
-The \fBruint8()\fR, \fBruint16()\fR, and \fBruint32()\fR functions work just as
-\fBruint8()\fR, \fBruint16()\fR, and \fBruint32()\fR, except they return signed
-integers.
-
-.SS "\fBruint64()\fR, \fBrsint64()\fR"
-.sp
-.LP
-The \fBruint64()\fR and \fBrsint64()\fR functions read unsigned and signed 64
-bit integers respectively from \fBbuf\fR. Due to the limitations of ECMAScript's
-\fBNumber\fR type, they cannot be stored as one value without a loss of
-precision. Instead of returning the values as a single \fBNumber\fR, the
-functions return an array of two numbers. The first entry always contains the
-upper 32-bits and the second value contains the lower 32-bits. The lossy
-transformation into a number would be \fIres[0]*Math.pow(2,32)+res[1]\fR.
-Note that, unless an entry is zero, both array entries are guaranteed to have
-the same sign.
-
-.SS "\fBwuint8()\fR, \fBwuint16()\fR, \fBwuint32()\fR"
-.sp
-.LP
-The functions \fBwuint8()\fR, \fBwuint16()\fR, and \fBwuint32()\fR modify the
-contents of \fBbuf\fR by writing an 8, 16, and 32-bit unsigned integer
-respectively to \fBbuf\fR. It is illegal to pass a number that is not an integer
-within the domain of the integer size, for example, for \fBwuint8()\fR the valid
-range is \fB[0, 255]\fR. The value will be written in either big or little
-endian format based upon the value of \fBendian\fR.
-
-
-.SS "\fBwsint8()\fR, \fBwsint16()\fR, \fBwsint32()\fR"
-.sp
-.LP
-The functions \fBwsint8()\fR, \fBwsint16()\fR, and \fBwsint32()\fR function
-identically to the functions \fBwuint8()\fR, \fBwuint16()\fR, and
-\fBwuint32()\fR except that they the valid domain for \fBvalue\fR is that of a
-signed number instead of an unsigned number. For example the \fBwsint8()\fR has
-a domain of \fB[-128, 127]\fR.
-
-.SS "\fBwuint64()\fR, \fBwsint64()\fR"
-.sp
-.LP
-The functions \fBwuint64()\fR and \fBswint64()\fR write out 64-bit unsigned and
-signed integers to \fBbuf\fR. The \fBvalue\fR argument must be in the same
-format as described in \fBruint64()\fR and \fBrsint64()\fR.
-
-.SS "\fBrfloat()\fR, \fBrdouble()\fR"
-.sp
-.LP
-The functions "\fBrfloat()\fR and \fBrdouble()\fR" work like the other read
-functions, except that they read a single precision and double precision
-IEEE-754 floating point value instead.
-
-.SS "\fBwfloat()\fR, \fBwdouble()\fR"
-.sp
-.LP
-The functions "\fBrfloat()\fR and \fBrdouble()\fR" work like the other write 
-functions, except that the domain for a float is that of a single precision 4
-byte value. The domain for a double is any \fBNumber\fR in ECMAScript, which is
-defined to be represented by a double.
-
-.SH ATTRIBUTES
-.sp
-.LP
-See \fBattributes\fR(5) for descriptions of the following attributes:
-.sp
-
-.sp
-.TS
-box;
-c | c
-l | l .
-ATTRIBUTE TYPE	ATTRIBUTE VALUE
-_
-Interface Stability	Committed
-_
-MT-Level	See below.
-_
-Standard	Not standardized.
-.TE
-
-.sp
-.LP
-
-All functions are MT-safe in so far as there aren't shared memory MT concerns in
-most node programs. If one where to concoct such an environment, these functions
-wouldn't be MT-safe.
-
-.SH SEE ALSO
-.sp
-.LP
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-{
-  "name": "ctype",
-  "version": "0.5.2",
-  "description": "read and write binary structures and data types",
-  "homepage": "https://github.com/rmustacc/node-ctype",
-  "author": {
-    "name": "Robert Mustacchi",
-    "email": "rm@fingolfin.org"
-  },
-  "engines": {
-    "node": ">= 0.4"
-  },
-  "main": "ctype.js",
-  "readme": "Node-CType is a way to read and write binary data in structured and easy to use\nformat. Its name comes from the C header file.\n\nTo get started, simply clone the repository or use npm to install it. Once it is\nthere, simply require it.\n\ngit clone git://github.com/rmustacc/node-ctype\nnpm install ctype\nvar mod_ctype = require('ctype')\n\n\nThere are two APIs that you can use, depending on what abstraction you'd like.\nThe low level API let's you read and write individual integers and floats from\nbuffers. The higher level API let's you read and write structures of these. To\nillustrate this, let's looks look at how we would read and write a binary\nencoded x,y point.\n\nIn C we would define this structure as follows:\n\ntypedef struct point {\n\tuint16_t\tp_x;\n\tuint16_t\tp_y;\n} point_t;\n\nTo read a binary encoded point from a Buffer, we first need to create a CType\nparser (where we specify the endian and other options) and add the typedef.\n\nvar parser = new mod_ctype.Parser({ endian: 'big' });\nparser.typedef('point_t', [\n\t{ x: { type: 'uint16_t' } },\n\t{ y: { type: 'uint16_t' } }\n]);\n\nFrom here, given a buffer buf and an offset into it, we can read a point.\n\nvar out = parser.readData([ { point: { type: 'point_t' } } ], buffer, 0);\nconsole.log(out);\n{ point: { x: 23, y: 42 } }\n\nAnother way to get the same information would be to use the low level methods.\nNote that these require you to manually deal with the offset. Here's how we'd\nget the same values of x and y from the buffer.\n\nvar x = mod_ctype.ruint16(buf, 'big', 0);\nvar y = mod_ctype.ruint16(buf, 'big', 2);\nconsole.log(x + ', ' + y);\n23, 42\n\nThe true power of this API comes from the ability to define and nest typedefs,\njust as you would in C. By default, the following types are defined by default.\nNote that they return a Number, unless indicated otherwise.\n\n    * int8_t\n    * int16_t\n    * int32_t\n    * int64_t (returns an array where val[0] << 32 + val[1] would be the value)\n    * uint8_t\n    * uint16_t\n    * uint32_t\n    * uint64_t (returns an array where val[0] << 32 + val[1] would be the value)\n    * float\n    * double\n    * char (either returns a buffer with that character or a uint8_t)\n    * char[] (returns an object with the buffer and the number of characters read which is either the total amount requested or until the first 0)\n\n\nctf2json integration:\n\nNode-CType supports consuming the output of ctf2json. Once you read in a JSON file,\nall you have to do to add all the definitions it contains is:\n\nvar data, parser;\ndata = JSON.parse(parsedJSONData);\nparser = mod_ctype.parseCTF(data, { endian: 'big' });\n\nFor more documentation, see the file README.old. Full documentation is in the\nprocess of being rewritten as a series of manual pages which will be available\nin the repository and online for viewing.\n\nTo read the ctio manual page simple run, from the root of the workspace:\n\nman -Mman -s 3ctype ctio\n",
-  "readmeFilename": "README",
-  "_id": "ctype@0.5.2",
-  "_from": "ctype@0.5.2"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tools/jsl.conf	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,129 +0,0 @@
-#
-# Configuration File for JavaScript Lint 0.3.0
-# Developed by Matthias Miller (http://www.JavaScriptLint.com)
-#
-# This configuration file can be used to lint a collection of scripts, or to enable
-# or disable warnings for scripts that are linted via the command line.
-#
-
-### Warnings
-# Enable or disable warnings based on requirements.
-# Use "+WarningName" to display or "-WarningName" to suppress.
-#
-+no_return_value              # function {0} does not always return a value
-+duplicate_formal             # duplicate formal argument {0}
-+equal_as_assign              # test for equality (==) mistyped as assignment (=)?{0}
-+var_hides_arg                # variable {0} hides argument
-+redeclared_var               # redeclaration of {0} {1}
-+anon_no_return_value         # anonymous function does not always return a value
-+missing_semicolon            # missing semicolon
-+meaningless_block            # meaningless block; curly braces have no impact
-+comma_separated_stmts        # multiple statements separated by commas (use semicolons?)
-+unreachable_code             # unreachable code
-+missing_break                # missing break statement
-+missing_break_for_last_case  # missing break statement for last case in switch
-+comparison_type_conv         # comparisons against null, 0, true, false, or an empty string allowing implicit type conversion (use === or !==)
--inc_dec_within_stmt          # increment (++) and decrement (--) operators used as part of greater statement
-+useless_void                 # use of the void type may be unnecessary (void is always undefined)
--useless_quotes			# quotation marks are unnecessary
-+multiple_plus_minus          # unknown order of operations for successive plus (e.g. x+++y) or minus (e.g. x---y) signs
-+use_of_label                 # use of label
--block_without_braces         # block statement without curly braces
-+leading_decimal_point        # leading decimal point may indicate a number or an object member
-+trailing_decimal_point       # trailing decimal point may indicate a number or an object member
--octal_number                 # leading zeros make an octal number
-+nested_comment               # nested comment
-+misplaced_regex              # regular expressions should be preceded by a left parenthesis, assignment, colon, or comma
-+ambiguous_newline            # unexpected end of line; it is ambiguous whether these lines are part of the same statement
-+empty_statement              # empty statement or extra semicolon
--missing_option_explicit      # the "option explicit" control comment is missing
-+partial_option_explicit      # the "option explicit" control comment, if used, must be in the first script tag
-+dup_option_explicit          # duplicate "option explicit" control comment
-+useless_assign               # useless assignment
-+ambiguous_nested_stmt        # block statements containing block statements should use curly braces to resolve ambiguity
-+ambiguous_else_stmt          # the else statement could be matched with one of multiple if statements (use curly braces to indicate intent)
-+missing_default_case         # missing default case in switch statement
-+duplicate_case_in_switch     # duplicate case in switch statements
-+default_not_at_end           # the default case is not at the end of the switch statement
-+legacy_cc_not_understood     # couldn't understand control comment using /*@keyword@*/ syntax
-+jsl_cc_not_understood        # couldn't understand control comment using /*jsl:keyword*/ syntax
-+useless_comparison           # useless comparison; comparing identical expressions
-+with_statement               # with statement hides undeclared variables; use temporary variable instead
-+trailing_comma_in_array      # extra comma is not recommended in array initializers
-+assign_to_function_call      # assignment to a function call
-+parseint_missing_radix       # parseInt missing radix parameter
--unreferenced_argument        # argument declared but never referenced: {name}
-
-### Output format
-# Customize the format of the error message.
-#    __FILE__ indicates current file path
-#    __FILENAME__ indicates current file name
-#    __LINE__ indicates current line
-#    __ERROR__ indicates error message
-#
-# Visual Studio syntax (default):
-+output-format __FILE__(__LINE__): __ERROR__
-# Alternative syntax:
-#+output-format __FILE__:__LINE__: __ERROR__
-
-
-### Context
-# Show the in-line position of the error.
-# Use "+context" to display or "-context" to suppress.
-#
-+context
-
-
-### Semicolons
-# By default, assignments of an anonymous function to a variable or
-# property (such as a function prototype) must be followed by a semicolon.
-#
-#+lambda_assign_requires_semicolon # deprecated setting
-
-
-### Control Comments
-# Both JavaScript Lint and the JScript interpreter confuse each other with the syntax for
-# the /*@keyword@*/ control comments and JScript conditional comments. (The latter is
-# enabled in JScript with @cc_on@). The /*jsl:keyword*/ syntax is preferred for this reason,
-# although legacy control comments are enabled by default for backward compatibility.
-#
-+legacy_control_comments
-
-
-### JScript Function Extensions
-# JScript allows member functions to be defined like this:
-#     function MyObj() { /*constructor*/ }
-#     function MyObj.prototype.go() { /*member function*/ }
-#
-# It also allows events to be attached like this:
-#     function window::onload() { /*init page*/ }
-#
-# This is a Microsoft-only JavaScript extension. Enable this setting to allow them.
-#
-#-jscript_function_extensions # deprecated setting
-
-
-### Defining identifiers
-# By default, "option explicit" is enabled on a per-file basis.
-# To enable this for all files, use "+always_use_option_explicit"
--always_use_option_explicit
-
-# Define certain identifiers of which the lint is not aware.
-# (Use this in conjunction with the "undeclared identifier" warning.)
-#
-# Common uses for webpages might be:
-#+define window
-#+define document
-+define require
-+define exports
-+define console
-+define Buffer
-+define JSON
-
-### Files
-# Specify which files to lint
-# Use "+recurse" to enable recursion (disabled by default).
-# To add a set of files, use "+process FileName", "+process Folder\Path\*.js",
-# or "+process Folder\Path\*.htm".
-#
-#+process jsl-test.js
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tools/jsstyle	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,839 +0,0 @@
-#!/usr/bin/env perl
-#
-# CDDL HEADER START
-#
-# The contents of this file are subject to the terms of the
-# Common Development and Distribution License (the "License").
-# You may not use this file except in compliance with the License.
-#
-# You can obtain a copy of the license at usr/src/OPENSOLARIS.LICENSE
-# or http://www.opensolaris.org/os/licensing.
-# See the License for the specific language governing permissions
-# and limitations under the License.
-#
-# When distributing Covered Code, include this CDDL HEADER in each
-# file and include the License file at usr/src/OPENSOLARIS.LICENSE.
-# If applicable, add the following below this CDDL HEADER, with the
-# fields enclosed by brackets "[]" replaced with your own identifying
-# information: Portions Copyright [yyyy] [name of copyright owner]
-#
-# CDDL HEADER END
-#
-#
-# Copyright 2008 Sun Microsystems, Inc.  All rights reserved.
-# Use is subject to license terms.
-#
-# Copyright 2011 Joyent, Inc. All rights reserved.
-#
-# jsstyle - check for some common stylistic errors.
-#
-#	jsstyle is a sort of "lint" for Javascript coding style.  This tool is
-#	derived from the cstyle tool, used to check for the style used in the
-#	Solaris kernel, sometimes known as "Bill Joy Normal Form".
-#
-#	There's a lot this can't check for, like proper indentation of code
-#	blocks.  There's also a lot more this could check for.
-#
-#	A note to the non perl literate:
-#
-#		perl regular expressions are pretty much like egrep
-#		regular expressions, with the following special symbols
-#
-#		\s	any space character
-#		\S	any non-space character
-#		\w	any "word" character [a-zA-Z0-9_]
-#		\W	any non-word character
-#		\d	a digit [0-9]
-#		\D	a non-digit
-#		\b	word boundary (between \w and \W)
-#		\B	non-word boundary
-#
-
-require 5.0;
-use IO::File;
-use Getopt::Std;
-use strict;
-
-my $usage =
-"usage: jsstyle [-chvC] [-o constructs] file ...
-	-c	check continuation indentation inside functions
-	-h	perform heuristic checks that are sometimes wrong
-	-v	verbose
-	-C	don't check anything in header block comments
-	-o constructs
-		allow a comma-seperated list of optional constructs:
-		    doxygen	allow doxygen-style block comments (/** /*!)
-		    splint	allow splint-style lint comments (/*@ ... @*/)
-";
-
-my %opts;
-
-if (!getopts("cho:vC", \%opts)) {
-	print $usage;
-	exit 2;
-}
-
-my $check_continuation = $opts{'c'};
-my $heuristic = $opts{'h'};
-my $verbose = $opts{'v'};
-my $ignore_hdr_comment = $opts{'C'};
-
-my $doxygen_comments = 0;
-my $splint_comments = 0;
-
-if (defined($opts{'o'})) {
-	for my $x (split /,/, $opts{'o'}) {
-		if ($x eq "doxygen") {
-			$doxygen_comments = 1;
-		} elsif ($x eq "splint") {
-			$splint_comments = 1;
-		} else {
-			print "jsstyle: unrecognized construct \"$x\"\n";
-			print $usage;
-			exit 2;
-		}
-	}
-}
-
-my ($filename, $line, $prev);		# shared globals
-
-my $fmt;
-my $hdr_comment_start;
-
-if ($verbose) {
-	$fmt = "%s: %d: %s\n%s\n";
-} else {
-	$fmt = "%s: %d: %s\n";
-}
-
-if ($doxygen_comments) {
-	# doxygen comments look like "/*!" or "/**"; allow them.
-	$hdr_comment_start = qr/^\s*\/\*[\!\*]?$/;
-} else {
-	$hdr_comment_start = qr/^\s*\/\*$/;
-}
-
-# Note, following must be in single quotes so that \s and \w work right.
-my $lint_re = qr/\/\*(?:
-	jsl:\w+?|ARGSUSED[0-9]*|NOTREACHED|LINTLIBRARY|VARARGS[0-9]*|
-	CONSTCOND|CONSTANTCOND|CONSTANTCONDITION|EMPTY|
-	FALLTHRU|FALLTHROUGH|LINTED.*?|PRINTFLIKE[0-9]*|
-	PROTOLIB[0-9]*|SCANFLIKE[0-9]*|JSSTYLED.*?
-    )\*\//x;
-
-my $splint_re = qr/\/\*@.*?@\*\//x;
-
-my $err_stat = 0;		# exit status
-
-if ($#ARGV >= 0) {
-	foreach my $arg (@ARGV) {
-		my $fh = new IO::File $arg, "r";
-		if (!defined($fh)) {
-			printf "%s: cannot open\n", $arg;
-		} else {
-			&jsstyle($arg, $fh);
-			close $fh;
-		}
-	}
-} else {
-	&jsstyle("<stdin>", *STDIN);
-}
-exit $err_stat;
-
-my $no_errs = 0;		# set for JSSTYLED-protected lines
-
-sub err($) {
-	my ($error) = @_;
-	unless ($no_errs) {
-		printf $fmt, $filename, $., $error, $line;
-		$err_stat = 1;
-	}
-}
-
-sub err_prefix($$) {
-	my ($prevline, $error) = @_;
-	my $out = $prevline."\n".$line;
-	unless ($no_errs) {
-		printf $fmt, $filename, $., $error, $out;
-		$err_stat = 1;
-	}
-}
-
-sub err_prev($) {
-	my ($error) = @_;
-	unless ($no_errs) {
-		printf $fmt, $filename, $. - 1, $error, $prev;
-		$err_stat = 1;
-	}
-}
-
-sub jsstyle($$) {
-
-my ($fn, $filehandle) = @_;
-$filename = $fn;			# share it globally
-
-my $in_cpp = 0;
-my $next_in_cpp = 0;
-
-my $in_comment = 0;
-my $in_header_comment = 0;
-my $comment_done = 0;
-my $in_function = 0;
-my $in_function_header = 0;
-my $in_declaration = 0;
-my $note_level = 0;
-my $nextok = 0;
-my $nocheck = 0;
-
-my $in_string = 0;
-
-my ($okmsg, $comment_prefix);
-
-$line = '';
-$prev = '';
-reset_indent();
-
-line: while (<$filehandle>) {
-	s/\r?\n$//;	# strip return and newline
-
-	# save the original line, then remove all text from within
-	# double or single quotes, we do not want to check such text.
-
-	$line = $_;
-
-	#
-	# C allows strings to be continued with a backslash at the end of
-	# the line.  We translate that into a quoted string on the previous
-	# line followed by an initial quote on the next line.
-	#
-	# (we assume that no-one will use backslash-continuation with character
-	# constants)
-	#
-	$_ = '"' . $_		if ($in_string && !$nocheck && !$in_comment);
-
-	#
-	# normal strings and characters
-	#
-	s/'([^\\']|\\.)*'/\'\'/g;
-	s/"([^\\"]|\\.)*"/\"\"/g;
-
-	#
-	# detect string continuation
-	#
-	if ($nocheck || $in_comment) {
-		$in_string = 0;
-	} else {
-		#
-		# Now that all full strings are replaced with "", we check
-		# for unfinished strings continuing onto the next line.
-		#
-		$in_string =
-		    (s/([^"](?:"")*)"([^\\"]|\\.)*\\$/$1""/ ||
-		    s/^("")*"([^\\"]|\\.)*\\$/""/);
-	}
-
-	#
-	# figure out if we are in a cpp directive
-	#
-	$in_cpp = $next_in_cpp || /^\s*#/;	# continued or started
-	$next_in_cpp = $in_cpp && /\\$/;	# only if continued
-
-	# strip off trailing backslashes, which appear in long macros
-	s/\s*\\$//;
-
-	# an /* END JSSTYLED */ comment ends a no-check block.
-	if ($nocheck) {
-		if (/\/\* *END *JSSTYLED *\*\//) {
-			$nocheck = 0;
-		} else {
-			reset_indent();
-			next line;
-		}
-	}
-
-	# a /*JSSTYLED*/ comment indicates that the next line is ok.
-	if ($nextok) {
-		if ($okmsg) {
-			err($okmsg);
-		}
-		$nextok = 0;
-		$okmsg = 0;
-		if (/\/\* *JSSTYLED.*\*\//) {
-			/^.*\/\* *JSSTYLED *(.*) *\*\/.*$/;
-			$okmsg = $1;
-			$nextok = 1;
-		}
-		$no_errs = 1;
-	} elsif ($no_errs) {
-		$no_errs = 0;
-	}
-
-	# check length of line.
-	# first, a quick check to see if there is any chance of being too long.
-	if (($line =~ tr/\t/\t/) * 7 + length($line) > 80) {
-		# yes, there is a chance.
-		# replace tabs with spaces and check again.
-		my $eline = $line;
-		1 while $eline =~
-		    s/\t+/' ' x (length($&) * 8 - length($`) % 8)/e;
-		if (length($eline) > 80) {
-			err("line > 80 characters");
-		}
-	}
-
-	# ignore NOTE(...) annotations (assumes NOTE is on lines by itself).
-	if ($note_level || /\b_?NOTE\s*\(/) { # if in NOTE or this is NOTE
-		s/[^()]//g;			  # eliminate all non-parens
-		$note_level += s/\(//g - length;  # update paren nest level
-		next;
-	}
-
-	# a /* BEGIN JSSTYLED */ comment starts a no-check block.
-	if (/\/\* *BEGIN *JSSTYLED *\*\//) {
-		$nocheck = 1;
-	}
-
-	# a /*JSSTYLED*/ comment indicates that the next line is ok.
-	if (/\/\* *JSSTYLED.*\*\//) {
-		/^.*\/\* *JSSTYLED *(.*) *\*\/.*$/;
-		$okmsg = $1;
-		$nextok = 1;
-	}
-	if (/\/\/ *JSSTYLED/) {
-		/^.*\/\/ *JSSTYLED *(.*)$/;
-		$okmsg = $1;
-		$nextok = 1;
-	}
-
-	# universal checks; apply to everything
-	if (/\t +\t/) {
-		err("spaces between tabs");
-	}
-	if (/ \t+ /) {
-		err("tabs between spaces");
-	}
-	if (/\s$/) {
-		err("space or tab at end of line");
-	}
-	if (/[^ \t(]\/\*/ && !/\w\(\/\*.*\*\/\);/) {
-		err("comment preceded by non-blank");
-	}
-
-	# is this the beginning or ending of a function?
-	# (not if "struct foo\n{\n")
-	if (/^{$/ && $prev =~ /\)\s*(const\s*)?(\/\*.*\*\/\s*)?\\?$/) {
-		$in_function = 1;
-		$in_declaration = 1;
-		$in_function_header = 0;
-		$prev = $line;
-		next line;
-	}
-	if (/^}\s*(\/\*.*\*\/\s*)*$/) {
-		if ($prev =~ /^\s*return\s*;/) {
-			err_prev("unneeded return at end of function");
-		}
-		$in_function = 0;
-		reset_indent();		# we don't check between functions
-		$prev = $line;
-		next line;
-	}
-	if (/^\w*\($/) {
-		$in_function_header = 1;
-	}
-
-	# a blank line terminates the declarations within a function.
-	# XXX - but still a problem in sub-blocks.
-	if ($in_declaration && /^$/) {
-		$in_declaration = 0;
-	}
-
-	if ($comment_done) {
-		$in_comment = 0;
-		$in_header_comment = 0;
-		$comment_done = 0;
-	}
-	# does this looks like the start of a block comment?
-	if (/$hdr_comment_start/) {
-		if (!/^\t*\/\*/) {
-			err("block comment not indented by tabs");
-		}
-		$in_comment = 1;
-		/^(\s*)\//;
-		$comment_prefix = $1;
-		if ($comment_prefix eq "") {
-			$in_header_comment = 1;
-		}
-		$prev = $line;
-		next line;
-	}
-	# are we still in the block comment?
-	if ($in_comment) {
-		if (/^$comment_prefix \*\/$/) {
-			$comment_done = 1;
-		} elsif (/\*\//) {
-			$comment_done = 1;
-			err("improper block comment close")
-			    unless ($ignore_hdr_comment && $in_header_comment);
-		} elsif (!/^$comment_prefix \*[ \t]/ &&
-		    !/^$comment_prefix \*$/) {
-			err("improper block comment")
-			    unless ($ignore_hdr_comment && $in_header_comment);
-		}
-	}
-
-	if ($in_header_comment && $ignore_hdr_comment) {
-		$prev = $line;
-		next line;
-	}
-
-	# check for errors that might occur in comments and in code.
-
-	# allow spaces to be used to draw pictures in header comments.
-	#if (/[^ ]     / && !/".*     .*"/ && !$in_header_comment) {
-	#	err("spaces instead of tabs");
-	#}
-	#if (/^ / && !/^ \*[ \t\/]/ && !/^ \*$/ &&
-	#    (!/^    \w/ || $in_function != 0)) {
-	#	err("indent by spaces instead of tabs");
-	#}
-	if (/^ {2,}/ && !/^    [^ ]/) {
-		err("indent by spaces instead of tabs");
-	}
-	if (/^\t+ [^ \t\*]/ || /^\t+  \S/ || /^\t+   \S/) {
-		err("continuation line not indented by 4 spaces");
-	}
-
-	if (/^\s*\/\*./ && !/^\s*\/\*.*\*\// && !/$hdr_comment_start/) {
-		err("improper first line of block comment");
-	}
-
-	if ($in_comment) {	# still in comment, don't do further checks
-		$prev = $line;
-		next line;
-	}
-
-	if ((/[^(]\/\*\S/ || /^\/\*\S/) &&
-	    !(/$lint_re/ || ($splint_comments && /$splint_re/))) {
-		err("missing blank after open comment");
-	}
-	if (/\S\*\/[^)]|\S\*\/$/ &&
-	    !(/$lint_re/ || ($splint_comments && /$splint_re/))) {
-		err("missing blank before close comment");
-	}
-	if (/\/\/\S/) {		# C++ comments
-		err("missing blank after start comment");
-	}
-	# check for unterminated single line comments, but allow them when
-	# they are used to comment out the argument list of a function
-	# declaration.
-	if (/\S.*\/\*/ && !/\S.*\/\*.*\*\// && !/\(\/\*/) {
-		err("unterminated single line comment");
-	}
-
-	if (/^(#else|#endif|#include)(.*)$/) {
-		$prev = $line;
-		next line;
-	}
-
-	#
-	# delete any comments and check everything else.  Note that
-	# ".*?" is a non-greedy match, so that we don't get confused by
-	# multiple comments on the same line.
-	#
-	s/\/\*.*?\*\///g;
-	s/\/\/.*$//;		# C++ comments
-
-	# delete any trailing whitespace; we have already checked for that.
-	s/\s*$//;
-
-	# following checks do not apply to text in comments.
-	if (/"/) {
-		err("literal string using double-quote instead of single");
-	}
-
-	if (/[^=!<>\s][!<>=]=/ || /[^<>!=][!<>=]==?[^\s,=]/ ||
-	    (/[^->]>[^,=>\s]/ && !/[^->]>$/) ||
-	    (/[^<]<[^,=<\s]/ && !/[^<]<$/) ||
-	    /[^<\s]<[^<]/ || /[^->\s]>[^>]/) {
-		err("missing space around relational operator");
-	}
-	if (/\S>>=/ || /\S<<=/ || />>=\S/ || /<<=\S/ || /\S[-+*\/&|^%]=/ ||
-	    (/[^-+*\/&|^%!<>=\s]=[^=]/ && !/[^-+*\/&|^%!<>=\s]=$/) ||
-	    (/[^!<>=]=[^=\s]/ && !/[^!<>=]=$/)) {
-		# XXX - should only check this for C++ code
-		# XXX - there are probably other forms that should be allowed
-		if (!/\soperator=/) {
-			err("missing space around assignment operator");
-		}
-	}
-	if (/[,;]\S/ && !/\bfor \(;;\)/) {
-		err("comma or semicolon followed by non-blank");
-	}
-	# allow "for" statements to have empty "while" clauses
-	if (/\s[,;]/ && !/^[\t]+;$/ && !/^\s*for \([^;]*; ;[^;]*\)/) {
-		err("comma or semicolon preceded by blank");
-	}
-	if (/^\s*(&&|\|\|)/) {
-		err("improper boolean continuation");
-	}
-	if (/\S   *(&&|\|\|)/ || /(&&|\|\|)   *\S/) {
-		err("more than one space around boolean operator");
-	}
-	if (/\b(delete|typeof|instanceOf|throw|with|catch|new|function|in|for|if|while|switch|return|case)\(/) {
-		err("missing space between keyword and paren");
-	}
-	if (/(\b(catch|for|if|with|while|switch|return)\b.*){2,}/) {
-		# multiple "case" and "sizeof" allowed
-		err("more than one keyword on line");
-	}
-	if (/\b(delete|typeof|instanceOf|with|throw|catch|new|function|in|for|if|while|switch|return|case)\s\s+\(/ &&
-	    !/^#if\s+\(/) {
-		err("extra space between keyword and paren");
-	}
-	# try to detect "func (x)" but not "if (x)" or
-	# "#define foo (x)" or "int (*func)();"
-	if (/\w\s\(/) {
-		my $s = $_;
-		# strip off all keywords on the line
-		s/\b(delete|typeof|instanceOf|throw|with|catch|new|function|in|for|if|while|switch|return|case)\s\(/XXX(/g;
-		s/#elif\s\(/XXX(/g;
-		s/^#define\s+\w+\s+\(/XXX(/;
-		# do not match things like "void (*f)();"
-		# or "typedef void (func_t)();"
-		s/\w\s\(+\*/XXX(*/g;
-		s/\b(void)\s+\(+/XXX(/og;
-		if (/\w\s\(/) {
-			err("extra space between function name and left paren");
-		}
-		$_ = $s;
-	}
-
-	if (/^\s*return\W[^;]*;/ && !/^\s*return\s*\(.*\);/) {
-		err("unparenthesized return expression");
-	}
-	if (/\btypeof\b/ && !/\btypeof\s*\(.*\)/) {
-		err("unparenthesized typeof expression");
-	}
-	if (/\(\s/) {
-		err("whitespace after left paren");
-	}
-	# allow "for" statements to have empty "continue" clauses
-	if (/\s\)/ && !/^\s*for \([^;]*;[^;]*; \)/) {
-		err("whitespace before right paren");
-	}
-	if (/^\s*\(void\)[^ ]/) {
-		err("missing space after (void) cast");
-	}
-	if (/\S{/ && !/({|\(){/) {
-		err("missing space before left brace");
-	}
-	if ($in_function && /^\s+{/ &&
-	    ($prev =~ /\)\s*$/ || $prev =~ /\bstruct\s+\w+$/)) {
-		err("left brace starting a line");
-	}
-	if (/}(else|while)/) {
-		err("missing space after right brace");
-	}
-	if (/}\s\s+(else|while)/) {
-		err("extra space after right brace");
-	}
-	if (/^\s+#/) {
-		err("preprocessor statement not in column 1");
-	}
-	if (/^#\s/) {
-		err("blank after preprocessor #");
-	}
-
-	#
-	# We completely ignore, for purposes of indentation:
-	#  * lines outside of functions
-	#  * preprocessor lines
-	#
-	if ($check_continuation && $in_function && !$in_cpp) {
-		process_indent($_);
-	}
-
-	if ($heuristic) {
-		# cannot check this everywhere due to "struct {\n...\n} foo;"
-		if ($in_function && !$in_declaration &&
-		    /}./ && !/}\s+=/ && !/{.*}[;,]$/ && !/}(\s|)*$/ &&
-		    !/} (else|while)/ && !/}}/) {
-			err("possible bad text following right brace");
-		}
-		# cannot check this because sub-blocks in
-		# the middle of code are ok
-		if ($in_function && /^\s+{/) {
-			err("possible left brace starting a line");
-		}
-	}
-	if (/^\s*else\W/) {
-		if ($prev =~ /^\s*}$/) {
-			err_prefix($prev,
-			    "else and right brace should be on same line");
-		}
-	}
-	$prev = $line;
-}
-
-if ($prev eq "") {
-	err("last line in file is blank");
-}
-
-}
-
-#
-# Continuation-line checking
-#
-# The rest of this file contains the code for the continuation checking
-# engine.  It's a pretty simple state machine which tracks the expression
-# depth (unmatched '('s and '['s).
-#
-# Keep in mind that the argument to process_indent() has already been heavily
-# processed; all comments have been replaced by control-A, and the contents of
-# strings and character constants have been elided.
-#
-
-my $cont_in;		# currently inside of a continuation
-my $cont_off;		# skipping an initializer or definition
-my $cont_noerr;		# suppress cascading errors
-my $cont_start;		# the line being continued
-my $cont_base;		# the base indentation
-my $cont_first;		# this is the first line of a statement
-my $cont_multiseg;	# this continuation has multiple segments
-
-my $cont_special;	# this is a C statement (if, for, etc.)
-my $cont_macro;		# this is a macro
-my $cont_case;		# this is a multi-line case
-
-my @cont_paren;		# the stack of unmatched ( and [s we've seen
-
-sub
-reset_indent()
-{
-	$cont_in = 0;
-	$cont_off = 0;
-}
-
-sub
-delabel($)
-{
-	#
-	# replace labels with tabs.  Note that there may be multiple
-	# labels on a line.
-	#
-	local $_ = $_[0];
-
-	while (/^(\t*)( *(?:(?:\w+\s*)|(?:case\b[^:]*)): *)(.*)$/) {
-		my ($pre_tabs, $label, $rest) = ($1, $2, $3);
-		$_ = $pre_tabs;
-		while ($label =~ s/^([^\t]*)(\t+)//) {
-			$_ .= "\t" x (length($2) + length($1) / 8);
-		}
-		$_ .= ("\t" x (length($label) / 8)).$rest;
-	}
-
-	return ($_);
-}
-
-sub
-process_indent($)
-{
-	require strict;
-	local $_ = $_[0];			# preserve the global $_
-
-	s///g;	# No comments
-	s/\s+$//;	# Strip trailing whitespace
-
-	return			if (/^$/);	# skip empty lines
-
-	# regexps used below; keywords taking (), macros, and continued cases
-	my $special = '(?:(?:\}\s*)?else\s+)?(?:if|for|while|switch)\b';
-	my $macro = '[A-Z_][A-Z_0-9]*\(';
-	my $case = 'case\b[^:]*$';
-
-	# skip over enumerations, array definitions, initializers, etc.
-	if ($cont_off <= 0 && !/^\s*$special/ &&
-	    (/(?:(?:\b(?:enum|struct|union)\s*[^\{]*)|(?:\s+=\s*)){/ ||
-	    (/^\s*{/ && $prev =~ /=\s*(?:\/\*.*\*\/\s*)*$/))) {
-		$cont_in = 0;
-		$cont_off = tr/{/{/ - tr/}/}/;
-		return;
-	}
-	if ($cont_off) {
-		$cont_off += tr/{/{/ - tr/}/}/;
-		return;
-	}
-
-	if (!$cont_in) {
-		$cont_start = $line;
-
-		if (/^\t* /) {
-			err("non-continuation indented 4 spaces");
-			$cont_noerr = 1;		# stop reporting
-		}
-		$_ = delabel($_);	# replace labels with tabs
-
-		# check if the statement is complete
-		return		if (/^\s*\}?$/);
-		return		if (/^\s*\}?\s*else\s*\{?$/);
-		return		if (/^\s*do\s*\{?$/);
-		return		if (/{$/);
-		return		if (/}[,;]?$/);
-
-		# Allow macros on their own lines
-		return		if (/^\s*[A-Z_][A-Z_0-9]*$/);
-
-		# cases we don't deal with, generally non-kosher
-		if (/{/) {
-			err("stuff after {");
-			return;
-		}
-
-		# Get the base line, and set up the state machine
-		/^(\t*)/;
-		$cont_base = $1;
-		$cont_in = 1;
-		@cont_paren = ();
-		$cont_first = 1;
-		$cont_multiseg = 0;
-
-		# certain things need special processing
-		$cont_special = /^\s*$special/? 1 : 0;
-		$cont_macro = /^\s*$macro/? 1 : 0;
-		$cont_case = /^\s*$case/? 1 : 0;
-	} else {
-		$cont_first = 0;
-
-		# Strings may be pulled back to an earlier (half-)tabstop
-		unless ($cont_noerr || /^$cont_base    / ||
-		    (/^\t*(?:    )?(?:gettext\()?\"/ && !/^$cont_base\t/)) {
-			err_prefix($cont_start,
-			    "continuation should be indented 4 spaces");
-		}
-	}
-
-	my $rest = $_;			# keeps the remainder of the line
-
-	#
-	# The split matches 0 characters, so that each 'special' character
-	# is processed separately.  Parens and brackets are pushed and
-	# popped off the @cont_paren stack.  For normal processing, we wait
-	# until a ; or { terminates the statement.  "special" processing
-	# (if/for/while/switch) is allowed to stop when the stack empties,
-	# as is macro processing.  Case statements are terminated with a :
-	# and an empty paren stack.
-	#
-	foreach $_ (split /[^\(\)\[\]\{\}\;\:]*/) {
-		next		if (length($_) == 0);
-
-		# rest contains the remainder of the line
-		my $rxp = "[^\Q$_\E]*\Q$_\E";
-		$rest =~ s/^$rxp//;
-
-		if (/\(/ || /\[/) {
-			push @cont_paren, $_;
-		} elsif (/\)/ || /\]/) {
-			my $cur = $_;
-			tr/\)\]/\(\[/;
-
-			my $old = (pop @cont_paren);
-			if (!defined($old)) {
-				err("unexpected '$cur'");
-				$cont_in = 0;
-				last;
-			} elsif ($old ne $_) {
-				err("'$cur' mismatched with '$old'");
-				$cont_in = 0;
-				last;
-			}
-
-			#
-			# If the stack is now empty, do special processing
-			# for if/for/while/switch and macro statements.
-			#
-			next		if (@cont_paren != 0);
-			if ($cont_special) {
-				if ($rest =~ /^\s*{?$/) {
-					$cont_in = 0;
-					last;
-				}
-				if ($rest =~ /^\s*;$/) {
-					err("empty if/for/while body ".
-					    "not on its own line");
-					$cont_in = 0;
-					last;
-				}
-				if (!$cont_first && $cont_multiseg == 1) {
-					err_prefix($cont_start,
-					    "multiple statements continued ".
-					    "over multiple lines");
-					$cont_multiseg = 2;
-				} elsif ($cont_multiseg == 0) {
-					$cont_multiseg = 1;
-				}
-				# We've finished this section, start
-				# processing the next.
-				goto section_ended;
-			}
-			if ($cont_macro) {
-				if ($rest =~ /^$/) {
-					$cont_in = 0;
-					last;
-				}
-			}
-		} elsif (/\;/) {
-			if ($cont_case) {
-				err("unexpected ;");
-			} elsif (!$cont_special) {
-				err("unexpected ;")	if (@cont_paren != 0);
-				if (!$cont_first && $cont_multiseg == 1) {
-					err_prefix($cont_start,
-					    "multiple statements continued ".
-					    "over multiple lines");
-					$cont_multiseg = 2;
-				} elsif ($cont_multiseg == 0) {
-					$cont_multiseg = 1;
-				}
-				if ($rest =~ /^$/) {
-					$cont_in = 0;
-					last;
-				}
-				if ($rest =~ /^\s*special/) {
-					err("if/for/while/switch not started ".
-					    "on its own line");
-				}
-				goto section_ended;
-			}
-		} elsif (/\{/) {
-			err("{ while in parens/brackets") if (@cont_paren != 0);
-			err("stuff after {")		if ($rest =~ /[^\s}]/);
-			$cont_in = 0;
-			last;
-		} elsif (/\}/) {
-			err("} while in parens/brackets") if (@cont_paren != 0);
-			if (!$cont_special && $rest !~ /^\s*(while|else)\b/) {
-				if ($rest =~ /^$/) {
-					err("unexpected }");
-				} else {
-					err("stuff after }");
-				}
-				$cont_in = 0;
-				last;
-			}
-		} elsif (/\:/ && $cont_case && @cont_paren == 0) {
-			err("stuff after multi-line case") if ($rest !~ /$^/);
-			$cont_in = 0;
-			last;
-		}
-		next;
-section_ended:
-		# End of a statement or if/while/for loop.  Reset
-		# cont_special and cont_macro based on the rest of the
-		# line.
-		$cont_special = ($rest =~ /^\s*$special/)? 1 : 0;
-		$cont_macro = ($rest =~ /^\s*$macro/)? 1 : 0;
-		$cont_case = 0;
-		next;
-	}
-	$cont_noerr = 0			if (!$cont_in);
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/float.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-{ "metadata":
-	{
-		"ctf2json_version": "1.0",
-		"created_at": 1316563626,
-		"derived_from": "/lib/libc.so",
-		"ctf_version": 2,
-		"requested_types": [ "float" ]
-	},
-"data":
-	[
-		{ "name": "float", "float": { "length": 4 } }
-	]
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/int.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-{ "metadata":
-	{
-		"ctf2json_version": "1.0",
-		"created_at": 1316563631,
-		"derived_from": "/lib/libc.so",
-		"ctf_version": 2,
-		"requested_types": [ "int" ]
-	},
-"data":
-	[
-		{ "name": "int", "integer": { "length": 4, "signed": true } }
-	]
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/psinfo.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,104 +0,0 @@
-{ "metadata":
-	{
-		"ctf2json_version": "1.0",
-		"created_at": 1316563573,
-		"derived_from": "/lib/libc.so",
-		"ctf_version": 2,
-		"requested_types": [ "psinfo_t" ]
-	},
-"data":
-	[
-		{ "name": "int", "integer": { "length": 4, "signed": true } },
-		{ "name": "char", "integer": { "length": 1, "signed": true } },
-		{ "name": "unsigned short", "integer": { "length": 2, "signed": false } },
-		{ "name": "long", "integer": { "length": 4, "signed": true } },
-		{ "name": "unsigned", "integer": { "length": 4, "signed": false } },
-		{ "name": "size_t", "typedef": "unsigned" },
-		{ "name": "unsigned long", "integer": { "length": 4, "signed": false } },
-		{ "name": "time_t", "typedef": "long" },
-		{ "name": "struct timespec", "struct": [
-			{ "name": "tv_sec", "type": "time_t" },
-			{ "name": "tv_nsec", "type": "long" }
-		] },
-		{ "name": "zoneid_t", "typedef": "long" },
-		{ "name": "taskid_t", "typedef": "long" },
-		{ "name": "dev_t", "typedef": "unsigned long" },
-		{ "name": "uid_t", "typedef": "unsigned" },
-		{ "name": "gid_t", "typedef": "unsigned" },
-		{ "name": "timestruc_t", "typedef": "struct timespec" },
-		{ "name": "short", "integer": { "length": 2, "signed": true } },
-		{ "name": "projid_t", "typedef": "long" },
-		{ "name": "ushort_t", "typedef": "unsigned short" },
-		{ "name": "poolid_t", "typedef": "long" },
-		{ "name": "uintptr_t", "typedef": "unsigned" },
-		{ "name": "id_t", "typedef": "long" },
-		{ "name": "pid_t", "typedef": "long" },
-		{ "name": "processorid_t", "typedef": "int" },
-		{ "name": "psetid_t", "typedef": "int" },
-		{ "name": "struct lwpsinfo", "struct": [
-			{ "name": "pr_flag", "type": "int" },
-			{ "name": "pr_lwpid", "type": "id_t" },
-			{ "name": "pr_addr", "type": "uintptr_t" },
-			{ "name": "pr_wchan", "type": "uintptr_t" },
-			{ "name": "pr_stype", "type": "char" },
-			{ "name": "pr_state", "type": "char" },
-			{ "name": "pr_sname", "type": "char" },
-			{ "name": "pr_nice", "type": "char" },
-			{ "name": "pr_syscall", "type": "short" },
-			{ "name": "pr_oldpri", "type": "char" },
-			{ "name": "pr_cpu", "type": "char" },
-			{ "name": "pr_pri", "type": "int" },
-			{ "name": "pr_pctcpu", "type": "ushort_t" },
-			{ "name": "pr_pad", "type": "ushort_t" },
-			{ "name": "pr_start", "type": "timestruc_t" },
-			{ "name": "pr_time", "type": "timestruc_t" },
-			{ "name": "pr_clname", "type": "char [8]" },
-			{ "name": "pr_name", "type": "char [16]" },
-			{ "name": "pr_onpro", "type": "processorid_t" },
-			{ "name": "pr_bindpro", "type": "processorid_t" },
-			{ "name": "pr_bindpset", "type": "psetid_t" },
-			{ "name": "pr_lgrp", "type": "int" },
-			{ "name": "pr_filler", "type": "int [4]" }
-		] },
-		{ "name": "lwpsinfo_t", "typedef": "struct lwpsinfo" },
-		{ "name": "struct psinfo", "struct": [
-			{ "name": "pr_flag", "type": "int" },
-			{ "name": "pr_nlwp", "type": "int" },
-			{ "name": "pr_pid", "type": "pid_t" },
-			{ "name": "pr_ppid", "type": "pid_t" },
-			{ "name": "pr_pgid", "type": "pid_t" },
-			{ "name": "pr_sid", "type": "pid_t" },
-			{ "name": "pr_uid", "type": "uid_t" },
-			{ "name": "pr_euid", "type": "uid_t" },
-			{ "name": "pr_gid", "type": "gid_t" },
-			{ "name": "pr_egid", "type": "gid_t" },
-			{ "name": "pr_addr", "type": "uintptr_t" },
-			{ "name": "pr_size", "type": "size_t" },
-			{ "name": "pr_rssize", "type": "size_t" },
-			{ "name": "pr_pad1", "type": "size_t" },
-			{ "name": "pr_ttydev", "type": "dev_t" },
-			{ "name": "pr_pctcpu", "type": "ushort_t" },
-			{ "name": "pr_pctmem", "type": "ushort_t" },
-			{ "name": "pr_start", "type": "timestruc_t" },
-			{ "name": "pr_time", "type": "timestruc_t" },
-			{ "name": "pr_ctime", "type": "timestruc_t" },
-			{ "name": "pr_fname", "type": "char [16]" },
-			{ "name": "pr_psargs", "type": "char [80]" },
-			{ "name": "pr_wstat", "type": "int" },
-			{ "name": "pr_argc", "type": "int" },
-			{ "name": "pr_argv", "type": "uintptr_t" },
-			{ "name": "pr_envp", "type": "uintptr_t" },
-			{ "name": "pr_dmodel", "type": "char" },
-			{ "name": "pr_pad2", "type": "char [3]" },
-			{ "name": "pr_taskid", "type": "taskid_t" },
-			{ "name": "pr_projid", "type": "projid_t" },
-			{ "name": "pr_nzomb", "type": "int" },
-			{ "name": "pr_poolid", "type": "poolid_t" },
-			{ "name": "pr_zoneid", "type": "zoneid_t" },
-			{ "name": "pr_contract", "type": "id_t" },
-			{ "name": "pr_filler", "type": "int [1]" },
-			{ "name": "pr_lwp", "type": "lwpsinfo_t" }
-		] },
-		{ "name": "psinfo_t", "typedef": "struct psinfo" }
-	]
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/struct.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-{ "metadata":
-	{
-		"ctf2json_version": "1.0",
-		"created_at": 1316563648,
-		"derived_from": "/lib/libc.so",
-		"ctf_version": 2,
-		"requested_types": [ "timestruc_t" ]
-	},
-"data":
-	[
-		{ "name": "long", "integer": { "length": 4, "signed": true } },
-		{ "name": "time_t", "typedef": "long" },
-		{ "name": "struct timespec", "struct": [
-			{ "name": "tv_sec", "type": "time_t" },
-			{ "name": "tv_nsec", "type": "long" }
-		] },
-		{ "name": "timestruc_t", "typedef": "struct timespec" }
-	]
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.fail.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-/*
- * Test several conditions that should always cause us to throw.
- */
-var mod_assert = require('assert');
-var mod_ctype = require('../../ctype.js');
-
-var cases = [
-{ json: { }, msg: 'bad JSON - no metadata or data' },
-{ json: { metadata: {} }, msg: 'bad JSON - bad metadata section' },
-{ json: { metadata: { 'JSON version': [] } },
-    msg: 'bad JSON - bad JSON version' },
-{ json: { metadata: { 'JSON version': 2 } },
-    msg: 'bad JSON - bad JSON version' },
-{ json: { metadata: { 'JSON version': '100.20' } },
-    msg: 'bad JSON - bad JSON version' },
-{ json: { metadata: { 'JSON version': '1.0' } },
-    msg: 'missing data section' },
-{ json: { metadata: { 'JSON version': '1.0' }, data: 1 },
-    msg: 'invalid data section' },
-{ json: { metadata: { 'JSON version': '1.0' }, data: 1.1 },
-    msg: 'invalid data section' },
-{ json: { metadata: { 'JSON version': '1.0' }, data: '1.1' },
-    msg: 'invalid data section' },
-{ json: { metadata: { 'JSON version': '1.0' }, data: {} },
-    msg: 'invalid data section' }
-];
-
-function test()
-{
-	var ii;
-
-	for (ii = 0; ii < cases.length; ii++) {
-		mod_assert.throws(function () {
-		    mod_ctype.parseCTF(cases[ii].json);
-		}, Error, cases[ii].msg);
-	}
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.float.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-var mod_fs = require('fs');
-var mod_ctype = require('../../ctype.js');
-var mod_assert = require('assert');
-
-function test()
-{
-	var data, parser;
-
-	data = JSON.parse(mod_fs.readFileSync('./float.json').toString());
-	parser = mod_ctype.parseCTF(data, { endian: 'big' });
-	mod_assert.deepEqual(parser.lstypes(), {});
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.int.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-var mod_fs = require('fs');
-var mod_ctype = require('../../ctype.js');
-var mod_assert = require('assert');
-
-function test()
-{
-	var data, parser;
-
-	data = JSON.parse(mod_fs.readFileSync('./int.json').toString());
-	parser = mod_ctype.parseCTF(data, { endian: 'big' });
-	mod_assert.deepEqual(parser.lstypes(), { 'int': 'int32_t' });
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.psinfo.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,17 +0,0 @@
-var mod_fs = require('fs');
-var mod_ctype = require('../../ctype.js');
-var mod_assert = require('assert');
-
-/*
- * This is too unwieldly to actually write out. Just make sure we can parse it
- * without errrors.
- */
-function test()
-{
-	var data;
-
-	data = JSON.parse(mod_fs.readFileSync('./psinfo.json').toString());
-	mod_ctype.parseCTF(data, { endian: 'big' });
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.struct.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-var mod_fs = require('fs');
-var mod_ctype = require('../../ctype.js');
-var mod_assert = require('assert');
-
-function test()
-{
-	var data, parser;
-
-	data = JSON.parse(mod_fs.readFileSync('./struct.json').toString());
-	parser = mod_ctype.parseCTF(data, { endian: 'big' });
-	mod_assert.deepEqual(parser.lstypes(), { 'long': 'int32_t',
-	    'time_t': 'long',
-	    'timestruc_t': 'struct timespec',
-	    'struct timespec': [ { 'tv_sec': { 'type': 'time_t' } },
-	        { 'tv_nsec': { 'type': 'long' } } ] });
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.typedef.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-var mod_fs = require('fs');
-var mod_ctype = require('../../ctype.js');
-var mod_assert = require('assert');
-
-function test()
-{
-	var data, parser;
-
-	data = JSON.parse(mod_fs.readFileSync('./typedef.json').toString());
-	parser = mod_ctype.parseCTF(data, { endian: 'big' });
-	mod_assert.deepEqual(parser.lstypes(), { 'bar_t': 'int',
-	    'int': 'int32_t' });
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/typedef.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-{ "metadata":
-	{
-		"ctf2json_version": "1.0",
-		"created_at": 1316302348,
-		"derived_from": "/lib/libc.so",
-		"ctf_version": 2,
-		"requested_types": [ "bar_t" ]
-	},
-"data":
-	[
-		{ "name": "int", "integer": { "length": 4, "signed": true } },
-		{ "name": "bar_t", "typedef": "int" }
-	]
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/float/tst.rfloat.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,767 +0,0 @@
-/*
- * Battery of tests to break our floating point implementation. Oh ho ho.
- *
- * There are a few useful ways to generate the expected output. The first is
- * just write a C program and write raw bytes out and inspect with xxd. Remember
- * to consider whether or not you're on a big endian or little endian machine.
- * Another useful site I found to help with some of this was:
- *
- * http://babbage.cs.qc.edu/IEEE-754/
- */
-
-var mod_ctype = require('../../../ctio.js');
-var ASSERT = require('assert');
-
-function testfloat()
-{
-	var buffer = new Buffer(4);
-	/* Start off with some of the easy ones: +zero */
-	buffer[0] = 0;
-	buffer[1] = 0;
-	buffer[2] = 0;
-	buffer[3] = 0;
-
-	ASSERT.equal(0, mod_ctype.rfloat(buffer, 'big', 0));
-	ASSERT.equal(0, mod_ctype.rfloat(buffer, 'little', 0));
-
-	/* Test -0 */
-	buffer[0] = 0x80;
-	ASSERT.equal(0, mod_ctype.rfloat(buffer, 'big', 0));
-	buffer[3] = buffer[0];
-	buffer[0] = 0;
-	ASSERT.equal(0, mod_ctype.rfloat(buffer, 'little', 0));
-
-	/* Catch +infin */
-	buffer[0] = 0x7f;
-	buffer[1] = 0x80;
-	buffer[2] = 0x00;
-	buffer[3] = 0x00;
-	ASSERT.equal(Number.POSITIVE_INFINITY,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-	buffer[3] = 0x7f;
-	buffer[2] = 0x80;
-	buffer[1] = 0x00;
-	buffer[0] = 0x00;
-	ASSERT.equal(Number.POSITIVE_INFINITY,
-	    mod_ctype.rfloat(buffer, 'litle', 0));
-
-	/* Catch -infin */
-	buffer[0] = 0xff;
-	buffer[1] = 0x80;
-	buffer[2] = 0x00;
-	buffer[3] = 0x00;
-	ASSERT.equal(Number.NEGATIVE_INFINITY,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-	buffer[3] = 0xff;
-	buffer[2] = 0x80;
-	buffer[1] = 0x00;
-	buffer[0] = 0x00;
-	ASSERT.equal(Number.NEGATIVE_INFINITY,
-	    mod_ctype.rfloat(buffer, 'litle', 0));
-
-	/* Catch NaN */
-
-	buffer[0] = 0x7f;
-	buffer[1] = 0x80;
-	buffer[2] = 0x00;
-	buffer[3] = 0x23;
-	ASSERT.ok(isNaN(mod_ctype.rfloat(buffer, 'big', 0)));
-	buffer[3] = 0x7f;
-	buffer[2] = 0x80;
-	buffer[1] = 0x00;
-	buffer[0] = 0x23;
-	ASSERT.ok(isNaN(mod_ctype.rfloat(buffer, 'little', 0)));
-
-	/* Catch -infin */
-	buffer[0] = 0xff;
-	buffer[1] = 0x80;
-	buffer[2] = 0x00;
-	buffer[3] = 0x23;
-	ASSERT.ok(isNaN(mod_ctype.rfloat(buffer, 'big', 0)));
-	buffer[3] = 0xff;
-	buffer[2] = 0x80;
-	buffer[1] = 0x00;
-	buffer[0] = 0x23;
-	ASSERT.ok(isNaN(mod_ctype.rfloat(buffer, 'little', 0)));
-
-	/* On to some basic tests */
-	/* 1.125 */
-	buffer[0] = 0x3f;
-	buffer[1] = 0x90;
-	buffer[2] = 0x00;
-	buffer[3] = 0x00;
-	ASSERT.equal(1.125, mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x3f;
-	buffer[2] = 0x90;
-	buffer[1] = 0x00;
-	buffer[0] = 0x00;
-	ASSERT.equal(1.125, mod_ctype.rfloat(buffer, 'little', 0));
-
-	/* ff34a2b0 -2.4010576103645774e+38 */
-	buffer[0] = 0xff;
-	buffer[1] = 0x34;
-	buffer[2] = 0xa2;
-	buffer[3] = 0xb0;
-	ASSERT.equal(-2.4010576103645774e+38,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0xff;
-	buffer[2] = 0x34;
-	buffer[1] = 0xa2;
-	buffer[0] = 0xb0;
-	ASSERT.equal(-2.4010576103645774e+38,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-	/* Denormalized tests */
-
-	/* 0003f89a +/- 3.6468792534053364e-40 */
-	buffer[0] = 0x00;
-	buffer[1] = 0x03;
-	buffer[2] = 0xf8;
-	buffer[3] = 0x9a;
-	ASSERT.equal(3.6468792534053364e-40,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x00;
-	buffer[2] = 0x03;
-	buffer[1] = 0xf8;
-	buffer[0] = 0x9a;
-	ASSERT.equal(3.6468792534053364e-40,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-	buffer[0] = 0x80;
-	buffer[1] = 0x03;
-	buffer[2] = 0xf8;
-	buffer[3] = 0x9a;
-	ASSERT.equal(-3.6468792534053364e-40,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x80;
-	buffer[2] = 0x03;
-	buffer[1] = 0xf8;
-	buffer[0] = 0x9a;
-	ASSERT.equal(-3.6468792534053364e-40,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-
-	/* Maximum and minimum normalized and denormalized values */
-
-	/* Largest normalized number +/- 3.4028234663852886e+38 */
-
-	buffer[0] = 0x7f;
-	buffer[1] = 0x7f;
-	buffer[2] = 0xff;
-	buffer[3] = 0xff;
-	ASSERT.equal(3.4028234663852886e+38,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x7f;
-	buffer[2] = 0x7f;
-	buffer[1] = 0xff;
-	buffer[0] = 0xff;
-	ASSERT.equal(3.4028234663852886e+38,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-	buffer[0] = 0xff;
-	buffer[1] = 0x7f;
-	buffer[2] = 0xff;
-	buffer[3] = 0xff;
-	ASSERT.equal(-3.4028234663852886e+38,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0xff;
-	buffer[2] = 0x7f;
-	buffer[1] = 0xff;
-	buffer[0] = 0xff;
-	ASSERT.equal(-3.4028234663852886e+38,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-	/* Smallest normalied number +/- 1.1754943508222875e-38 */
-	buffer[0] = 0x00;
-	buffer[1] = 0x80;
-	buffer[2] = 0x00;
-	buffer[3] = 0x00;
-	ASSERT.equal(1.1754943508222875e-38,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x00;
-	buffer[2] = 0x80;
-	buffer[1] = 0x00;
-	buffer[0] = 0x00;
-	ASSERT.equal(1.1754943508222875e-38,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-	buffer[0] = 0x80;
-	buffer[1] = 0x80;
-	buffer[2] = 0x00;
-	buffer[3] = 0x00;
-	ASSERT.equal(-1.1754943508222875e-38,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x80;
-	buffer[2] = 0x80;
-	buffer[1] = 0x00;
-	buffer[0] = 0x00;
-	ASSERT.equal(-1.1754943508222875e-38,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-
-	/* Smallest denormalized number 1.401298464324817e-45 */
-	buffer[0] = 0x00;
-	buffer[1] = 0x00;
-	buffer[2] = 0x00;
-	buffer[3] = 0x01;
-	ASSERT.equal(1.401298464324817e-45,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x00;
-	buffer[2] = 0x00;
-	buffer[1] = 0x00;
-	buffer[0] = 0x01;
-	ASSERT.equal(1.401298464324817e-45,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-	buffer[0] = 0x80;
-	buffer[1] = 0x00;
-	buffer[2] = 0x00;
-	buffer[3] = 0x01;
-	ASSERT.equal(-1.401298464324817e-45,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x80;
-	buffer[2] = 0x00;
-	buffer[1] = 0x00;
-	buffer[0] = 0x01;
-	ASSERT.equal(-1.401298464324817e-45,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-	/* Largest denormalized value +/- 1.1754942106924411e-38 */
-	buffer[0] = 0x00;
-	buffer[1] = 0x7f;
-	buffer[2] = 0xff;
-	buffer[3] = 0xff;
-	ASSERT.equal(1.1754942106924411e-38,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x00;
-	buffer[2] = 0x7f;
-	buffer[1] = 0xff;
-	buffer[0] = 0xff;
-	ASSERT.equal(1.1754942106924411e-38,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-	buffer[0] = 0x80;
-	buffer[1] = 0x7f;
-	buffer[2] = 0xff;
-	buffer[3] = 0xff;
-	ASSERT.equal(-1.1754942106924411e-38,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-
-	buffer[3] = 0x80;
-	buffer[2] = 0x7f;
-	buffer[1] = 0xff;
-	buffer[0] = 0xff;
-	ASSERT.equal(-1.1754942106924411e-38,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-
-	/* Do some quick offset testing */
-	buffer = new Buffer(6);
-	buffer[0] = 0x7f;
-	buffer[1] = 0x4e;
-	buffer[2] = 0x8a;
-	buffer[3] = 0x79;
-	buffer[4] = 0xcd;
-	buffer[5] = 0x3f;
-
-	ASSERT.equal(2.745399582697325e+38,
-	    mod_ctype.rfloat(buffer, 'big', 0));
-	ASSERT.equal(1161619072,
-	    mod_ctype.rfloat(buffer, 'big', 1));
-	ASSERT.equal(-1.2027516403607578e-32,
-	    mod_ctype.rfloat(buffer, 'big', 2));
-
-	ASSERT.equal(8.97661320504413e+34,
-	    mod_ctype.rfloat(buffer, 'little', 0));
-	ASSERT.equal(-261661920,
-	    mod_ctype.rfloat(buffer, 'little', 1));
-	ASSERT.equal(1.605271577835083,
-	    mod_ctype.rfloat(buffer, 'little', 2));
-
-}
-
-function testdouble()
-{
-	var buffer = new Buffer(10);
-
-	/* Check 0 */
-	buffer[0] = 0;
-	buffer[1] = 0;
-	buffer[2] = 0;
-	buffer[3] = 0;
-	buffer[4] = 0;
-	buffer[5] = 0;
-	buffer[6] = 0;
-	buffer[7] = 0;
-	ASSERT.equal(0,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-	ASSERT.equal(0,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	buffer[0] = 0x80;
-	buffer[1] = 0;
-	buffer[2] = 0;
-	buffer[3] = 0;
-	buffer[4] = 0;
-	buffer[5] = 0;
-	buffer[6] = 0;
-	buffer[7] = 0;
-	ASSERT.equal(0,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-	buffer[7] = 0x80;
-	buffer[6] = 0;
-	buffer[5] = 0;
-	buffer[4] = 0;
-	buffer[3] = 0;
-	buffer[2] = 0;
-	buffer[1] = 0;
-	buffer[0] = 0;
-	ASSERT.equal(0,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* Check NaN */
-	buffer[0] = 0x7f;
-	buffer[1] = 0xf0;
-	buffer[2] = 0;
-	buffer[3] = 0;
-	buffer[4] = 0;
-	buffer[5] = 0;
-	buffer[6] = 0;
-	buffer[7] = 23;
-	ASSERT.ok(isNaN(mod_ctype.rdouble(buffer, 'big', 0)));
-
-	buffer[7] = 0x7f;
-	buffer[6] = 0xf0;
-	buffer[5] = 0;
-	buffer[4] = 0;
-	buffer[3] = 0;
-	buffer[2] = 0;
-	buffer[1] = 0;
-	buffer[0] = 23;
-	ASSERT.ok(isNaN(mod_ctype.rdouble(buffer, 'little', 0)));
-
-	buffer[0] = 0xff;
-	buffer[1] = 0xf0;
-	buffer[2] = 0;
-	buffer[3] = 0;
-	buffer[4] = 0;
-	buffer[5] = 0;
-	buffer[6] = 0;
-	buffer[7] = 23;
-	ASSERT.ok(isNaN(mod_ctype.rdouble(buffer, 'big', 0)));
-
-	buffer[7] = 0xff;
-	buffer[6] = 0xf0;
-	buffer[5] = 0;
-	buffer[4] = 0;
-	buffer[3] = 0;
-	buffer[2] = 0;
-	buffer[1] = 0;
-	buffer[0] = 23;
-	ASSERT.ok(isNaN(mod_ctype.rdouble(buffer, 'little', 0)));
-
-	/* pos inf */
-	buffer[0] = 0x7f;
-	buffer[1] = 0xf0;
-	buffer[2] = 0;
-	buffer[3] = 0;
-	buffer[4] = 0;
-	buffer[5] = 0;
-	buffer[6] = 0;
-	buffer[7] = 0;
-	ASSERT.equal(Number.POSITIVE_INFINITY,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x7f;
-	buffer[6] = 0xf0;
-	buffer[5] = 0;
-	buffer[4] = 0;
-	buffer[3] = 0;
-	buffer[2] = 0;
-	buffer[1] = 0;
-	buffer[0] = 0;
-	ASSERT.equal(Number.POSITIVE_INFINITY,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* neg inf */
-	buffer[0] = 0xff;
-	buffer[1] = 0xf0;
-	buffer[2] = 0;
-	buffer[3] = 0;
-	buffer[4] = 0;
-	buffer[5] = 0;
-	buffer[6] = 0;
-	buffer[7] = 0;
-	ASSERT.equal(Number.NEGATIVE_INFINITY,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0xff;
-	buffer[6] = 0xf0;
-	buffer[5] = 0;
-	buffer[4] = 0;
-	buffer[3] = 0;
-	buffer[2] = 0;
-	buffer[1] = 0;
-	buffer[0] = 0;
-	ASSERT.equal(Number.NEGATIVE_INFINITY,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* Simple normalized values */
-
-	/* +/- 1.125 */
-	buffer[0] = 0x3f;
-	buffer[1] = 0xf2;
-	buffer[2] = 0;
-	buffer[3] = 0;
-	buffer[4] = 0;
-	buffer[5] = 0;
-	buffer[6] = 0;
-	buffer[7] = 0;
-	ASSERT.equal(1.125,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x3f;
-	buffer[6] = 0xf2;
-	buffer[5] = 0;
-	buffer[4] = 0;
-	buffer[3] = 0;
-	buffer[2] = 0;
-	buffer[1] = 0;
-	buffer[0] = 0;
-	ASSERT.equal(1.125,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	buffer[0] = 0xbf;
-	buffer[1] = 0xf2;
-	buffer[2] = 0;
-	buffer[3] = 0;
-	buffer[4] = 0;
-	buffer[5] = 0;
-	buffer[6] = 0;
-	buffer[7] = 0;
-	ASSERT.equal(-1.125,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0xbf;
-	buffer[6] = 0xf2;
-	buffer[5] = 0;
-	buffer[4] = 0;
-	buffer[3] = 0;
-	buffer[2] = 0;
-	buffer[1] = 0;
-	buffer[0] = 0;
-	ASSERT.equal(-1.125,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* +/- 1.4397318913736026e+283 */
-	buffer[0] = 0x7a;
-	buffer[1] = 0xb8;
-	buffer[2] = 0xc9;
-	buffer[3] = 0x34;
-	buffer[4] = 0x72;
-	buffer[5] = 0x16;
-	buffer[6] = 0xf9;
-	buffer[7] = 0x0e;
-	ASSERT.equal(1.4397318913736026e+283,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x7a;
-	buffer[6] = 0xb8;
-	buffer[5] = 0xc9;
-	buffer[4] = 0x34;
-	buffer[3] = 0x72;
-	buffer[2] = 0x16;
-	buffer[1] = 0xf9;
-	buffer[0] = 0x0e;
-	ASSERT.equal(1.4397318913736026e+283,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	buffer[0] = 0xfa;
-	buffer[1] = 0xb8;
-	buffer[2] = 0xc9;
-	buffer[3] = 0x34;
-	buffer[4] = 0x72;
-	buffer[5] = 0x16;
-	buffer[6] = 0xf9;
-	buffer[7] = 0x0e;
-	ASSERT.equal(-1.4397318913736026e+283,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0xfa;
-	buffer[6] = 0xb8;
-	buffer[5] = 0xc9;
-	buffer[4] = 0x34;
-	buffer[3] = 0x72;
-	buffer[2] = 0x16;
-	buffer[1] = 0xf9;
-	buffer[0] = 0x0e;
-	ASSERT.equal(-1.4397318913736026e+283,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* Denormalized values */
-	/* +/- 8.82521232268344e-309 */
-	buffer[0] = 0x00;
-	buffer[1] = 0x06;
-	buffer[2] = 0x58;
-	buffer[3] = 0x94;
-	buffer[4] = 0x13;
-	buffer[5] = 0x27;
-	buffer[6] = 0x8a;
-	buffer[7] = 0xcd;
-	ASSERT.equal(8.82521232268344e-309,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x00;
-	buffer[6] = 0x06;
-	buffer[5] = 0x58;
-	buffer[4] = 0x94;
-	buffer[3] = 0x13;
-	buffer[2] = 0x27;
-	buffer[1] = 0x8a;
-	buffer[0] = 0xcd;
-	ASSERT.equal(8.82521232268344e-309,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	buffer[0] = 0x80;
-	buffer[1] = 0x06;
-	buffer[2] = 0x58;
-	buffer[3] = 0x94;
-	buffer[4] = 0x13;
-	buffer[5] = 0x27;
-	buffer[6] = 0x8a;
-	buffer[7] = 0xcd;
-	ASSERT.equal(-8.82521232268344e-309,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x80;
-	buffer[6] = 0x06;
-	buffer[5] = 0x58;
-	buffer[4] = 0x94;
-	buffer[3] = 0x13;
-	buffer[2] = 0x27;
-	buffer[1] = 0x8a;
-	buffer[0] = 0xcd;
-	ASSERT.equal(-8.82521232268344e-309,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* Edge cases, maximum and minimum values */
-
-	/* Smallest denormalized value 5e-324 */
-	buffer[0] = 0x00;
-	buffer[1] = 0x00;
-	buffer[2] = 0x00;
-	buffer[3] = 0x00;
-	buffer[4] = 0x00;
-	buffer[5] = 0x00;
-	buffer[6] = 0x00;
-	buffer[7] = 0x01;
-	ASSERT.equal(5e-324,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x00;
-	buffer[6] = 0x00;
-	buffer[5] = 0x00;
-	buffer[4] = 0x00;
-	buffer[3] = 0x00;
-	buffer[2] = 0x00;
-	buffer[1] = 0x00;
-	buffer[0] = 0x01;
-	ASSERT.equal(5e-324,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	buffer[0] = 0x80;
-	buffer[1] = 0x00;
-	buffer[2] = 0x00;
-	buffer[3] = 0x00;
-	buffer[4] = 0x00;
-	buffer[5] = 0x00;
-	buffer[6] = 0x00;
-	buffer[7] = 0x01;
-	ASSERT.equal(-5e-324,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x80;
-	buffer[6] = 0x00;
-	buffer[5] = 0x00;
-	buffer[4] = 0x00;
-	buffer[3] = 0x00;
-	buffer[2] = 0x00;
-	buffer[1] = 0x00;
-	buffer[0] = 0x01;
-	ASSERT.equal(-5e-324,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* Largest denormalized value 2.225073858507201e-308 */
-	buffer[0] = 0x00;
-	buffer[1] = 0x0f;
-	buffer[2] = 0xff;
-	buffer[3] = 0xff;
-	buffer[4] = 0xff;
-	buffer[5] = 0xff;
-	buffer[6] = 0xff;
-	buffer[7] = 0xff;
-	ASSERT.equal(2.225073858507201e-308,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x00;
-	buffer[6] = 0x0f;
-	buffer[5] = 0xff;
-	buffer[4] = 0xff;
-	buffer[3] = 0xff;
-	buffer[2] = 0xff;
-	buffer[1] = 0xff;
-	buffer[0] = 0xff;
-	ASSERT.equal(2.225073858507201e-308,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	buffer[0] = 0x80;
-	buffer[1] = 0x0f;
-	buffer[2] = 0xff;
-	buffer[3] = 0xff;
-	buffer[4] = 0xff;
-	buffer[5] = 0xff;
-	buffer[6] = 0xff;
-	buffer[7] = 0xff;
-	ASSERT.equal(-2.225073858507201e-308,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x80;
-	buffer[6] = 0x0f;
-	buffer[5] = 0xff;
-	buffer[4] = 0xff;
-	buffer[3] = 0xff;
-	buffer[2] = 0xff;
-	buffer[1] = 0xff;
-	buffer[0] = 0xff;
-	ASSERT.equal(-2.225073858507201e-308,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* Smallest normalized value 2.2250738585072014e-308 */
-	buffer[0] = 0x00;
-	buffer[1] = 0x10;
-	buffer[2] = 0x00;
-	buffer[3] = 0x00;
-	buffer[4] = 0x00;
-	buffer[5] = 0x00;
-	buffer[6] = 0x00;
-	buffer[7] = 0x00;
-	ASSERT.equal(2.2250738585072014e-308,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x00;
-	buffer[6] = 0x10;
-	buffer[5] = 0x00;
-	buffer[4] = 0x00;
-	buffer[3] = 0x00;
-	buffer[2] = 0x00;
-	buffer[1] = 0x00;
-	buffer[0] = 0x00;
-	ASSERT.equal(2.2250738585072014e-308,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	buffer[0] = 0x80;
-	buffer[1] = 0x10;
-	buffer[2] = 0x00;
-	buffer[3] = 0x00;
-	buffer[4] = 0x00;
-	buffer[5] = 0x00;
-	buffer[6] = 0x00;
-	buffer[7] = 0x00;
-	ASSERT.equal(-2.2250738585072014e-308,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x80;
-	buffer[6] = 0x10;
-	buffer[5] = 0x00;
-	buffer[4] = 0x00;
-	buffer[3] = 0x00;
-	buffer[2] = 0x00;
-	buffer[1] = 0x00;
-	buffer[0] = 0x00;
-	ASSERT.equal(-2.2250738585072014e-308,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* Largest normalized value 1.7976931348623157e+308 */
-	buffer[0] = 0x7f;
-	buffer[1] = 0xef;
-	buffer[2] = 0xff;
-	buffer[3] = 0xff;
-	buffer[4] = 0xff;
-	buffer[5] = 0xff;
-	buffer[6] = 0xff;
-	buffer[7] = 0xff;
-	ASSERT.equal(1.7976931348623157e+308,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0x7f;
-	buffer[6] = 0xef;
-	buffer[5] = 0xff;
-	buffer[4] = 0xff;
-	buffer[3] = 0xff;
-	buffer[2] = 0xff;
-	buffer[1] = 0xff;
-	buffer[0] = 0xff;
-	ASSERT.equal(1.7976931348623157e+308,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	buffer[0] = 0xff;
-	buffer[1] = 0xef;
-	buffer[2] = 0xff;
-	buffer[3] = 0xff;
-	buffer[4] = 0xff;
-	buffer[5] = 0xff;
-	buffer[6] = 0xff;
-	buffer[7] = 0xff;
-	ASSERT.equal(-1.7976931348623157e+308,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-
-	buffer[7] = 0xff;
-	buffer[6] = 0xef;
-	buffer[5] = 0xff;
-	buffer[4] = 0xff;
-	buffer[3] = 0xff;
-	buffer[2] = 0xff;
-	buffer[1] = 0xff;
-	buffer[0] = 0xff;
-	ASSERT.equal(-1.7976931348623157e+308,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-
-	/* Try offsets */
-	buffer[0] = 0xde;
-	buffer[1] = 0xad;
-	buffer[2] = 0xbe;
-	buffer[3] = 0xef;
-	buffer[4] = 0xba;
-	buffer[5] = 0xdd;
-	buffer[6] = 0xca;
-	buffer[7] = 0xfe;
-	buffer[8] = 0x16;
-	buffer[9] = 0x79;
-
-	ASSERT.equal(-1.1885958404126936e+148,
-	    mod_ctype.rdouble(buffer, 'big', 0));
-	ASSERT.equal(-2.4299184080448593e-88,
-	    mod_ctype.rdouble(buffer, 'big', 1));
-	ASSERT.equal(-0.000015130017658081283,
-	    mod_ctype.rdouble(buffer, 'big', 2));
-
-	ASSERT.equal(-5.757458694845505e+302,
-	    mod_ctype.rdouble(buffer, 'little', 0));
-	ASSERT.equal(6.436459604192476e-198,
-	    mod_ctype.rdouble(buffer, 'little', 1));
-	ASSERT.equal(1.9903745632417286e+275,
-	    mod_ctype.rdouble(buffer, 'little', 2));
-}
-
-testfloat();
-testdouble();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/float/tst.wfloat.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,753 +0,0 @@
-/*
- * Another place to find bugs that may yet plague us. This time with writing out
- * floats to arrays. We are lazy and did basically just take the opposite of our
- * test code to read in values.
- */
-
-var mod_ctype = require('../../../ctio.js');
-var ASSERT = require('assert');
-
-
-/*
- *	A useful thing to keep around for debugging
- *	console.log('buffer[0]: ' + buffer[0].toString(16));
- *	console.log('buffer[1]: ' + buffer[1].toString(16));
- *	console.log('buffer[2]: ' + buffer[2].toString(16));
- *	console.log('buffer[3]: ' + buffer[3].toString(16));
- *	console.log('buffer[4]: ' + buffer[4].toString(16));
- *	console.log('buffer[5]: ' + buffer[5].toString(16));
- * 	console.log('buffer[6]: ' + buffer[6].toString(16));
- *	console.log('buffer[7]: ' + buffer[7].toString(16));
- */
-
-function testfloat()
-{
-	var buffer = new Buffer(4);
-	mod_ctype.wfloat(0, 'big', buffer, 0);
-	/* Start off with some of the easy ones: +zero */
-	ASSERT.equal(0, buffer[0]);
-	ASSERT.equal(0, buffer[1]);
-	ASSERT.equal(0, buffer[2]);
-	ASSERT.equal(0, buffer[3]);
-	mod_ctype.wfloat(0, 'little', buffer, 0);
-	ASSERT.equal(0, buffer[0]);
-	ASSERT.equal(0, buffer[1]);
-	ASSERT.equal(0, buffer[2]);
-	ASSERT.equal(0, buffer[3]);
-
-	/* Catch +infin */
-	mod_ctype.wfloat(Number.POSITIVE_INFINITY, 'big', buffer, 0);
-	ASSERT.equal(0x7f, buffer[0]);
-	ASSERT.equal(0x80, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	mod_ctype.wfloat(Number.POSITIVE_INFINITY, 'little', buffer, 0);
-	ASSERT.equal(0x7f, buffer[3]);
-	ASSERT.equal(0x80, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	/* Catch -infin */
-	mod_ctype.wfloat(Number.NEGATIVE_INFINITY, 'big', buffer, 0);
-	ASSERT.equal(0xff, buffer[0]);
-	ASSERT.equal(0x80, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	mod_ctype.wfloat(Number.NEGATIVE_INFINITY, 'little', buffer, 0);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0x80, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	/* Catch NaN */
-
-	/*
-	 * NaN Is a litle weird in its requirements, so we're going to encode a
-	 * bit of how we actually implement it into this test. Probably not the
-	 * best, since technically the sign is a don't care and the mantissa
-	 * needs to just be non-zero.
-	 */
-	mod_ctype.wfloat(NaN, 'big', buffer, 0);
-	ASSERT.equal(0x7f, buffer[0]);
-	ASSERT.equal(0x80, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x17, buffer[3]);
-	mod_ctype.wfloat(NaN, 'little', buffer, 0);
-	ASSERT.equal(0x7f, buffer[3]);
-	ASSERT.equal(0x80, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x17, buffer[0]);
-
-	/* On to some basic tests */
-	/* 1.125 */
-	mod_ctype.wfloat(1.125, 'big', buffer, 0);
-	ASSERT.equal(0x3f, buffer[0]);
-	ASSERT.equal(0x90, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	mod_ctype.wfloat(1.125, 'little', buffer, 0);
-	ASSERT.equal(0x3f, buffer[3]);
-	ASSERT.equal(0x90, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	mod_ctype.wfloat(1.0000001192092896, 'big', buffer, 0);
-	ASSERT.equal(0x3f, buffer[0]);
-	ASSERT.equal(0x80, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x01, buffer[3]);
-	mod_ctype.wfloat(1.0000001192092896, 'little', buffer, 0);
-	ASSERT.equal(0x3f, buffer[3]);
-	ASSERT.equal(0x80, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x01, buffer[0]);
-
-	mod_ctype.wfloat(1.0000001192092896, 'big', buffer, 0);
-	ASSERT.equal(0x3f, buffer[0]);
-	ASSERT.equal(0x80, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x01, buffer[3]);
-	mod_ctype.wfloat(1.0000001192092896, 'little', buffer, 0);
-	ASSERT.equal(0x3f, buffer[3]);
-	ASSERT.equal(0x80, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x01, buffer[0]);
-
-	mod_ctype.wfloat(2.3283067140944524e-10, 'big', buffer, 0);
-	ASSERT.equal(0x2f, buffer[0]);
-	ASSERT.equal(0x80, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x01, buffer[3]);
-	mod_ctype.wfloat(2.3283067140944524e-10, 'little', buffer, 0);
-	ASSERT.equal(0x2f, buffer[3]);
-	ASSERT.equal(0x80, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x01, buffer[0]);
-
-	/* ff34a2b0 -2.4010576103645774e+38 */
-	mod_ctype.wfloat(-2.4010576103645774e+38,
-	    'big', buffer, 0);
-	ASSERT.equal(0xff, buffer[0]);
-	ASSERT.equal(0x34, buffer[1]);
-	ASSERT.equal(0xa2, buffer[2]);
-	ASSERT.equal(0xb0, buffer[3]);
-	mod_ctype.wfloat(-2.4010576103645774e+38,
-	    'little', buffer, 0);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0x34, buffer[2]);
-	ASSERT.equal(0xa2, buffer[1]);
-	ASSERT.equal(0xb0, buffer[0]);
-
-	/* Denormalized tests */
-
-	/* 0003f89a +/- 3.6468792534053364e-40 */
-	mod_ctype.wfloat(3.6468792534053364e-40,
-	    'big', buffer, 0);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x03, buffer[1]);
-	ASSERT.equal(0xf8, buffer[2]);
-	ASSERT.equal(0x9a, buffer[3]);
-	mod_ctype.wfloat(3.6468792534053364e-40,
-	    'little', buffer, 0);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x03, buffer[2]);
-	ASSERT.equal(0xf8, buffer[1]);
-	ASSERT.equal(0x9a, buffer[0]);
-
-	mod_ctype.wfloat(-3.6468792534053364e-40,
-	    'big', buffer, 0);
-	ASSERT.equal(0x80, buffer[0]);
-	ASSERT.equal(0x03, buffer[1]);
-	ASSERT.equal(0xf8, buffer[2]);
-	ASSERT.equal(0x9a, buffer[3]);
-	mod_ctype.wfloat(-3.6468792534053364e-40,
-	    'little', buffer, 0);
-	ASSERT.equal(0x80, buffer[3]);
-	ASSERT.equal(0x03, buffer[2]);
-	ASSERT.equal(0xf8, buffer[1]);
-	ASSERT.equal(0x9a, buffer[0]);
-
-	/* Maximum and minimum normalized and denormalized values */
-
-	/* Largest normalized number +/- 3.4028234663852886e+38 */
-
-	mod_ctype.wfloat(3.4028234663852886e+38,
-	    'big', buffer, 0);
-	ASSERT.equal(0x7f, buffer[0]);
-	ASSERT.equal(0x7f, buffer[1]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[3]);
-	mod_ctype.wfloat(3.4028234663852886e+38,
-	    'little', buffer, 0);
-	ASSERT.equal(0x7f, buffer[3]);
-	ASSERT.equal(0x7f, buffer[2]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xff, buffer[0]);
-
-	mod_ctype.wfloat(-3.4028234663852886e+38,
-	    'big', buffer, 0);
-	ASSERT.equal(0xff, buffer[0]);
-	ASSERT.equal(0x7f, buffer[1]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[3]);
-	mod_ctype.wfloat(-3.4028234663852886e+38,
-	    'little', buffer, 0);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0x7f, buffer[2]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xff, buffer[0]);
-
-	/* Smallest normalied number +/- 1.1754943508222875e-38 */
-
-	mod_ctype.wfloat(1.1754943508222875e-38,
-	    'big', buffer, 0);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x80, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	mod_ctype.wfloat(1.1754943508222875e-38,
-	    'little', buffer, 0);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x80, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	mod_ctype.wfloat(-1.1754943508222875e-38,
-	    'big', buffer, 0);
-	ASSERT.equal(0x80, buffer[0]);
-	ASSERT.equal(0x80, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	mod_ctype.wfloat(-1.1754943508222875e-38,
-	    'little', buffer, 0);
-	ASSERT.equal(0x80, buffer[3]);
-	ASSERT.equal(0x80, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	/* Smallest denormalized number 1.401298464324817e-45 */
-	mod_ctype.wfloat(1.401298464324817e-45,
-	    'big', buffer, 0);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x01, buffer[3]);
-	mod_ctype.wfloat(1.401298464324817e-45,
-	    'little', buffer, 0);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x01, buffer[0]);
-
-	mod_ctype.wfloat(-1.401298464324817e-45,
-	    'big', buffer, 0);
-	ASSERT.equal(0x80, buffer[0]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x01, buffer[3]);
-	mod_ctype.wfloat(-1.401298464324817e-45,
-	    'little', buffer, 0);
-	ASSERT.equal(0x80, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x01, buffer[0]);
-
-	/* Largest denormalized value +/- 1.1754942106924411e-38 */
-
-	mod_ctype.wfloat(1.1754942106924411e-38,
-	    'big', buffer, 0);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x7f, buffer[1]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[3]);
-	mod_ctype.wfloat(1.1754942106924411e-38,
-	    'little', buffer, 0);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x7f, buffer[2]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xff, buffer[0]);
-
-	mod_ctype.wfloat(-1.1754942106924411e-38,
-	    'big', buffer, 0);
-	ASSERT.equal(0x80, buffer[0]);
-	ASSERT.equal(0x7f, buffer[1]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[3]);
-	mod_ctype.wfloat(-1.1754942106924411e-38,
-	    'little', buffer, 0);
-	ASSERT.equal(0x80, buffer[3]);
-	ASSERT.equal(0x7f, buffer[2]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xff, buffer[0]);
-
-	/* Do some quick offset testing */
-	buffer = new Buffer(6);
-	mod_ctype.wfloat(-1.2027516403607578e-32,
-	    'big', buffer, 2);
-	ASSERT.equal(0x8a, buffer[2]);
-	ASSERT.equal(0x79, buffer[3]);
-	ASSERT.equal(0xcd, buffer[4]);
-	ASSERT.equal(0x3f, buffer[5]);
-
-	mod_ctype.wfloat(-1.2027516403607578e-32,
-	    'little', buffer, 2);
-	ASSERT.equal(0x8a, buffer[5]);
-	ASSERT.equal(0x79, buffer[4]);
-	ASSERT.equal(0xcd, buffer[3]);
-	ASSERT.equal(0x3f, buffer[2]);
-
-}
-
-function testdouble()
-{
-	var buffer = new Buffer(10);
-
-	/* Check 0 */
-	mod_ctype.wdouble(0, 'big', buffer, 0);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[7]);
-	mod_ctype.wdouble(0, 'little', buffer, 0);
-	ASSERT.equal(0x00, buffer[7]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	/* Check NaN */
-	/* Similar to floats we are only generating a subset of values */
-	mod_ctype.wdouble(NaN, 'big', buffer, 0);
-	ASSERT.equal(0x7f, buffer[0]);
-	ASSERT.equal(0xf0, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x17, buffer[7]);
-	mod_ctype.wdouble(NaN, 'little', buffer, 0);
-	ASSERT.equal(0x7f, buffer[7]);
-	ASSERT.equal(0xf0, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x17, buffer[0]);
-
-	/* pos inf */
-	mod_ctype.wdouble(Number.POSITIVE_INFINITY,
-	    'big', buffer, 0);
-	ASSERT.equal(0x7f, buffer[0]);
-	ASSERT.equal(0xf0, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[7]);
-	mod_ctype.wdouble(Number.POSITIVE_INFINITY,
-	    'little', buffer, 0);
-	ASSERT.equal(0x7f, buffer[7]);
-	ASSERT.equal(0xf0, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	/* neg inf */
-	mod_ctype.wdouble(Number.NEGATIVE_INFINITY,
-	    'big', buffer, 0);
-	ASSERT.equal(0xff, buffer[0]);
-	ASSERT.equal(0xf0, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[7]);
-	mod_ctype.wdouble(Number.NEGATIVE_INFINITY,
-	    'little', buffer, 0);
-	ASSERT.equal(0xff, buffer[7]);
-	ASSERT.equal(0xf0, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	/* Simple normalized values */
-
-	/* +/- 1.125 */
-	mod_ctype.wdouble(1.125,
-	    'big', buffer, 0);
-	ASSERT.equal(0x3f, buffer[0]);
-	ASSERT.equal(0xf2, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[7]);
-
-	mod_ctype.wdouble(1.125,
-	    'little', buffer, 0);
-	ASSERT.equal(0x3f, buffer[7]);
-	ASSERT.equal(0xf2, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	mod_ctype.wdouble(-1.125,
-	    'big', buffer, 0);
-	ASSERT.equal(0xbf, buffer[0]);
-	ASSERT.equal(0xf2, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[7]);
-
-	mod_ctype.wdouble(-1.125,
-	    'little', buffer, 0);
-	ASSERT.equal(0xbf, buffer[7]);
-	ASSERT.equal(0xf2, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-
-	/* +/- 1.4397318913736026e+283 */
-	mod_ctype.wdouble(1.4397318913736026e+283,
-	    'big', buffer, 0);
-	ASSERT.equal(0x7a, buffer[0]);
-	ASSERT.equal(0xb8, buffer[1]);
-	ASSERT.equal(0xc9, buffer[2]);
-	ASSERT.equal(0x34, buffer[3]);
-	ASSERT.equal(0x72, buffer[4]);
-	ASSERT.equal(0x16, buffer[5]);
-	ASSERT.equal(0xf9, buffer[6]);
-	ASSERT.equal(0x0e, buffer[7]);
-
-	mod_ctype.wdouble(1.4397318913736026e+283,
-	    'little', buffer, 0);
-	ASSERT.equal(0x7a, buffer[7]);
-	ASSERT.equal(0xb8, buffer[6]);
-	ASSERT.equal(0xc9, buffer[5]);
-	ASSERT.equal(0x34, buffer[4]);
-	ASSERT.equal(0x72, buffer[3]);
-	ASSERT.equal(0x16, buffer[2]);
-	ASSERT.equal(0xf9, buffer[1]);
-	ASSERT.equal(0x0e, buffer[0]);
-
-	mod_ctype.wdouble(-1.4397318913736026e+283,
-	    'big', buffer, 0);
-	ASSERT.equal(0xfa, buffer[0]);
-	ASSERT.equal(0xb8, buffer[1]);
-	ASSERT.equal(0xc9, buffer[2]);
-	ASSERT.equal(0x34, buffer[3]);
-	ASSERT.equal(0x72, buffer[4]);
-	ASSERT.equal(0x16, buffer[5]);
-	ASSERT.equal(0xf9, buffer[6]);
-	ASSERT.equal(0x0e, buffer[7]);
-
-	mod_ctype.wdouble(-1.4397318913736026e+283,
-	    'little', buffer, 0);
-	ASSERT.equal(0xfa, buffer[7]);
-	ASSERT.equal(0xb8, buffer[6]);
-	ASSERT.equal(0xc9, buffer[5]);
-	ASSERT.equal(0x34, buffer[4]);
-	ASSERT.equal(0x72, buffer[3]);
-	ASSERT.equal(0x16, buffer[2]);
-	ASSERT.equal(0xf9, buffer[1]);
-	ASSERT.equal(0x0e, buffer[0]);
-
-	/* Denormalized values */
-	/* +/- 8.82521232268344e-309 */
-	mod_ctype.wdouble(8.82521232268344e-309,
-	    'big', buffer, 0);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x06, buffer[1]);
-	ASSERT.equal(0x58, buffer[2]);
-	ASSERT.equal(0x94, buffer[3]);
-	ASSERT.equal(0x13, buffer[4]);
-	ASSERT.equal(0x27, buffer[5]);
-	ASSERT.equal(0x8a, buffer[6]);
-	ASSERT.equal(0xcd, buffer[7]);
-
-	mod_ctype.wdouble(8.82521232268344e-309,
-	    'little', buffer, 0);
-	ASSERT.equal(0x00, buffer[7]);
-	ASSERT.equal(0x06, buffer[6]);
-	ASSERT.equal(0x58, buffer[5]);
-	ASSERT.equal(0x94, buffer[4]);
-	ASSERT.equal(0x13, buffer[3]);
-	ASSERT.equal(0x27, buffer[2]);
-	ASSERT.equal(0x8a, buffer[1]);
-	ASSERT.equal(0xcd, buffer[0]);
-
-	mod_ctype.wdouble(-8.82521232268344e-309,
-	    'big', buffer, 0);
-	ASSERT.equal(0x80, buffer[0]);
-	ASSERT.equal(0x06, buffer[1]);
-	ASSERT.equal(0x58, buffer[2]);
-	ASSERT.equal(0x94, buffer[3]);
-	ASSERT.equal(0x13, buffer[4]);
-	ASSERT.equal(0x27, buffer[5]);
-	ASSERT.equal(0x8a, buffer[6]);
-	ASSERT.equal(0xcd, buffer[7]);
-
-	mod_ctype.wdouble(-8.82521232268344e-309,
-	    'little', buffer, 0);
-	ASSERT.equal(0x80, buffer[7]);
-	ASSERT.equal(0x06, buffer[6]);
-	ASSERT.equal(0x58, buffer[5]);
-	ASSERT.equal(0x94, buffer[4]);
-	ASSERT.equal(0x13, buffer[3]);
-	ASSERT.equal(0x27, buffer[2]);
-	ASSERT.equal(0x8a, buffer[1]);
-	ASSERT.equal(0xcd, buffer[0]);
-
-
-	/* Edge cases, maximum and minimum values */
-
-	/* Smallest denormalized value 5e-324 */
-	mod_ctype.wdouble(5e-324,
-	    'big', buffer, 0);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x01, buffer[7]);
-
-	mod_ctype.wdouble(5e-324,
-	    'little', buffer, 0);
-	ASSERT.equal(0x00, buffer[7]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x01, buffer[0]);
-
-	mod_ctype.wdouble(-5e-324,
-	    'big', buffer, 0);
-	ASSERT.equal(0x80, buffer[0]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x01, buffer[7]);
-
-	mod_ctype.wdouble(-5e-324,
-	    'little', buffer, 0);
-	ASSERT.equal(0x80, buffer[7]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x01, buffer[0]);
-
-
-
-	/* Largest denormalized value 2.225073858507201e-308 */
-	mod_ctype.wdouble(2.225073858507201e-308,
-	    'big', buffer, 0);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x0f, buffer[1]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0xff, buffer[4]);
-	ASSERT.equal(0xff, buffer[5]);
-	ASSERT.equal(0xff, buffer[6]);
-	ASSERT.equal(0xff, buffer[7]);
-
-	mod_ctype.wdouble(2.225073858507201e-308,
-	    'little', buffer, 0);
-	ASSERT.equal(0x00, buffer[7]);
-	ASSERT.equal(0x0f, buffer[6]);
-	ASSERT.equal(0xff, buffer[5]);
-	ASSERT.equal(0xff, buffer[4]);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xff, buffer[0]);
-
-	mod_ctype.wdouble(-2.225073858507201e-308,
-	    'big', buffer, 0);
-	ASSERT.equal(0x80, buffer[0]);
-	ASSERT.equal(0x0f, buffer[1]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0xff, buffer[4]);
-	ASSERT.equal(0xff, buffer[5]);
-	ASSERT.equal(0xff, buffer[6]);
-	ASSERT.equal(0xff, buffer[7]);
-
-	mod_ctype.wdouble(-2.225073858507201e-308,
-	    'little', buffer, 0);
-	ASSERT.equal(0x80, buffer[7]);
-	ASSERT.equal(0x0f, buffer[6]);
-	ASSERT.equal(0xff, buffer[5]);
-	ASSERT.equal(0xff, buffer[4]);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xff, buffer[0]);
-
-
-	/* Smallest normalized value 2.2250738585072014e-308 */
-	mod_ctype.wdouble(2.2250738585072014e-308,
-	    'big', buffer, 0);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x10, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[7]);
-
-	mod_ctype.wdouble(2.2250738585072014e-308,
-	    'little', buffer, 0);
-	ASSERT.equal(0x00, buffer[7]);
-	ASSERT.equal(0x10, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-	mod_ctype.wdouble(-2.2250738585072014e-308,
-	    'big', buffer, 0);
-	ASSERT.equal(0x80, buffer[0]);
-	ASSERT.equal(0x10, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[7]);
-
-	mod_ctype.wdouble(-2.2250738585072014e-308,
-	    'little', buffer, 0);
-	ASSERT.equal(0x80, buffer[7]);
-	ASSERT.equal(0x10, buffer[6]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[4]);
-	ASSERT.equal(0x00, buffer[3]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[0]);
-
-
-	/* Largest normalized value 1.7976931348623157e+308 */
-	mod_ctype.wdouble(1.7976931348623157e+308,
-	    'big', buffer, 0);
-	ASSERT.equal(0x7f, buffer[0]);
-	ASSERT.equal(0xef, buffer[1]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0xff, buffer[4]);
-	ASSERT.equal(0xff, buffer[5]);
-	ASSERT.equal(0xff, buffer[6]);
-	ASSERT.equal(0xff, buffer[7]);
-
-	mod_ctype.wdouble(1.7976931348623157e+308,
-	    'little', buffer, 0);
-	ASSERT.equal(0x7f, buffer[7]);
-	ASSERT.equal(0xef, buffer[6]);
-	ASSERT.equal(0xff, buffer[5]);
-	ASSERT.equal(0xff, buffer[4]);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xff, buffer[0]);
-
-	mod_ctype.wdouble(-1.7976931348623157e+308,
-	    'big', buffer, 0);
-	ASSERT.equal(0xff, buffer[0]);
-	ASSERT.equal(0xef, buffer[1]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0xff, buffer[4]);
-	ASSERT.equal(0xff, buffer[5]);
-	ASSERT.equal(0xff, buffer[6]);
-	ASSERT.equal(0xff, buffer[7]);
-
-	mod_ctype.wdouble(-1.7976931348623157e+308,
-	    'little', buffer, 0);
-	ASSERT.equal(0xff, buffer[7]);
-	ASSERT.equal(0xef, buffer[6]);
-	ASSERT.equal(0xff, buffer[5]);
-	ASSERT.equal(0xff, buffer[4]);
-	ASSERT.equal(0xff, buffer[3]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xff, buffer[0]);
-
-
-	/* Try offsets */
-	buffer[0] = 0xde;
-	buffer[1] = 0xad;
-	buffer[2] = 0xbe;
-	buffer[3] = 0xef;
-	buffer[4] = 0xba;
-	buffer[5] = 0xdd;
-	buffer[6] = 0xca;
-	buffer[7] = 0xfe;
-	buffer[8] = 0x16;
-	buffer[9] = 0x79;
-
-	mod_ctype.wdouble(-0.000015130017658081283,
-	    'big', buffer, 2);
-	ASSERT.equal(0xbe, buffer[2]);
-	ASSERT.equal(0xef, buffer[3]);
-	ASSERT.equal(0xba, buffer[4]);
-	ASSERT.equal(0xdd, buffer[5]);
-	ASSERT.equal(0xca, buffer[6]);
-	ASSERT.equal(0xfe, buffer[7]);
-	ASSERT.equal(0x16, buffer[8]);
-	ASSERT.equal(0x79, buffer[9]);
-
-	mod_ctype.wdouble(-0.000015130017658081283,
-	    'little', buffer, 2);
-	ASSERT.equal(0xbe, buffer[9]);
-	ASSERT.equal(0xef, buffer[8]);
-	ASSERT.equal(0xba, buffer[7]);
-	ASSERT.equal(0xdd, buffer[6]);
-	ASSERT.equal(0xca, buffer[5]);
-	ASSERT.equal(0xfe, buffer[4]);
-	ASSERT.equal(0x16, buffer[3]);
-	ASSERT.equal(0x79, buffer[2]);
-}
-
-testfloat();
-testdouble();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.64.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,638 +0,0 @@
-/*
- * Test our ability to read and write signed 64-bit integers.
- */
-
-var mod_ctype = require('../../../ctio.js');
-var ASSERT = require('assert');
-
-function testRead()
-{
-	var res, data;
-	data = new Buffer(10);
-
-	data[0] = 0x32;
-	data[1] = 0x65;
-	data[2] = 0x42;
-	data[3] = 0x56;
-	data[4] = 0x23;
-	data[5] = 0xff;
-	data[6] = 0xff;
-	data[7] = 0xff;
-	data[8] = 0x89;
-	data[9] = 0x11;
-	res = mod_ctype.rsint64(data, 'big', 0);
-	ASSERT.equal(0x32654256, res[0]);
-	ASSERT.equal(0x23ffffff, res[1]);
-	res = mod_ctype.rsint64(data, 'big', 1);
-	ASSERT.equal(0x65425623, res[0]);
-	ASSERT.equal(0xffffff89, res[1]);
-	res = mod_ctype.rsint64(data, 'big', 2);
-	ASSERT.equal(0x425623ff, res[0]);
-	ASSERT.equal(0xffff8911, res[1]);
-	res = mod_ctype.rsint64(data, 'little', 0);
-	ASSERT.equal(-0x000000dc, res[0]);
-	ASSERT.equal(-0xa9bd9ace, res[1]);
-	res = mod_ctype.rsint64(data, 'little', 1);
-	ASSERT.equal(-0x76000000, res[0]);
-	ASSERT.equal(-0xdca9bd9b, res[1]);
-	res = mod_ctype.rsint64(data, 'little', 2);
-	ASSERT.equal(0x1189ffff, res[0]);
-	ASSERT.equal(0xff235642, res[1]);
-
-	data.fill(0x00);
-	res = mod_ctype.rsint64(data, 'big', 0);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(0x00000000, res[1]);
-	res = mod_ctype.rsint64(data, 'big', 1);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(0x00000000, res[1]);
-	res = mod_ctype.rsint64(data, 'big', 2);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(0x00000000, res[1]);
-	res = mod_ctype.rsint64(data, 'little', 0);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(0x00000000, res[1]);
-	res = mod_ctype.rsint64(data, 'little', 1);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(0x00000000, res[1]);
-	res = mod_ctype.rsint64(data, 'little', 2);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(0x00000000, res[1]);
-
-	data.fill(0xff);
-	res = mod_ctype.rsint64(data, 'big', 0);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(-1, res[1]);
-	res = mod_ctype.rsint64(data, 'big', 1);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(-1, res[1]);
-	res = mod_ctype.rsint64(data, 'big', 2);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(-1, res[1]);
-	res = mod_ctype.rsint64(data, 'little', 0);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(-1, res[1]);
-	res = mod_ctype.rsint64(data, 'little', 1);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(-1, res[1]);
-	res = mod_ctype.rsint64(data, 'little', 2);
-	ASSERT.equal(0x00000000, res[0]);
-	ASSERT.equal(-1, res[1]);
-
-	data[0] = 0x80;
-	data[1] = 0x00;
-	data[2] = 0x00;
-	data[3] = 0x00;
-	data[4] = 0x00;
-	data[5] = 0x00;
-	data[6] = 0x00;
-	data[7] = 0x00;
-	res = mod_ctype.rsint64(data, 'big', 0);
-	ASSERT.equal(-0x80000000, res[0]);
-	ASSERT.equal(0, res[1]);
-
-
-	data[7] = 0x80;
-	data[6] = 0x00;
-	data[5] = 0x00;
-	data[4] = 0x00;
-	data[3] = 0x00;
-	data[2] = 0x00;
-	data[1] = 0x00;
-	data[0] = 0x00;
-	res = mod_ctype.rsint64(data, 'little', 0);
-	ASSERT.equal(-0x80000000, res[0]);
-	ASSERT.equal(0, res[1]);
-
-	data[0] = 0x80;
-	data[1] = 0x00;
-	data[2] = 0x00;
-	data[3] = 0x00;
-	data[4] = 0x00;
-	data[5] = 0x00;
-	data[6] = 0x00;
-	data[7] = 0x01;
-	res = mod_ctype.rsint64(data, 'big', 0);
-	ASSERT.equal(-0x7fffffff, res[0]);
-	ASSERT.equal(-0xffffffff, res[1]);
-
-
-}
-
-function testWriteZero()
-{
-	var data, buf;
-	buf = new Buffer(10);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wsint64(data, 'big', buf, 0);
-	ASSERT.equal(0, buf[0]);
-	ASSERT.equal(0, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wsint64(data, 'big', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wsint64(data, 'big', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0, buf[8]);
-	ASSERT.equal(0, buf[9]);
-
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wsint64(data, 'little', buf, 0);
-	ASSERT.equal(0, buf[0]);
-	ASSERT.equal(0, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wsint64(data, 'little', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wsint64(data, 'little', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0, buf[8]);
-	ASSERT.equal(0, buf[9]);
-}
-
-/*
- * Also include tests that are going to force us to go into a negative value and
- * insure that it's written correctly.
- */
-function testWrite()
-{
-	var data, buf;
-
-	buf = new Buffer(10);
-	data = [ 0x234456, 0x87 ];
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'big', buf, 0);
-	ASSERT.equal(0x00, buf[0]);
-	ASSERT.equal(0x23, buf[1]);
-	ASSERT.equal(0x44, buf[2]);
-	ASSERT.equal(0x56, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x87, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'big', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x23, buf[2]);
-	ASSERT.equal(0x44, buf[3]);
-	ASSERT.equal(0x56, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x00, buf[7]);
-	ASSERT.equal(0x87, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'big', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x23, buf[3]);
-	ASSERT.equal(0x44, buf[4]);
-	ASSERT.equal(0x56, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x00, buf[7]);
-	ASSERT.equal(0x00, buf[8]);
-	ASSERT.equal(0x87, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'little', buf, 0);
-	ASSERT.equal(0x87, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x56, buf[4]);
-	ASSERT.equal(0x44, buf[5]);
-	ASSERT.equal(0x23, buf[6]);
-	ASSERT.equal(0x00, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'little', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x87, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x56, buf[5]);
-	ASSERT.equal(0x44, buf[6]);
-	ASSERT.equal(0x23, buf[7]);
-	ASSERT.equal(0x00, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'little', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0x87, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x56, buf[6]);
-	ASSERT.equal(0x44, buf[7]);
-	ASSERT.equal(0x23, buf[8]);
-	ASSERT.equal(0x00, buf[9]);
-
-	data = [0x3421, 0x34abcdba];
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'big', buf, 0);
-	ASSERT.equal(0x00, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x34, buf[2]);
-	ASSERT.equal(0x21, buf[3]);
-	ASSERT.equal(0x34, buf[4]);
-	ASSERT.equal(0xab, buf[5]);
-	ASSERT.equal(0xcd, buf[6]);
-	ASSERT.equal(0xba, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'big', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x34, buf[3]);
-	ASSERT.equal(0x21, buf[4]);
-	ASSERT.equal(0x34, buf[5]);
-	ASSERT.equal(0xab, buf[6]);
-	ASSERT.equal(0xcd, buf[7]);
-	ASSERT.equal(0xba, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'big', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x34, buf[4]);
-	ASSERT.equal(0x21, buf[5]);
-	ASSERT.equal(0x34, buf[6]);
-	ASSERT.equal(0xab, buf[7]);
-	ASSERT.equal(0xcd, buf[8]);
-	ASSERT.equal(0xba, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'little', buf, 0);
-	ASSERT.equal(0xba, buf[0]);
-	ASSERT.equal(0xcd, buf[1]);
-	ASSERT.equal(0xab, buf[2]);
-	ASSERT.equal(0x34, buf[3]);
-	ASSERT.equal(0x21, buf[4]);
-	ASSERT.equal(0x34, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x00, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'little', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0xba, buf[1]);
-	ASSERT.equal(0xcd, buf[2]);
-	ASSERT.equal(0xab, buf[3]);
-	ASSERT.equal(0x34, buf[4]);
-	ASSERT.equal(0x21, buf[5]);
-	ASSERT.equal(0x34, buf[6]);
-	ASSERT.equal(0x00, buf[7]);
-	ASSERT.equal(0x00, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'little', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0xba, buf[2]);
-	ASSERT.equal(0xcd, buf[3]);
-	ASSERT.equal(0xab, buf[4]);
-	ASSERT.equal(0x34, buf[5]);
-	ASSERT.equal(0x21, buf[6]);
-	ASSERT.equal(0x34, buf[7]);
-	ASSERT.equal(0x00, buf[8]);
-	ASSERT.equal(0x00, buf[9]);
-
-
-	data = [ -0x80000000, 0 ];
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'big', buf, 0);
-	ASSERT.equal(0x80, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x00, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'little', buf, 0);
-	ASSERT.equal(0x00, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x80, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	data = [ -0x7fffffff, -0xffffffff ];
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'big', buf, 0);
-	ASSERT.equal(0x80, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x01, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'little', buf, 0);
-	ASSERT.equal(0x01, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x80, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	data = [ 0x0, -0x1];
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'big', buf, 0);
-	ASSERT.equal(0xff, buf[0]);
-	ASSERT.equal(0xff, buf[1]);
-	ASSERT.equal(0xff, buf[2]);
-	ASSERT.equal(0xff, buf[3]);
-	ASSERT.equal(0xff, buf[4]);
-	ASSERT.equal(0xff, buf[5]);
-	ASSERT.equal(0xff, buf[6]);
-	ASSERT.equal(0xff, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wsint64(data, 'little', buf, 0);
-	ASSERT.equal(0xff, buf[0]);
-	ASSERT.equal(0xff, buf[1]);
-	ASSERT.equal(0xff, buf[2]);
-	ASSERT.equal(0xff, buf[3]);
-	ASSERT.equal(0xff, buf[4]);
-	ASSERT.equal(0xff, buf[5]);
-	ASSERT.equal(0xff, buf[6]);
-	ASSERT.equal(0xff, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-}
-
-/*
- * Make sure we catch invalid writes.
- */
-function testWriteInvalid()
-{
-	var data, buf;
-
-	/* Buffer too small */
-	buf = new Buffer(4);
-	data = [ 0, 0];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 0);
-	}, Error, 'buffer too small');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 0);
-	}, Error, 'buffer too small');
-
-	/* Beyond the end of the buffer */
-	buf = new Buffer(12);
-	data = [ 0, 0];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 11);
-	}, Error, 'write beyond end of buffer');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 11);
-	}, Error, 'write beyond end of buffer');
-
-	/* Write fractional values */
-	buf = new Buffer(12);
-	data = [ 3.33, 0 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	data = [ 0, 3.3 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	data = [ -3.33, 0 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	data = [ 0, -3.3 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	data = [ 3.33, 2.42 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	data = [ 3.33, -2.42 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	data = [ -3.33, -2.42 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	data = [ -3.33, 2.42 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	/* Signs don't match */
-	buf = new Buffer(12);
-	data = [ 0x800000, -0x32 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	data = [ -0x800000, 0x32 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	/* Write values that are too large */
-	buf = new Buffer(12);
-	data = [ 0x80000000, 0 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	data = [ 0x7fffffff, 0x100000000 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	data = [ 0x00, 0x800000000 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	data = [ 0xffffffffff, 0xffffff238 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	data = [ 0x23, 0xffffff238 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	data = [ -0x80000000, -0xfff238 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	data = [ -0x80000004, -0xfff238 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wsint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-}
-
-
-testRead();
-testWrite();
-testWriteZero();
-testWriteInvalid();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.rint.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,101 +0,0 @@
-/*
- * Tests to verify we're reading in signed integers correctly
- */
-var mod_ctype = require('../../../ctio.js');
-var ASSERT = require('assert');
-
-/*
- * Test 8 bit signed integers
- */
-function test8()
-{
-	var data = new Buffer(4);
-
-	data[0] = 0x23;
-	ASSERT.equal(0x23, mod_ctype.rsint8(data, 'big', 0));
-	ASSERT.equal(0x23, mod_ctype.rsint8(data, 'little', 0));
-
-	data[0] = 0xff;
-	ASSERT.equal(-1, mod_ctype.rsint8(data, 'big', 0));
-	ASSERT.equal(-1, mod_ctype.rsint8(data, 'little', 0));
-
-	data[0] = 0x87;
-	data[1] = 0xab;
-	data[2] = 0x7c;
-	data[3] = 0xef;
-	ASSERT.equal(-121, mod_ctype.rsint8(data, 'big', 0));
-	ASSERT.equal(-85, mod_ctype.rsint8(data, 'big', 1));
-	ASSERT.equal(124, mod_ctype.rsint8(data, 'big', 2));
-	ASSERT.equal(-17, mod_ctype.rsint8(data, 'big', 3));
-	ASSERT.equal(-121, mod_ctype.rsint8(data, 'little', 0));
-	ASSERT.equal(-85, mod_ctype.rsint8(data, 'little', 1));
-	ASSERT.equal(124, mod_ctype.rsint8(data, 'little', 2));
-	ASSERT.equal(-17, mod_ctype.rsint8(data, 'little', 3));
-}
-
-function test16()
-{
-	var buffer = new Buffer(6);
-	buffer[0] = 0x16;
-	buffer[1] = 0x79;
-	ASSERT.equal(0x1679, mod_ctype.rsint16(buffer, 'big', 0));
-	ASSERT.equal(0x7916, mod_ctype.rsint16(buffer, 'little', 0));
-
-	buffer[0] = 0xff;
-	buffer[1] = 0x80;
-	ASSERT.equal(-128, mod_ctype.rsint16(buffer, 'big', 0));
-	ASSERT.equal(-32513, mod_ctype.rsint16(buffer, 'little', 0));
-
-	/* test offset with weenix */
-	buffer[0] = 0x77;
-	buffer[1] = 0x65;
-	buffer[2] = 0x65;
-	buffer[3] = 0x6e;
-	buffer[4] = 0x69;
-	buffer[5] = 0x78;
-	ASSERT.equal(0x7765, mod_ctype.rsint16(buffer, 'big', 0));
-	ASSERT.equal(0x6565, mod_ctype.rsint16(buffer, 'big', 1));
-	ASSERT.equal(0x656e, mod_ctype.rsint16(buffer, 'big', 2));
-	ASSERT.equal(0x6e69, mod_ctype.rsint16(buffer, 'big', 3));
-	ASSERT.equal(0x6978, mod_ctype.rsint16(buffer, 'big', 4));
-	ASSERT.equal(0x6577, mod_ctype.rsint16(buffer, 'little', 0));
-	ASSERT.equal(0x6565, mod_ctype.rsint16(buffer, 'little', 1));
-	ASSERT.equal(0x6e65, mod_ctype.rsint16(buffer, 'little', 2));
-	ASSERT.equal(0x696e, mod_ctype.rsint16(buffer, 'little', 3));
-	ASSERT.equal(0x7869, mod_ctype.rsint16(buffer, 'little', 4));
-}
-
-function test32()
-{
-	var buffer = new Buffer(6);
-	buffer[0] = 0x43;
-	buffer[1] = 0x53;
-	buffer[2] = 0x16;
-	buffer[3] = 0x79;
-	ASSERT.equal(0x43531679, mod_ctype.rsint32(buffer, 'big', 0));
-	ASSERT.equal(0x79165343, mod_ctype.rsint32(buffer, 'little', 0));
-
-	buffer[0] = 0xff;
-	buffer[1] = 0xfe;
-	buffer[2] = 0xef;
-	buffer[3] = 0xfa;
-	ASSERT.equal(-69638, mod_ctype.rsint32(buffer, 'big', 0));
-	ASSERT.equal(-84934913, mod_ctype.rsint32(buffer, 'little', 0));
-
-	buffer[0] = 0x42;
-	buffer[1] = 0xc3;
-	buffer[2] = 0x95;
-	buffer[3] = 0xa9;
-	buffer[4] = 0x36;
-	buffer[5] = 0x17;
-	ASSERT.equal(0x42c395a9, mod_ctype.rsint32(buffer, 'big', 0));
-	ASSERT.equal(-1013601994, mod_ctype.rsint32(buffer, 'big', 1));
-	ASSERT.equal(-1784072681, mod_ctype.rsint32(buffer, 'big', 2));
-	ASSERT.equal(-1449802942, mod_ctype.rsint32(buffer, 'little', 0));
-	ASSERT.equal(917083587, mod_ctype.rsint32(buffer, 'little', 1));
-	ASSERT.equal(389458325, mod_ctype.rsint32(buffer, 'little', 2));
-}
-
-test8();
-test16();
-test32();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.wbounds.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-/*
- * Test to make sure that we properly are erroring whenever we try to write
- * beyond the size of the integer.
- */
-
-var mod_ctio = require('../../../ctio.js');
-var mod_assert = require('assert');
-var tb = new Buffer(16); /* Largest buffer we'll need */
-
-var cases = [
-	{ func:
-	function () {
-		mod_ctio.wsint8(0x80, 'big', tb, 0);
-	}, test: '+int8_t' },
-	{ func:
-	function () {
-		mod_ctio.wsint8(-0x81, 'big', tb, 0);
-	}, test: '-int8_t' },
-
-	{ func:
-	function () {
-		mod_ctio.wsint16(0x8000, 'big', tb, 0);
-	}, test: '+int16_t' },
-	{ func:
-	function () {
-		mod_ctio.wsint16(-0x8001, 'big', tb, 0);
-	}, test: '-int16_t' },
-	{ func:
-	function () {
-		mod_ctio.wsint32(0x80000000, 'big', tb, 0);
-	}, test: '+int32_t' },
-	{ func:
-	function () {
-		mod_ctio.wsint32(-0x80000001, 'big', tb, 0);
-	}, test: '-int32_t' },
-	{ func:
-	function () {
-		mod_ctio.wsint64([ 0x80000000, 0 ], 'big', tb, 0);
-	}, test: '+int64_t' },
-	{ func:
-	function () {
-		mod_ctio.wsint64([ -0x80000000, -1 ], 'big', tb, 0);
-	}, test: '-int64_t' }
-];
-
-function test()
-{
-	var ii;
-	for (ii = 0; ii < cases.length; ii++)
-		mod_assert.throws(cases[ii]['func'], Error, cases[ii]['test']);
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.wint.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,92 +0,0 @@
-/*
- * Tests to verify we're writing signed integers correctly
- */
-var mod_ctype = require('../../../ctio.js');
-var ASSERT = require('assert');
-
-function test8()
-{
-	var buffer = new Buffer(4);
-	mod_ctype.wsint8(0x23, 'big', buffer, 0);
-	mod_ctype.wsint8(0x23, 'little', buffer, 1);
-	mod_ctype.wsint8(-5, 'big', buffer, 2);
-	mod_ctype.wsint8(-5, 'little', buffer, 3);
-
-	ASSERT.equal(0x23, buffer[0]);
-	ASSERT.equal(0x23, buffer[1]);
-	ASSERT.equal(0xfb, buffer[2]);
-	ASSERT.equal(0xfb, buffer[3]);
-
-	/* Make sure we handle truncation correctly */
-	ASSERT.throws(function () {
-	     mod_ctype.wsint8(0xabc, 'big', buffer, 0);
-	});
-	ASSERT.throws(function () {
-	     mod_ctype.wsint8(0xabc, 'little', buffer, 0);
-	});
-}
-
-function test16()
-{
-	var buffer = new Buffer(6);
-	mod_ctype.wsint16(0x0023, 'big', buffer, 0);
-	mod_ctype.wsint16(0x0023, 'little', buffer, 2);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x23, buffer[1]);
-	ASSERT.equal(0x23, buffer[2]);
-	ASSERT.equal(0x00, buffer[3]);
-	mod_ctype.wsint16(-5, 'big', buffer, 0);
-	mod_ctype.wsint16(-5, 'little', buffer, 2);
-	ASSERT.equal(0xff, buffer[0]);
-	ASSERT.equal(0xfb, buffer[1]);
-	ASSERT.equal(0xfb, buffer[2]);
-	ASSERT.equal(0xff, buffer[3]);
-
-	mod_ctype.wsint16(-1679, 'big', buffer, 1);
-	mod_ctype.wsint16(-1679, 'little', buffer, 3);
-	ASSERT.equal(0xf9, buffer[1]);
-	ASSERT.equal(0x71, buffer[2]);
-	ASSERT.equal(0x71, buffer[3]);
-	ASSERT.equal(0xf9, buffer[4]);
-}
-
-function test32()
-{
-	var buffer = new Buffer(8);
-	mod_ctype.wsint32(0x23, 'big', buffer, 0);
-	mod_ctype.wsint32(0x23, 'little', buffer, 4);
-	ASSERT.equal(0x00, buffer[0]);
-	ASSERT.equal(0x00, buffer[1]);
-	ASSERT.equal(0x00, buffer[2]);
-	ASSERT.equal(0x23, buffer[3]);
-	ASSERT.equal(0x23, buffer[4]);
-	ASSERT.equal(0x00, buffer[5]);
-	ASSERT.equal(0x00, buffer[6]);
-	ASSERT.equal(0x00, buffer[7]);
-
-	mod_ctype.wsint32(-5, 'big', buffer, 0);
-	mod_ctype.wsint32(-5, 'little', buffer, 4);
-	ASSERT.equal(0xff, buffer[0]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xff, buffer[2]);
-	ASSERT.equal(0xfb, buffer[3]);
-	ASSERT.equal(0xfb, buffer[4]);
-	ASSERT.equal(0xff, buffer[5]);
-	ASSERT.equal(0xff, buffer[6]);
-	ASSERT.equal(0xff, buffer[7]);
-
-	mod_ctype.wsint32(-805306713, 'big', buffer, 0);
-	mod_ctype.wsint32(-805306713, 'litle', buffer, 4);
-	ASSERT.equal(0xcf, buffer[0]);
-	ASSERT.equal(0xff, buffer[1]);
-	ASSERT.equal(0xfe, buffer[2]);
-	ASSERT.equal(0xa7, buffer[3]);
-	ASSERT.equal(0xa7, buffer[4]);
-	ASSERT.equal(0xfe, buffer[5]);
-	ASSERT.equal(0xff, buffer[6]);
-	ASSERT.equal(0xcf, buffer[7]);
-}
-
-test8();
-test16();
-test32();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.64.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,451 +0,0 @@
-/*
- * Test our ability to read and write unsigned 64-bit integers.
- */
-
-var mod_ctype = require('../../../ctio.js');
-var ASSERT = require('assert');
-
-function testRead()
-{
-	var res, data;
-	data = new Buffer(10);
-
-	data[0] = 0x32;
-	data[1] = 0x65;
-	data[2] = 0x42;
-	data[3] = 0x56;
-	data[4] = 0x23;
-	data[5] = 0xff;
-	data[6] = 0xff;
-	data[7] = 0xff;
-	data[8] = 0x89;
-	data[9] = 0x11;
-	res = mod_ctype.ruint64(data, 'big', 0);
-	ASSERT.equal(0x32654256, res[0]);
-	ASSERT.equal(0x23ffffff, res[1]);
-	res = mod_ctype.ruint64(data, 'big', 1);
-	ASSERT.equal(0x65425623, res[0]);
-	ASSERT.equal(0xffffff89, res[1]);
-	res = mod_ctype.ruint64(data, 'big', 2);
-	ASSERT.equal(0x425623ff, res[0]);
-	ASSERT.equal(0xffff8911, res[1]);
-	res = mod_ctype.ruint64(data, 'little', 0);
-	ASSERT.equal(0xffffff23, res[0]);
-	ASSERT.equal(0x56426532, res[1]);
-	res = mod_ctype.ruint64(data, 'little', 1);
-	ASSERT.equal(0x89ffffff, res[0]);
-	ASSERT.equal(0x23564265, res[1]);
-	res = mod_ctype.ruint64(data, 'little', 2);
-	ASSERT.equal(0x1189ffff, res[0]);
-	ASSERT.equal(0xff235642, res[1]);
-
-}
-
-function testReadOver()
-{
-	var res, data;
-	data = new Buffer(10);
-
-	data[0] = 0x80;
-	data[1] = 0xff;
-	data[2] = 0x80;
-	data[3] = 0xff;
-	data[4] = 0x80;
-	data[5] = 0xff;
-	data[6] = 0x80;
-	data[7] = 0xff;
-	data[8] = 0x80;
-	data[9] = 0xff;
-	res = mod_ctype.ruint64(data, 'big', 0);
-	ASSERT.equal(0x80ff80ff, res[0]);
-	ASSERT.equal(0x80ff80ff, res[1]);
-	res = mod_ctype.ruint64(data, 'big', 1);
-	ASSERT.equal(0xff80ff80, res[0]);
-	ASSERT.equal(0xff80ff80, res[1]);
-	res = mod_ctype.ruint64(data, 'big', 2);
-	ASSERT.equal(0x80ff80ff, res[0]);
-	ASSERT.equal(0x80ff80ff, res[1]);
-	res = mod_ctype.ruint64(data, 'little', 0);
-	ASSERT.equal(0xff80ff80, res[0]);
-	ASSERT.equal(0xff80ff80, res[1]);
-	res = mod_ctype.ruint64(data, 'little', 1);
-	ASSERT.equal(0x80ff80ff, res[0]);
-	ASSERT.equal(0x80ff80ff, res[1]);
-	res = mod_ctype.ruint64(data, 'little', 2);
-	ASSERT.equal(0xff80ff80, res[0]);
-	ASSERT.equal(0xff80ff80, res[1]);
-}
-
-function testWriteZero()
-{
-	var data, buf;
-	buf = new Buffer(10);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wuint64(data, 'big', buf, 0);
-	ASSERT.equal(0, buf[0]);
-	ASSERT.equal(0, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wuint64(data, 'big', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wuint64(data, 'big', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0, buf[8]);
-	ASSERT.equal(0, buf[9]);
-
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wuint64(data, 'little', buf, 0);
-	ASSERT.equal(0, buf[0]);
-	ASSERT.equal(0, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wuint64(data, 'little', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	data = [0, 0];
-	mod_ctype.wuint64(data, 'little', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0, buf[2]);
-	ASSERT.equal(0, buf[3]);
-	ASSERT.equal(0, buf[4]);
-	ASSERT.equal(0, buf[5]);
-	ASSERT.equal(0, buf[6]);
-	ASSERT.equal(0, buf[7]);
-	ASSERT.equal(0, buf[8]);
-	ASSERT.equal(0, buf[9]);
-}
-
-/*
- * Also include tests that are going to force us to go into a negative value and
- * insure that it's written correctly.
- */
-function testWrite()
-{
-	var data, buf;
-
-	buf = new Buffer(10);
-	data = [ 0x234456, 0x87 ];
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'big', buf, 0);
-	ASSERT.equal(0x00, buf[0]);
-	ASSERT.equal(0x23, buf[1]);
-	ASSERT.equal(0x44, buf[2]);
-	ASSERT.equal(0x56, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x87, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'big', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x23, buf[2]);
-	ASSERT.equal(0x44, buf[3]);
-	ASSERT.equal(0x56, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x00, buf[7]);
-	ASSERT.equal(0x87, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'big', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x23, buf[3]);
-	ASSERT.equal(0x44, buf[4]);
-	ASSERT.equal(0x56, buf[5]);
-	ASSERT.equal(0x00, buf[6]);
-	ASSERT.equal(0x00, buf[7]);
-	ASSERT.equal(0x00, buf[8]);
-	ASSERT.equal(0x87, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'little', buf, 0);
-	ASSERT.equal(0x87, buf[0]);
-	ASSERT.equal(0x00, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x56, buf[4]);
-	ASSERT.equal(0x44, buf[5]);
-	ASSERT.equal(0x23, buf[6]);
-	ASSERT.equal(0x00, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'little', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x87, buf[1]);
-	ASSERT.equal(0x00, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x56, buf[5]);
-	ASSERT.equal(0x44, buf[6]);
-	ASSERT.equal(0x23, buf[7]);
-	ASSERT.equal(0x00, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'little', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0x87, buf[2]);
-	ASSERT.equal(0x00, buf[3]);
-	ASSERT.equal(0x00, buf[4]);
-	ASSERT.equal(0x00, buf[5]);
-	ASSERT.equal(0x56, buf[6]);
-	ASSERT.equal(0x44, buf[7]);
-	ASSERT.equal(0x23, buf[8]);
-	ASSERT.equal(0x00, buf[9]);
-
-	data = [0xffff3421, 0x34abcdba];
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'big', buf, 0);
-	ASSERT.equal(0xff, buf[0]);
-	ASSERT.equal(0xff, buf[1]);
-	ASSERT.equal(0x34, buf[2]);
-	ASSERT.equal(0x21, buf[3]);
-	ASSERT.equal(0x34, buf[4]);
-	ASSERT.equal(0xab, buf[5]);
-	ASSERT.equal(0xcd, buf[6]);
-	ASSERT.equal(0xba, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'big', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0xff, buf[1]);
-	ASSERT.equal(0xff, buf[2]);
-	ASSERT.equal(0x34, buf[3]);
-	ASSERT.equal(0x21, buf[4]);
-	ASSERT.equal(0x34, buf[5]);
-	ASSERT.equal(0xab, buf[6]);
-	ASSERT.equal(0xcd, buf[7]);
-	ASSERT.equal(0xba, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'big', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0xff, buf[2]);
-	ASSERT.equal(0xff, buf[3]);
-	ASSERT.equal(0x34, buf[4]);
-	ASSERT.equal(0x21, buf[5]);
-	ASSERT.equal(0x34, buf[6]);
-	ASSERT.equal(0xab, buf[7]);
-	ASSERT.equal(0xcd, buf[8]);
-	ASSERT.equal(0xba, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'little', buf, 0);
-	ASSERT.equal(0xba, buf[0]);
-	ASSERT.equal(0xcd, buf[1]);
-	ASSERT.equal(0xab, buf[2]);
-	ASSERT.equal(0x34, buf[3]);
-	ASSERT.equal(0x21, buf[4]);
-	ASSERT.equal(0x34, buf[5]);
-	ASSERT.equal(0xff, buf[6]);
-	ASSERT.equal(0xff, buf[7]);
-	ASSERT.equal(0x66, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'little', buf, 1);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0xba, buf[1]);
-	ASSERT.equal(0xcd, buf[2]);
-	ASSERT.equal(0xab, buf[3]);
-	ASSERT.equal(0x34, buf[4]);
-	ASSERT.equal(0x21, buf[5]);
-	ASSERT.equal(0x34, buf[6]);
-	ASSERT.equal(0xff, buf[7]);
-	ASSERT.equal(0xff, buf[8]);
-	ASSERT.equal(0x66, buf[9]);
-
-	buf.fill(0x66);
-	mod_ctype.wuint64(data, 'little', buf, 2);
-	ASSERT.equal(0x66, buf[0]);
-	ASSERT.equal(0x66, buf[1]);
-	ASSERT.equal(0xba, buf[2]);
-	ASSERT.equal(0xcd, buf[3]);
-	ASSERT.equal(0xab, buf[4]);
-	ASSERT.equal(0x34, buf[5]);
-	ASSERT.equal(0x21, buf[6]);
-	ASSERT.equal(0x34, buf[7]);
-	ASSERT.equal(0xff, buf[8]);
-	ASSERT.equal(0xff, buf[9]);
-}
-
-/*
- * Make sure we catch invalid writes.
- */
-function testWriteInvalid()
-{
-	var data, buf;
-
-	/* Buffer too small */
-	buf = new Buffer(4);
-	data = [ 0, 0];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 0);
-	}, Error, 'buffer too small');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 0);
-	}, Error, 'buffer too small');
-
-	/* Beyond the end of the buffer */
-	buf = new Buffer(12);
-	data = [ 0, 0];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 11);
-	}, Error, 'write beyond end of buffer');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 11);
-	}, Error, 'write beyond end of buffer');
-
-	/* Write negative values */
-	buf = new Buffer(12);
-	data = [ -3, 0 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 1);
-	}, Error, 'write negative number');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 1);
-	}, Error, 'write negative number');
-
-	data = [ 0, -3 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 1);
-	}, Error, 'write negative number');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 1);
-	}, Error, 'write negative number');
-
-	data = [ -3, -3 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 1);
-	}, Error, 'write negative number');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 1);
-	}, Error, 'write negative number');
-
-
-	/* Write fractional values */
-	buf = new Buffer(12);
-	data = [ 3.33, 0 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	data = [ 0, 3.3 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	data = [ 3.33, 2.42 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 1);
-	}, Error, 'write fractions');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 1);
-	}, Error, 'write fractions');
-
-	/* Write values that are too large */
-	buf = new Buffer(12);
-	data = [ 0xffffffffff, 23 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	data = [ 0xffffffffff, 0xffffff238 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-
-	data = [ 0x23, 0xffffff238 ];
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'big', buf, 1);
-	}, Error, 'write too large');
-	ASSERT.throws(function () {
-	    mod_ctype.wuint64(data, 'little', buf, 1);
-	}, Error, 'write too large');
-}
-
-
-testRead();
-testReadOver();
-testWriteZero();
-testWrite();
-testWriteInvalid();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.roundtrip.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,81 +0,0 @@
-/*
- * A battery of tests for sucessful round-trip between writes and reads
- */
-
-var mod_ctype = require('../../../ctio.js');
-var ASSERT = require('assert');
-
-
-/*
- * What the heck, let's just test every value for 8-bits.
- */
-
-function test8() {
-	var data = new Buffer(1);
-	var i;
-	for (i = 0; i < 256; i++) {
-		mod_ctype.wuint8(i, 'big', data, 0);
-		ASSERT.equal(i, mod_ctype.ruint8(data, 'big', 0));
-		mod_ctype.wuint8(i, 'little', data, 0);
-		ASSERT.equal(i, mod_ctype.ruint8(data, 'little', 0));
-	}
-	ASSERT.ok(true);
-}
-
-/*
- * Test a random sample of 256 values in the 16-bit unsigned range
- */
-
-function test16() {
-	var data = new Buffer(2);
-	var i = 0;
-	for (i = 0; i < 256; i++) {
-		var value = Math.round(Math.random() * Math.pow(2, 16));
-		mod_ctype.wuint16(value, 'big', data, 0);
-		ASSERT.equal(value, mod_ctype.ruint16(data, 'big', 0));
-		mod_ctype.wuint16(value, 'little', data, 0);
-		ASSERT.equal(value, mod_ctype.ruint16(data, 'little', 0));
-	}
-}
-
-/*
- * Test a random sample of 256 values in the 32-bit unsigned range
- */
-
-function test32() {
-	var data = new Buffer(4);
-	var i = 0;
-	for (i = 0; i < 256; i++) {
-		var value = Math.round(Math.random() * Math.pow(2, 32));
-		mod_ctype.wuint32(value, 'big', data, 0);
-		ASSERT.equal(value, mod_ctype.ruint32(data, 'big', 0));
-		mod_ctype.wuint32(value, 'little', data, 0);
-		ASSERT.equal(value, mod_ctype.ruint32(data, 'little', 0));
-	}
-}
-
-/*
- * Test a random sample of 256 values in the 64-bit unsigned range
- */
-
-function test64() {
-	var data = new Buffer(8);
-	var i = 0;
-	for (i = 0; i < 256; i++) {
-		var low = Math.round(Math.random() * Math.pow(2, 32));
-		var high = Math.round(Math.random() * Math.pow(2, 32));
-		mod_ctype.wuint64([high, low], 'big', data, 0);
-		var result = mod_ctype.ruint64(data, 'big', 0);
-		ASSERT.equal(high, result[0]);
-		ASSERT.equal(low, result[1]);
-		mod_ctype.wuint64([high, low], 'little', data, 0);
-		result = mod_ctype.ruint64(data, 'little', 0);
-		ASSERT.equal(high, result[0]);
-		ASSERT.equal(low, result[1]);
-	}
-}
-
-exports.test8 = test8;
-exports.test16 = test16;
-exports.test32 = test32;
-exports.test64 = test64;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.ruint.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,95 +0,0 @@
-/*
- * A battery of tests to help us read a series of uints
- */
-
-var mod_ctype = require('../../../ctio.js');
-var ASSERT = require('assert');
-
-/*
- * We need to check the following things:
- *  - We are correctly resolving big endian (doesn't mean anything for 8 bit)
- *  - Correctly resolving little endian (doesn't mean anything for 8 bit)
- *  - Correctly using the offsets
- *  - Correctly interpreting values that are beyond the signed range as unsigned
- */
-function test8()
-{
-	var data = new Buffer(4);
-	data[0] = 23;
-	data[1] = 23;
-	data[2] = 23;
-	data[3] = 23;
-	ASSERT.equal(23, mod_ctype.ruint8(data, 'big', 0));
-	ASSERT.equal(23, mod_ctype.ruint8(data, 'little', 0));
-	ASSERT.equal(23, mod_ctype.ruint8(data, 'big', 1));
-	ASSERT.equal(23, mod_ctype.ruint8(data, 'little', 1));
-	ASSERT.equal(23, mod_ctype.ruint8(data, 'big', 2));
-	ASSERT.equal(23, mod_ctype.ruint8(data, 'little', 2));
-	ASSERT.equal(23, mod_ctype.ruint8(data, 'big', 3));
-	ASSERT.equal(23, mod_ctype.ruint8(data, 'little', 3));
-	data[0] = 255; /* If it became a signed int, would be -1 */
-	ASSERT.equal(255, mod_ctype.ruint8(data, 'big', 0));
-	ASSERT.equal(255, mod_ctype.ruint8(data, 'little', 0));
-}
-
-/*
- * Test 16 bit unsigned integers. We need to verify the same set as 8 bit, only
- * now some of the issues actually matter:
- *  - We are correctly resolving big endian
- *  - Correctly resolving little endian
- *  - Correctly using the offsets
- *  - Correctly interpreting values that are beyond the signed range as unsigned
- */
-function test16()
-{
-	var data = new Buffer(4);
-	/* Test signed values first */
-	data[0] = 0;
-	data[1] = 0x23;
-	data[2] = 0x42;
-	data[3] = 0x3f;
-
-	ASSERT.equal(0x23, mod_ctype.ruint16(data, 'big', 0));
-	ASSERT.equal(0x2342, mod_ctype.ruint16(data, 'big', 1));
-	ASSERT.equal(0x423f, mod_ctype.ruint16(data, 'big', 2));
-
-	ASSERT.equal(0x2300, mod_ctype.ruint16(data, 'little', 0));
-	ASSERT.equal(0x4223, mod_ctype.ruint16(data, 'little', 1));
-	ASSERT.equal(0x3f42, mod_ctype.ruint16(data, 'little', 2));
-
-	data[0] = 0xfe;
-	data[1] = 0xfe;
-
-	ASSERT.equal(0xfefe, mod_ctype.ruint16(data, 'big', 0));
-	ASSERT.equal(0xfefe, mod_ctype.ruint16(data, 'little', 0));
-}
-
-/*
- * Test 32 bit unsigned integers. We need to verify the same set as 8 bit, only
- * now some of the issues actually matter:
- *  - We are correctly resolving big endian
- *  - Correctly using the offsets
- *  - Correctly interpreting values that are beyond the signed range as unsigned
- */
-function test32()
-{
-	var data = new Buffer(8);
-	data[0] = 0x32;
-	data[1] = 0x65;
-	data[2] = 0x42;
-	data[3] = 0x56;
-	data[4] = 0x23;
-	data[5] = 0xff;
-
-	ASSERT.equal(0x32654256, mod_ctype.ruint32(data, 'big', 0));
-	ASSERT.equal(0x65425623, mod_ctype.ruint32(data, 'big', 1));
-	ASSERT.equal(0x425623ff, mod_ctype.ruint32(data, 'big', 2));
-
-	ASSERT.equal(0x56426532, mod_ctype.ruint32(data, 'little', 0));
-	ASSERT.equal(0x23564265, mod_ctype.ruint32(data, 'little', 1));
-	ASSERT.equal(0xff235642, mod_ctype.ruint32(data, 'little', 2));
-}
-
-test8();
-test16();
-test32();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.wuint.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,156 +0,0 @@
-/*
- * A battery of tests to help us read a series of uints
- */
-
-var mod_ctype = require('../../../ctio.js');
-var ASSERT = require('assert');
-
-/*
- * We need to check the following things:
- *  - We are correctly resolving big endian (doesn't mean anything for 8 bit)
- *  - Correctly resolving little endian (doesn't mean anything for 8 bit)
- *  - Correctly using the offsets
- *  - Correctly interpreting values that are beyond the signed range as unsigned
- */
-function test8()
-{
-	var data = new Buffer(4);
-	mod_ctype.wuint8(23, 'big', data, 0);
-	mod_ctype.wuint8(23, 'big', data, 1);
-	mod_ctype.wuint8(23, 'big', data, 2);
-	mod_ctype.wuint8(23, 'big', data, 3);
-	ASSERT.equal(23, data[0]);
-	ASSERT.equal(23, data[1]);
-	ASSERT.equal(23, data[2]);
-	ASSERT.equal(23, data[3]);
-	mod_ctype.wuint8(23, 'little', data, 0);
-	mod_ctype.wuint8(23, 'little', data, 1);
-	mod_ctype.wuint8(23, 'little', data, 2);
-	mod_ctype.wuint8(23, 'little', data, 3);
-	ASSERT.equal(23, data[0]);
-	ASSERT.equal(23, data[1]);
-	ASSERT.equal(23, data[2]);
-	ASSERT.equal(23, data[3]);
-	mod_ctype.wuint8(255, 'big', data, 0);
-	ASSERT.equal(255, data[0]);
-	mod_ctype.wuint8(255, 'little', data, 0);
-	ASSERT.equal(255, data[0]);
-}
-
-function test16()
-{
-	var value = 0x2343;
-	var data = new Buffer(4);
-	mod_ctype.wuint16(value, 'big', data, 0);
-	ASSERT.equal(0x23, data[0]);
-	ASSERT.equal(0x43, data[1]);
-	mod_ctype.wuint16(value, 'big', data, 1);
-	ASSERT.equal(0x23, data[1]);
-	ASSERT.equal(0x43, data[2]);
-	mod_ctype.wuint16(value, 'big', data, 2);
-	ASSERT.equal(0x23, data[2]);
-	ASSERT.equal(0x43, data[3]);
-
-	mod_ctype.wuint16(value, 'little', data, 0);
-	ASSERT.equal(0x23, data[1]);
-	ASSERT.equal(0x43, data[0]);
-
-	mod_ctype.wuint16(value, 'little', data, 1);
-	ASSERT.equal(0x23, data[2]);
-	ASSERT.equal(0x43, data[1]);
-
-	mod_ctype.wuint16(value, 'little', data, 2);
-	ASSERT.equal(0x23, data[3]);
-	ASSERT.equal(0x43, data[2]);
-
-	value = 0xff80;
-	mod_ctype.wuint16(value, 'little', data, 0);
-	ASSERT.equal(0xff, data[1]);
-	ASSERT.equal(0x80, data[0]);
-
-	mod_ctype.wuint16(value, 'big', data, 0);
-	ASSERT.equal(0xff, data[0]);
-	ASSERT.equal(0x80, data[1]);
-}
-
-function test32()
-{
-	var data = new Buffer(6);
-	var value = 0xe7f90a6d;
-
-	mod_ctype.wuint32(value, 'big', data, 0);
-	ASSERT.equal(0xe7, data[0]);
-	ASSERT.equal(0xf9, data[1]);
-	ASSERT.equal(0x0a, data[2]);
-	ASSERT.equal(0x6d, data[3]);
-
-	mod_ctype.wuint32(value, 'big', data, 1);
-	ASSERT.equal(0xe7, data[1]);
-	ASSERT.equal(0xf9, data[2]);
-	ASSERT.equal(0x0a, data[3]);
-	ASSERT.equal(0x6d, data[4]);
-
-	mod_ctype.wuint32(value, 'big', data, 2);
-	ASSERT.equal(0xe7, data[2]);
-	ASSERT.equal(0xf9, data[3]);
-	ASSERT.equal(0x0a, data[4]);
-	ASSERT.equal(0x6d, data[5]);
-
-	mod_ctype.wuint32(value, 'little', data, 0);
-	ASSERT.equal(0xe7, data[3]);
-	ASSERT.equal(0xf9, data[2]);
-	ASSERT.equal(0x0a, data[1]);
-	ASSERT.equal(0x6d, data[0]);
-
-	mod_ctype.wuint32(value, 'little', data, 1);
-	ASSERT.equal(0xe7, data[4]);
-	ASSERT.equal(0xf9, data[3]);
-	ASSERT.equal(0x0a, data[2]);
-	ASSERT.equal(0x6d, data[1]);
-
-	mod_ctype.wuint32(value, 'little', data, 2);
-	ASSERT.equal(0xe7, data[5]);
-	ASSERT.equal(0xf9, data[4]);
-	ASSERT.equal(0x0a, data[3]);
-	ASSERT.equal(0x6d, data[2]);
-}
-
-function test64()
-{
-	var data = new Buffer(10);
-	var value = 0x0007cda8e7f90a6d;
-	var high = Math.floor(value / Math.pow(2, 32));
-	var low = value - (high * Math.pow(2, 32));
-	ASSERT.equal(0x0007cda8, high);
-	ASSERT.equal(0xe7f90a6d, low);
-
-	mod_ctype.wuint64([high, low], 'big', data, 0);
-	ASSERT.equal(0x00, data[0]);
-	ASSERT.equal(0x07, data[1]);
-	ASSERT.equal(0xcd, data[2]);
-	ASSERT.equal(0xa8, data[3]);
-	ASSERT.equal(0xe7, data[4]);
-	ASSERT.equal(0xf9, data[5]);
-	ASSERT.equal(0x0a, data[6]);
-	ASSERT.equal(0x6d, data[7]);
-
-	mod_ctype.wuint64([high, low], 'little', data, 0);
-	ASSERT.equal(0x6d, data[0]);
-	ASSERT.equal(0x0a, data[1]);
-	ASSERT.equal(0xf9, data[2]);
-	ASSERT.equal(0xe7, data[3]);
-	ASSERT.equal(0xa8, data[4]);
-	ASSERT.equal(0xcd, data[5]);
-	ASSERT.equal(0x07, data[6]);
-	ASSERT.equal(0x00, data[7]);
-}
-
-test8();
-test16();
-test32();
-test64();
-
-exports.test8 = test8;
-exports.test16 = test16;
-exports.test32 = test32;
-exports.test64 = test64;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.basicr.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,50 +0,0 @@
-/*
- * Simple does to see if it works at all
- */
-var mod_ctype = require('../../ctype');
-var ASSERT = require('assert');
-var mod_sys = require('sys');
-
-function test()
-{
-	var ii, p, result, buffer;
-
-	p = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer(4);
-	buffer[0] = 23;
-	buffer[3] = 42;
-	result = p.readData([ { x: { type: 'uint8_t' }},
-	    { y: { type: 'uint8_t', offset: 3 }}
-	], buffer, 0);
-	ASSERT.equal(23, result['x']);
-	ASSERT.equal(42, result['y']);
-
-	buffer = new Buffer(23);
-	for (ii = 0; ii < 23; ii++)
-		buffer[ii] = 0;
-
-	buffer.write('Hello, world!');
-	result = p.readData([ { x: { type: 'char[20]' }} ], buffer, 0);
-
-	/*
-	 * This is currently broken behvaior, need to redesign check
-	 * ASSERT.equal('Hello, world!', result['x'].toString('utf-8', 0,
-	 *  result['x'].length));
-	 */
-
-	buffer = new Buffer(4);
-	buffer[0] = 0x03;
-	buffer[1] = 0x24;
-	buffer[2] = 0x25;
-	buffer[3] = 0x26;
-	result = p.readData([ { y: { type: 'uint8_t' }},
-	    { x: { type: 'uint8_t[y]' }}], buffer, 0);
-	console.log(mod_sys.inspect(result, true));
-
-	p.typedef('ssize_t', 'int32_t');
-	ASSERT.deepEqual({ 'ssize_t': 'int32_t' }, p.lstypes());
-	result = p.readData([ { x: { type: 'ssize_t' } } ], buffer, 0);
-	ASSERT.equal(0x26252403, result['x']);
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.basicw.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,44 +0,0 @@
-/*
- * Simple does it fucking work at all test
- */
-
-var mod_ctype = require('../../ctype');
-var ASSERT = require('assert');
-var mod_sys = require('sys');
-
-function test()
-{
-	var ii, p, buffer, buf2;
-
-	p = new mod_ctype.Parser({ endian: 'big' });
-	buffer = new Buffer(4);
-	p.writeData([ { x: { type: 'uint8_t', value: 23 }},
-	    { y: { type: 'uint8_t', offset: 3, value: 42 }}
-	], buffer, 0);
-	ASSERT.equal(23, buffer[0]);
-	ASSERT.equal(42, buffer[3]);
-
-	buffer = new Buffer(20);
-	for (ii = 0; ii < 20; ii++)
-		buffer[ii] = 0;
-
-	buffer.write('Hello, world!');
-	buf2 = new Buffer(22);
-	p.writeData([ { x: { type: 'char[20]', value: buffer }} ], buf2, 0);
-	for (ii = 0; ii < 20; ii++)
-		ASSERT.equal(buffer[ii], buf2[ii]);
-	/*
-	 * This is currently broken behvaior, need to redesign check
-	 * ASSERT.equal('Hello, world!', result['x'].toString('utf-8', 0,
-	 *   result['x'].length));
-	 */
-
-	buffer = new Buffer(4);
-	p.writeData([ { y: { type: 'uint8_t', value: 3 }},
-	    { x: { type: 'uint8_t[y]', value: [ 0x24, 0x25, 0x26] }}],
-	    buffer, 0);
-	console.log(mod_sys.inspect(buffer));
-
-	p.typedef('ssize_t', 'int32_t');
-	ASSERT.deepEqual({ 'ssize_t': 'int32_t' }, p.lstypes());
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.char.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-/*
- * Test the different forms of reading characters:
- *
- *  - the default, a single element buffer
- *  - uint8, values are uint8_ts
- *  - int8, values are int8_ts
- */
-var mod_ctype = require('../../ctype');
-var mod_assert = require('assert');
-
-function test()
-{
-	var p, buf, res;
-
-	buf = new Buffer(1);
-	buf[0] = 255;
-
-	p = new mod_ctype.Parser({ endian: 'little'});
-	res = p.readData([ { c: { type: 'char' }} ], buf, 0);
-	res = res['c'];
-	mod_assert.ok(res instanceof Buffer);
-	mod_assert.equal(255, res[0]);
-
-	p = new mod_ctype.Parser({ endian: 'little',
-	    'char-type': 'int8' });
-	res = p.readData([ { c: { type: 'char' }} ], buf, 0);
-	res = res['c'];
-	mod_assert.ok(typeof (res) == 'number', 'got typeof (res): ' +
-	    typeof (res));
-	mod_assert.equal(-1, res);
-
-	p = new mod_ctype.Parser({ endian: 'little',
-	    'char-type': 'uint8' });
-	res = p.readData([ { c: { type: 'char' }} ], buf, 0);
-	res = res['c'];
-	mod_assert.ok(typeof (res) == 'number', 'got typeof (res): ' +
-	    typeof (res));
-	mod_assert.equal(255, res);
-
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.endian.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,45 +0,0 @@
-/*
- * Simple test to make sure that the endian setting works.
- */
-
-var mod_ctype = require('../../ctype.js');
-var mod_assert = require('assert');
-
-function test()
-{
-	var parser, buf;
-
-	parser = new mod_ctype.Parser({
-	    endian: 'little'
-	});
-
-	buf = new Buffer(2);
-	parser.writeData([ { key: { type: 'uint16_t' } } ], buf, 0, [ 0x1234 ]);
-	mod_assert.equal(buf[0], 0x34);
-	mod_assert.equal(buf[1], 0x12);
-	parser.setEndian('big');
-
-	parser.writeData([ { key: { type: 'uint16_t' } } ], buf, 0, [ 0x1234 ]);
-	mod_assert.equal(buf[0], 0x12);
-	mod_assert.equal(buf[1], 0x34);
-
-	parser.setEndian('little');
-	parser.writeData([ { key: { type: 'uint16_t' } } ], buf, 0, [ 0x1234 ]);
-	mod_assert.equal(buf[0], 0x34);
-	mod_assert.equal(buf[1], 0x12);
-}
-
-function fail()
-{
-	var parser;
-
-	parser = new mod_ctype.Parser({
-	    endian: 'little'
-	});
-	mod_assert.throws(function () {
-		parser.setEndian('littlebigwrong');
-	});
-}
-
-test();
-fail();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.oldwrite.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-/*
- * A long overdue test to go through and verify that we can read and write
- * structures as well as nested structures.
- */
-
-var mod_ctype = require('../../ctype.js');
-var mod_assert = require('assert');
-
-function test()
-{
-	var parser, buf, data;
-	parser = new mod_ctype.Parser({
-	    endian: 'little'
-	});
-	parser.typedef('point_t', [
-	    { x: { type: 'uint8_t' } },
-	    { y: { type: 'uint8_t' } }
-	]);
-	buf = new Buffer(2);
-	data = [
-	    { point: { type: 'point_t', value: [ 23, 42 ] } }
-	];
-	parser.writeData(data, buf, 0);
-	mod_assert.ok(buf[0] == 23);
-	mod_assert.ok(buf[1] == 42);
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.readSize.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,128 +0,0 @@
-/*
- * Testing to ensure we're reading the expected number bytes
- */
-var mod_ctype = require('../../ctype');
-var ASSERT = require('assert');
-
-function testUint8()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('80', 'hex');
-	result = parser.readStruct([ { item: { type: 'uint8_t' } } ], buffer,
-	    0);
-	ASSERT.equal(result['size'], 1);
-}
-
-function testSint8()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('80', 'hex');
-	result = parser.readStruct([ { item: { type: 'int8_t' } } ], buffer, 0);
-	ASSERT.equal(result['size'], 1);
-}
-
-function testUint16()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('8000', 'hex');
-	result = parser.readStruct([ { item: { type: 'uint16_t' } } ], buffer,
-	    0);
-	ASSERT.equal(result['size'], 2);
-}
-
-function testSint16()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('8000', 'hex');
-	result = parser.readStruct([ { item: { type: 'int16_t' } } ], buffer,
-	    0);
-	ASSERT.equal(result['size'], 2);
-}
-
-function testUint32()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('80000000', 'hex');
-	result = parser.readStruct([ { item: { type: 'uint32_t' } } ], buffer,
-	    0);
-	ASSERT.equal(result['size'], 4);
-}
-
-function testSint32()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('80000000', 'hex');
-	result = parser.readStruct([ { item: { type: 'int32_t' } } ], buffer,
-	    0);
-	ASSERT.equal(result['size'], 4);
-}
-
-function testUint64()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('8000000000000000', 'hex');
-	result = parser.readStruct([ { item: { type: 'uint64_t' } } ], buffer,
-	    0);
-	ASSERT.equal(result['size'], 8);
-}
-
-function testSint64()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('8000000000000000', 'hex');
-	result = parser.readStruct([ { item: { type: 'int64_t' } } ], buffer,
-	    0);
-	ASSERT.equal(result['size'], 8);
-}
-
-function testFloat()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('ABAAAA3E', 'hex');
-	result = parser.readStruct([ { item: { type: 'float' } } ], buffer, 0);
-	ASSERT.equal(result['size'], 4);
-}
-
-function testDouble()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('000000000000F03F', 'hex');
-	result = parser.readStruct([ { item: { type: 'double' } } ], buffer, 0);
-	ASSERT.equal(result['size'], 8);
-}
-
-function testChar()
-{
-	var parser, result, buffer;
-	parser = new mod_ctype.Parser({ endian: 'little' });
-	buffer = new Buffer('t');
-	result = parser.readStruct([ { item: { type: 'char' } } ], buffer, 0);
-	ASSERT.equal(result['size'], 1);
-}
-
-function test()
-{
-	testSint8();
-	testUint8();
-	testSint16();
-	testUint16();
-	testSint32();
-	testUint32();
-	testSint64();
-	testUint64();
-	testFloat();
-	testDouble();
-	testChar();
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.structw.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-/*
- * A long overdue test to go through and verify that we can read and write
- * structures as well as nested structures.
- */
-
-var mod_ctype = require('../../ctype.js');
-var mod_assert = require('assert');
-
-function test()
-{
-	var parser, buf, data;
-	parser = new mod_ctype.Parser({
-	    endian: 'little'
-	});
-	parser.typedef('point_t', [
-	    { x: { type: 'uint8_t' } },
-	    { y: { type: 'uint8_t' } }
-	]);
-	buf = new Buffer(2);
-	data = [
-	    { point: { type: 'point_t' } }
-	];
-	parser.writeData(data, buf, 0, [ [ 23, 42 ] ]);
-	mod_assert.ok(buf[0] == 23);
-	mod_assert.ok(buf[1] == 42);
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.writeStruct.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-/*
- * Test to verify that the offset is incremented when structures are written to.
- * Hopefully we will not regress issue #41
- */
-
-var mod_ctype = require('../../ctype.js');
-var mod_assert = require('assert');
-
-function test()
-{
-	var parser, buf, data;
-	parser = new mod_ctype.Parser({
-	    endian: 'little'
-	});
-	parser.typedef('point_t', [
-	    { x: { type: 'uint8_t' } },
-	    { y: { type: 'uint8_t' } }
-	]);
-	buf = new Buffer(4);
-	data = [
-	    { point1: { type: 'point_t' } },
-	    { point2: { type: 'point_t' } }
-	];
-	parser.writeData(data, buf, 0, [ [ 23, 42 ], [ 91, 18 ] ]);
-	mod_assert.ok(buf[0] == 23);
-	mod_assert.ok(buf[1] == 42);
-	mod_assert.ok(buf[2] == 91);
-	mod_assert.ok(buf[3] == 18);
-}
-
-test();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/http-signature/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-{
-  "author": {
-    "name": "Joyent, Inc"
-  },
-  "name": "http-signature",
-  "description": "Reference implementation of Joyent's HTTP Signature Scheme",
-  "version": "0.10.0",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/joyent/node-http-signature.git"
-  },
-  "engines": {
-    "node": ">=0.8"
-  },
-  "main": "lib/index.js",
-  "scripts": {
-    "test": "./node_modules/.bin/tap tst/*.js"
-  },
-  "dependencies": {
-    "assert-plus": "0.1.2",
-    "asn1": "0.1.11",
-    "ctype": "0.5.2"
-  },
-  "devDependencies": {
-    "node-uuid": "1.4.0",
-    "tap": "0.4.2"
-  },
-  "readme": "# node-http-signature\n\nnode-http-signature is a node.js library that has client and server components\nfor Joyent's [HTTP Signature Scheme](http_signing.md).\n\n## Usage\n\nNote the example below signs a request with the same key/cert used to start an\nHTTP server. This is almost certainly not what you actaully want, but is just\nused to illustrate the API calls; you will need to provide your own key\nmanagement in addition to this library.\n\n### Client\n\n    var fs = require('fs');\n    var https = require('https');\n    var httpSignature = require('http-signature');\n\n    var key = fs.readFileSync('./key.pem', 'ascii');\n\n    var options = {\n      host: 'localhost',\n      port: 8443,\n      path: '/',\n      method: 'GET',\n      headers: {}\n    };\n\n    // Adds a 'Date' header in, signs it, and adds the\n    // 'Authorization' header in.\n    var req = https.request(options, function(res) {\n      console.log(res.statusCode);\n    });\n\n\n    httpSignature.sign(req, {\n      key: key,\n      keyId: './cert.pem'\n    });\n\n    req.end();\n\n### Server\n\n    var fs = require('fs');\n    var https = require('https');\n    var httpSignature = require('http-signature');\n\n    var options = {\n      key: fs.readFileSync('./key.pem'),\n      cert: fs.readFileSync('./cert.pem')\n    };\n\n    https.createServer(options, function (req, res) {\n      var rc = 200;\n      var parsed = httpSignature.parseRequest(req);\n      var pub = fs.readFileSync(parsed.keyId, 'ascii');\n      if (!httpSignature.verifySignature(parsed, pub))\n        rc = 401;\n\n      res.writeHead(rc);\n      res.end();\n    }).listen(8443);\n\n## Installation\n\n    npm install http-signature\n\n## License\n\nMIT.\n\n## Bugs\n\nSee <https://github.com/joyent/node-http-signature/issues>.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/joyent/node-http-signature/issues"
-  },
-  "homepage": "https://github.com/joyent/node-http-signature",
-  "_id": "http-signature@0.10.0",
-  "_from": "http-signature@~0.10.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,49 +0,0 @@
-# json-stringify-safe
-
-Like JSON.stringify, but doesn't throw on circular references.
-
-## Usage
-
-Takes the same arguments as `JSON.stringify`.
-
-```javascript
-var stringify = require('json-stringify-safe');
-var circularObj = {};
-circularObj.circularRef = circularObj;
-circularObj.list = [ circularObj, circularObj ];
-console.log(stringify(circularObj, null, 2));
-```
-
-Output:
-
-```json
-{
-  "circularRef": "[Circular]",
-  "list": [
-    "[Circular]",
-    "[Circular]"
-  ]
-}
-```
-
-## Details
-
-```
-stringify(obj, serializer, indent, decycler)
-```
-
-The first three arguments are the same as to JSON.stringify.  The last
-is an argument that's only used when the object has been seen already.
-
-The default `decycler` function returns the string `'[Circular]'`.
-If, for example, you pass in `function(k,v){}` (return nothing) then it
-will prune cycles.  If you pass in `function(k,v){ return {foo: 'bar'}}`,
-then cyclical objects will always be represented as `{"foo":"bar"}` in
-the result.
-
-```
-stringify.getSerialize(serializer, decycler)
-```
-
-Returns a serializer that can be used elsewhere.  This is the actual
-function that's passed to JSON.stringify.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-{
-  "name": "json-stringify-safe",
-  "version": "5.0.0",
-  "description": "Like JSON.stringify, but doesn't blow up on circular refs",
-  "main": "stringify.js",
-  "scripts": {
-    "test": "node test.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/json-stringify-safe"
-  },
-  "keywords": [
-    "json",
-    "stringify",
-    "circular",
-    "safe"
-  ],
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me"
-  },
-  "license": "BSD",
-  "readmeFilename": "README.md",
-  "readme": "# json-stringify-safe\n\nLike JSON.stringify, but doesn't throw on circular references.\n\n## Usage\n\nTakes the same arguments as `JSON.stringify`.\n\n```javascript\nvar stringify = require('json-stringify-safe');\nvar circularObj = {};\ncircularObj.circularRef = circularObj;\ncircularObj.list = [ circularObj, circularObj ];\nconsole.log(stringify(circularObj, null, 2));\n```\n\nOutput:\n\n```json\n{\n  \"circularRef\": \"[Circular]\",\n  \"list\": [\n    \"[Circular]\",\n    \"[Circular]\"\n  ]\n}\n```\n\n## Details\n\n```\nstringify(obj, serializer, indent, decycler)\n```\n\nThe first three arguments are the same as to JSON.stringify.  The last\nis an argument that's only used when the object has been seen already.\n\nThe default `decycler` function returns the string `'[Circular]'`.\nIf, for example, you pass in `function(k,v){}` (return nothing) then it\nwill prune cycles.  If you pass in `function(k,v){ return {foo: 'bar'}}`,\nthen cyclical objects will always be represented as `{\"foo\":\"bar\"}` in\nthe result.\n\n```\nstringify.getSerialize(serializer, decycler)\n```\n\nReturns a serializer that can be used elsewhere.  This is the actual\nfunction that's passed to JSON.stringify.\n",
-  "bugs": {
-    "url": "https://github.com/isaacs/json-stringify-safe/issues"
-  },
-  "homepage": "https://github.com/isaacs/json-stringify-safe",
-  "_id": "json-stringify-safe@5.0.0",
-  "_from": "json-stringify-safe@~5.0.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/stringify.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-module.exports = stringify;
-
-function getSerialize (fn, decycle) {
-  var seen = [], keys = [];
-  decycle = decycle || function(key, value) {
-    return '[Circular ' + getPath(value, seen, keys) + ']'
-  };
-  return function(key, value) {
-    var ret = value;
-    if (typeof value === 'object' && value) {
-      if (seen.indexOf(value) !== -1)
-        ret = decycle(key, value);
-      else {
-        seen.push(value);
-        keys.push(key);
-      }
-    }
-    if (fn) ret = fn(key, ret);
-    return ret;
-  }
-}
-
-function getPath (value, seen, keys) {
-  var index = seen.indexOf(value);
-  var path = [ keys[index] ];
-  for (index--; index >= 0; index--) {
-    if (seen[index][ path[0] ] === value) {
-      value = seen[index];
-      path.unshift(keys[index]);
-    }
-  }
-  return '~' + path.join('.');
-}
-
-function stringify(obj, fn, spaces, decycle) {
-  return JSON.stringify(obj, getSerialize(fn, decycle), spaces);
-}
-
-stringify.getSerialize = getSerialize;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/test.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,128 +0,0 @@
-var stringify = require('./stringify.js');
-
-var circularObj = { a: 'b' };
-circularObj.circularRef = circularObj;
-circularObj.list = [ circularObj, circularObj ];
-
-//////////
-// default
-var testObj = {
-  "a": "b",
-  "circularRef": "[Circular ~]",
-  "list": [
-    "[Circular ~]",
-    "[Circular ~]"
-  ]
-};
-
-var assert = require('assert');
-assert.equal(JSON.stringify(testObj, null, 2),
-             stringify(circularObj, null, 2));
-
-assert.equal(JSON.stringify(testObj, null, 2),
-            JSON.stringify(circularObj, stringify.getSerialize(), 2));
-
-
-////////
-// prune
-testObj = {
-  "a": "b",
-  "list": [
-    null,
-    null
-  ]
-};
-
-function prune(k, v) {}
-
-assert.equal(JSON.stringify(testObj, null, 2),
-             stringify(circularObj, null, 2, prune));
-
-///////////
-// re-cycle
-// (throws)
-function recycle(k, v) {
-  return v;
-}
-
-assert.throws(function() {
-  stringify(circularObj, null, 2, recycle);
-});
-
-////////
-// fancy
-testObj = {
-  "a": "b",
-  "circularRef": "circularRef{a:string,circularRef:Object,list:Array}",
-  "list": [
-    "0{a:string,circularRef:Object,list:Array}",
-    "1{a:string,circularRef:Object,list:Array}"
-  ]
-};
-
-function signer(key, value) {
-  var ret = key + '{';
-  var f = false;
-  for (var i in value) {
-    if (f)
-      ret += ',';
-    f = true;
-    ret += i + ':';
-    var v = value[i];
-    switch (typeof v) {
-      case 'object':
-        if (!v)
-          ret += 'null';
-        else if (Array.isArray(v))
-          ret += 'Array'
-        else
-          ret += v.constructor && v.constructor.name || 'Object';
-        break;
-      default:
-        ret += typeof v;
-        break;
-    }
-  }
-  ret += '}';
-  return ret;
-}
-
-assert.equal(JSON.stringify(testObj, null, 2),
-             stringify(circularObj, null, 2, signer));
-
-
-///////
-//multi
-var a = { x: 1 };
-a.a = a;
-var b = { x: 2 };
-b.a = a;
-
-var c = { a: a, b: b };
-var d = { list: [ a, b, c ] };
-d.d = d;
-
-var multi = {
-  "list": [
-    {
-      "x": 1,
-      "a": "[Circular ~.list.0]"
-    },
-    {
-      "x": 2,
-      "a": "[Circular ~.list.0]"
-    },
-    {
-      "a": "[Circular ~.list.0]",
-      "b": "[Circular ~.list.1]"
-    }
-  ],
-  "d": "[Circular ~]"
-};
-
-assert.equal(JSON.stringify(multi, null, 2),
-             stringify(d, null, 2));
-
-////////
-// pass!
-console.log('ok');
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-Copyright (c) 2010 Benjamin Thomas, Robert Kieffer
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,66 +0,0 @@
-# mime
-
-Comprehensive MIME type mapping API. Includes all 600+ types and 800+ extensions defined by the Apache project, plus additional types submitted by the node.js community.
-
-## Install
-
-Install with [npm](http://github.com/isaacs/npm):
-
-    npm install mime
-
-## API - Queries
-
-### mime.lookup(path)
-Get the mime type associated with a file, if no mime type is found `application/octet-stream` is returned. Performs a case-insensitive lookup using the extension in `path` (the substring after the last '/' or '.').  E.g.
-
-    var mime = require('mime');
-
-    mime.lookup('/path/to/file.txt');         // => 'text/plain'
-    mime.lookup('file.txt');                  // => 'text/plain'
-    mime.lookup('.TXT');                      // => 'text/plain'
-    mime.lookup('htm');                       // => 'text/html'
-
-### mime.default_type
-Sets the mime type returned when `mime.lookup` fails to find the extension searched for. (Default is `application/octet-stream`.)
-
-### mime.extension(type)
-Get the default extension for `type`
-
-    mime.extension('text/html');                 // => 'html'
-    mime.extension('application/octet-stream');  // => 'bin'
-
-### mime.charsets.lookup()
-
-Map mime-type to charset
-
-    mime.charsets.lookup('text/plain');        // => 'UTF-8'
-
-(The logic for charset lookups is pretty rudimentary.  Feel free to suggest improvements.)
-
-## API - Defining Custom Types
-
-The following APIs allow you to add your own type mappings within your project.  If you feel a type should be included as part of node-mime, see [requesting new types](https://github.com/broofa/node-mime/wiki/Requesting-New-Types).
-
-### mime.define()
-
-Add custom mime/extension mappings
-
-    mime.define({
-        'text/x-some-format': ['x-sf', 'x-sft', 'x-sfml'],
-        'application/x-my-type': ['x-mt', 'x-mtt'],
-        // etc ...
-    });
-
-    mime.lookup('x-sft');                 // => 'text/x-some-format'
-
-The first entry in the extensions array is returned by `mime.extension()`. E.g.
-
-    mime.extension('text/x-some-format'); // => 'x-sf'
-
-### mime.load(filepath)
-
-Load mappings from an Apache ".types" format file
-
-    mime.load('./my_project.types');
-
-The .types file format is simple -  See the `types` dir for examples.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/mime.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,114 +0,0 @@
-var path = require('path');
-var fs = require('fs');
-
-function Mime() {
-  // Map of extension -> mime type
-  this.types = Object.create(null);
-
-  // Map of mime type -> extension
-  this.extensions = Object.create(null);
-}
-
-/**
- * Define mimetype -> extension mappings.  Each key is a mime-type that maps
- * to an array of extensions associated with the type.  The first extension is
- * used as the default extension for the type.
- *
- * e.g. mime.define({'audio/ogg', ['oga', 'ogg', 'spx']});
- *
- * @param map (Object) type definitions
- */
-Mime.prototype.define = function (map) {
-  for (var type in map) {
-    var exts = map[type];
-
-    for (var i = 0; i < exts.length; i++) {
-      if (process.env.DEBUG_MIME && this.types[exts]) {
-        console.warn(this._loading.replace(/.*\//, ''), 'changes "' + exts[i] + '" extension type from ' +
-          this.types[exts] + ' to ' + type);
-      }
-
-      this.types[exts[i]] = type;
-    }
-
-    // Default extension is the first one we encounter
-    if (!this.extensions[type]) {
-      this.extensions[type] = exts[0];
-    }
-  }
-};
-
-/**
- * Load an Apache2-style ".types" file
- *
- * This may be called multiple times (it's expected).  Where files declare
- * overlapping types/extensions, the last file wins.
- *
- * @param file (String) path of file to load.
- */
-Mime.prototype.load = function(file) {
-
-  this._loading = file;
-  // Read file and split into lines
-  var map = {},
-      content = fs.readFileSync(file, 'ascii'),
-      lines = content.split(/[\r\n]+/);
-
-  lines.forEach(function(line) {
-    // Clean up whitespace/comments, and split into fields
-    var fields = line.replace(/\s*#.*|^\s*|\s*$/g, '').split(/\s+/);
-    map[fields.shift()] = fields;
-  });
-
-  this.define(map);
-
-  this._loading = null;
-};
-
-/**
- * Lookup a mime type based on extension
- */
-Mime.prototype.lookup = function(path, fallback) {
-  var ext = path.replace(/.*[\.\/\\]/, '').toLowerCase();
-
-  return this.types[ext] || fallback || this.default_type;
-};
-
-/**
- * Return file extension associated with a mime type
- */
-Mime.prototype.extension = function(mimeType) {
-  var type = mimeType.match(/^\s*([^;\s]*)(?:;|\s|$)/)[1].toLowerCase();
-  return this.extensions[type];
-};
-
-// Default instance
-var mime = new Mime();
-
-// Load local copy of
-// http://svn.apache.org/repos/asf/httpd/httpd/trunk/docs/conf/mime.types
-mime.load(path.join(__dirname, 'types/mime.types'));
-
-// Load additional types from node.js community
-mime.load(path.join(__dirname, 'types/node.types'));
-
-// Default type
-mime.default_type = mime.lookup('bin');
-
-//
-// Additional API specific to the default instance
-//
-
-mime.Mime = Mime;
-
-/**
- * Lookup a charset based on mime type.
- */
-mime.charsets = {
-  lookup: function(mimeType, fallback) {
-    // Assume text types are utf8
-    return (/^text\//).test(mimeType) ? 'UTF-8' : fallback;
-  }
-};
-
-module.exports = mime;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-{
-  "author": {
-    "name": "Robert Kieffer",
-    "email": "robert@broofa.com",
-    "url": "http://github.com/broofa"
-  },
-  "contributors": [
-    {
-      "name": "Benjamin Thomas",
-      "email": "benjamin@benjaminthomas.org",
-      "url": "http://github.com/bentomas"
-    }
-  ],
-  "dependencies": {},
-  "description": "A comprehensive library for mime-type mapping",
-  "devDependencies": {},
-  "keywords": [
-    "util",
-    "mime"
-  ],
-  "main": "mime.js",
-  "name": "mime",
-  "repository": {
-    "url": "https://github.com/broofa/node-mime",
-    "type": "git"
-  },
-  "version": "1.2.11",
-  "readme": "# mime\n\nComprehensive MIME type mapping API. Includes all 600+ types and 800+ extensions defined by the Apache project, plus additional types submitted by the node.js community.\n\n## Install\n\nInstall with [npm](http://github.com/isaacs/npm):\n\n    npm install mime\n\n## API - Queries\n\n### mime.lookup(path)\nGet the mime type associated with a file, if no mime type is found `application/octet-stream` is returned. Performs a case-insensitive lookup using the extension in `path` (the substring after the last '/' or '.').  E.g.\n\n    var mime = require('mime');\n\n    mime.lookup('/path/to/file.txt');         // => 'text/plain'\n    mime.lookup('file.txt');                  // => 'text/plain'\n    mime.lookup('.TXT');                      // => 'text/plain'\n    mime.lookup('htm');                       // => 'text/html'\n\n### mime.default_type\nSets the mime type returned when `mime.lookup` fails to find the extension searched for. (Default is `application/octet-stream`.)\n\n### mime.extension(type)\nGet the default extension for `type`\n\n    mime.extension('text/html');                 // => 'html'\n    mime.extension('application/octet-stream');  // => 'bin'\n\n### mime.charsets.lookup()\n\nMap mime-type to charset\n\n    mime.charsets.lookup('text/plain');        // => 'UTF-8'\n\n(The logic for charset lookups is pretty rudimentary.  Feel free to suggest improvements.)\n\n## API - Defining Custom Types\n\nThe following APIs allow you to add your own type mappings within your project.  If you feel a type should be included as part of node-mime, see [requesting new types](https://github.com/broofa/node-mime/wiki/Requesting-New-Types).\n\n### mime.define()\n\nAdd custom mime/extension mappings\n\n    mime.define({\n        'text/x-some-format': ['x-sf', 'x-sft', 'x-sfml'],\n        'application/x-my-type': ['x-mt', 'x-mtt'],\n        // etc ...\n    });\n\n    mime.lookup('x-sft');                 // => 'text/x-some-format'\n\nThe first entry in the extensions array is returned by `mime.extension()`. E.g.\n\n    mime.extension('text/x-some-format'); // => 'x-sf'\n\n### mime.load(filepath)\n\nLoad mappings from an Apache \".types\" format file\n\n    mime.load('./my_project.types');\n\nThe .types file format is simple -  See the `types` dir for examples.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/broofa/node-mime/issues"
-  },
-  "homepage": "https://github.com/broofa/node-mime",
-  "_id": "mime@1.2.11",
-  "_from": "mime@~1.2.9"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/test.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,84 +0,0 @@
-/**
- * Usage: node test.js
- */
-
-var mime = require('./mime');
-var assert = require('assert');
-var path = require('path');
-
-function eq(a, b) {
-  console.log('Test: ' + a + ' === ' + b);
-  assert.strictEqual.apply(null, arguments);
-}
-
-console.log(Object.keys(mime.extensions).length + ' types');
-console.log(Object.keys(mime.types).length + ' extensions\n');
-
-//
-// Test mime lookups
-//
-
-eq('text/plain', mime.lookup('text.txt'));     // normal file
-eq('text/plain', mime.lookup('TEXT.TXT'));     // uppercase
-eq('text/plain', mime.lookup('dir/text.txt')); // dir + file
-eq('text/plain', mime.lookup('.text.txt'));    // hidden file
-eq('text/plain', mime.lookup('.txt'));         // nameless
-eq('text/plain', mime.lookup('txt'));          // extension-only
-eq('text/plain', mime.lookup('/txt'));         // extension-less ()
-eq('text/plain', mime.lookup('\\txt'));        // Windows, extension-less
-eq('application/octet-stream', mime.lookup('text.nope')); // unrecognized
-eq('fallback', mime.lookup('text.fallback', 'fallback')); // alternate default
-
-//
-// Test extensions
-//
-
-eq('txt', mime.extension(mime.types.text));
-eq('html', mime.extension(mime.types.htm));
-eq('bin', mime.extension('application/octet-stream'));
-eq('bin', mime.extension('application/octet-stream '));
-eq('html', mime.extension(' text/html; charset=UTF-8'));
-eq('html', mime.extension('text/html; charset=UTF-8 '));
-eq('html', mime.extension('text/html; charset=UTF-8'));
-eq('html', mime.extension('text/html ; charset=UTF-8'));
-eq('html', mime.extension('text/html;charset=UTF-8'));
-eq('html', mime.extension('text/Html;charset=UTF-8'));
-eq(undefined, mime.extension('unrecognized'));
-
-//
-// Test node.types lookups
-//
-
-eq('application/font-woff', mime.lookup('file.woff'));
-eq('application/octet-stream', mime.lookup('file.buffer'));
-eq('audio/mp4', mime.lookup('file.m4a'));
-eq('font/opentype', mime.lookup('file.otf'));
-
-//
-// Test charsets
-//
-
-eq('UTF-8', mime.charsets.lookup('text/plain'));
-eq(undefined, mime.charsets.lookup(mime.types.js));
-eq('fallback', mime.charsets.lookup('application/octet-stream', 'fallback'));
-
-//
-// Test for overlaps between mime.types and node.types
-//
-
-var apacheTypes = new mime.Mime(), nodeTypes = new mime.Mime();
-apacheTypes.load(path.join(__dirname, 'types/mime.types'));
-nodeTypes.load(path.join(__dirname, 'types/node.types'));
-
-var keys = [].concat(Object.keys(apacheTypes.types))
-             .concat(Object.keys(nodeTypes.types));
-keys.sort();
-for (var i = 1; i < keys.length; i++) {
-  if (keys[i] == keys[i-1]) {
-    console.warn('Warning: ' +
-      'node.types defines ' + keys[i] + '->' + nodeTypes.types[keys[i]] +
-      ', mime.types defines ' + keys[i] + '->' + apacheTypes.types[keys[i]]);
-  }
-}
-
-console.log('\nOK');
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/types/mime.types	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1588 +0,0 @@
-# This file maps Internet media types to unique file extension(s).
-# Although created for httpd, this file is used by many software systems
-# and has been placed in the public domain for unlimited redisribution.
-#
-# The table below contains both registered and (common) unregistered types.
-# A type that has no unique extension can be ignored -- they are listed
-# here to guide configurations toward known types and to make it easier to
-# identify "new" types.  File extensions are also commonly used to indicate
-# content languages and encodings, so choose them carefully.
-#
-# Internet media types should be registered as described in RFC 4288.
-# The registry is at <http://www.iana.org/assignments/media-types/>.
-#
-# MIME type (lowercased)			Extensions
-# ============================================	==========
-# application/1d-interleaved-parityfec
-# application/3gpp-ims+xml
-# application/activemessage
-application/andrew-inset			ez
-# application/applefile
-application/applixware				aw
-application/atom+xml				atom
-application/atomcat+xml				atomcat
-# application/atomicmail
-application/atomsvc+xml				atomsvc
-# application/auth-policy+xml
-# application/batch-smtp
-# application/beep+xml
-# application/calendar+xml
-# application/cals-1840
-# application/ccmp+xml
-application/ccxml+xml				ccxml
-application/cdmi-capability			cdmia
-application/cdmi-container			cdmic
-application/cdmi-domain				cdmid
-application/cdmi-object				cdmio
-application/cdmi-queue				cdmiq
-# application/cea-2018+xml
-# application/cellml+xml
-# application/cfw
-# application/cnrp+xml
-# application/commonground
-# application/conference-info+xml
-# application/cpl+xml
-# application/csta+xml
-# application/cstadata+xml
-application/cu-seeme				cu
-# application/cybercash
-application/davmount+xml			davmount
-# application/dca-rft
-# application/dec-dx
-# application/dialog-info+xml
-# application/dicom
-# application/dns
-application/docbook+xml				dbk
-# application/dskpp+xml
-application/dssc+der				dssc
-application/dssc+xml				xdssc
-# application/dvcs
-application/ecmascript				ecma
-# application/edi-consent
-# application/edi-x12
-# application/edifact
-application/emma+xml				emma
-# application/epp+xml
-application/epub+zip				epub
-# application/eshop
-# application/example
-application/exi					exi
-# application/fastinfoset
-# application/fastsoap
-# application/fits
-application/font-tdpfr				pfr
-# application/framework-attributes+xml
-application/gml+xml				gml
-application/gpx+xml				gpx
-application/gxf					gxf
-# application/h224
-# application/held+xml
-# application/http
-application/hyperstudio				stk
-# application/ibe-key-request+xml
-# application/ibe-pkg-reply+xml
-# application/ibe-pp-data
-# application/iges
-# application/im-iscomposing+xml
-# application/index
-# application/index.cmd
-# application/index.obj
-# application/index.response
-# application/index.vnd
-application/inkml+xml				ink inkml
-# application/iotp
-application/ipfix				ipfix
-# application/ipp
-# application/isup
-application/java-archive			jar
-application/java-serialized-object		ser
-application/java-vm				class
-application/javascript				js
-application/json				json
-application/jsonml+json				jsonml
-# application/kpml-request+xml
-# application/kpml-response+xml
-application/lost+xml				lostxml
-application/mac-binhex40			hqx
-application/mac-compactpro			cpt
-# application/macwriteii
-application/mads+xml				mads
-application/marc				mrc
-application/marcxml+xml				mrcx
-application/mathematica				ma nb mb
-# application/mathml-content+xml
-# application/mathml-presentation+xml
-application/mathml+xml				mathml
-# application/mbms-associated-procedure-description+xml
-# application/mbms-deregister+xml
-# application/mbms-envelope+xml
-# application/mbms-msk+xml
-# application/mbms-msk-response+xml
-# application/mbms-protection-description+xml
-# application/mbms-reception-report+xml
-# application/mbms-register+xml
-# application/mbms-register-response+xml
-# application/mbms-user-service-description+xml
-application/mbox				mbox
-# application/media_control+xml
-application/mediaservercontrol+xml		mscml
-application/metalink+xml			metalink
-application/metalink4+xml			meta4
-application/mets+xml				mets
-# application/mikey
-application/mods+xml				mods
-# application/moss-keys
-# application/moss-signature
-# application/mosskey-data
-# application/mosskey-request
-application/mp21				m21 mp21
-application/mp4					mp4s
-# application/mpeg4-generic
-# application/mpeg4-iod
-# application/mpeg4-iod-xmt
-# application/msc-ivr+xml
-# application/msc-mixer+xml
-application/msword				doc dot
-application/mxf					mxf
-# application/nasdata
-# application/news-checkgroups
-# application/news-groupinfo
-# application/news-transmission
-# application/nss
-# application/ocsp-request
-# application/ocsp-response
-application/octet-stream	bin dms lrf mar so dist distz pkg bpk dump elc deploy
-application/oda					oda
-application/oebps-package+xml			opf
-application/ogg					ogx
-application/omdoc+xml				omdoc
-application/onenote				onetoc onetoc2 onetmp onepkg
-application/oxps				oxps
-# application/parityfec
-application/patch-ops-error+xml			xer
-application/pdf					pdf
-application/pgp-encrypted			pgp
-# application/pgp-keys
-application/pgp-signature			asc sig
-application/pics-rules				prf
-# application/pidf+xml
-# application/pidf-diff+xml
-application/pkcs10				p10
-application/pkcs7-mime				p7m p7c
-application/pkcs7-signature			p7s
-application/pkcs8				p8
-application/pkix-attr-cert			ac
-application/pkix-cert				cer
-application/pkix-crl				crl
-application/pkix-pkipath			pkipath
-application/pkixcmp				pki
-application/pls+xml				pls
-# application/poc-settings+xml
-application/postscript				ai eps ps
-# application/prs.alvestrand.titrax-sheet
-application/prs.cww				cww
-# application/prs.nprend
-# application/prs.plucker
-# application/prs.rdf-xml-crypt
-# application/prs.xsf+xml
-application/pskc+xml				pskcxml
-# application/qsig
-application/rdf+xml				rdf
-application/reginfo+xml				rif
-application/relax-ng-compact-syntax		rnc
-# application/remote-printing
-application/resource-lists+xml			rl
-application/resource-lists-diff+xml		rld
-# application/riscos
-# application/rlmi+xml
-application/rls-services+xml			rs
-application/rpki-ghostbusters			gbr
-application/rpki-manifest			mft
-application/rpki-roa				roa
-# application/rpki-updown
-application/rsd+xml				rsd
-application/rss+xml				rss
-application/rtf					rtf
-# application/rtx
-# application/samlassertion+xml
-# application/samlmetadata+xml
-application/sbml+xml				sbml
-application/scvp-cv-request			scq
-application/scvp-cv-response			scs
-application/scvp-vp-request			spq
-application/scvp-vp-response			spp
-application/sdp					sdp
-# application/set-payment
-application/set-payment-initiation		setpay
-# application/set-registration
-application/set-registration-initiation		setreg
-# application/sgml
-# application/sgml-open-catalog
-application/shf+xml				shf
-# application/sieve
-# application/simple-filter+xml
-# application/simple-message-summary
-# application/simplesymbolcontainer
-# application/slate
-# application/smil
-application/smil+xml				smi smil
-# application/soap+fastinfoset
-# application/soap+xml
-application/sparql-query			rq
-application/sparql-results+xml			srx
-# application/spirits-event+xml
-application/srgs				gram
-application/srgs+xml				grxml
-application/sru+xml				sru
-application/ssdl+xml				ssdl
-application/ssml+xml				ssml
-# application/tamp-apex-update
-# application/tamp-apex-update-confirm
-# application/tamp-community-update
-# application/tamp-community-update-confirm
-# application/tamp-error
-# application/tamp-sequence-adjust
-# application/tamp-sequence-adjust-confirm
-# application/tamp-status-query
-# application/tamp-status-response
-# application/tamp-update
-# application/tamp-update-confirm
-application/tei+xml				tei teicorpus
-application/thraud+xml				tfi
-# application/timestamp-query
-# application/timestamp-reply
-application/timestamped-data			tsd
-# application/tve-trigger
-# application/ulpfec
-# application/vcard+xml
-# application/vemmi
-# application/vividence.scriptfile
-# application/vnd.3gpp.bsf+xml
-application/vnd.3gpp.pic-bw-large		plb
-application/vnd.3gpp.pic-bw-small		psb
-application/vnd.3gpp.pic-bw-var			pvb
-# application/vnd.3gpp.sms
-# application/vnd.3gpp2.bcmcsinfo+xml
-# application/vnd.3gpp2.sms
-application/vnd.3gpp2.tcap			tcap
-application/vnd.3m.post-it-notes		pwn
-application/vnd.accpac.simply.aso		aso
-application/vnd.accpac.simply.imp		imp
-application/vnd.acucobol			acu
-application/vnd.acucorp				atc acutc
-application/vnd.adobe.air-application-installer-package+zip	air
-application/vnd.adobe.formscentral.fcdt		fcdt
-application/vnd.adobe.fxp			fxp fxpl
-# application/vnd.adobe.partial-upload
-application/vnd.adobe.xdp+xml			xdp
-application/vnd.adobe.xfdf			xfdf
-# application/vnd.aether.imp
-# application/vnd.ah-barcode
-application/vnd.ahead.space			ahead
-application/vnd.airzip.filesecure.azf		azf
-application/vnd.airzip.filesecure.azs		azs
-application/vnd.amazon.ebook			azw
-application/vnd.americandynamics.acc		acc
-application/vnd.amiga.ami			ami
-# application/vnd.amundsen.maze+xml
-application/vnd.android.package-archive		apk
-application/vnd.anser-web-certificate-issue-initiation	cii
-application/vnd.anser-web-funds-transfer-initiation	fti
-application/vnd.antix.game-component		atx
-application/vnd.apple.installer+xml		mpkg
-application/vnd.apple.mpegurl			m3u8
-# application/vnd.arastra.swi
-application/vnd.aristanetworks.swi		swi
-application/vnd.astraea-software.iota		iota
-application/vnd.audiograph			aep
-# application/vnd.autopackage
-# application/vnd.avistar+xml
-application/vnd.blueice.multipass		mpm
-# application/vnd.bluetooth.ep.oob
-application/vnd.bmi				bmi
-application/vnd.businessobjects			rep
-# application/vnd.cab-jscript
-# application/vnd.canon-cpdl
-# application/vnd.canon-lips
-# application/vnd.cendio.thinlinc.clientconf
-application/vnd.chemdraw+xml			cdxml
-application/vnd.chipnuts.karaoke-mmd		mmd
-application/vnd.cinderella			cdy
-# application/vnd.cirpack.isdn-ext
-application/vnd.claymore			cla
-application/vnd.cloanto.rp9			rp9
-application/vnd.clonk.c4group			c4g c4d c4f c4p c4u
-application/vnd.cluetrust.cartomobile-config		c11amc
-application/vnd.cluetrust.cartomobile-config-pkg	c11amz
-# application/vnd.collection+json
-# application/vnd.commerce-battelle
-application/vnd.commonspace			csp
-application/vnd.contact.cmsg			cdbcmsg
-application/vnd.cosmocaller			cmc
-application/vnd.crick.clicker			clkx
-application/vnd.crick.clicker.keyboard		clkk
-application/vnd.crick.clicker.palette		clkp
-application/vnd.crick.clicker.template		clkt
-application/vnd.crick.clicker.wordbank		clkw
-application/vnd.criticaltools.wbs+xml		wbs
-application/vnd.ctc-posml			pml
-# application/vnd.ctct.ws+xml
-# application/vnd.cups-pdf
-# application/vnd.cups-postscript
-application/vnd.cups-ppd			ppd
-# application/vnd.cups-raster
-# application/vnd.cups-raw
-# application/vnd.curl
-application/vnd.curl.car			car
-application/vnd.curl.pcurl			pcurl
-# application/vnd.cybank
-application/vnd.dart				dart
-application/vnd.data-vision.rdz			rdz
-application/vnd.dece.data			uvf uvvf uvd uvvd
-application/vnd.dece.ttml+xml			uvt uvvt
-application/vnd.dece.unspecified		uvx uvvx
-application/vnd.dece.zip			uvz uvvz
-application/vnd.denovo.fcselayout-link		fe_launch
-# application/vnd.dir-bi.plate-dl-nosuffix
-application/vnd.dna				dna
-application/vnd.dolby.mlp			mlp
-# application/vnd.dolby.mobile.1
-# application/vnd.dolby.mobile.2
-application/vnd.dpgraph				dpg
-application/vnd.dreamfactory			dfac
-application/vnd.ds-keypoint			kpxx
-application/vnd.dvb.ait				ait
-# application/vnd.dvb.dvbj
-# application/vnd.dvb.esgcontainer
-# application/vnd.dvb.ipdcdftnotifaccess
-# application/vnd.dvb.ipdcesgaccess
-# application/vnd.dvb.ipdcesgaccess2
-# application/vnd.dvb.ipdcesgpdd
-# application/vnd.dvb.ipdcroaming
-# application/vnd.dvb.iptv.alfec-base
-# application/vnd.dvb.iptv.alfec-enhancement
-# application/vnd.dvb.notif-aggregate-root+xml
-# application/vnd.dvb.notif-container+xml
-# application/vnd.dvb.notif-generic+xml
-# application/vnd.dvb.notif-ia-msglist+xml
-# application/vnd.dvb.notif-ia-registration-request+xml
-# application/vnd.dvb.notif-ia-registration-response+xml
-# application/vnd.dvb.notif-init+xml
-# application/vnd.dvb.pfr
-application/vnd.dvb.service			svc
-# application/vnd.dxr
-application/vnd.dynageo				geo
-# application/vnd.easykaraoke.cdgdownload
-# application/vnd.ecdis-update
-application/vnd.ecowin.chart			mag
-# application/vnd.ecowin.filerequest
-# application/vnd.ecowin.fileupdate
-# application/vnd.ecowin.series
-# application/vnd.ecowin.seriesrequest
-# application/vnd.ecowin.seriesupdate
-# application/vnd.emclient.accessrequest+xml
-application/vnd.enliven				nml
-# application/vnd.eprints.data+xml
-application/vnd.epson.esf			esf
-application/vnd.epson.msf			msf
-application/vnd.epson.quickanime		qam
-application/vnd.epson.salt			slt
-application/vnd.epson.ssf			ssf
-# application/vnd.ericsson.quickcall
-application/vnd.eszigno3+xml			es3 et3
-# application/vnd.etsi.aoc+xml
-# application/vnd.etsi.cug+xml
-# application/vnd.etsi.iptvcommand+xml
-# application/vnd.etsi.iptvdiscovery+xml
-# application/vnd.etsi.iptvprofile+xml
-# application/vnd.etsi.iptvsad-bc+xml
-# application/vnd.etsi.iptvsad-cod+xml
-# application/vnd.etsi.iptvsad-npvr+xml
-# application/vnd.etsi.iptvservice+xml
-# application/vnd.etsi.iptvsync+xml
-# application/vnd.etsi.iptvueprofile+xml
-# application/vnd.etsi.mcid+xml
-# application/vnd.etsi.overload-control-policy-dataset+xml
-# application/vnd.etsi.sci+xml
-# application/vnd.etsi.simservs+xml
-# application/vnd.etsi.tsl+xml
-# application/vnd.etsi.tsl.der
-# application/vnd.eudora.data
-application/vnd.ezpix-album			ez2
-application/vnd.ezpix-package			ez3
-# application/vnd.f-secure.mobile
-application/vnd.fdf				fdf
-application/vnd.fdsn.mseed			mseed
-application/vnd.fdsn.seed			seed dataless
-# application/vnd.ffsns
-# application/vnd.fints
-application/vnd.flographit			gph
-application/vnd.fluxtime.clip			ftc
-# application/vnd.font-fontforge-sfd
-application/vnd.framemaker			fm frame maker book
-application/vnd.frogans.fnc			fnc
-application/vnd.frogans.ltf			ltf
-application/vnd.fsc.weblaunch			fsc
-application/vnd.fujitsu.oasys			oas
-application/vnd.fujitsu.oasys2			oa2
-application/vnd.fujitsu.oasys3			oa3
-application/vnd.fujitsu.oasysgp			fg5
-application/vnd.fujitsu.oasysprs		bh2
-# application/vnd.fujixerox.art-ex
-# application/vnd.fujixerox.art4
-# application/vnd.fujixerox.hbpl
-application/vnd.fujixerox.ddd			ddd
-application/vnd.fujixerox.docuworks		xdw
-application/vnd.fujixerox.docuworks.binder	xbd
-# application/vnd.fut-misnet
-application/vnd.fuzzysheet			fzs
-application/vnd.genomatix.tuxedo		txd
-# application/vnd.geocube+xml
-application/vnd.geogebra.file			ggb
-application/vnd.geogebra.tool			ggt
-application/vnd.geometry-explorer		gex gre
-application/vnd.geonext				gxt
-application/vnd.geoplan				g2w
-application/vnd.geospace			g3w
-# application/vnd.globalplatform.card-content-mgt
-# application/vnd.globalplatform.card-content-mgt-response
-application/vnd.gmx				gmx
-application/vnd.google-earth.kml+xml		kml
-application/vnd.google-earth.kmz		kmz
-application/vnd.grafeq				gqf gqs
-# application/vnd.gridmp
-application/vnd.groove-account			gac
-application/vnd.groove-help			ghf
-application/vnd.groove-identity-message		gim
-application/vnd.groove-injector			grv
-application/vnd.groove-tool-message		gtm
-application/vnd.groove-tool-template		tpl
-application/vnd.groove-vcard			vcg
-# application/vnd.hal+json
-application/vnd.hal+xml				hal
-application/vnd.handheld-entertainment+xml	zmm
-application/vnd.hbci				hbci
-# application/vnd.hcl-bireports
-application/vnd.hhe.lesson-player		les
-application/vnd.hp-hpgl				hpgl
-application/vnd.hp-hpid				hpid
-application/vnd.hp-hps				hps
-application/vnd.hp-jlyt				jlt
-application/vnd.hp-pcl				pcl
-application/vnd.hp-pclxl			pclxl
-# application/vnd.httphone
-application/vnd.hydrostatix.sof-data		sfd-hdstx
-# application/vnd.hzn-3d-crossword
-# application/vnd.ibm.afplinedata
-# application/vnd.ibm.electronic-media
-application/vnd.ibm.minipay			mpy
-application/vnd.ibm.modcap			afp listafp list3820
-application/vnd.ibm.rights-management		irm
-application/vnd.ibm.secure-container		sc
-application/vnd.iccprofile			icc icm
-application/vnd.igloader			igl
-application/vnd.immervision-ivp			ivp
-application/vnd.immervision-ivu			ivu
-# application/vnd.informedcontrol.rms+xml
-# application/vnd.informix-visionary
-# application/vnd.infotech.project
-# application/vnd.infotech.project+xml
-# application/vnd.innopath.wamp.notification
-application/vnd.insors.igm			igm
-application/vnd.intercon.formnet		xpw xpx
-application/vnd.intergeo			i2g
-# application/vnd.intertrust.digibox
-# application/vnd.intertrust.nncp
-application/vnd.intu.qbo			qbo
-application/vnd.intu.qfx			qfx
-# application/vnd.iptc.g2.conceptitem+xml
-# application/vnd.iptc.g2.knowledgeitem+xml
-# application/vnd.iptc.g2.newsitem+xml
-# application/vnd.iptc.g2.newsmessage+xml
-# application/vnd.iptc.g2.packageitem+xml
-# application/vnd.iptc.g2.planningitem+xml
-application/vnd.ipunplugged.rcprofile		rcprofile
-application/vnd.irepository.package+xml		irp
-application/vnd.is-xpr				xpr
-application/vnd.isac.fcs			fcs
-application/vnd.jam				jam
-# application/vnd.japannet-directory-service
-# application/vnd.japannet-jpnstore-wakeup
-# application/vnd.japannet-payment-wakeup
-# application/vnd.japannet-registration
-# application/vnd.japannet-registration-wakeup
-# application/vnd.japannet-setstore-wakeup
-# application/vnd.japannet-verification
-# application/vnd.japannet-verification-wakeup
-application/vnd.jcp.javame.midlet-rms		rms
-application/vnd.jisp				jisp
-application/vnd.joost.joda-archive		joda
-application/vnd.kahootz				ktz ktr
-application/vnd.kde.karbon			karbon
-application/vnd.kde.kchart			chrt
-application/vnd.kde.kformula			kfo
-application/vnd.kde.kivio			flw
-application/vnd.kde.kontour			kon
-application/vnd.kde.kpresenter			kpr kpt
-application/vnd.kde.kspread			ksp
-application/vnd.kde.kword			kwd kwt
-application/vnd.kenameaapp			htke
-application/vnd.kidspiration			kia
-application/vnd.kinar				kne knp
-application/vnd.koan				skp skd skt skm
-application/vnd.kodak-descriptor		sse
-application/vnd.las.las+xml			lasxml
-# application/vnd.liberty-request+xml
-application/vnd.llamagraphics.life-balance.desktop	lbd
-application/vnd.llamagraphics.life-balance.exchange+xml	lbe
-application/vnd.lotus-1-2-3			123
-application/vnd.lotus-approach			apr
-application/vnd.lotus-freelance			pre
-application/vnd.lotus-notes			nsf
-application/vnd.lotus-organizer			org
-application/vnd.lotus-screencam			scm
-application/vnd.lotus-wordpro			lwp
-application/vnd.macports.portpkg		portpkg
-# application/vnd.marlin.drm.actiontoken+xml
-# application/vnd.marlin.drm.conftoken+xml
-# application/vnd.marlin.drm.license+xml
-# application/vnd.marlin.drm.mdcf
-application/vnd.mcd				mcd
-application/vnd.medcalcdata			mc1
-application/vnd.mediastation.cdkey		cdkey
-# application/vnd.meridian-slingshot
-application/vnd.mfer				mwf
-application/vnd.mfmp				mfm
-application/vnd.micrografx.flo			flo
-application/vnd.micrografx.igx			igx
-application/vnd.mif				mif
-# application/vnd.minisoft-hp3000-save
-# application/vnd.mitsubishi.misty-guard.trustweb
-application/vnd.mobius.daf			daf
-application/vnd.mobius.dis			dis
-application/vnd.mobius.mbk			mbk
-application/vnd.mobius.mqy			mqy
-application/vnd.mobius.msl			msl
-application/vnd.mobius.plc			plc
-application/vnd.mobius.txf			txf
-application/vnd.mophun.application		mpn
-application/vnd.mophun.certificate		mpc
-# application/vnd.motorola.flexsuite
-# application/vnd.motorola.flexsuite.adsi
-# application/vnd.motorola.flexsuite.fis
-# application/vnd.motorola.flexsuite.gotap
-# application/vnd.motorola.flexsuite.kmr
-# application/vnd.motorola.flexsuite.ttc
-# application/vnd.motorola.flexsuite.wem
-# application/vnd.motorola.iprm
-application/vnd.mozilla.xul+xml			xul
-application/vnd.ms-artgalry			cil
-# application/vnd.ms-asf
-application/vnd.ms-cab-compressed		cab
-# application/vnd.ms-color.iccprofile
-application/vnd.ms-excel			xls xlm xla xlc xlt xlw
-application/vnd.ms-excel.addin.macroenabled.12		xlam
-application/vnd.ms-excel.sheet.binary.macroenabled.12	xlsb
-application/vnd.ms-excel.sheet.macroenabled.12		xlsm
-application/vnd.ms-excel.template.macroenabled.12	xltm
-application/vnd.ms-fontobject			eot
-application/vnd.ms-htmlhelp			chm
-application/vnd.ms-ims				ims
-application/vnd.ms-lrm				lrm
-# application/vnd.ms-office.activex+xml
-application/vnd.ms-officetheme			thmx
-# application/vnd.ms-opentype
-# application/vnd.ms-package.obfuscated-opentype
-application/vnd.ms-pki.seccat			cat
-application/vnd.ms-pki.stl			stl
-# application/vnd.ms-playready.initiator+xml
-application/vnd.ms-powerpoint			ppt pps pot
-application/vnd.ms-powerpoint.addin.macroenabled.12		ppam
-application/vnd.ms-powerpoint.presentation.macroenabled.12	pptm
-application/vnd.ms-powerpoint.slide.macroenabled.12		sldm
-application/vnd.ms-powerpoint.slideshow.macroenabled.12		ppsm
-application/vnd.ms-powerpoint.template.macroenabled.12		potm
-# application/vnd.ms-printing.printticket+xml
-application/vnd.ms-project			mpp mpt
-# application/vnd.ms-tnef
-# application/vnd.ms-wmdrm.lic-chlg-req
-# application/vnd.ms-wmdrm.lic-resp
-# application/vnd.ms-wmdrm.meter-chlg-req
-# application/vnd.ms-wmdrm.meter-resp
-application/vnd.ms-word.document.macroenabled.12	docm
-application/vnd.ms-word.template.macroenabled.12	dotm
-application/vnd.ms-works			wps wks wcm wdb
-application/vnd.ms-wpl				wpl
-application/vnd.ms-xpsdocument			xps
-application/vnd.mseq				mseq
-# application/vnd.msign
-# application/vnd.multiad.creator
-# application/vnd.multiad.creator.cif
-# application/vnd.music-niff
-application/vnd.musician			mus
-application/vnd.muvee.style			msty
-application/vnd.mynfc				taglet
-# application/vnd.ncd.control
-# application/vnd.ncd.reference
-# application/vnd.nervana
-# application/vnd.netfpx
-application/vnd.neurolanguage.nlu		nlu
-application/vnd.nitf				ntf nitf
-application/vnd.noblenet-directory		nnd
-application/vnd.noblenet-sealer			nns
-application/vnd.noblenet-web			nnw
-# application/vnd.nokia.catalogs
-# application/vnd.nokia.conml+wbxml
-# application/vnd.nokia.conml+xml
-# application/vnd.nokia.isds-radio-presets
-# application/vnd.nokia.iptv.config+xml
-# application/vnd.nokia.landmark+wbxml
-# application/vnd.nokia.landmark+xml
-# application/vnd.nokia.landmarkcollection+xml
-# application/vnd.nokia.n-gage.ac+xml
-application/vnd.nokia.n-gage.data		ngdat
-application/vnd.nokia.n-gage.symbian.install	n-gage
-# application/vnd.nokia.ncd
-# application/vnd.nokia.pcd+wbxml
-# application/vnd.nokia.pcd+xml
-application/vnd.nokia.radio-preset		rpst
-application/vnd.nokia.radio-presets		rpss
-application/vnd.novadigm.edm			edm
-application/vnd.novadigm.edx			edx
-application/vnd.novadigm.ext			ext
-# application/vnd.ntt-local.file-transfer
-# application/vnd.ntt-local.sip-ta_remote
-# application/vnd.ntt-local.sip-ta_tcp_stream
-application/vnd.oasis.opendocument.chart		odc
-application/vnd.oasis.opendocument.chart-template	otc
-application/vnd.oasis.opendocument.database		odb
-application/vnd.oasis.opendocument.formula		odf
-application/vnd.oasis.opendocument.formula-template	odft
-application/vnd.oasis.opendocument.graphics		odg
-application/vnd.oasis.opendocument.graphics-template	otg
-application/vnd.oasis.opendocument.image		odi
-application/vnd.oasis.opendocument.image-template	oti
-application/vnd.oasis.opendocument.presentation		odp
-application/vnd.oasis.opendocument.presentation-template	otp
-application/vnd.oasis.opendocument.spreadsheet		ods
-application/vnd.oasis.opendocument.spreadsheet-template	ots
-application/vnd.oasis.opendocument.text			odt
-application/vnd.oasis.opendocument.text-master		odm
-application/vnd.oasis.opendocument.text-template	ott
-application/vnd.oasis.opendocument.text-web		oth
-# application/vnd.obn
-# application/vnd.oftn.l10n+json
-# application/vnd.oipf.contentaccessdownload+xml
-# application/vnd.oipf.contentaccessstreaming+xml
-# application/vnd.oipf.cspg-hexbinary
-# application/vnd.oipf.dae.svg+xml
-# application/vnd.oipf.dae.xhtml+xml
-# application/vnd.oipf.mippvcontrolmessage+xml
-# application/vnd.oipf.pae.gem
-# application/vnd.oipf.spdiscovery+xml
-# application/vnd.oipf.spdlist+xml
-# application/vnd.oipf.ueprofile+xml
-# application/vnd.oipf.userprofile+xml
-application/vnd.olpc-sugar			xo
-# application/vnd.oma-scws-config
-# application/vnd.oma-scws-http-request
-# application/vnd.oma-scws-http-response
-# application/vnd.oma.bcast.associated-procedure-parameter+xml
-# application/vnd.oma.bcast.drm-trigger+xml
-# application/vnd.oma.bcast.imd+xml
-# application/vnd.oma.bcast.ltkm
-# application/vnd.oma.bcast.notification+xml
-# application/vnd.oma.bcast.provisioningtrigger
-# application/vnd.oma.bcast.sgboot
-# application/vnd.oma.bcast.sgdd+xml
-# application/vnd.oma.bcast.sgdu
-# application/vnd.oma.bcast.simple-symbol-container
-# application/vnd.oma.bcast.smartcard-trigger+xml
-# application/vnd.oma.bcast.sprov+xml
-# application/vnd.oma.bcast.stkm
-# application/vnd.oma.cab-address-book+xml
-# application/vnd.oma.cab-feature-handler+xml
-# application/vnd.oma.cab-pcc+xml
-# application/vnd.oma.cab-user-prefs+xml
-# application/vnd.oma.dcd
-# application/vnd.oma.dcdc
-application/vnd.oma.dd2+xml			dd2
-# application/vnd.oma.drm.risd+xml
-# application/vnd.oma.group-usage-list+xml
-# application/vnd.oma.pal+xml
-# application/vnd.oma.poc.detailed-progress-report+xml
-# application/vnd.oma.poc.final-report+xml
-# application/vnd.oma.poc.groups+xml
-# application/vnd.oma.poc.invocation-descriptor+xml
-# application/vnd.oma.poc.optimized-progress-report+xml
-# application/vnd.oma.push
-# application/vnd.oma.scidm.messages+xml
-# application/vnd.oma.xcap-directory+xml
-# application/vnd.omads-email+xml
-# application/vnd.omads-file+xml
-# application/vnd.omads-folder+xml
-# application/vnd.omaloc-supl-init
-application/vnd.openofficeorg.extension		oxt
-# application/vnd.openxmlformats-officedocument.custom-properties+xml
-# application/vnd.openxmlformats-officedocument.customxmlproperties+xml
-# application/vnd.openxmlformats-officedocument.drawing+xml
-# application/vnd.openxmlformats-officedocument.drawingml.chart+xml
-# application/vnd.openxmlformats-officedocument.drawingml.chartshapes+xml
-# application/vnd.openxmlformats-officedocument.drawingml.diagramcolors+xml
-# application/vnd.openxmlformats-officedocument.drawingml.diagramdata+xml
-# application/vnd.openxmlformats-officedocument.drawingml.diagramlayout+xml
-# application/vnd.openxmlformats-officedocument.drawingml.diagramstyle+xml
-# application/vnd.openxmlformats-officedocument.extended-properties+xml
-# application/vnd.openxmlformats-officedocument.presentationml.commentauthors+xml
-# application/vnd.openxmlformats-officedocument.presentationml.comments+xml
-# application/vnd.openxmlformats-officedocument.presentationml.handoutmaster+xml
-# application/vnd.openxmlformats-officedocument.presentationml.notesmaster+xml
-# application/vnd.openxmlformats-officedocument.presentationml.notesslide+xml
-application/vnd.openxmlformats-officedocument.presentationml.presentation	pptx
-# application/vnd.openxmlformats-officedocument.presentationml.presentation.main+xml
-# application/vnd.openxmlformats-officedocument.presentationml.presprops+xml
-application/vnd.openxmlformats-officedocument.presentationml.slide	sldx
-# application/vnd.openxmlformats-officedocument.presentationml.slide+xml
-# application/vnd.openxmlformats-officedocument.presentationml.slidelayout+xml
-# application/vnd.openxmlformats-officedocument.presentationml.slidemaster+xml
-application/vnd.openxmlformats-officedocument.presentationml.slideshow	ppsx
-# application/vnd.openxmlformats-officedocument.presentationml.slideshow.main+xml
-# application/vnd.openxmlformats-officedocument.presentationml.slideupdateinfo+xml
-# application/vnd.openxmlformats-officedocument.presentationml.tablestyles+xml
-# application/vnd.openxmlformats-officedocument.presentationml.tags+xml
-application/vnd.openxmlformats-officedocument.presentationml.template	potx
-# application/vnd.openxmlformats-officedocument.presentationml.template.main+xml
-# application/vnd.openxmlformats-officedocument.presentationml.viewprops+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.calcchain+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.chartsheet+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.comments+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.connections+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.dialogsheet+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.externallink+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.pivotcachedefinition+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.pivotcacherecords+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.pivottable+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.querytable+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.revisionheaders+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.revisionlog+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.sharedstrings+xml
-application/vnd.openxmlformats-officedocument.spreadsheetml.sheet	xlsx
-# application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.sheetmetadata+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.styles+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.table+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.tablesinglecells+xml
-application/vnd.openxmlformats-officedocument.spreadsheetml.template	xltx
-# application/vnd.openxmlformats-officedocument.spreadsheetml.template.main+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.usernames+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.volatiledependencies+xml
-# application/vnd.openxmlformats-officedocument.spreadsheetml.worksheet+xml
-# application/vnd.openxmlformats-officedocument.theme+xml
-# application/vnd.openxmlformats-officedocument.themeoverride+xml
-# application/vnd.openxmlformats-officedocument.vmldrawing
-# application/vnd.openxmlformats-officedocument.wordprocessingml.comments+xml
-application/vnd.openxmlformats-officedocument.wordprocessingml.document	docx
-# application/vnd.openxmlformats-officedocument.wordprocessingml.document.glossary+xml
-# application/vnd.openxmlformats-officedocument.wordprocessingml.document.main+xml
-# application/vnd.openxmlformats-officedocument.wordprocessingml.endnotes+xml
-# application/vnd.openxmlformats-officedocument.wordprocessingml.fonttable+xml
-# application/vnd.openxmlformats-officedocument.wordprocessingml.footer+xml
-# application/vnd.openxmlformats-officedocument.wordprocessingml.footnotes+xml
-# application/vnd.openxmlformats-officedocument.wordprocessingml.numbering+xml
-# application/vnd.openxmlformats-officedocument.wordprocessingml.settings+xml
-# application/vnd.openxmlformats-officedocument.wordprocessingml.styles+xml
-application/vnd.openxmlformats-officedocument.wordprocessingml.template	dotx
-# application/vnd.openxmlformats-officedocument.wordprocessingml.template.main+xml
-# application/vnd.openxmlformats-officedocument.wordprocessingml.websettings+xml
-# application/vnd.openxmlformats-package.core-properties+xml
-# application/vnd.openxmlformats-package.digital-signature-xmlsignature+xml
-# application/vnd.openxmlformats-package.relationships+xml
-# application/vnd.quobject-quoxdocument
-# application/vnd.osa.netdeploy
-application/vnd.osgeo.mapguide.package		mgp
-# application/vnd.osgi.bundle
-application/vnd.osgi.dp				dp
-application/vnd.osgi.subsystem			esa
-# application/vnd.otps.ct-kip+xml
-application/vnd.palm				pdb pqa oprc
-# application/vnd.paos.xml
-application/vnd.pawaafile			paw
-application/vnd.pg.format			str
-application/vnd.pg.osasli			ei6
-# application/vnd.piaccess.application-licence
-application/vnd.picsel				efif
-application/vnd.pmi.widget			wg
-# application/vnd.poc.group-advertisement+xml
-application/vnd.pocketlearn			plf
-application/vnd.powerbuilder6			pbd
-# application/vnd.powerbuilder6-s
-# application/vnd.powerbuilder7
-# application/vnd.powerbuilder7-s
-# application/vnd.powerbuilder75
-# application/vnd.powerbuilder75-s
-# application/vnd.preminet
-application/vnd.previewsystems.box		box
-application/vnd.proteus.magazine		mgz
-application/vnd.publishare-delta-tree		qps
-application/vnd.pvi.ptid1			ptid
-# application/vnd.pwg-multiplexed
-# application/vnd.pwg-xhtml-print+xml
-# application/vnd.qualcomm.brew-app-res
-application/vnd.quark.quarkxpress		qxd qxt qwd qwt qxl qxb
-# application/vnd.radisys.moml+xml
-# application/vnd.radisys.msml+xml
-# application/vnd.radisys.msml-audit+xml
-# application/vnd.radisys.msml-audit-conf+xml
-# application/vnd.radisys.msml-audit-conn+xml
-# application/vnd.radisys.msml-audit-dialog+xml
-# application/vnd.radisys.msml-audit-stream+xml
-# application/vnd.radisys.msml-conf+xml
-# application/vnd.radisys.msml-dialog+xml
-# application/vnd.radisys.msml-dialog-base+xml
-# application/vnd.radisys.msml-dialog-fax-detect+xml
-# application/vnd.radisys.msml-dialog-fax-sendrecv+xml
-# application/vnd.radisys.msml-dialog-group+xml
-# application/vnd.radisys.msml-dialog-speech+xml
-# application/vnd.radisys.msml-dialog-transform+xml
-# application/vnd.rainstor.data
-# application/vnd.rapid
-application/vnd.realvnc.bed			bed
-application/vnd.recordare.musicxml		mxl
-application/vnd.recordare.musicxml+xml		musicxml
-# application/vnd.renlearn.rlprint
-application/vnd.rig.cryptonote			cryptonote
-application/vnd.rim.cod				cod
-application/vnd.rn-realmedia			rm
-application/vnd.rn-realmedia-vbr		rmvb
-application/vnd.route66.link66+xml		link66
-# application/vnd.rs-274x
-# application/vnd.ruckus.download
-# application/vnd.s3sms
-application/vnd.sailingtracker.track		st
-# application/vnd.sbm.cid
-# application/vnd.sbm.mid2
-# application/vnd.scribus
-# application/vnd.sealed.3df
-# application/vnd.sealed.csf
-# application/vnd.sealed.doc
-# application/vnd.sealed.eml
-# application/vnd.sealed.mht
-# application/vnd.sealed.net
-# application/vnd.sealed.ppt
-# application/vnd.sealed.tiff
-# application/vnd.sealed.xls
-# application/vnd.sealedmedia.softseal.html
-# application/vnd.sealedmedia.softseal.pdf
-application/vnd.seemail				see
-application/vnd.sema				sema
-application/vnd.semd				semd
-application/vnd.semf				semf
-application/vnd.shana.informed.formdata		ifm
-application/vnd.shana.informed.formtemplate	itp
-application/vnd.shana.informed.interchange	iif
-application/vnd.shana.informed.package		ipk
-application/vnd.simtech-mindmapper		twd twds
-application/vnd.smaf				mmf
-# application/vnd.smart.notebook
-application/vnd.smart.teacher			teacher
-# application/vnd.software602.filler.form+xml
-# application/vnd.software602.filler.form-xml-zip
-application/vnd.solent.sdkm+xml			sdkm sdkd
-application/vnd.spotfire.dxp			dxp
-application/vnd.spotfire.sfs			sfs
-# application/vnd.sss-cod
-# application/vnd.sss-dtf
-# application/vnd.sss-ntf
-application/vnd.stardivision.calc		sdc
-application/vnd.stardivision.draw		sda
-application/vnd.stardivision.impress		sdd
-application/vnd.stardivision.math		smf
-application/vnd.stardivision.writer		sdw vor
-application/vnd.stardivision.writer-global	sgl
-application/vnd.stepmania.package		smzip
-application/vnd.stepmania.stepchart		sm
-# application/vnd.street-stream
-application/vnd.sun.xml.calc			sxc
-application/vnd.sun.xml.calc.template		stc
-application/vnd.sun.xml.draw			sxd
-application/vnd.sun.xml.draw.template		std
-application/vnd.sun.xml.impress			sxi
-application/vnd.sun.xml.impress.template	sti
-application/vnd.sun.xml.math			sxm
-application/vnd.sun.xml.writer			sxw
-application/vnd.sun.xml.writer.global		sxg
-application/vnd.sun.xml.writer.template		stw
-# application/vnd.sun.wadl+xml
-application/vnd.sus-calendar			sus susp
-application/vnd.svd				svd
-# application/vnd.swiftview-ics
-application/vnd.symbian.install			sis sisx
-application/vnd.syncml+xml			xsm
-application/vnd.syncml.dm+wbxml			bdm
-application/vnd.syncml.dm+xml			xdm
-# application/vnd.syncml.dm.notification
-# application/vnd.syncml.ds.notification
-application/vnd.tao.intent-module-archive	tao
-application/vnd.tcpdump.pcap			pcap cap dmp
-application/vnd.tmobile-livetv			tmo
-application/vnd.trid.tpt			tpt
-application/vnd.triscape.mxs			mxs
-application/vnd.trueapp				tra
-# application/vnd.truedoc
-# application/vnd.ubisoft.webplayer
-application/vnd.ufdl				ufd ufdl
-application/vnd.uiq.theme			utz
-application/vnd.umajin				umj
-application/vnd.unity				unityweb
-application/vnd.uoml+xml			uoml
-# application/vnd.uplanet.alert
-# application/vnd.uplanet.alert-wbxml
-# application/vnd.uplanet.bearer-choice
-# application/vnd.uplanet.bearer-choice-wbxml
-# application/vnd.uplanet.cacheop
-# application/vnd.uplanet.cacheop-wbxml
-# application/vnd.uplanet.channel
-# application/vnd.uplanet.channel-wbxml
-# application/vnd.uplanet.list
-# application/vnd.uplanet.list-wbxml
-# application/vnd.uplanet.listcmd
-# application/vnd.uplanet.listcmd-wbxml
-# application/vnd.uplanet.signal
-application/vnd.vcx				vcx
-# application/vnd.vd-study
-# application/vnd.vectorworks
-# application/vnd.verimatrix.vcas
-# application/vnd.vidsoft.vidconference
-application/vnd.visio				vsd vst vss vsw
-application/vnd.visionary			vis
-# application/vnd.vividence.scriptfile
-application/vnd.vsf				vsf
-# application/vnd.wap.sic
-# application/vnd.wap.slc
-application/vnd.wap.wbxml			wbxml
-application/vnd.wap.wmlc			wmlc
-application/vnd.wap.wmlscriptc			wmlsc
-application/vnd.webturbo			wtb
-# application/vnd.wfa.wsc
-# application/vnd.wmc
-# application/vnd.wmf.bootstrap
-# application/vnd.wolfram.mathematica
-# application/vnd.wolfram.mathematica.package
-application/vnd.wolfram.player			nbp
-application/vnd.wordperfect			wpd
-application/vnd.wqd				wqd
-# application/vnd.wrq-hp3000-labelled
-application/vnd.wt.stf				stf
-# application/vnd.wv.csp+wbxml
-# application/vnd.wv.csp+xml
-# application/vnd.wv.ssp+xml
-application/vnd.xara				xar
-application/vnd.xfdl				xfdl
-# application/vnd.xfdl.webform
-# application/vnd.xmi+xml
-# application/vnd.xmpie.cpkg
-# application/vnd.xmpie.dpkg
-# application/vnd.xmpie.plan
-# application/vnd.xmpie.ppkg
-# application/vnd.xmpie.xlim
-application/vnd.yamaha.hv-dic			hvd
-application/vnd.yamaha.hv-script		hvs
-application/vnd.yamaha.hv-voice			hvp
-application/vnd.yamaha.openscoreformat			osf
-application/vnd.yamaha.openscoreformat.osfpvg+xml	osfpvg
-# application/vnd.yamaha.remote-setup
-application/vnd.yamaha.smaf-audio		saf
-application/vnd.yamaha.smaf-phrase		spf
-# application/vnd.yamaha.through-ngn
-# application/vnd.yamaha.tunnel-udpencap
-application/vnd.yellowriver-custom-menu		cmp
-application/vnd.zul				zir zirz
-application/vnd.zzazz.deck+xml			zaz
-application/voicexml+xml			vxml
-# application/vq-rtcpxr
-# application/watcherinfo+xml
-# application/whoispp-query
-# application/whoispp-response
-application/widget				wgt
-application/winhlp				hlp
-# application/wita
-# application/wordperfect5.1
-application/wsdl+xml				wsdl
-application/wspolicy+xml			wspolicy
-application/x-7z-compressed			7z
-application/x-abiword				abw
-application/x-ace-compressed			ace
-# application/x-amf
-application/x-apple-diskimage			dmg
-application/x-authorware-bin			aab x32 u32 vox
-application/x-authorware-map			aam
-application/x-authorware-seg			aas
-application/x-bcpio				bcpio
-application/x-bittorrent			torrent
-application/x-blorb				blb blorb
-application/x-bzip				bz
-application/x-bzip2				bz2 boz
-application/x-cbr				cbr cba cbt cbz cb7
-application/x-cdlink				vcd
-application/x-cfs-compressed			cfs
-application/x-chat				chat
-application/x-chess-pgn				pgn
-application/x-conference			nsc
-# application/x-compress
-application/x-cpio				cpio
-application/x-csh				csh
-application/x-debian-package			deb udeb
-application/x-dgc-compressed			dgc
-application/x-director			dir dcr dxr cst cct cxt w3d fgd swa
-application/x-doom				wad
-application/x-dtbncx+xml			ncx
-application/x-dtbook+xml			dtb
-application/x-dtbresource+xml			res
-application/x-dvi				dvi
-application/x-envoy				evy
-application/x-eva				eva
-application/x-font-bdf				bdf
-# application/x-font-dos
-# application/x-font-framemaker
-application/x-font-ghostscript			gsf
-# application/x-font-libgrx
-application/x-font-linux-psf			psf
-application/x-font-otf				otf
-application/x-font-pcf				pcf
-application/x-font-snf				snf
-# application/x-font-speedo
-# application/x-font-sunos-news
-application/x-font-ttf				ttf ttc
-application/x-font-type1			pfa pfb pfm afm
-application/font-woff				woff
-# application/x-font-vfont
-application/x-freearc				arc
-application/x-futuresplash			spl
-application/x-gca-compressed			gca
-application/x-glulx				ulx
-application/x-gnumeric				gnumeric
-application/x-gramps-xml			gramps
-application/x-gtar				gtar
-# application/x-gzip
-application/x-hdf				hdf
-application/x-install-instructions		install
-application/x-iso9660-image			iso
-application/x-java-jnlp-file			jnlp
-application/x-latex				latex
-application/x-lzh-compressed			lzh lha
-application/x-mie				mie
-application/x-mobipocket-ebook			prc mobi
-application/x-ms-application			application
-application/x-ms-shortcut			lnk
-application/x-ms-wmd				wmd
-application/x-ms-wmz				wmz
-application/x-ms-xbap				xbap
-application/x-msaccess				mdb
-application/x-msbinder				obd
-application/x-mscardfile			crd
-application/x-msclip				clp
-application/x-msdownload			exe dll com bat msi
-application/x-msmediaview			mvb m13 m14
-application/x-msmetafile			wmf wmz emf emz
-application/x-msmoney				mny
-application/x-mspublisher			pub
-application/x-msschedule			scd
-application/x-msterminal			trm
-application/x-mswrite				wri
-application/x-netcdf				nc cdf
-application/x-nzb				nzb
-application/x-pkcs12				p12 pfx
-application/x-pkcs7-certificates		p7b spc
-application/x-pkcs7-certreqresp			p7r
-application/x-rar-compressed			rar
-application/x-research-info-systems		ris
-application/x-sh				sh
-application/x-shar				shar
-application/x-shockwave-flash			swf
-application/x-silverlight-app			xap
-application/x-sql				sql
-application/x-stuffit				sit
-application/x-stuffitx				sitx
-application/x-subrip				srt
-application/x-sv4cpio				sv4cpio
-application/x-sv4crc				sv4crc
-application/x-t3vm-image			t3
-application/x-tads				gam
-application/x-tar				tar
-application/x-tcl				tcl
-application/x-tex				tex
-application/x-tex-tfm				tfm
-application/x-texinfo				texinfo texi
-application/x-tgif				obj
-application/x-ustar				ustar
-application/x-wais-source			src
-application/x-x509-ca-cert			der crt
-application/x-xfig				fig
-application/x-xliff+xml				xlf
-application/x-xpinstall				xpi
-application/x-xz				xz
-application/x-zmachine				z1 z2 z3 z4 z5 z6 z7 z8
-# application/x400-bp
-application/xaml+xml				xaml
-# application/xcap-att+xml
-# application/xcap-caps+xml
-application/xcap-diff+xml			xdf
-# application/xcap-el+xml
-# application/xcap-error+xml
-# application/xcap-ns+xml
-# application/xcon-conference-info-diff+xml
-# application/xcon-conference-info+xml
-application/xenc+xml				xenc
-application/xhtml+xml				xhtml xht
-# application/xhtml-voice+xml
-application/xml					xml xsl
-application/xml-dtd				dtd
-# application/xml-external-parsed-entity
-# application/xmpp+xml
-application/xop+xml				xop
-application/xproc+xml				xpl
-application/xslt+xml				xslt
-application/xspf+xml				xspf
-application/xv+xml				mxml xhvml xvml xvm
-application/yang				yang
-application/yin+xml				yin
-application/zip					zip
-# audio/1d-interleaved-parityfec
-# audio/32kadpcm
-# audio/3gpp
-# audio/3gpp2
-# audio/ac3
-audio/adpcm					adp
-# audio/amr
-# audio/amr-wb
-# audio/amr-wb+
-# audio/asc
-# audio/atrac-advanced-lossless
-# audio/atrac-x
-# audio/atrac3
-audio/basic					au snd
-# audio/bv16
-# audio/bv32
-# audio/clearmode
-# audio/cn
-# audio/dat12
-# audio/dls
-# audio/dsr-es201108
-# audio/dsr-es202050
-# audio/dsr-es202211
-# audio/dsr-es202212
-# audio/dv
-# audio/dvi4
-# audio/eac3
-# audio/evrc
-# audio/evrc-qcp
-# audio/evrc0
-# audio/evrc1
-# audio/evrcb
-# audio/evrcb0
-# audio/evrcb1
-# audio/evrcwb
-# audio/evrcwb0
-# audio/evrcwb1
-# audio/example
-# audio/fwdred
-# audio/g719
-# audio/g722
-# audio/g7221
-# audio/g723
-# audio/g726-16
-# audio/g726-24
-# audio/g726-32
-# audio/g726-40
-# audio/g728
-# audio/g729
-# audio/g7291
-# audio/g729d
-# audio/g729e
-# audio/gsm
-# audio/gsm-efr
-# audio/gsm-hr-08
-# audio/ilbc
-# audio/ip-mr_v2.5
-# audio/isac
-# audio/l16
-# audio/l20
-# audio/l24
-# audio/l8
-# audio/lpc
-audio/midi					mid midi kar rmi
-# audio/mobile-xmf
-audio/mp4					mp4a
-# audio/mp4a-latm
-# audio/mpa
-# audio/mpa-robust
-audio/mpeg					mpga mp2 mp2a mp3 m2a m3a
-# audio/mpeg4-generic
-# audio/musepack
-audio/ogg					oga ogg spx
-# audio/opus
-# audio/parityfec
-# audio/pcma
-# audio/pcma-wb
-# audio/pcmu-wb
-# audio/pcmu
-# audio/prs.sid
-# audio/qcelp
-# audio/red
-# audio/rtp-enc-aescm128
-# audio/rtp-midi
-# audio/rtx
-audio/s3m					s3m
-audio/silk					sil
-# audio/smv
-# audio/smv0
-# audio/smv-qcp
-# audio/sp-midi
-# audio/speex
-# audio/t140c
-# audio/t38
-# audio/telephone-event
-# audio/tone
-# audio/uemclip
-# audio/ulpfec
-# audio/vdvi
-# audio/vmr-wb
-# audio/vnd.3gpp.iufp
-# audio/vnd.4sb
-# audio/vnd.audiokoz
-# audio/vnd.celp
-# audio/vnd.cisco.nse
-# audio/vnd.cmles.radio-events
-# audio/vnd.cns.anp1
-# audio/vnd.cns.inf1
-audio/vnd.dece.audio				uva uvva
-audio/vnd.digital-winds				eol
-# audio/vnd.dlna.adts
-# audio/vnd.dolby.heaac.1
-# audio/vnd.dolby.heaac.2
-# audio/vnd.dolby.mlp
-# audio/vnd.dolby.mps
-# audio/vnd.dolby.pl2
-# audio/vnd.dolby.pl2x
-# audio/vnd.dolby.pl2z
-# audio/vnd.dolby.pulse.1
-audio/vnd.dra					dra
-audio/vnd.dts					dts
-audio/vnd.dts.hd				dtshd
-# audio/vnd.dvb.file
-# audio/vnd.everad.plj
-# audio/vnd.hns.audio
-audio/vnd.lucent.voice				lvp
-audio/vnd.ms-playready.media.pya		pya
-# audio/vnd.nokia.mobile-xmf
-# audio/vnd.nortel.vbk
-audio/vnd.nuera.ecelp4800			ecelp4800
-audio/vnd.nuera.ecelp7470			ecelp7470
-audio/vnd.nuera.ecelp9600			ecelp9600
-# audio/vnd.octel.sbc
-# audio/vnd.qcelp
-# audio/vnd.rhetorex.32kadpcm
-audio/vnd.rip					rip
-# audio/vnd.sealedmedia.softseal.mpeg
-# audio/vnd.vmx.cvsd
-# audio/vorbis
-# audio/vorbis-config
-audio/webm					weba
-audio/x-aac					aac
-audio/x-aiff					aif aiff aifc
-audio/x-caf					caf
-audio/x-flac					flac
-audio/x-matroska				mka
-audio/x-mpegurl					m3u
-audio/x-ms-wax					wax
-audio/x-ms-wma					wma
-audio/x-pn-realaudio				ram ra
-audio/x-pn-realaudio-plugin			rmp
-# audio/x-tta
-audio/x-wav					wav
-audio/xm					xm
-chemical/x-cdx					cdx
-chemical/x-cif					cif
-chemical/x-cmdf					cmdf
-chemical/x-cml					cml
-chemical/x-csml					csml
-# chemical/x-pdb
-chemical/x-xyz					xyz
-image/bmp					bmp
-image/cgm					cgm
-# image/example
-# image/fits
-image/g3fax					g3
-image/gif					gif
-image/ief					ief
-# image/jp2
-image/jpeg					jpeg jpg jpe
-# image/jpm
-# image/jpx
-image/ktx					ktx
-# image/naplps
-image/png					png
-image/prs.btif					btif
-# image/prs.pti
-image/sgi					sgi
-image/svg+xml					svg svgz
-# image/t38
-image/tiff					tiff tif
-# image/tiff-fx
-image/vnd.adobe.photoshop			psd
-# image/vnd.cns.inf2
-image/vnd.dece.graphic				uvi uvvi uvg uvvg
-image/vnd.dvb.subtitle				sub
-image/vnd.djvu					djvu djv
-image/vnd.dwg					dwg
-image/vnd.dxf					dxf
-image/vnd.fastbidsheet				fbs
-image/vnd.fpx					fpx
-image/vnd.fst					fst
-image/vnd.fujixerox.edmics-mmr			mmr
-image/vnd.fujixerox.edmics-rlc			rlc
-# image/vnd.globalgraphics.pgb
-# image/vnd.microsoft.icon
-# image/vnd.mix
-image/vnd.ms-modi				mdi
-image/vnd.ms-photo				wdp
-image/vnd.net-fpx				npx
-# image/vnd.radiance
-# image/vnd.sealed.png
-# image/vnd.sealedmedia.softseal.gif
-# image/vnd.sealedmedia.softseal.jpg
-# image/vnd.svf
-image/vnd.wap.wbmp				wbmp
-image/vnd.xiff					xif
-image/webp					webp
-image/x-3ds					3ds
-image/x-cmu-raster				ras
-image/x-cmx					cmx
-image/x-freehand				fh fhc fh4 fh5 fh7
-image/x-icon					ico
-image/x-mrsid-image				sid
-image/x-pcx					pcx
-image/x-pict					pic pct
-image/x-portable-anymap				pnm
-image/x-portable-bitmap				pbm
-image/x-portable-graymap			pgm
-image/x-portable-pixmap				ppm
-image/x-rgb					rgb
-image/x-tga					tga
-image/x-xbitmap					xbm
-image/x-xpixmap					xpm
-image/x-xwindowdump				xwd
-# message/cpim
-# message/delivery-status
-# message/disposition-notification
-# message/example
-# message/external-body
-# message/feedback-report
-# message/global
-# message/global-delivery-status
-# message/global-disposition-notification
-# message/global-headers
-# message/http
-# message/imdn+xml
-# message/news
-# message/partial
-message/rfc822					eml mime
-# message/s-http
-# message/sip
-# message/sipfrag
-# message/tracking-status
-# message/vnd.si.simp
-# model/example
-model/iges					igs iges
-model/mesh					msh mesh silo
-model/vnd.collada+xml				dae
-model/vnd.dwf					dwf
-# model/vnd.flatland.3dml
-model/vnd.gdl					gdl
-# model/vnd.gs-gdl
-# model/vnd.gs.gdl
-model/vnd.gtw					gtw
-# model/vnd.moml+xml
-model/vnd.mts					mts
-# model/vnd.parasolid.transmit.binary
-# model/vnd.parasolid.transmit.text
-model/vnd.vtu					vtu
-model/vrml					wrl vrml
-model/x3d+binary				x3db x3dbz
-model/x3d+vrml					x3dv x3dvz
-model/x3d+xml					x3d x3dz
-# multipart/alternative
-# multipart/appledouble
-# multipart/byteranges
-# multipart/digest
-# multipart/encrypted
-# multipart/example
-# multipart/form-data
-# multipart/header-set
-# multipart/mixed
-# multipart/parallel
-# multipart/related
-# multipart/report
-# multipart/signed
-# multipart/voice-message
-# text/1d-interleaved-parityfec
-text/cache-manifest				appcache
-text/calendar					ics ifb
-text/css					css
-text/csv					csv
-# text/directory
-# text/dns
-# text/ecmascript
-# text/enriched
-# text/example
-# text/fwdred
-text/html					html htm
-# text/javascript
-text/n3						n3
-# text/parityfec
-text/plain					txt text conf def list log in
-# text/prs.fallenstein.rst
-text/prs.lines.tag				dsc
-# text/vnd.radisys.msml-basic-layout
-# text/red
-# text/rfc822-headers
-text/richtext					rtx
-# text/rtf
-# text/rtp-enc-aescm128
-# text/rtx
-text/sgml					sgml sgm
-# text/t140
-text/tab-separated-values			tsv
-text/troff					t tr roff man me ms
-text/turtle					ttl
-# text/ulpfec
-text/uri-list					uri uris urls
-text/vcard					vcard
-# text/vnd.abc
-text/vnd.curl					curl
-text/vnd.curl.dcurl				dcurl
-text/vnd.curl.scurl				scurl
-text/vnd.curl.mcurl				mcurl
-# text/vnd.dmclientscript
-text/vnd.dvb.subtitle				sub
-# text/vnd.esmertec.theme-descriptor
-text/vnd.fly					fly
-text/vnd.fmi.flexstor				flx
-text/vnd.graphviz				gv
-text/vnd.in3d.3dml				3dml
-text/vnd.in3d.spot				spot
-# text/vnd.iptc.newsml
-# text/vnd.iptc.nitf
-# text/vnd.latex-z
-# text/vnd.motorola.reflex
-# text/vnd.ms-mediapackage
-# text/vnd.net2phone.commcenter.command
-# text/vnd.si.uricatalogue
-text/vnd.sun.j2me.app-descriptor		jad
-# text/vnd.trolltech.linguist
-# text/vnd.wap.si
-# text/vnd.wap.sl
-text/vnd.wap.wml				wml
-text/vnd.wap.wmlscript				wmls
-text/x-asm					s asm
-text/x-c					c cc cxx cpp h hh dic
-text/x-fortran					f for f77 f90
-text/x-java-source				java
-text/x-opml					opml
-text/x-pascal					p pas
-text/x-nfo					nfo
-text/x-setext					etx
-text/x-sfv					sfv
-text/x-uuencode					uu
-text/x-vcalendar				vcs
-text/x-vcard					vcf
-# text/xml
-# text/xml-external-parsed-entity
-# video/1d-interleaved-parityfec
-video/3gpp					3gp
-# video/3gpp-tt
-video/3gpp2					3g2
-# video/bmpeg
-# video/bt656
-# video/celb
-# video/dv
-# video/example
-video/h261					h261
-video/h263					h263
-# video/h263-1998
-# video/h263-2000
-video/h264					h264
-# video/h264-rcdo
-# video/h264-svc
-video/jpeg					jpgv
-# video/jpeg2000
-video/jpm					jpm jpgm
-video/mj2					mj2 mjp2
-# video/mp1s
-# video/mp2p
-# video/mp2t
-video/mp4					mp4 mp4v mpg4
-# video/mp4v-es
-video/mpeg					mpeg mpg mpe m1v m2v
-# video/mpeg4-generic
-# video/mpv
-# video/nv
-video/ogg					ogv
-# video/parityfec
-# video/pointer
-video/quicktime					qt mov
-# video/raw
-# video/rtp-enc-aescm128
-# video/rtx
-# video/smpte292m
-# video/ulpfec
-# video/vc1
-# video/vnd.cctv
-video/vnd.dece.hd				uvh uvvh
-video/vnd.dece.mobile				uvm uvvm
-# video/vnd.dece.mp4
-video/vnd.dece.pd				uvp uvvp
-video/vnd.dece.sd				uvs uvvs
-video/vnd.dece.video				uvv uvvv
-# video/vnd.directv.mpeg
-# video/vnd.directv.mpeg-tts
-# video/vnd.dlna.mpeg-tts
-video/vnd.dvb.file				dvb
-video/vnd.fvt					fvt
-# video/vnd.hns.video
-# video/vnd.iptvforum.1dparityfec-1010
-# video/vnd.iptvforum.1dparityfec-2005
-# video/vnd.iptvforum.2dparityfec-1010
-# video/vnd.iptvforum.2dparityfec-2005
-# video/vnd.iptvforum.ttsavc
-# video/vnd.iptvforum.ttsmpeg2
-# video/vnd.motorola.video
-# video/vnd.motorola.videop
-video/vnd.mpegurl				mxu m4u
-video/vnd.ms-playready.media.pyv		pyv
-# video/vnd.nokia.interleaved-multimedia
-# video/vnd.nokia.videovoip
-# video/vnd.objectvideo
-# video/vnd.sealed.mpeg1
-# video/vnd.sealed.mpeg4
-# video/vnd.sealed.swf
-# video/vnd.sealedmedia.softseal.mov
-video/vnd.uvvu.mp4				uvu uvvu
-video/vnd.vivo					viv
-video/webm					webm
-video/x-f4v					f4v
-video/x-fli					fli
-video/x-flv					flv
-video/x-m4v					m4v
-video/x-matroska				mkv mk3d mks
-video/x-mng					mng
-video/x-ms-asf					asf asx
-video/x-ms-vob					vob
-video/x-ms-wm					wm
-video/x-ms-wmv					wmv
-video/x-ms-wmx					wmx
-video/x-ms-wvx					wvx
-video/x-msvideo					avi
-video/x-sgi-movie				movie
-video/x-smv					smv
-x-conference/x-cooltalk				ice
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/mime/types/node.types	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,77 +0,0 @@
-# What: WebVTT
-# Why: To allow formats intended for marking up external text track resources.
-# http://dev.w3.org/html5/webvtt/
-# Added by: niftylettuce
-text/vtt  vtt
-
-# What: Google Chrome Extension
-# Why: To allow apps to (work) be served with the right content type header.
-# http://codereview.chromium.org/2830017
-# Added by: niftylettuce
-application/x-chrome-extension  crx
-
-# What: HTC support
-# Why: To properly render .htc files such as CSS3PIE
-# Added by: niftylettuce
-text/x-component  htc
-
-# What: HTML5 application cache manifes ('.manifest' extension)
-# Why: De-facto standard. Required by Mozilla browser when serving HTML5 apps
-# per https://developer.mozilla.org/en/offline_resources_in_firefox
-# Added by: louisremi
-text/cache-manifest  manifest
-
-# What: node binary buffer format
-# Why: semi-standard extension w/in the node community
-# Added by: tootallnate
-application/octet-stream  buffer
-
-# What: The "protected" MP-4 formats used by iTunes.
-# Why: Required for streaming music to browsers (?)
-# Added by: broofa
-application/mp4  m4p
-audio/mp4  m4a
-
-# What: Video format, Part of RFC1890
-# Why: See https://github.com/bentomas/node-mime/pull/6
-# Added by: mjrusso
-video/MP2T  ts
-
-# What: EventSource mime type
-# Why: mime type of Server-Sent Events stream
-# http://www.w3.org/TR/eventsource/#text-event-stream
-# Added by: francois2metz
-text/event-stream  event-stream
-
-# What: Mozilla App manifest mime type
-# Why: https://developer.mozilla.org/en/Apps/Manifest#Serving_manifests
-# Added by: ednapiranha
-application/x-web-app-manifest+json   webapp
-
-# What: Lua file types
-# Why: Googling around shows de-facto consensus on these
-# Added by: creationix (Issue #45)
-text/x-lua  lua
-application/x-lua-bytecode  luac
-
-# What: Markdown files, as per http://daringfireball.net/projects/markdown/syntax
-# Why: http://stackoverflow.com/questions/10701983/what-is-the-mime-type-for-markdown
-# Added by: avoidwork
-text/x-markdown  markdown md mkd
-
-# What: ini files
-# Why: because they're just text files
-# Added by: Matthew Kastor
-text/plain  ini
-
-# What: DASH Adaptive Streaming manifest
-# Why: https://developer.mozilla.org/en-US/docs/DASH_Adaptive_Streaming_for_HTML_5_Video
-# Added by: eelcocramer
-application/dash+xml mdp
-
-# What: OpenType font files - http://www.microsoft.com/typography/otspec/
-# Why:  Browsers usually ignore the font MIME types and sniff the content,
-#       but Chrome, shows a warning if OpenType fonts aren't served with
-#       the `font/opentype` MIME type: http://i.imgur.com/8c5RN8M.png.
-# Added by: alrra
-font/opentype  otf
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-node_modules
-.DS_Store
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/LICENSE.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-Copyright (c) 2010-2012 Robert Kieffer
-MIT License - http://opensource.org/licenses/mit-license.php
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,207 +0,0 @@
-# node-uuid
-
-Simple, fast generation of [RFC4122](http://www.ietf.org/rfc/rfc4122.txt) UUIDS.
-
-Features:
-
-* Generate RFC4122 version 1 or version 4 UUIDs
-* Runs in node.js and all browsers.
-* Registered as a [ComponentJS](https://github.com/component/component) [component](https://github.com/component/component/wiki/Components) ('broofa/node-uuid').
-* Cryptographically strong random # generation on supporting platforms
-* 1.1K minified and gzip'ed  (Want something smaller?  Check this [crazy shit](https://gist.github.com/982883) out! )
-* [Annotated source code](http://broofa.github.com/node-uuid/docs/uuid.html)
-
-## Getting Started
-
-Install it in your browser:
-
-```html
-<script src="uuid.js"></script>
-```
-
-Or in node.js:
-
-```
-npm install node-uuid
-```
-
-```javascript
-var uuid = require('node-uuid');
-```
-
-Then create some ids ...
-
-```javascript
-// Generate a v1 (time-based) id
-uuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a'
-
-// Generate a v4 (random) id
-uuid.v4(); // -> '110ec58a-a0f2-4ac4-8393-c866d813b8d1'
-```
-
-## API
-
-### uuid.v1([`options` [, `buffer` [, `offset`]]])
-
-Generate and return a RFC4122 v1 (timestamp-based) UUID.
-
-* `options` - (Object) Optional uuid state to apply. Properties may include:
-
-  * `node` - (Array) Node id as Array of 6 bytes (per 4.1.6). Default: Randomly generated ID.  See note 1.
-  * `clockseq` - (Number between 0 - 0x3fff) RFC clock sequence.  Default: An internally maintained clockseq is used.
-  * `msecs` - (Number | Date) Time in milliseconds since unix Epoch.  Default: The current time is used.
-  * `nsecs` - (Number between 0-9999) additional time, in 100-nanosecond units. Ignored if `msecs` is unspecified. Default: internal uuid counter is used, as per 4.2.1.2.
-
-* `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written.
-* `offset` - (Number) Starting index in `buffer` at which to begin writing.
-
-Returns `buffer`, if specified, otherwise the string form of the UUID
-
-Notes:
-
-1. The randomly generated node id is only guaranteed to stay constant for the lifetime of the current JS runtime. (Future versions of this module may use persistent storage mechanisms to extend this guarantee.)
-
-Example: Generate string UUID with fully-specified options
-
-```javascript
-uuid.v1({
-  node: [0x01, 0x23, 0x45, 0x67, 0x89, 0xab],
-  clockseq: 0x1234,
-  msecs: new Date('2011-11-01').getTime(),
-  nsecs: 5678
-});   // -> "710b962e-041c-11e1-9234-0123456789ab"
-```
-
-Example: In-place generation of two binary IDs
-
-```javascript
-// Generate two ids in an array
-var arr = new Array(32); // -> []
-uuid.v1(null, arr, 0);   // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15]
-uuid.v1(null, arr, 16);  // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15 02 a3 1c b0 14 32 11 e1 85 58 0b 48 8e 4f c1 15]
-
-// Optionally use uuid.unparse() to get stringify the ids
-uuid.unparse(buffer);    // -> '02a2ce90-1432-11e1-8558-0b488e4fc115'
-uuid.unparse(buffer, 16) // -> '02a31cb0-1432-11e1-8558-0b488e4fc115'
-```
-
-### uuid.v4([`options` [, `buffer` [, `offset`]]])
-
-Generate and return a RFC4122 v4 UUID.
-
-* `options` - (Object) Optional uuid state to apply. Properties may include:
-
-  * `random` - (Number[16]) Array of 16 numbers (0-255) to use in place of randomly generated values
-  * `rng` - (Function) Random # generator to use.  Set to one of the built-in generators - `uuid.mathRNG` (all platforms), `uuid.nodeRNG` (node.js only), `uuid.whatwgRNG` (WebKit only) - or a custom function that returns an array[16] of byte values.
-
-* `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written.
-* `offset` - (Number) Starting index in `buffer` at which to begin writing.
-
-Returns `buffer`, if specified, otherwise the string form of the UUID
-
-Example: Generate string UUID with fully-specified options
-
-```javascript
-uuid.v4({
-  random: [
-    0x10, 0x91, 0x56, 0xbe, 0xc4, 0xfb, 0xc1, 0xea,
-    0x71, 0xb4, 0xef, 0xe1, 0x67, 0x1c, 0x58, 0x36
-  ]
-});
-// -> "109156be-c4fb-41ea-b1b4-efe1671c5836"
-```
-
-Example: Generate two IDs in a single buffer
-
-```javascript
-var buffer = new Array(32); // (or 'new Buffer' in node.js)
-uuid.v4(null, buffer, 0);
-uuid.v4(null, buffer, 16);
-```
-
-### uuid.parse(id[, buffer[, offset]])
-### uuid.unparse(buffer[, offset])
-
-Parse and unparse UUIDs
-
-  * `id` - (String) UUID(-like) string
-  * `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written. Default: A new Array or Buffer is used
-  * `offset` - (Number) Starting index in `buffer` at which to begin writing. Default: 0
-
-Example parsing and unparsing a UUID string
-
-```javascript
-var bytes = uuid.parse('797ff043-11eb-11e1-80d6-510998755d10'); // -> <Buffer 79 7f f0 43 11 eb 11 e1 80 d6 51 09 98 75 5d 10>
-var string = uuid.unparse(bytes); // -> '797ff043-11eb-11e1-80d6-510998755d10'
-```
-
-### uuid.noConflict()
-
-(Browsers only) Set `uuid` property back to it's previous value.
-
-Returns the node-uuid object.
-
-Example:
-
-```javascript
-var myUuid = uuid.noConflict();
-myUuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a'
-```
-
-## Deprecated APIs
-
-Support for the following v1.2 APIs is available in v1.3, but is deprecated and will be removed in the next major version.
-
-### uuid([format [, buffer [, offset]]])
-
-uuid() has become uuid.v4(), and the `format` argument is now implicit in the `buffer` argument. (i.e. if you specify a buffer, the format is assumed to be binary).
-
-### uuid.BufferClass
-
-The class of container created when generating binary uuid data if no buffer argument is specified.  This is expected to go away, with no replacement API.
-
-## Testing
-
-In node.js
-
-```
-> cd test
-> node test.js
-```
-
-In Browser
-
-```
-open test/test.html
-```
-
-### Benchmarking
-
-Requires node.js
-
-```
-npm install uuid uuid-js
-node benchmark/benchmark.js
-```
-
-For a more complete discussion of node-uuid performance, please see the `benchmark/README.md` file, and the [benchmark wiki](https://github.com/broofa/node-uuid/wiki/Benchmark)
-
-For browser performance [checkout the JSPerf tests](http://jsperf.com/node-uuid-performance).
-
-## Release notes
-
-### 1.4.0
-
-* Improved module context detection
-* Removed public RNG functions
-
-### 1.3.2
-
-* Improve tests and handling of v1() options (Issue #24)
-* Expose RNG option to allow for perf testing with different generators
-
-### 1.3.0
-
-* Support for version 1 ids, thanks to [@ctavan](https://github.com/ctavan)!
-* Support for node.js crypto API
-* De-emphasizing performance in favor of a) cryptographic quality PRNGs where available and b) more manageable code
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-# node-uuid Benchmarks
-
-### Results
-
-To see the results of our benchmarks visit https://github.com/broofa/node-uuid/wiki/Benchmark
-
-### Run them yourself
-
-node-uuid comes with some benchmarks to measure performance of generating UUIDs. These can be run using node.js. node-uuid is being benchmarked against some other uuid modules, that are available through npm namely `uuid` and `uuid-js`.
-
-To prepare and run the benchmark issue;
-
-```
-npm install uuid uuid-js
-node benchmark/benchmark.js
-```
-
-You'll see an output like this one:
-
-```
-# v4
-nodeuuid.v4(): 854700 uuids/second
-nodeuuid.v4('binary'): 788643 uuids/second
-nodeuuid.v4('binary', buffer): 1336898 uuids/second
-uuid(): 479386 uuids/second
-uuid('binary'): 582072 uuids/second
-uuidjs.create(4): 312304 uuids/second
-
-# v1
-nodeuuid.v1(): 938086 uuids/second
-nodeuuid.v1('binary'): 683060 uuids/second
-nodeuuid.v1('binary', buffer): 1644736 uuids/second
-uuidjs.create(1): 190621 uuids/second
-```
-
-* The `uuid()` entries are for Nikhil Marathe's [uuid module](https://bitbucket.org/nikhilm/uuidjs) which is a wrapper around the native libuuid library.
-* The `uuidjs()` entries are for Patrick Negri's [uuid-js module](https://github.com/pnegri/uuid-js) which is a pure javascript implementation based on [UUID.js](https://github.com/LiosK/UUID.js) by LiosK.
-
-If you want to get more reliable results you can run the benchmark multiple times and write the output into a log file:
-
-```
-for i in {0..9}; do node benchmark/benchmark.js >> benchmark/bench_0.4.12.log; done;
-```
-
-If you're interested in how performance varies between different node versions, you can issue the above command multiple times.
-
-You can then use the shell script `bench.sh` provided in this directory to calculate the averages over all benchmark runs and draw a nice plot:
-
-```
-(cd benchmark/ && ./bench.sh)
-```
-
-This assumes you have [gnuplot](http://www.gnuplot.info/) and [ImageMagick](http://www.imagemagick.org/) installed. You'll find a nice `bench.png` graph in the `benchmark/` directory then.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/bench.gnu	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,174 +0,0 @@
-#!/opt/local/bin/gnuplot -persist
-#
-#    
-#    	G N U P L O T
-#    	Version 4.4 patchlevel 3
-#    	last modified March 2011
-#    	System: Darwin 10.8.0
-#    
-#    	Copyright (C) 1986-1993, 1998, 2004, 2007-2010
-#    	Thomas Williams, Colin Kelley and many others
-#    
-#    	gnuplot home:     http://www.gnuplot.info
-#    	faq, bugs, etc:   type "help seeking-assistance"
-#    	immediate help:   type "help"
-#    	plot window:      hit 'h'
-set terminal postscript eps noenhanced defaultplex \
- leveldefault color colortext \
- solid linewidth 1.2 butt noclip \
- palfuncparam 2000,0.003 \
- "Helvetica" 14 
-set output 'bench.eps'
-unset clip points
-set clip one
-unset clip two
-set bar 1.000000 front
-set border 31 front linetype -1 linewidth 1.000
-set xdata
-set ydata
-set zdata
-set x2data
-set y2data
-set timefmt x "%d/%m/%y,%H:%M"
-set timefmt y "%d/%m/%y,%H:%M"
-set timefmt z "%d/%m/%y,%H:%M"
-set timefmt x2 "%d/%m/%y,%H:%M"
-set timefmt y2 "%d/%m/%y,%H:%M"
-set timefmt cb "%d/%m/%y,%H:%M"
-set boxwidth
-set style fill  empty border
-set style rectangle back fc lt -3 fillstyle   solid 1.00 border lt -1
-set style circle radius graph 0.02, first 0, 0 
-set dummy x,y
-set format x "% g"
-set format y "% g"
-set format x2 "% g"
-set format y2 "% g"
-set format z "% g"
-set format cb "% g"
-set angles radians
-unset grid
-set key title ""
-set key outside left top horizontal Right noreverse enhanced autotitles columnhead nobox
-set key noinvert samplen 4 spacing 1 width 0 height 0 
-set key maxcolumns 2 maxrows 0
-unset label
-unset arrow
-set style increment default
-unset style line
-set style line 1  linetype 1 linewidth 2.000 pointtype 1 pointsize default pointinterval 0
-unset style arrow
-set style histogram clustered gap 2 title  offset character 0, 0, 0
-unset logscale
-set offsets graph 0.05, 0.15, 0, 0
-set pointsize 1.5
-set pointintervalbox 1
-set encoding default
-unset polar
-unset parametric
-unset decimalsign
-set view 60, 30, 1, 1
-set samples 100, 100
-set isosamples 10, 10
-set surface
-unset contour
-set clabel '%8.3g'
-set mapping cartesian
-set datafile separator whitespace
-unset hidden3d
-set cntrparam order 4
-set cntrparam linear
-set cntrparam levels auto 5
-set cntrparam points 5
-set size ratio 0 1,1
-set origin 0,0
-set style data points
-set style function lines
-set xzeroaxis linetype -2 linewidth 1.000
-set yzeroaxis linetype -2 linewidth 1.000
-set zzeroaxis linetype -2 linewidth 1.000
-set x2zeroaxis linetype -2 linewidth 1.000
-set y2zeroaxis linetype -2 linewidth 1.000
-set ticslevel 0.5
-set mxtics default
-set mytics default
-set mztics default
-set mx2tics default
-set my2tics default
-set mcbtics default
-set xtics border in scale 1,0.5 mirror norotate  offset character 0, 0, 0
-set xtics  norangelimit
-set xtics   ()
-set ytics border in scale 1,0.5 mirror norotate  offset character 0, 0, 0
-set ytics autofreq  norangelimit
-set ztics border in scale 1,0.5 nomirror norotate  offset character 0, 0, 0
-set ztics autofreq  norangelimit
-set nox2tics
-set noy2tics
-set cbtics border in scale 1,0.5 mirror norotate  offset character 0, 0, 0
-set cbtics autofreq  norangelimit
-set title "" 
-set title  offset character 0, 0, 0 font "" norotate
-set timestamp bottom 
-set timestamp "" 
-set timestamp  offset character 0, 0, 0 font "" norotate
-set rrange [ * : * ] noreverse nowriteback  # (currently [8.98847e+307:-8.98847e+307] )
-set autoscale rfixmin
-set autoscale rfixmax
-set trange [ * : * ] noreverse nowriteback  # (currently [-5.00000:5.00000] )
-set autoscale tfixmin
-set autoscale tfixmax
-set urange [ * : * ] noreverse nowriteback  # (currently [-10.0000:10.0000] )
-set autoscale ufixmin
-set autoscale ufixmax
-set vrange [ * : * ] noreverse nowriteback  # (currently [-10.0000:10.0000] )
-set autoscale vfixmin
-set autoscale vfixmax
-set xlabel "" 
-set xlabel  offset character 0, 0, 0 font "" textcolor lt -1 norotate
-set x2label "" 
-set x2label  offset character 0, 0, 0 font "" textcolor lt -1 norotate
-set xrange [ * : * ] noreverse nowriteback  # (currently [-0.150000:3.15000] )
-set autoscale xfixmin
-set autoscale xfixmax
-set x2range [ * : * ] noreverse nowriteback  # (currently [0.00000:3.00000] )
-set autoscale x2fixmin
-set autoscale x2fixmax
-set ylabel "" 
-set ylabel  offset character 0, 0, 0 font "" textcolor lt -1 rotate by -270
-set y2label "" 
-set y2label  offset character 0, 0, 0 font "" textcolor lt -1 rotate by -270
-set yrange [ 0.00000 : 1.90000e+06 ] noreverse nowriteback  # (currently [:] )
-set autoscale yfixmin
-set autoscale yfixmax
-set y2range [ * : * ] noreverse nowriteback  # (currently [0.00000:1.90000e+06] )
-set autoscale y2fixmin
-set autoscale y2fixmax
-set zlabel "" 
-set zlabel  offset character 0, 0, 0 font "" textcolor lt -1 norotate
-set zrange [ * : * ] noreverse nowriteback  # (currently [-10.0000:10.0000] )
-set autoscale zfixmin
-set autoscale zfixmax
-set cblabel "" 
-set cblabel  offset character 0, 0, 0 font "" textcolor lt -1 rotate by -270
-set cbrange [ * : * ] noreverse nowriteback  # (currently [8.98847e+307:-8.98847e+307] )
-set autoscale cbfixmin
-set autoscale cbfixmax
-set zero 1e-08
-set lmargin  -1
-set bmargin  -1
-set rmargin  -1
-set tmargin  -1
-set pm3d explicit at s
-set pm3d scansautomatic
-set pm3d interpolate 1,1 flush begin noftriangles nohidden3d corners2color mean
-set palette positive nops_allcF maxcolors 0 gamma 1.5 color model RGB 
-set palette rgbformulae 7, 5, 15
-set colorbox default
-set colorbox vertical origin screen 0.9, 0.2, 0 size screen 0.05, 0.6, 0 front bdefault
-set loadpath 
-set fontpath 
-set fit noerrorvariables
-GNUTERM = "aqua"
-plot 'bench_results.txt' using 2:xticlabel(1) w lp lw 2, '' using 3:xticlabel(1) w lp lw 2, '' using 4:xticlabel(1) w lp lw 2, '' using 5:xticlabel(1) w lp lw 2, '' using 6:xticlabel(1) w lp lw 2, '' using 7:xticlabel(1) w lp lw 2, '' using 8:xticlabel(1) w lp lw 2, '' using 9:xticlabel(1) w lp lw 2
-#    EOF
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/bench.sh	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,34 +0,0 @@
-#!/bin/bash
-
-# for a given node version run:
-# for i in {0..9}; do node benchmark.js >> bench_0.6.2.log; done;
-
-PATTERNS=('nodeuuid.v1()' "nodeuuid.v1('binary'," 'nodeuuid.v4()' "nodeuuid.v4('binary'," "uuid()" "uuid('binary')" 'uuidjs.create(1)' 'uuidjs.create(4)' '140byte')
-FILES=(node_uuid_v1_string node_uuid_v1_buf node_uuid_v4_string node_uuid_v4_buf libuuid_v4_string libuuid_v4_binary uuidjs_v1_string uuidjs_v4_string 140byte_es)
-INDICES=(2 3 2 3 2 2 2 2 2)
-VERSIONS=$( ls bench_*.log | sed -e 's/^bench_\([0-9\.]*\)\.log/\1/' | tr "\\n" " " )
-TMPJOIN="tmp_join"
-OUTPUT="bench_results.txt"
-
-for I in ${!FILES[*]}; do
-  F=${FILES[$I]}
-  P=${PATTERNS[$I]}
-  INDEX=${INDICES[$I]}
-  echo "version $F" > $F
-  for V in $VERSIONS; do
-    (VAL=$( grep "$P" bench_$V.log | LC_ALL=en_US awk '{ sum += $'$INDEX' } END { print sum/NR }' ); echo $V $VAL) >> $F
-  done
-  if [ $I == 0 ]; then
-    cat $F > $TMPJOIN
-  else
-    join $TMPJOIN $F > $OUTPUT
-    cp $OUTPUT $TMPJOIN
-  fi
-  rm $F
-done
-
-rm $TMPJOIN
-
-gnuplot bench.gnu
-convert -density 200 -resize 800x560 -flatten bench.eps bench.png
-rm bench.eps
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/benchmark-native.c	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,34 +0,0 @@
-/*
-Test performance of native C UUID generation
-
-To Compile: cc -luuid benchmark-native.c -o benchmark-native
-*/
-
-#include <stdio.h>
-#include <unistd.h>
-#include <sys/time.h>
-#include <uuid/uuid.h>
-
-int main() {
-  uuid_t myid;
-  char buf[36+1];
-  int i;
-  struct timeval t;
-  double start, finish;
-
-  gettimeofday(&t, NULL);
-  start = t.tv_sec + t.tv_usec/1e6;
-
-  int n = 2e5;
-  for (i = 0; i < n; i++) {
-    uuid_generate(myid);
-    uuid_unparse(myid, buf);
-  }
-
-  gettimeofday(&t, NULL);
-  finish = t.tv_sec + t.tv_usec/1e6;
-  double dur = finish - start;
-
-  printf("%d uuids/sec", (int)(n/dur));
-  return 0;
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/benchmark.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,84 +0,0 @@
-try {
-  var nodeuuid = require('../uuid');
-} catch (e) {
-  console.error('node-uuid require failed - skipping tests');
-}
-
-try {
-  var uuid = require('uuid');
-} catch (e) {
-  console.error('uuid require failed - skipping tests');
-}
-
-try {
-  var uuidjs = require('uuid-js');
-} catch (e) {
-  console.error('uuid-js require failed - skipping tests');
-}
-
-var N = 5e5;
-
-function rate(msg, t) {
-  console.log(msg + ': ' +
-    (N / (Date.now() - t) * 1e3 | 0) +
-    ' uuids/second');
-}
-
-console.log('# v4');
-
-// node-uuid - string form
-if (nodeuuid) {
-  for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4();
-  rate('nodeuuid.v4() - using node.js crypto RNG', t);
-
-  for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4({rng: nodeuuid.mathRNG});
-  rate('nodeuuid.v4() - using Math.random() RNG', t);
-
-  for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4('binary');
-  rate('nodeuuid.v4(\'binary\')', t);
-
-  var buffer = new nodeuuid.BufferClass(16);
-  for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4('binary', buffer);
-  rate('nodeuuid.v4(\'binary\', buffer)', t);
-}
-
-// libuuid - string form
-if (uuid) {
-  for (var i = 0, t = Date.now(); i < N; i++) uuid();
-  rate('uuid()', t);
-
-  for (var i = 0, t = Date.now(); i < N; i++) uuid('binary');
-  rate('uuid(\'binary\')', t);
-}
-
-// uuid-js - string form
-if (uuidjs) {
-  for (var i = 0, t = Date.now(); i < N; i++) uuidjs.create(4);
-  rate('uuidjs.create(4)', t);
-}
-
-// 140byte.es
-for (var i = 0, t = Date.now(); i < N; i++) 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g,function(s,r){r=Math.random()*16|0;return (s=='x'?r:r&0x3|0x8).toString(16)});
-rate('140byte.es_v4', t);
-
-console.log('');
-console.log('# v1');
-
-// node-uuid - v1 string form
-if (nodeuuid) {
-  for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v1();
-  rate('nodeuuid.v1()', t);
-
-  for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v1('binary');
-  rate('nodeuuid.v1(\'binary\')', t);
-
-  var buffer = new nodeuuid.BufferClass(16);
-  for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v1('binary', buffer);
-  rate('nodeuuid.v1(\'binary\', buffer)', t);
-}
-
-// uuid-js - v1 string form
-if (uuidjs) {
-  for (var i = 0, t = Date.now(); i < N; i++) uuidjs.create(1);
-  rate('uuidjs.create(1)', t);
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/component.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-{
-  "name": "node-uuid",
-  "repo": "broofa/node-uuid",
-  "description": "Rigorous implementation of RFC4122 (v1 and v4) UUIDs.",
-  "version": "1.4.0",
-  "author": "Robert Kieffer <robert@broofa.com>",
-  "contributors": [
-    {"name": "Christoph Tavan <dev@tavan.de>", "github": "https://github.com/ctavan"}
-  ],
-  "keywords": ["uuid", "guid", "rfc4122"],
-  "dependencies": {},
-  "development": {},
-  "main": "uuid.js",
-  "scripts": [
-    "uuid.js"
-  ],
-  "license": "MIT"
-}
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "name": "node-uuid",
-  "description": "Rigorous implementation of RFC4122 (v1 and v4) UUIDs.",
-  "url": "http://github.com/broofa/node-uuid",
-  "keywords": [
-    "uuid",
-    "guid",
-    "rfc4122"
-  ],
-  "author": {
-    "name": "Robert Kieffer",
-    "email": "robert@broofa.com"
-  },
-  "contributors": [
-    {
-      "name": "Christoph Tavan",
-      "email": "dev@tavan.de"
-    }
-  ],
-  "lib": ".",
-  "main": "./uuid.js",
-  "repository": {
-    "type": "git",
-    "url": "https://github.com/broofa/node-uuid.git"
-  },
-  "version": "1.4.1",
-  "readme": "# node-uuid\n\nSimple, fast generation of [RFC4122](http://www.ietf.org/rfc/rfc4122.txt) UUIDS.\n\nFeatures:\n\n* Generate RFC4122 version 1 or version 4 UUIDs\n* Runs in node.js and all browsers.\n* Registered as a [ComponentJS](https://github.com/component/component) [component](https://github.com/component/component/wiki/Components) ('broofa/node-uuid').\n* Cryptographically strong random # generation on supporting platforms\n* 1.1K minified and gzip'ed  (Want something smaller?  Check this [crazy shit](https://gist.github.com/982883) out! )\n* [Annotated source code](http://broofa.github.com/node-uuid/docs/uuid.html)\n\n## Getting Started\n\nInstall it in your browser:\n\n```html\n<script src=\"uuid.js\"></script>\n```\n\nOr in node.js:\n\n```\nnpm install node-uuid\n```\n\n```javascript\nvar uuid = require('node-uuid');\n```\n\nThen create some ids ...\n\n```javascript\n// Generate a v1 (time-based) id\nuuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a'\n\n// Generate a v4 (random) id\nuuid.v4(); // -> '110ec58a-a0f2-4ac4-8393-c866d813b8d1'\n```\n\n## API\n\n### uuid.v1([`options` [, `buffer` [, `offset`]]])\n\nGenerate and return a RFC4122 v1 (timestamp-based) UUID.\n\n* `options` - (Object) Optional uuid state to apply. Properties may include:\n\n  * `node` - (Array) Node id as Array of 6 bytes (per 4.1.6). Default: Randomly generated ID.  See note 1.\n  * `clockseq` - (Number between 0 - 0x3fff) RFC clock sequence.  Default: An internally maintained clockseq is used.\n  * `msecs` - (Number | Date) Time in milliseconds since unix Epoch.  Default: The current time is used.\n  * `nsecs` - (Number between 0-9999) additional time, in 100-nanosecond units. Ignored if `msecs` is unspecified. Default: internal uuid counter is used, as per 4.2.1.2.\n\n* `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written.\n* `offset` - (Number) Starting index in `buffer` at which to begin writing.\n\nReturns `buffer`, if specified, otherwise the string form of the UUID\n\nNotes:\n\n1. The randomly generated node id is only guaranteed to stay constant for the lifetime of the current JS runtime. (Future versions of this module may use persistent storage mechanisms to extend this guarantee.)\n\nExample: Generate string UUID with fully-specified options\n\n```javascript\nuuid.v1({\n  node: [0x01, 0x23, 0x45, 0x67, 0x89, 0xab],\n  clockseq: 0x1234,\n  msecs: new Date('2011-11-01').getTime(),\n  nsecs: 5678\n});   // -> \"710b962e-041c-11e1-9234-0123456789ab\"\n```\n\nExample: In-place generation of two binary IDs\n\n```javascript\n// Generate two ids in an array\nvar arr = new Array(32); // -> []\nuuid.v1(null, arr, 0);   // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15]\nuuid.v1(null, arr, 16);  // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15 02 a3 1c b0 14 32 11 e1 85 58 0b 48 8e 4f c1 15]\n\n// Optionally use uuid.unparse() to get stringify the ids\nuuid.unparse(buffer);    // -> '02a2ce90-1432-11e1-8558-0b488e4fc115'\nuuid.unparse(buffer, 16) // -> '02a31cb0-1432-11e1-8558-0b488e4fc115'\n```\n\n### uuid.v4([`options` [, `buffer` [, `offset`]]])\n\nGenerate and return a RFC4122 v4 UUID.\n\n* `options` - (Object) Optional uuid state to apply. Properties may include:\n\n  * `random` - (Number[16]) Array of 16 numbers (0-255) to use in place of randomly generated values\n  * `rng` - (Function) Random # generator to use.  Set to one of the built-in generators - `uuid.mathRNG` (all platforms), `uuid.nodeRNG` (node.js only), `uuid.whatwgRNG` (WebKit only) - or a custom function that returns an array[16] of byte values.\n\n* `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written.\n* `offset` - (Number) Starting index in `buffer` at which to begin writing.\n\nReturns `buffer`, if specified, otherwise the string form of the UUID\n\nExample: Generate string UUID with fully-specified options\n\n```javascript\nuuid.v4({\n  random: [\n    0x10, 0x91, 0x56, 0xbe, 0xc4, 0xfb, 0xc1, 0xea,\n    0x71, 0xb4, 0xef, 0xe1, 0x67, 0x1c, 0x58, 0x36\n  ]\n});\n// -> \"109156be-c4fb-41ea-b1b4-efe1671c5836\"\n```\n\nExample: Generate two IDs in a single buffer\n\n```javascript\nvar buffer = new Array(32); // (or 'new Buffer' in node.js)\nuuid.v4(null, buffer, 0);\nuuid.v4(null, buffer, 16);\n```\n\n### uuid.parse(id[, buffer[, offset]])\n### uuid.unparse(buffer[, offset])\n\nParse and unparse UUIDs\n\n  * `id` - (String) UUID(-like) string\n  * `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written. Default: A new Array or Buffer is used\n  * `offset` - (Number) Starting index in `buffer` at which to begin writing. Default: 0\n\nExample parsing and unparsing a UUID string\n\n```javascript\nvar bytes = uuid.parse('797ff043-11eb-11e1-80d6-510998755d10'); // -> <Buffer 79 7f f0 43 11 eb 11 e1 80 d6 51 09 98 75 5d 10>\nvar string = uuid.unparse(bytes); // -> '797ff043-11eb-11e1-80d6-510998755d10'\n```\n\n### uuid.noConflict()\n\n(Browsers only) Set `uuid` property back to it's previous value.\n\nReturns the node-uuid object.\n\nExample:\n\n```javascript\nvar myUuid = uuid.noConflict();\nmyUuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a'\n```\n\n## Deprecated APIs\n\nSupport for the following v1.2 APIs is available in v1.3, but is deprecated and will be removed in the next major version.\n\n### uuid([format [, buffer [, offset]]])\n\nuuid() has become uuid.v4(), and the `format` argument is now implicit in the `buffer` argument. (i.e. if you specify a buffer, the format is assumed to be binary).\n\n### uuid.BufferClass\n\nThe class of container created when generating binary uuid data if no buffer argument is specified.  This is expected to go away, with no replacement API.\n\n## Testing\n\nIn node.js\n\n```\n> cd test\n> node test.js\n```\n\nIn Browser\n\n```\nopen test/test.html\n```\n\n### Benchmarking\n\nRequires node.js\n\n```\nnpm install uuid uuid-js\nnode benchmark/benchmark.js\n```\n\nFor a more complete discussion of node-uuid performance, please see the `benchmark/README.md` file, and the [benchmark wiki](https://github.com/broofa/node-uuid/wiki/Benchmark)\n\nFor browser performance [checkout the JSPerf tests](http://jsperf.com/node-uuid-performance).\n\n## Release notes\n\n### 1.4.0\n\n* Improved module context detection\n* Removed public RNG functions\n\n### 1.3.2\n\n* Improve tests and handling of v1() options (Issue #24)\n* Expose RNG option to allow for perf testing with different generators\n\n### 1.3.0\n\n* Support for version 1 ids, thanks to [@ctavan](https://github.com/ctavan)!\n* Support for node.js crypto API\n* De-emphasizing performance in favor of a) cryptographic quality PRNGs where available and b) more manageable code\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/broofa/node-uuid/issues"
-  },
-  "homepage": "https://github.com/broofa/node-uuid",
-  "_id": "node-uuid@1.4.1",
-  "_from": "node-uuid@~1.4.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/uuid.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,245 +0,0 @@
-//     uuid.js
-//
-//     Copyright (c) 2010-2012 Robert Kieffer
-//     MIT License - http://opensource.org/licenses/mit-license.php
-
-(function() {
-  var _global = this;
-
-  // Unique ID creation requires a high quality random # generator.  We feature
-  // detect to determine the best RNG source, normalizing to a function that
-  // returns 128-bits of randomness, since that's what's usually required
-  var _rng;
-
-  // Node.js crypto-based RNG - http://nodejs.org/docs/v0.6.2/api/crypto.html
-  //
-  // Moderately fast, high quality
-  if (typeof(require) == 'function') {
-    try {
-      var _rb = require('crypto').randomBytes;
-      _rng = _rb && function() {return _rb(16);};
-    } catch(e) {}
-  }
-
-  if (!_rng && _global.crypto && crypto.getRandomValues) {
-    // WHATWG crypto-based RNG - http://wiki.whatwg.org/wiki/Crypto
-    //
-    // Moderately fast, high quality
-    var _rnds8 = new Uint8Array(16);
-    _rng = function whatwgRNG() {
-      crypto.getRandomValues(_rnds8);
-      return _rnds8;
-    };
-  }
-
-  if (!_rng) {
-    // Math.random()-based (RNG)
-    //
-    // If all else fails, use Math.random().  It's fast, but is of unspecified
-    // quality.
-    var  _rnds = new Array(16);
-    _rng = function() {
-      for (var i = 0, r; i < 16; i++) {
-        if ((i & 0x03) === 0) r = Math.random() * 0x100000000;
-        _rnds[i] = r >>> ((i & 0x03) << 3) & 0xff;
-      }
-
-      return _rnds;
-    };
-  }
-
-  // Buffer class to use
-  var BufferClass = typeof(Buffer) == 'function' ? Buffer : Array;
-
-  // Maps for number <-> hex string conversion
-  var _byteToHex = [];
-  var _hexToByte = {};
-  for (var i = 0; i < 256; i++) {
-    _byteToHex[i] = (i + 0x100).toString(16).substr(1);
-    _hexToByte[_byteToHex[i]] = i;
-  }
-
-  // **`parse()` - Parse a UUID into it's component bytes**
-  function parse(s, buf, offset) {
-    var i = (buf && offset) || 0, ii = 0;
-
-    buf = buf || [];
-    s.toLowerCase().replace(/[0-9a-f]{2}/g, function(oct) {
-      if (ii < 16) { // Don't overflow!
-        buf[i + ii++] = _hexToByte[oct];
-      }
-    });
-
-    // Zero out remaining bytes if string was short
-    while (ii < 16) {
-      buf[i + ii++] = 0;
-    }
-
-    return buf;
-  }
-
-  // **`unparse()` - Convert UUID byte array (ala parse()) into a string**
-  function unparse(buf, offset) {
-    var i = offset || 0, bth = _byteToHex;
-    return  bth[buf[i++]] + bth[buf[i++]] +
-            bth[buf[i++]] + bth[buf[i++]] + '-' +
-            bth[buf[i++]] + bth[buf[i++]] + '-' +
-            bth[buf[i++]] + bth[buf[i++]] + '-' +
-            bth[buf[i++]] + bth[buf[i++]] + '-' +
-            bth[buf[i++]] + bth[buf[i++]] +
-            bth[buf[i++]] + bth[buf[i++]] +
-            bth[buf[i++]] + bth[buf[i++]];
-  }
-
-  // **`v1()` - Generate time-based UUID**
-  //
-  // Inspired by https://github.com/LiosK/UUID.js
-  // and http://docs.python.org/library/uuid.html
-
-  // random #'s we need to init node and clockseq
-  var _seedBytes = _rng();
-
-  // Per 4.5, create and 48-bit node id, (47 random bits + multicast bit = 1)
-  var _nodeId = [
-    _seedBytes[0] | 0x01,
-    _seedBytes[1], _seedBytes[2], _seedBytes[3], _seedBytes[4], _seedBytes[5]
-  ];
-
-  // Per 4.2.2, randomize (14 bit) clockseq
-  var _clockseq = (_seedBytes[6] << 8 | _seedBytes[7]) & 0x3fff;
-
-  // Previous uuid creation time
-  var _lastMSecs = 0, _lastNSecs = 0;
-
-  // See https://github.com/broofa/node-uuid for API details
-  function v1(options, buf, offset) {
-    var i = buf && offset || 0;
-    var b = buf || [];
-
-    options = options || {};
-
-    var clockseq = options.clockseq != null ? options.clockseq : _clockseq;
-
-    // UUID timestamps are 100 nano-second units since the Gregorian epoch,
-    // (1582-10-15 00:00).  JSNumbers aren't precise enough for this, so
-    // time is handled internally as 'msecs' (integer milliseconds) and 'nsecs'
-    // (100-nanoseconds offset from msecs) since unix epoch, 1970-01-01 00:00.
-    var msecs = options.msecs != null ? options.msecs : new Date().getTime();
-
-    // Per 4.2.1.2, use count of uuid's generated during the current clock
-    // cycle to simulate higher resolution clock
-    var nsecs = options.nsecs != null ? options.nsecs : _lastNSecs + 1;
-
-    // Time since last uuid creation (in msecs)
-    var dt = (msecs - _lastMSecs) + (nsecs - _lastNSecs)/10000;
-
-    // Per 4.2.1.2, Bump clockseq on clock regression
-    if (dt < 0 && options.clockseq == null) {
-      clockseq = clockseq + 1 & 0x3fff;
-    }
-
-    // Reset nsecs if clock regresses (new clockseq) or we've moved onto a new
-    // time interval
-    if ((dt < 0 || msecs > _lastMSecs) && options.nsecs == null) {
-      nsecs = 0;
-    }
-
-    // Per 4.2.1.2 Throw error if too many uuids are requested
-    if (nsecs >= 10000) {
-      throw new Error('uuid.v1(): Can\'t create more than 10M uuids/sec');
-    }
-
-    _lastMSecs = msecs;
-    _lastNSecs = nsecs;
-    _clockseq = clockseq;
-
-    // Per 4.1.4 - Convert from unix epoch to Gregorian epoch
-    msecs += 12219292800000;
-
-    // `time_low`
-    var tl = ((msecs & 0xfffffff) * 10000 + nsecs) % 0x100000000;
-    b[i++] = tl >>> 24 & 0xff;
-    b[i++] = tl >>> 16 & 0xff;
-    b[i++] = tl >>> 8 & 0xff;
-    b[i++] = tl & 0xff;
-
-    // `time_mid`
-    var tmh = (msecs / 0x100000000 * 10000) & 0xfffffff;
-    b[i++] = tmh >>> 8 & 0xff;
-    b[i++] = tmh & 0xff;
-
-    // `time_high_and_version`
-    b[i++] = tmh >>> 24 & 0xf | 0x10; // include version
-    b[i++] = tmh >>> 16 & 0xff;
-
-    // `clock_seq_hi_and_reserved` (Per 4.2.2 - include variant)
-    b[i++] = clockseq >>> 8 | 0x80;
-
-    // `clock_seq_low`
-    b[i++] = clockseq & 0xff;
-
-    // `node`
-    var node = options.node || _nodeId;
-    for (var n = 0; n < 6; n++) {
-      b[i + n] = node[n];
-    }
-
-    return buf ? buf : unparse(b);
-  }
-
-  // **`v4()` - Generate random UUID**
-
-  // See https://github.com/broofa/node-uuid for API details
-  function v4(options, buf, offset) {
-    // Deprecated - 'format' argument, as supported in v1.2
-    var i = buf && offset || 0;
-
-    if (typeof(options) == 'string') {
-      buf = options == 'binary' ? new BufferClass(16) : null;
-      options = null;
-    }
-    options = options || {};
-
-    var rnds = options.random || (options.rng || _rng)();
-
-    // Per 4.4, set bits for version and `clock_seq_hi_and_reserved`
-    rnds[6] = (rnds[6] & 0x0f) | 0x40;
-    rnds[8] = (rnds[8] & 0x3f) | 0x80;
-
-    // Copy bytes to buffer, if provided
-    if (buf) {
-      for (var ii = 0; ii < 16; ii++) {
-        buf[i + ii] = rnds[ii];
-      }
-    }
-
-    return buf || unparse(rnds);
-  }
-
-  // Export public API
-  var uuid = v4;
-  uuid.v1 = v1;
-  uuid.v4 = v4;
-  uuid.parse = parse;
-  uuid.unparse = unparse;
-  uuid.BufferClass = BufferClass;
-
-  if (typeof define === 'function' && define.amd) {
-    // Publish as AMD module
-    define(function() {return uuid;});
-  } else if (typeof(module) != 'undefined' && module.exports) {
-    // Publish as node.js module
-    module.exports = uuid;
-  } else {
-    // Publish as global (in browsers)
-    var _previousRoot = _global.uuid;
-
-    // **`noConflict()` - (browser only) to reset global 'uuid' var**
-    uuid.noConflict = function() {
-      _global.uuid = _previousRoot;
-      return uuid;
-    };
-
-    _global.uuid = uuid;
-  }
-}).call(this);
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-Apache License
-
-Version 2.0, January 2004
-
-http://www.apache.org/licenses/
-
-TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-1. Definitions.
-
-"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
-
-"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
-
-"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
-
-"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
-
-"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
-
-"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
-
-"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
-
-"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
-
-"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
-
-"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
-
-2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
-
-3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
-
-4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
-
-You must give any other recipients of the Work or Derivative Works a copy of this License; and
-
-You must cause any modified files to carry prominent notices stating that You changed the files; and
-
-You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
-
-If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
-
-5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
-
-6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
-
-7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
-
-8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
-
-9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
-
-END OF TERMS AND CONDITIONS
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-oauth-sign
-==========
-
-OAuth 1 signing. Formerly a vendor lib in mikeal/request, now a standalone module. 
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,43 +0,0 @@
-var crypto = require('crypto')
-  , qs = require('querystring')
-  ;
-
-function sha1 (key, body) {
-  return crypto.createHmac('sha1', key).update(body).digest('base64')
-}
-
-function rfc3986 (str) {
-  return encodeURIComponent(str)
-    .replace(/!/g,'%21')
-    .replace(/\*/g,'%2A')
-    .replace(/\(/g,'%28')
-    .replace(/\)/g,'%29')
-    .replace(/'/g,'%27')
-    ;
-}
-
-function hmacsign (httpMethod, base_uri, params, consumer_secret, token_secret) {
-  // adapted from https://dev.twitter.com/docs/auth/oauth and 
-  // https://dev.twitter.com/docs/auth/creating-signature
-
-  var querystring = Object.keys(params).sort().map(function(key){
-    // big WTF here with the escape + encoding but it's what twitter wants
-    return escape(rfc3986(key)) + "%3D" + escape(rfc3986(params[key]))
-  }).join('%26')
-
-  var base = [
-    httpMethod ? httpMethod.toUpperCase() : 'GET',
-    rfc3986(base_uri),
-    querystring
-  ].join('&')
-
-  var key = [
-    consumer_secret,
-    token_secret || ''
-  ].map(rfc3986).join('&')
-
-  return sha1(key, base)
-}
-
-exports.hmacsign = hmacsign
-exports.rfc3986 = rfc3986
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-{
-  "author": {
-    "name": "Mikeal Rogers",
-    "email": "mikeal.rogers@gmail.com",
-    "url": "http://www.futurealoof.com"
-  },
-  "name": "oauth-sign",
-  "description": "OAuth 1 signing. Formerly a vendor lib in mikeal/request, now a standalone module.",
-  "version": "0.3.0",
-  "repository": {
-    "url": "https://github.com/mikeal/oauth-sign"
-  },
-  "main": "index.js",
-  "dependencies": {},
-  "devDependencies": {},
-  "optionalDependencies": {},
-  "engines": {
-    "node": "*"
-  },
-  "scripts": {
-    "test": "node test.js"
-  },
-  "readme": "oauth-sign\n==========\n\nOAuth 1 signing. Formerly a vendor lib in mikeal/request, now a standalone module. \n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/mikeal/oauth-sign/issues"
-  },
-  "homepage": "https://github.com/mikeal/oauth-sign",
-  "_id": "oauth-sign@0.3.0",
-  "_from": "oauth-sign@~0.3.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/test.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,49 +0,0 @@
-var hmacsign = require('./index').hmacsign
-  , assert = require('assert')
-  , qs = require('querystring')
-  ;
-
-// Tests from Twitter documentation https://dev.twitter.com/docs/auth/oauth
-
-var reqsign = hmacsign('POST', 'https://api.twitter.com/oauth/request_token', 
-  { oauth_callback: 'http://localhost:3005/the_dance/process_callback?service_provider_id=11'
-  , oauth_consumer_key: 'GDdmIQH6jhtmLUypg82g'
-  , oauth_nonce: 'QP70eNmVz8jvdPevU3oJD2AfF7R7odC2XJcn4XlZJqk'
-  , oauth_signature_method: 'HMAC-SHA1'
-  , oauth_timestamp: '1272323042'
-  , oauth_version: '1.0'
-  }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98")
-
-console.log(reqsign)
-console.log('8wUi7m5HFQy76nowoCThusfgB+Q=')
-assert.equal(reqsign, '8wUi7m5HFQy76nowoCThusfgB+Q=')
-
-var accsign = hmacsign('POST', 'https://api.twitter.com/oauth/access_token',
-  { oauth_consumer_key: 'GDdmIQH6jhtmLUypg82g'
-  , oauth_nonce: '9zWH6qe0qG7Lc1telCn7FhUbLyVdjEaL3MO5uHxn8'
-  , oauth_signature_method: 'HMAC-SHA1'
-  , oauth_token: '8ldIZyxQeVrFZXFOZH5tAwj6vzJYuLQpl0WUEYtWc'
-  , oauth_timestamp: '1272323047'
-  , oauth_verifier: 'pDNg57prOHapMbhv25RNf75lVRd6JDsni1AJJIDYoTY'
-  , oauth_version: '1.0'
-  }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98", "x6qpRnlEmW9JbQn4PQVVeVG8ZLPEx6A0TOebgwcuA")
-  
-console.log(accsign)
-console.log('PUw/dHA4fnlJYM6RhXk5IU/0fCc=')
-assert.equal(accsign, 'PUw/dHA4fnlJYM6RhXk5IU/0fCc=')
-
-var upsign = hmacsign('POST', 'http://api.twitter.com/1/statuses/update.json', 
-  { oauth_consumer_key: "GDdmIQH6jhtmLUypg82g"
-  , oauth_nonce: "oElnnMTQIZvqvlfXM56aBLAf5noGD0AQR3Fmi7Q6Y"
-  , oauth_signature_method: "HMAC-SHA1"
-  , oauth_token: "819797-Jxq8aYUDRmykzVKrgoLhXSq67TEa5ruc4GJC2rWimw"
-  , oauth_timestamp: "1272325550"
-  , oauth_version: "1.0"
-  , status: 'setting up my twitter 私のさえずりを設定する'
-  }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98", "J6zix3FfA9LofH0awS24M3HcBYXO5nI1iYe8EfBA")
-
-console.log(upsign)
-console.log('yOahq5m0YjDDjfjxHaXEsW9D+X0=')
-assert.equal(upsign, 'yOahq5m0YjDDjfjxHaXEsW9D+X0=')
-
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/.gitmodules	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-[submodule "support/expresso"]
-	path = support/expresso
-	url = git://github.com/visionmedia/expresso.git
-[submodule "support/should"]
-	path = support/should
-	url = git://github.com/visionmedia/should.js.git
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,7 +0,0 @@
-test
-.travis.yml
-benchmark.js
-component.json
-examples.js
-History.md
-Makefile
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/Readme.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,58 +0,0 @@
-# node-querystring
-
-  query string parser for node and the browser supporting nesting, as it was removed from `0.3.x`, so this library provides the previous and commonly desired behaviour (and twice as fast). Used by [express](http://expressjs.com), [connect](http://senchalabs.github.com/connect) and others.
-
-## Installation
-
-    $ npm install qs
-
-## Examples
-
-```js
-var qs = require('qs');
-
-qs.parse('user[name][first]=Tobi&user[email]=tobi@learnboost.com');
-// => { user: { name: { first: 'Tobi' }, email: 'tobi@learnboost.com' } }
-
-qs.stringify({ user: { name: 'Tobi', email: 'tobi@learnboost.com' }})
-// => user[name]=Tobi&user[email]=tobi%40learnboost.com
-```
-
-## Testing
-
-Install dev dependencies:
-
-    $ npm install -d
-
-and execute:
-
-    $ make test
-
-browser:
-
-    $ open test/browser/index.html
-
-## License 
-
-(The MIT License)
-
-Copyright (c) 2010 TJ Holowaychuk &lt;tj@vision-media.ca&gt;
-
-Permission is hereby granted, free of charge, to any person obtaining
-a copy of this software and associated documentation files (the
-'Software'), to deal in the Software without restriction, including
-without limitation the rights to use, copy, modify, merge, publish,
-distribute, sublicense, and/or sell copies of the Software, and to
-permit persons to whom the Software is furnished to do so, subject to
-the following conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
-IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
-CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
-TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
-SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,387 +0,0 @@
-/**
- * Object#toString() ref for stringify().
- */
-
-var toString = Object.prototype.toString;
-
-/**
- * Object#hasOwnProperty ref
- */
-
-var hasOwnProperty = Object.prototype.hasOwnProperty;
-
-/**
- * Array#indexOf shim.
- */
-
-var indexOf = typeof Array.prototype.indexOf === 'function'
-  ? function(arr, el) { return arr.indexOf(el); }
-  : function(arr, el) {
-      for (var i = 0; i < arr.length; i++) {
-        if (arr[i] === el) return i;
-      }
-      return -1;
-    };
-
-/**
- * Array.isArray shim.
- */
-
-var isArray = Array.isArray || function(arr) {
-  return toString.call(arr) == '[object Array]';
-};
-
-/**
- * Object.keys shim.
- */
-
-var objectKeys = Object.keys || function(obj) {
-  var ret = [];
-  for (var key in obj) ret.push(key);
-  return ret;
-};
-
-/**
- * Array#forEach shim.
- */
-
-var forEach = typeof Array.prototype.forEach === 'function'
-  ? function(arr, fn) { return arr.forEach(fn); }
-  : function(arr, fn) {
-      for (var i = 0; i < arr.length; i++) fn(arr[i]);
-    };
-
-/**
- * Array#reduce shim.
- */
-
-var reduce = function(arr, fn, initial) {
-  if (typeof arr.reduce === 'function') return arr.reduce(fn, initial);
-  var res = initial;
-  for (var i = 0; i < arr.length; i++) res = fn(res, arr[i]);
-  return res;
-};
-
-/**
- * Create a nullary object if possible
- */
-
-function createObject() {
-  return Object.create
-    ? Object.create(null)
-    : {};
-}
-
-/**
- * Cache non-integer test regexp.
- */
-
-var isint = /^[0-9]+$/;
-
-function promote(parent, key) {
-  if (parent[key].length == 0) return parent[key] = createObject();
-  var t = createObject();
-  for (var i in parent[key]) {
-    if (hasOwnProperty.call(parent[key], i)) {
-      t[i] = parent[key][i];
-    }
-  }
-  parent[key] = t;
-  return t;
-}
-
-function parse(parts, parent, key, val) {
-  var part = parts.shift();
-  // end
-  if (!part) {
-    if (isArray(parent[key])) {
-      parent[key].push(val);
-    } else if ('object' == typeof parent[key]) {
-      parent[key] = val;
-    } else if ('undefined' == typeof parent[key]) {
-      parent[key] = val;
-    } else {
-      parent[key] = [parent[key], val];
-    }
-    // array
-  } else {
-    var obj = parent[key] = parent[key] || [];
-    if (']' == part) {
-      if (isArray(obj)) {
-        if ('' != val) obj.push(val);
-      } else if ('object' == typeof obj) {
-        obj[objectKeys(obj).length] = val;
-      } else {
-        obj = parent[key] = [parent[key], val];
-      }
-      // prop
-    } else if (~indexOf(part, ']')) {
-      part = part.substr(0, part.length - 1);
-      if (!isint.test(part) && isArray(obj)) obj = promote(parent, key);
-      parse(parts, obj, part, val);
-      // key
-    } else {
-      if (!isint.test(part) && isArray(obj)) obj = promote(parent, key);
-      parse(parts, obj, part, val);
-    }
-  }
-}
-
-/**
- * Merge parent key/val pair.
- */
-
-function merge(parent, key, val){
-  if (~indexOf(key, ']')) {
-    var parts = key.split('[')
-      , len = parts.length
-      , last = len - 1;
-    parse(parts, parent, 'base', val);
-    // optimize
-  } else {
-    if (!isint.test(key) && isArray(parent.base)) {
-      var t = createObject();
-      for (var k in parent.base) t[k] = parent.base[k];
-      parent.base = t;
-    }
-    set(parent.base, key, val);
-  }
-
-  return parent;
-}
-
-/**
- * Compact sparse arrays.
- */
-
-function compact(obj) {
-  if ('object' != typeof obj) return obj;
-
-  if (isArray(obj)) {
-    var ret = [];
-
-    for (var i in obj) {
-      if (hasOwnProperty.call(obj, i)) {
-        ret.push(obj[i]);
-      }
-    }
-
-    return ret;
-  }
-
-  for (var key in obj) {
-    obj[key] = compact(obj[key]);
-  }
-
-  return obj;
-}
-
-/**
- * Restore Object.prototype.
- * see pull-request #58
- */
-
-function restoreProto(obj) {
-  if (!Object.create) return obj;
-  if (isArray(obj)) return obj;
-  if (obj && 'object' != typeof obj) return obj;
-
-  for (var key in obj) {
-    if (hasOwnProperty.call(obj, key)) {
-      obj[key] = restoreProto(obj[key]);
-    }
-  }
-
-  obj.__proto__ = Object.prototype;
-  return obj;
-}
-
-/**
- * Parse the given obj.
- */
-
-function parseObject(obj){
-  var ret = { base: {} };
-
-  forEach(objectKeys(obj), function(name){
-    merge(ret, name, obj[name]);
-  });
-
-  return compact(ret.base);
-}
-
-/**
- * Parse the given str.
- */
-
-function parseString(str){
-  var ret = reduce(String(str).split('&'), function(ret, pair){
-    var eql = indexOf(pair, '=')
-      , brace = lastBraceInKey(pair)
-      , key = pair.substr(0, brace || eql)
-      , val = pair.substr(brace || eql, pair.length)
-      , val = val.substr(indexOf(val, '=') + 1, val.length);
-
-    // ?foo
-    if ('' == key) key = pair, val = '';
-    if ('' == key) return ret;
-
-    return merge(ret, decode(key), decode(val));
-  }, { base: createObject() }).base;
-
-  return restoreProto(compact(ret));
-}
-
-/**
- * Parse the given query `str` or `obj`, returning an object.
- *
- * @param {String} str | {Object} obj
- * @return {Object}
- * @api public
- */
-
-exports.parse = function(str){
-  if (null == str || '' == str) return {};
-  return 'object' == typeof str
-    ? parseObject(str)
-    : parseString(str);
-};
-
-/**
- * Turn the given `obj` into a query string
- *
- * @param {Object} obj
- * @return {String}
- * @api public
- */
-
-var stringify = exports.stringify = function(obj, prefix) {
-  if (isArray(obj)) {
-    return stringifyArray(obj, prefix);
-  } else if ('[object Object]' == toString.call(obj)) {
-    return stringifyObject(obj, prefix);
-  } else if ('string' == typeof obj) {
-    return stringifyString(obj, prefix);
-  } else {
-    return prefix + '=' + encodeURIComponent(String(obj));
-  }
-};
-
-/**
- * Stringify the given `str`.
- *
- * @param {String} str
- * @param {String} prefix
- * @return {String}
- * @api private
- */
-
-function stringifyString(str, prefix) {
-  if (!prefix) throw new TypeError('stringify expects an object');
-  return prefix + '=' + encodeURIComponent(str);
-}
-
-/**
- * Stringify the given `arr`.
- *
- * @param {Array} arr
- * @param {String} prefix
- * @return {String}
- * @api private
- */
-
-function stringifyArray(arr, prefix) {
-  var ret = [];
-  if (!prefix) throw new TypeError('stringify expects an object');
-  for (var i = 0; i < arr.length; i++) {
-    ret.push(stringify(arr[i], prefix + '[' + i + ']'));
-  }
-  return ret.join('&');
-}
-
-/**
- * Stringify the given `obj`.
- *
- * @param {Object} obj
- * @param {String} prefix
- * @return {String}
- * @api private
- */
-
-function stringifyObject(obj, prefix) {
-  var ret = []
-    , keys = objectKeys(obj)
-    , key;
-
-  for (var i = 0, len = keys.length; i < len; ++i) {
-    key = keys[i];
-    if ('' == key) continue;
-    if (null == obj[key]) {
-      ret.push(encodeURIComponent(key) + '=');
-    } else {
-      ret.push(stringify(obj[key], prefix
-        ? prefix + '[' + encodeURIComponent(key) + ']'
-        : encodeURIComponent(key)));
-    }
-  }
-
-  return ret.join('&');
-}
-
-/**
- * Set `obj`'s `key` to `val` respecting
- * the weird and wonderful syntax of a qs,
- * where "foo=bar&foo=baz" becomes an array.
- *
- * @param {Object} obj
- * @param {String} key
- * @param {String} val
- * @api private
- */
-
-function set(obj, key, val) {
-  var v = obj[key];
-  if (undefined === v) {
-    obj[key] = val;
-  } else if (isArray(v)) {
-    v.push(val);
-  } else {
-    obj[key] = [v, val];
-  }
-}
-
-/**
- * Locate last brace in `str` within the key.
- *
- * @param {String} str
- * @return {Number}
- * @api private
- */
-
-function lastBraceInKey(str) {
-  var len = str.length
-    , brace
-    , c;
-  for (var i = 0; i < len; ++i) {
-    c = str[i];
-    if (']' == c) brace = false;
-    if ('[' == c) brace = true;
-    if ('=' == c && !brace) return i;
-  }
-}
-
-/**
- * Decode `str`.
- *
- * @param {String} str
- * @return {String}
- * @api private
- */
-
-function decode(str) {
-  try {
-    return decodeURIComponent(str.replace(/\+/g, ' '));
-  } catch (err) {
-    return str;
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/qs/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,38 +0,0 @@
-{
-  "name": "qs",
-  "description": "querystring parser",
-  "version": "0.6.5",
-  "keywords": [
-    "query string",
-    "parser",
-    "component"
-  ],
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/visionmedia/node-querystring.git"
-  },
-  "devDependencies": {
-    "mocha": "*",
-    "expect.js": "*"
-  },
-  "scripts": {
-    "test": "make test"
-  },
-  "author": {
-    "name": "TJ Holowaychuk",
-    "email": "tj@vision-media.ca",
-    "url": "http://tjholowaychuk.com"
-  },
-  "main": "index",
-  "engines": {
-    "node": "*"
-  },
-  "readme": "# node-querystring\n\n  query string parser for node and the browser supporting nesting, as it was removed from `0.3.x`, so this library provides the previous and commonly desired behaviour (and twice as fast). Used by [express](http://expressjs.com), [connect](http://senchalabs.github.com/connect) and others.\n\n## Installation\n\n    $ npm install qs\n\n## Examples\n\n```js\nvar qs = require('qs');\n\nqs.parse('user[name][first]=Tobi&user[email]=tobi@learnboost.com');\n// => { user: { name: { first: 'Tobi' }, email: 'tobi@learnboost.com' } }\n\nqs.stringify({ user: { name: 'Tobi', email: 'tobi@learnboost.com' }})\n// => user[name]=Tobi&user[email]=tobi%40learnboost.com\n```\n\n## Testing\n\nInstall dev dependencies:\n\n    $ npm install -d\n\nand execute:\n\n    $ make test\n\nbrowser:\n\n    $ open test/browser/index.html\n\n## License \n\n(The MIT License)\n\nCopyright (c) 2010 TJ Holowaychuk &lt;tj@vision-media.ca&gt;\n\nPermission is hereby granted, free of charge, to any person obtaining\na copy of this software and associated documentation files (the\n'Software'), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to\npermit persons to whom the Software is furnished to do so, subject to\nthe following conditions:\n\nThe above copyright notice and this permission notice shall be\nincluded in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,\nEXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.\nIN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY\nCLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,\nTORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\nSOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
-  "readmeFilename": "Readme.md",
-  "bugs": {
-    "url": "https://github.com/visionmedia/node-querystring/issues"
-  },
-  "homepage": "https://github.com/visionmedia/node-querystring",
-  "_id": "qs@0.6.5",
-  "_from": "qs@~0.6.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,55 +0,0 @@
-Apache License
-
-Version 2.0, January 2004
-
-http://www.apache.org/licenses/
-
-TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-1. Definitions.
-
-"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
-
-"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
-
-"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
-
-"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
-
-"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
-
-"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
-
-"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
-
-"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
-
-"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
-
-"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
-
-2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
-
-3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
-
-4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
-
-You must give any other recipients of the Work or Derivative Works a copy of this License; and
-
-You must cause any modified files to carry prominent notices stating that You changed the files; and
-
-You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
-
-If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
-
-5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
-
-6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
-
-7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
-
-8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
-
-9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
-
-END OF TERMS AND CONDITIONS
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-tunnel-agent
-============
-
-HTTP proxy tunneling agent. Formerly part of mikeal/request, now a standalone module.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,227 +0,0 @@
-'use strict'
-
-var net = require('net')
-  , tls = require('tls')
-  , http = require('http')
-  , https = require('https')
-  , events = require('events')
-  , assert = require('assert')
-  , util = require('util')
-  ;
-
-exports.httpOverHttp = httpOverHttp
-exports.httpsOverHttp = httpsOverHttp
-exports.httpOverHttps = httpOverHttps
-exports.httpsOverHttps = httpsOverHttps
-
-
-function httpOverHttp(options) {
-  var agent = new TunnelingAgent(options)
-  agent.request = http.request
-  return agent
-}
-
-function httpsOverHttp(options) {
-  var agent = new TunnelingAgent(options)
-  agent.request = http.request
-  agent.createSocket = createSecureSocket
-  return agent
-}
-
-function httpOverHttps(options) {
-  var agent = new TunnelingAgent(options)
-  agent.request = https.request
-  return agent
-}
-
-function httpsOverHttps(options) {
-  var agent = new TunnelingAgent(options)
-  agent.request = https.request
-  agent.createSocket = createSecureSocket
-  return agent
-}
-
-
-function TunnelingAgent(options) {
-  var self = this
-  self.options = options || {}
-  self.proxyOptions = self.options.proxy || {}
-  self.maxSockets = self.options.maxSockets || http.Agent.defaultMaxSockets
-  self.requests = []
-  self.sockets = []
-
-  self.on('free', function onFree(socket, host, port) {
-    for (var i = 0, len = self.requests.length; i < len; ++i) {
-      var pending = self.requests[i]
-      if (pending.host === host && pending.port === port) {
-        // Detect the request to connect same origin server,
-        // reuse the connection.
-        self.requests.splice(i, 1)
-        pending.request.onSocket(socket)
-        return
-      }
-    }
-    socket.destroy()
-    self.removeSocket(socket)
-  })
-}
-util.inherits(TunnelingAgent, events.EventEmitter)
-
-TunnelingAgent.prototype.addRequest = function addRequest(req, host, port) {
-  var self = this
-
-  if (self.sockets.length >= this.maxSockets) {
-    // We are over limit so we'll add it to the queue.
-    self.requests.push({host: host, port: port, request: req})
-    return
-  }
-
-  // If we are under maxSockets create a new one.
-  self.createSocket({host: host, port: port, request: req}, function(socket) {
-    socket.on('free', onFree)
-    socket.on('close', onCloseOrRemove)
-    socket.on('agentRemove', onCloseOrRemove)
-    req.onSocket(socket)
-
-    function onFree() {
-      self.emit('free', socket, host, port)
-    }
-
-    function onCloseOrRemove(err) {
-      self.removeSocket()
-      socket.removeListener('free', onFree)
-      socket.removeListener('close', onCloseOrRemove)
-      socket.removeListener('agentRemove', onCloseOrRemove)
-    }
-  })
-}
-
-TunnelingAgent.prototype.createSocket = function createSocket(options, cb) {
-  var self = this
-  var placeholder = {}
-  self.sockets.push(placeholder)
-
-  var connectOptions = mergeOptions({}, self.proxyOptions, 
-    { method: 'CONNECT'
-    , path: options.host + ':' + options.port
-    , agent: false
-    }
-  )
-  if (connectOptions.proxyAuth) {
-    connectOptions.headers = connectOptions.headers || {}
-    connectOptions.headers['Proxy-Authorization'] = 'Basic ' +
-        new Buffer(connectOptions.proxyAuth).toString('base64')
-  }
-
-  debug('making CONNECT request')
-  var connectReq = self.request(connectOptions)
-  connectReq.useChunkedEncodingByDefault = false // for v0.6
-  connectReq.once('response', onResponse) // for v0.6
-  connectReq.once('upgrade', onUpgrade)   // for v0.6
-  connectReq.once('connect', onConnect)   // for v0.7 or later
-  connectReq.once('error', onError)
-  connectReq.end()
-
-  function onResponse(res) {
-    // Very hacky. This is necessary to avoid http-parser leaks.
-    res.upgrade = true
-  }
-
-  function onUpgrade(res, socket, head) {
-    // Hacky.
-    process.nextTick(function() {
-      onConnect(res, socket, head)
-    })
-  }
-
-  function onConnect(res, socket, head) {
-    connectReq.removeAllListeners()
-    socket.removeAllListeners()
-
-    if (res.statusCode === 200) {
-      assert.equal(head.length, 0)
-      debug('tunneling connection has established')
-      self.sockets[self.sockets.indexOf(placeholder)] = socket
-      cb(socket)
-    } else {
-      debug('tunneling socket could not be established, statusCode=%d', res.statusCode)
-      var error = new Error('tunneling socket could not be established, ' + 'statusCode=' + res.statusCode)
-      error.code = 'ECONNRESET'
-      options.request.emit('error', error)
-      self.removeSocket(placeholder)
-    }
-  }
-
-  function onError(cause) {
-    connectReq.removeAllListeners()
-
-    debug('tunneling socket could not be established, cause=%s\n', cause.message, cause.stack)
-    var error = new Error('tunneling socket could not be established, ' + 'cause=' + cause.message)
-    error.code = 'ECONNRESET'
-    options.request.emit('error', error)
-    self.removeSocket(placeholder)
-  }
-}
-
-TunnelingAgent.prototype.removeSocket = function removeSocket(socket) {
-  var pos = this.sockets.indexOf(socket)
-  if (pos === -1) return
-  
-  this.sockets.splice(pos, 1)
-
-  var pending = this.requests.shift()
-  if (pending) {
-    // If we have pending requests and a socket gets closed a new one
-    // needs to be created to take over in the pool for the one that closed.
-    this.createSocket(pending, function(socket) {
-      pending.request.onSocket(socket)
-    })
-  }
-}
-
-function createSecureSocket(options, cb) {
-  var self = this
-  TunnelingAgent.prototype.createSocket.call(self, options, function(socket) {
-    // 0 is dummy port for v0.6
-    var secureSocket = tls.connect(0, mergeOptions({}, self.options, 
-      { servername: options.host
-      , socket: socket
-      }
-    ))
-    cb(secureSocket)
-  })
-}
-
-
-function mergeOptions(target) {
-  for (var i = 1, len = arguments.length; i < len; ++i) {
-    var overrides = arguments[i]
-    if (typeof overrides === 'object') {
-      var keys = Object.keys(overrides)
-      for (var j = 0, keyLen = keys.length; j < keyLen; ++j) {
-        var k = keys[j]
-        if (overrides[k] !== undefined) {
-          target[k] = overrides[k]
-        }
-      }
-    }
-  }
-  return target
-}
-
-
-var debug
-if (process.env.NODE_DEBUG && /\btunnel\b/.test(process.env.NODE_DEBUG)) {
-  debug = function() {
-    var args = Array.prototype.slice.call(arguments)
-    if (typeof args[0] === 'string') {
-      args[0] = 'TUNNEL: ' + args[0]
-    } else {
-      args.unshift('TUNNEL:')
-    }
-    console.error.apply(console, args)
-  }
-} else {
-  debug = function() {}
-}
-exports.debug = debug // for test
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-{
-  "author": {
-    "name": "Mikeal Rogers",
-    "email": "mikeal.rogers@gmail.com",
-    "url": "http://www.futurealoof.com"
-  },
-  "name": "tunnel-agent",
-  "description": "HTTP proxy tunneling agent. Formerly part of mikeal/request, now a standalone module.",
-  "version": "0.3.0",
-  "repository": {
-    "url": "https://github.com/mikeal/tunnel-agent"
-  },
-  "main": "index.js",
-  "dependencies": {},
-  "devDependencies": {},
-  "optionalDependencies": {},
-  "engines": {
-    "node": "*"
-  },
-  "readme": "tunnel-agent\n============\n\nHTTP proxy tunneling agent. Formerly part of mikeal/request, now a standalone module.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/mikeal/tunnel-agent/issues"
-  },
-  "homepage": "https://github.com/mikeal/tunnel-agent",
-  "_id": "tunnel-agent@0.3.0",
-  "_from": "tunnel-agent@~0.3.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,48 +0,0 @@
-{
-  "name": "request",
-  "description": "Simplified HTTP request client.",
-  "tags": [
-    "http",
-    "simple",
-    "util",
-    "utility"
-  ],
-  "version": "2.27.0",
-  "author": {
-    "name": "Mikeal Rogers",
-    "email": "mikeal.rogers@gmail.com"
-  },
-  "repository": {
-    "type": "git",
-    "url": "http://github.com/mikeal/request.git"
-  },
-  "bugs": {
-    "url": "http://github.com/mikeal/request/issues"
-  },
-  "engines": [
-    "node >= 0.8.0"
-  ],
-  "main": "index.js",
-  "dependencies": {
-    "qs": "~0.6.0",
-    "json-stringify-safe": "~5.0.0",
-    "forever-agent": "~0.5.0",
-    "tunnel-agent": "~0.3.0",
-    "http-signature": "~0.10.0",
-    "hawk": "~1.0.0",
-    "aws-sign": "~0.3.0",
-    "oauth-sign": "~0.3.0",
-    "cookie-jar": "~0.3.0",
-    "node-uuid": "~1.4.0",
-    "mime": "~1.2.9",
-    "form-data": "~0.1.0"
-  },
-  "scripts": {
-    "test": "node tests/run.js"
-  },
-  "readme": "# Request -- Simplified HTTP client\n\n[![NPM](https://nodei.co/npm/request.png)](https://nodei.co/npm/request/)\n\n## Super simple to use\n\nRequest is designed to be the simplest way possible to make http calls. It supports HTTPS and follows redirects by default.\n\n```javascript\nvar request = require('request');\nrequest('http://www.google.com', function (error, response, body) {\n  if (!error && response.statusCode == 200) {\n    console.log(body) // Print the google web page.\n  }\n})\n```\n\n## Streaming\n\nYou can stream any response to a file stream.\n\n```javascript\nrequest('http://google.com/doodle.png').pipe(fs.createWriteStream('doodle.png'))\n```\n\nYou can also stream a file to a PUT or POST request. This method will also check the file extension against a mapping of file extensions to content-types, in this case `application/json`, and use the proper content-type in the PUT request if one is not already provided in the headers.\n\n```javascript\nfs.createReadStream('file.json').pipe(request.put('http://mysite.com/obj.json'))\n```\n\nRequest can also pipe to itself. When doing so the content-type and content-length will be preserved in the PUT headers.\n\n```javascript\nrequest.get('http://google.com/img.png').pipe(request.put('http://mysite.com/img.png'))\n```\n\nNow let's get fancy.\n\n```javascript\nhttp.createServer(function (req, resp) {\n  if (req.url === '/doodle.png') {\n    if (req.method === 'PUT') {\n      req.pipe(request.put('http://mysite.com/doodle.png'))\n    } else if (req.method === 'GET' || req.method === 'HEAD') {\n      request.get('http://mysite.com/doodle.png').pipe(resp)\n    }\n  }\n})\n```\n\nYou can also pipe() from a http.ServerRequest instance and to a http.ServerResponse instance. The HTTP method and headers will be sent as well as the entity-body data. Which means that, if you don't really care about security, you can do:\n\n```javascript\nhttp.createServer(function (req, resp) {\n  if (req.url === '/doodle.png') {\n    var x = request('http://mysite.com/doodle.png')\n    req.pipe(x)\n    x.pipe(resp)\n  }\n})\n```\n\nAnd since pipe() returns the destination stream in node 0.5.x you can do one line proxying :)\n\n```javascript\nreq.pipe(request('http://mysite.com/doodle.png')).pipe(resp)\n```\n\nAlso, none of this new functionality conflicts with requests previous features, it just expands them.\n\n```javascript\nvar r = request.defaults({'proxy':'http://localproxy.com'})\n\nhttp.createServer(function (req, resp) {\n  if (req.url === '/doodle.png') {\n    r.get('http://google.com/doodle.png').pipe(resp)\n  }\n})\n```\nYou can still use intermediate proxies, the requests will still follow HTTP forwards, etc.\n\n## Forms\n\n`request` supports `application/x-www-form-urlencoded` and `multipart/form-data` form uploads. For `multipart/related` refer to the `multipart` API.\n\nUrl encoded forms are simple\n\n```javascript\nrequest.post('http://service.com/upload', {form:{key:'value'}})\n// or\nrequest.post('http://service.com/upload').form({key:'value'})\n```\n\nFor `multipart/form-data` we use the [form-data](https://github.com/felixge/node-form-data) library by [@felixge](https://github.com/felixge). You don't need to worry about piping the form object or setting the headers, `request` will handle that for you.\n\n```javascript\nvar r = request.post('http://service.com/upload')\nvar form = r.form()\nform.append('my_field', 'my_value')\nform.append('my_buffer', new Buffer([1, 2, 3]))\nform.append('my_file', fs.createReadStream(path.join(__dirname, 'doodle.png'))\nform.append('remote_file', request('http://google.com/doodle.png'))\n```\n\n## HTTP Authentication\n\n```javascript\nrequest.get('http://some.server.com/').auth('username', 'password', false);\n// or\nrequest.get('http://some.server.com/', {\n  'auth': {\n    'user': 'username',\n    'pass': 'password',\n    'sendImmediately': false\n  }\n});\n```\n\nIf passed as an option, `auth` should be a hash containing values `user` || `username`, `password` || `pass`, and `sendImmediately` (optional).  The method form takes parameters `auth(username, password, sendImmediately)`.\n\n`sendImmediately` defaults to true, which will cause a basic authentication header to be sent.  If `sendImmediately` is `false`, then `request` will retry with a proper authentication header after receiving a 401 response from the server (which must contain a `WWW-Authenticate` header indicating the required authentication method).\n\nDigest authentication is supported, but it only works with `sendImmediately` set to `false` (otherwise `request` will send basic authentication on the initial request, which will probably cause the request to fail).\n\n## OAuth Signing\n\n```javascript\n// Twitter OAuth\nvar qs = require('querystring')\n  , oauth =\n    { callback: 'http://mysite.com/callback/'\n    , consumer_key: CONSUMER_KEY\n    , consumer_secret: CONSUMER_SECRET\n    }\n  , url = 'https://api.twitter.com/oauth/request_token'\n  ;\nrequest.post({url:url, oauth:oauth}, function (e, r, body) {\n  // Ideally, you would take the body in the response\n  // and construct a URL that a user clicks on (like a sign in button).\n  // The verifier is only available in the response after a user has\n  // verified with twitter that they are authorizing your app.\n  var access_token = qs.parse(body)\n    , oauth =\n      { consumer_key: CONSUMER_KEY\n      , consumer_secret: CONSUMER_SECRET\n      , token: access_token.oauth_token\n      , verifier: access_token.oauth_verifier\n      }\n    , url = 'https://api.twitter.com/oauth/access_token'\n    ;\n  request.post({url:url, oauth:oauth}, function (e, r, body) {\n    var perm_token = qs.parse(body)\n      , oauth =\n        { consumer_key: CONSUMER_KEY\n        , consumer_secret: CONSUMER_SECRET\n        , token: perm_token.oauth_token\n        , token_secret: perm_token.oauth_token_secret\n        }\n      , url = 'https://api.twitter.com/1/users/show.json?'\n      , params =\n        { screen_name: perm_token.screen_name\n        , user_id: perm_token.user_id\n        }\n      ;\n    url += qs.stringify(params)\n    request.get({url:url, oauth:oauth, json:true}, function (e, r, user) {\n      console.log(user)\n    })\n  })\n})\n```\n\n\n\n### request(options, callback)\n\nThe first argument can be either a url or an options object. The only required option is uri, all others are optional.\n\n* `uri` || `url` - fully qualified uri or a parsed url object from url.parse()\n* `qs` - object containing querystring values to be appended to the uri\n* `method` - http method, defaults to GET\n* `headers` - http headers, defaults to {}\n* `body` - entity body for PATCH, POST and PUT requests. Must be buffer or string.\n* `form` - when passed an object this will set `body` but to a querystring representation of value and adds `Content-type: application/x-www-form-urlencoded; charset=utf-8` header. When passed no option a FormData instance is returned that will be piped to request.\n* `auth` - A hash containing values `user` || `username`, `password` || `pass`, and `sendImmediately` (optional).  See documentation above.\n* `json` - sets `body` but to JSON representation of value and adds `Content-type: application/json` header.  Additionally, parses the response body as json.\n* `multipart` - (experimental) array of objects which contains their own headers and `body` attribute. Sends `multipart/related` request. See example below.\n* `followRedirect` - follow HTTP 3xx responses as redirects. defaults to true.\n* `followAllRedirects` - follow non-GET HTTP 3xx responses as redirects. defaults to false.\n* `maxRedirects` - the maximum number of redirects to follow, defaults to 10.\n* `encoding` - Encoding to be used on `setEncoding` of response data. If set to `null`, the body is returned as a Buffer.\n* `pool` - A hash object containing the agents for these requests. If omitted this request will use the global pool which is set to node's default maxSockets.\n* `pool.maxSockets` - Integer containing the maximum amount of sockets in the pool.\n* `timeout` - Integer containing the number of milliseconds to wait for a request to respond before aborting the request\n* `proxy` - An HTTP proxy to be used. Support proxy Auth with Basic Auth the same way it's supported with the `url` parameter by embedding the auth info in the uri.\n* `oauth` - Options for OAuth HMAC-SHA1 signing, see documentation above.\n* `hawk` - Options for [Hawk signing](https://github.com/hueniverse/hawk). The `credentials` key must contain the necessary signing info, [see hawk docs for details](https://github.com/hueniverse/hawk#usage-example).\n* `strictSSL` - Set to `true` to require that SSL certificates be valid. Note: to use your own certificate authority, you need to specify an agent that was created with that ca as an option.\n* `jar` - Set to `true` if you want cookies to be remembered for future use, or define your custom cookie jar (see examples section)\n* `aws` - object containing aws signing information, should have the properties `key` and `secret` as well as `bucket` unless you're specifying your bucket as part of the path, or you are making a request that doesn't use a bucket (i.e. GET Services)\n* `httpSignature` - Options for the [HTTP Signature Scheme](https://github.com/joyent/node-http-signature/blob/master/http_signing.md) using [Joyent's library](https://github.com/joyent/node-http-signature). The `keyId` and `key` properties must be specified. See the docs for other options.\n* `localAddress` - Local interface to bind for network connections.\n\n\nThe callback argument gets 3 arguments. The first is an error when applicable (usually from the http.Client option not the http.ClientRequest object). The second is an http.ClientResponse object. The third is the response body String or Buffer.\n\n## Convenience methods\n\nThere are also shorthand methods for different HTTP METHODs and some other conveniences.\n\n### request.defaults(options)\n\nThis method returns a wrapper around the normal request API that defaults to whatever options you pass in to it.\n\n### request.put\n\nSame as request() but defaults to `method: \"PUT\"`.\n\n```javascript\nrequest.put(url)\n```\n\n### request.patch\n\nSame as request() but defaults to `method: \"PATCH\"`.\n\n```javascript\nrequest.patch(url)\n```\n\n### request.post\n\nSame as request() but defaults to `method: \"POST\"`.\n\n```javascript\nrequest.post(url)\n```\n\n### request.head\n\nSame as request() but defaults to `method: \"HEAD\"`.\n\n```javascript\nrequest.head(url)\n```\n\n### request.del\n\nSame as request() but defaults to `method: \"DELETE\"`.\n\n```javascript\nrequest.del(url)\n```\n\n### request.get\n\nAlias to normal request method for uniformity.\n\n```javascript\nrequest.get(url)\n```\n### request.cookie\n\nFunction that creates a new cookie.\n\n```javascript\nrequest.cookie('cookie_string_here')\n```\n### request.jar\n\nFunction that creates a new cookie jar.\n\n```javascript\nrequest.jar()\n```\n\n\n## Examples:\n\n```javascript\n  var request = require('request')\n    , rand = Math.floor(Math.random()*100000000).toString()\n    ;\n  request(\n    { method: 'PUT'\n    , uri: 'http://mikeal.iriscouch.com/testjs/' + rand\n    , multipart:\n      [ { 'content-type': 'application/json'\n        ,  body: JSON.stringify({foo: 'bar', _attachments: {'message.txt': {follows: true, length: 18, 'content_type': 'text/plain' }}})\n        }\n      , { body: 'I am an attachment' }\n      ]\n    }\n  , function (error, response, body) {\n      if(response.statusCode == 201){\n        console.log('document saved as: http://mikeal.iriscouch.com/testjs/'+ rand)\n      } else {\n        console.log('error: '+ response.statusCode)\n        console.log(body)\n      }\n    }\n  )\n```\nCookies are disabled by default (else, they would be used in subsequent requests). To enable cookies set jar to true (either in defaults or in the options sent).\n\n```javascript\nvar request = request.defaults({jar: true})\nrequest('http://www.google.com', function () {\n  request('http://images.google.com')\n})\n```\n\nIf you to use a custom cookie jar (instead of letting request use its own global cookie jar) you do so by setting the jar default or by specifying it as an option:\n\n```javascript\nvar j = request.jar()\nvar request = request.defaults({jar:j})\nrequest('http://www.google.com', function () {\n  request('http://images.google.com')\n})\n```\nOR\n\n```javascript\nvar j = request.jar()\nvar cookie = request.cookie('your_cookie_here')\nj.add(cookie)\nrequest({url: 'http://www.google.com', jar: j}, function () {\n  request('http://images.google.com')\n})\n```\n",
-  "readmeFilename": "README.md",
-  "homepage": "https://github.com/mikeal/request",
-  "_id": "request@2.27.0",
-  "_from": "request@~2.27.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/request.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1235 +0,0 @@
-var http = require('http')
-  , https = false
-  , tls = false
-  , url = require('url')
-  , util = require('util')
-  , stream = require('stream')
-  , qs = require('qs')
-  , querystring = require('querystring')
-  , crypto = require('crypto')
-
-  , oauth = require('oauth-sign')
-  , hawk = require('hawk')
-  , aws = require('aws-sign')
-  , httpSignature = require('http-signature')
-  , uuid = require('node-uuid')
-  , mime = require('mime')
-  , tunnel = require('tunnel-agent')
-  , _safeStringify = require('json-stringify-safe')
-
-  , ForeverAgent = require('forever-agent')
-  , FormData = require('form-data')
-
-  , Cookie = require('cookie-jar')
-  , CookieJar = Cookie.Jar
-  , cookieJar = new CookieJar
-
-  , copy = require('./lib/copy')
-  , debug = require('./lib/debug')
-  , getSafe = require('./lib/getSafe')
-  ;
-
-function safeStringify (obj) {
-  var ret
-  try { ret = JSON.stringify(obj) }
-  catch (e) { ret = _safeStringify(obj) }
-  return ret
-}
-
-var globalPool = {}
-var isUrl = /^https?:/i
-
-try {
-  https = require('https')
-} catch (e) {}
-
-try {
-  tls = require('tls')
-} catch (e) {}
-
-
-
-// Hacky fix for pre-0.4.4 https
-if (https && !https.Agent) {
-  https.Agent = function (options) {
-    http.Agent.call(this, options)
-  }
-  util.inherits(https.Agent, http.Agent)
-  https.Agent.prototype._getConnection = function (host, port, cb) {
-    var s = tls.connect(port, host, this.options, function () {
-      // do other checks here?
-      if (cb) cb()
-    })
-    return s
-  }
-}
-
-function isReadStream (rs) {
-  if (rs.readable && rs.path && rs.mode) {
-    return true
-  }
-}
-
-function toBase64 (str) {
-  return (new Buffer(str || "", "ascii")).toString("base64")
-}
-
-function md5 (str) {
-  return crypto.createHash('md5').update(str).digest('hex')
-}
-
-function Request (options) {
-  stream.Stream.call(this)
-  this.readable = true
-  this.writable = true
-
-  if (typeof options === 'string') {
-    options = {uri:options}
-  }
-
-  var reserved = Object.keys(Request.prototype)
-  for (var i in options) {
-    if (reserved.indexOf(i) === -1) {
-      this[i] = options[i]
-    } else {
-      if (typeof options[i] === 'function') {
-        delete options[i]
-      }
-    }
-  }
-
-  if (options.method) {
-    this.explicitMethod = true
-  }
-
-  this.init(options)
-}
-util.inherits(Request, stream.Stream)
-Request.prototype.init = function (options) {
-  // init() contains all the code to setup the request object.
-  // the actual outgoing request is not started until start() is called
-  // this function is called from both the constructor and on redirect.
-  var self = this
-  if (!options) options = {}
-
-  if (!self.method) self.method = options.method || 'GET'
-  self.localAddress = options.localAddress
-
-  debug(options)
-  if (!self.pool && self.pool !== false) self.pool = globalPool
-  self.dests = self.dests || []
-  self.__isRequestRequest = true
-
-  // Protect against double callback
-  if (!self._callback && self.callback) {
-    self._callback = self.callback
-    self.callback = function () {
-      if (self._callbackCalled) return // Print a warning maybe?
-      self._callbackCalled = true
-      self._callback.apply(self, arguments)
-    }
-    self.on('error', self.callback.bind())
-    self.on('complete', self.callback.bind(self, null))
-  }
-
-  if (self.url && !self.uri) {
-    // People use this property instead all the time so why not just support it.
-    self.uri = self.url
-    delete self.url
-  }
-
-  if (!self.uri) {
-    // this will throw if unhandled but is handleable when in a redirect
-    return self.emit('error', new Error("options.uri is a required argument"))
-  } else {
-    if (typeof self.uri == "string") self.uri = url.parse(self.uri)
-  }
-
-  if (self.strictSSL === false) {
-    self.rejectUnauthorized = false
-  }
-
-  if (self.proxy) {
-    if (typeof self.proxy == 'string') self.proxy = url.parse(self.proxy)
-
-    // do the HTTP CONNECT dance using koichik/node-tunnel
-    if (http.globalAgent && self.uri.protocol === "https:") {
-      var tunnelFn = self.proxy.protocol === "http:"
-                   ? tunnel.httpsOverHttp : tunnel.httpsOverHttps
-
-      var tunnelOptions = { proxy: { host: self.proxy.hostname
-                                   , port: +self.proxy.port
-                                   , proxyAuth: self.proxy.auth
-                                   , headers: { Host: self.uri.hostname + ':' +
-                                        (self.uri.port || self.uri.protocol === 'https:' ? 443 : 80) }}
-                          , rejectUnauthorized: self.rejectUnauthorized
-                          , ca: this.ca }
-
-      self.agent = tunnelFn(tunnelOptions)
-      self.tunnel = true
-    }
-  }
-
-  if (!self.uri.pathname) {self.uri.pathname = '/'}
-
-  if (!self.uri.host) {
-    // Invalid URI: it may generate lot of bad errors, like "TypeError: Cannot call method 'indexOf' of undefined" in CookieJar
-    // Detect and reject it as soon as possible
-    var faultyUri = url.format(self.uri)
-    var message = 'Invalid URI "' + faultyUri + '"'
-    if (Object.keys(options).length === 0) {
-      // No option ? This can be the sign of a redirect
-      // As this is a case where the user cannot do anything (he didn't call request directly with this URL)
-      // he should be warned that it can be caused by a redirection (can save some hair)
-      message += '. This can be caused by a crappy redirection.'
-    }
-    self.emit('error', new Error(message))
-    return // This error was fatal
-  }
-
-  self._redirectsFollowed = self._redirectsFollowed || 0
-  self.maxRedirects = (self.maxRedirects !== undefined) ? self.maxRedirects : 10
-  self.followRedirect = (self.followRedirect !== undefined) ? self.followRedirect : true
-  self.followAllRedirects = (self.followAllRedirects !== undefined) ? self.followAllRedirects : false
-  if (self.followRedirect || self.followAllRedirects)
-    self.redirects = self.redirects || []
-
-  self.headers = self.headers ? copy(self.headers) : {}
-
-  self.setHost = false
-  if (!self.hasHeader('host')) {
-    self.setHeader('host', self.uri.hostname)
-    if (self.uri.port) {
-      if ( !(self.uri.port === 80 && self.uri.protocol === 'http:') &&
-           !(self.uri.port === 443 && self.uri.protocol === 'https:') )
-      self.setHeader('host', self.getHeader('host') + (':'+self.uri.port) )
-    }
-    self.setHost = true
-  }
-
-  self.jar(self._jar || options.jar)
-
-  if (!self.uri.port) {
-    if (self.uri.protocol == 'http:') {self.uri.port = 80}
-    else if (self.uri.protocol == 'https:') {self.uri.port = 443}
-  }
-
-  if (self.proxy && !self.tunnel) {
-    self.port = self.proxy.port
-    self.host = self.proxy.hostname
-  } else {
-    self.port = self.uri.port
-    self.host = self.uri.hostname
-  }
-
-  self.clientErrorHandler = function (error) {
-    if (self._aborted) return
-
-    if (self.req && self.req._reusedSocket && error.code === 'ECONNRESET'
-        && self.agent.addRequestNoreuse) {
-      self.agent = { addRequest: self.agent.addRequestNoreuse.bind(self.agent) }
-      self.start()
-      self.req.end()
-      return
-    }
-    if (self.timeout && self.timeoutTimer) {
-      clearTimeout(self.timeoutTimer)
-      self.timeoutTimer = null
-    }
-    self.emit('error', error)
-  }
-
-  self._parserErrorHandler = function (error) {
-    if (this.res) {
-      if (this.res.request) {
-        this.res.request.emit('error', error)
-      } else {
-        this.res.emit('error', error)
-      }
-    } else {
-      this._httpMessage.emit('error', error)
-    }
-  }
-
-  if (options.form) {
-    self.form(options.form)
-  }
-
-  if (options.qs) self.qs(options.qs)
-
-  if (self.uri.path) {
-    self.path = self.uri.path
-  } else {
-    self.path = self.uri.pathname + (self.uri.search || "")
-  }
-
-  if (self.path.length === 0) self.path = '/'
-
-
-  // Auth must happen last in case signing is dependent on other headers
-  if (options.oauth) {
-    self.oauth(options.oauth)
-  }
-
-  if (options.aws) {
-    self.aws(options.aws)
-  }
-
-  if (options.hawk) {
-    self.hawk(options.hawk)
-  }
-
-  if (options.httpSignature) {
-    self.httpSignature(options.httpSignature)
-  }
-
-  if (options.auth) {
-    self.auth(
-      (options.auth.user==="") ? options.auth.user : (options.auth.user || options.auth.username ),
-      options.auth.pass || options.auth.password,
-      options.auth.sendImmediately)
-  }
-
-  if (self.uri.auth && !self.hasHeader('authorization')) {
-    var authPieces = self.uri.auth.split(':').map(function(item){ return querystring.unescape(item) })
-    self.auth(authPieces[0], authPieces.slice(1).join(':'), true)
-  }
-  if (self.proxy && self.proxy.auth && !self.hasHeader('proxy-authorization') && !self.tunnel) {
-    self.setHeader('proxy-authorization', "Basic " + toBase64(self.proxy.auth.split(':').map(function(item){ return querystring.unescape(item)}).join(':')))
-  }
-
-
-  if (self.proxy && !self.tunnel) self.path = (self.uri.protocol + '//' + self.uri.host + self.path)
-
-  if (options.json) {
-    self.json(options.json)
-  } else if (options.multipart) {
-    self.boundary = uuid()
-    self.multipart(options.multipart)
-  }
-
-  if (self.body) {
-    var length = 0
-    if (!Buffer.isBuffer(self.body)) {
-      if (Array.isArray(self.body)) {
-        for (var i = 0; i < self.body.length; i++) {
-          length += self.body[i].length
-        }
-      } else {
-        self.body = new Buffer(self.body)
-        length = self.body.length
-      }
-    } else {
-      length = self.body.length
-    }
-    if (length) {
-      if (!self.hasHeader('content-length')) self.setHeader('content-length', length)
-    } else {
-      throw new Error('Argument error, options.body.')
-    }
-  }
-
-  var protocol = self.proxy && !self.tunnel ? self.proxy.protocol : self.uri.protocol
-    , defaultModules = {'http:':http, 'https:':https}
-    , httpModules = self.httpModules || {}
-    ;
-  self.httpModule = httpModules[protocol] || defaultModules[protocol]
-
-  if (!self.httpModule) return this.emit('error', new Error("Invalid protocol"))
-
-  if (options.ca) self.ca = options.ca
-
-  if (!self.agent) {
-    if (options.agentOptions) self.agentOptions = options.agentOptions
-
-    if (options.agentClass) {
-      self.agentClass = options.agentClass
-    } else if (options.forever) {
-      self.agentClass = protocol === 'http:' ? ForeverAgent : ForeverAgent.SSL
-    } else {
-      self.agentClass = self.httpModule.Agent
-    }
-  }
-
-  if (self.pool === false) {
-    self.agent = false
-  } else {
-    self.agent = self.agent || self.getAgent()
-    if (self.maxSockets) {
-      // Don't use our pooling if node has the refactored client
-      self.agent.maxSockets = self.maxSockets
-    }
-    if (self.pool.maxSockets) {
-      // Don't use our pooling if node has the refactored client
-      self.agent.maxSockets = self.pool.maxSockets
-    }
-  }
-
-  self.on('pipe', function (src) {
-    if (self.ntick && self._started) throw new Error("You cannot pipe to this stream after the outbound request has started.")
-    self.src = src
-    if (isReadStream(src)) {
-      if (!self.hasHeader('content-type')) self.setHeader('content-type', mime.lookup(src.path))
-    } else {
-      if (src.headers) {
-        for (var i in src.headers) {
-          if (!self.hasHeader(i)) {
-            self.setHeader(i, src.headers[i])
-          }
-        }
-      }
-      if (self._json && !self.hasHeader('content-type'))
-        self.setHeader('content-type', 'application/json')
-      if (src.method && !self.explicitMethod) {
-        self.method = src.method
-      }
-    }
-
-    // self.on('pipe', function () {
-    //   console.error("You have already piped to this stream. Pipeing twice is likely to break the request.")
-    // })
-  })
-
-  process.nextTick(function () {
-    if (self._aborted) return
-
-    if (self._form) {
-      self.setHeaders(self._form.getHeaders())
-      self._form.pipe(self)
-    }
-    if (self.body) {
-      if (Array.isArray(self.body)) {
-        self.body.forEach(function (part) {
-          self.write(part)
-        })
-      } else {
-        self.write(self.body)
-      }
-      self.end()
-    } else if (self.requestBodyStream) {
-      console.warn("options.requestBodyStream is deprecated, please pass the request object to stream.pipe.")
-      self.requestBodyStream.pipe(self)
-    } else if (!self.src) {
-      if (self.method !== 'GET' && typeof self.method !== 'undefined') {
-        self.setHeader('content-length', 0)
-      }
-      self.end()
-    }
-    self.ntick = true
-  })
-}
-
-// Must call this when following a redirect from https to http or vice versa
-// Attempts to keep everything as identical as possible, but update the
-// httpModule, Tunneling agent, and/or Forever Agent in use.
-Request.prototype._updateProtocol = function () {
-  var self = this
-  var protocol = self.uri.protocol
-
-  if (protocol === 'https:') {
-    // previously was doing http, now doing https
-    // if it's https, then we might need to tunnel now.
-    if (self.proxy) {
-      self.tunnel = true
-      var tunnelFn = self.proxy.protocol === 'http:'
-                   ? tunnel.httpsOverHttp : tunnel.httpsOverHttps
-      var tunnelOptions = { proxy: { host: self.proxy.hostname
-                                   , port: +self.proxy.port
-                                   , proxyAuth: self.proxy.auth }
-                          , rejectUnauthorized: self.rejectUnauthorized
-                          , ca: self.ca }
-      self.agent = tunnelFn(tunnelOptions)
-      return
-    }
-
-    self.httpModule = https
-    switch (self.agentClass) {
-      case ForeverAgent:
-        self.agentClass = ForeverAgent.SSL
-        break
-      case http.Agent:
-        self.agentClass = https.Agent
-        break
-      default:
-        // nothing we can do.  Just hope for the best.
-        return
-    }
-
-    // if there's an agent, we need to get a new one.
-    if (self.agent) self.agent = self.getAgent()
-
-  } else {
-    // previously was doing https, now doing http
-    // stop any tunneling.
-    if (self.tunnel) self.tunnel = false
-    self.httpModule = http
-    switch (self.agentClass) {
-      case ForeverAgent.SSL:
-        self.agentClass = ForeverAgent
-        break
-      case https.Agent:
-        self.agentClass = http.Agent
-        break
-      default:
-        // nothing we can do.  just hope for the best
-        return
-    }
-
-    // if there's an agent, then get a new one.
-    if (self.agent) {
-      self.agent = null
-      self.agent = self.getAgent()
-    }
-  }
-}
-
-Request.prototype.getAgent = function () {
-  var Agent = this.agentClass
-  var options = {}
-  if (this.agentOptions) {
-    for (var i in this.agentOptions) {
-      options[i] = this.agentOptions[i]
-    }
-  }
-  if (this.ca) options.ca = this.ca
-  if (typeof this.rejectUnauthorized !== 'undefined') options.rejectUnauthorized = this.rejectUnauthorized
-
-  if (this.cert && this.key) {
-    options.key = this.key
-    options.cert = this.cert
-  }
-
-  var poolKey = ''
-
-  // different types of agents are in different pools
-  if (Agent !== this.httpModule.Agent) {
-    poolKey += Agent.name
-  }
-
-  if (!this.httpModule.globalAgent) {
-    // node 0.4.x
-    options.host = this.host
-    options.port = this.port
-    if (poolKey) poolKey += ':'
-    poolKey += this.host + ':' + this.port
-  }
-
-  // ca option is only relevant if proxy or destination are https
-  var proxy = this.proxy
-  if (typeof proxy === 'string') proxy = url.parse(proxy)
-  var isHttps = (proxy && proxy.protocol === 'https:') || this.uri.protocol === 'https:'
-  if (isHttps) {
-    if (options.ca) {
-      if (poolKey) poolKey += ':'
-      poolKey += options.ca
-    }
-
-    if (typeof options.rejectUnauthorized !== 'undefined') {
-      if (poolKey) poolKey += ':'
-      poolKey += options.rejectUnauthorized
-    }
-
-    if (options.cert)
-      poolKey += options.cert.toString('ascii') + options.key.toString('ascii')
-
-    if (options.ciphers) {
-      if (poolKey) poolKey += ':'
-      poolKey += options.ciphers
-    }
-
-    if (options.secureOptions) {
-      if (poolKey) poolKey += ':'
-      poolKey += options.secureOptions
-    }
-  }
-
-  if (this.pool === globalPool && !poolKey && Object.keys(options).length === 0 && this.httpModule.globalAgent) {
-    // not doing anything special.  Use the globalAgent
-    return this.httpModule.globalAgent
-  }
-
-  // we're using a stored agent.  Make sure it's protocol-specific
-  poolKey = this.uri.protocol + poolKey
-
-  // already generated an agent for this setting
-  if (this.pool[poolKey]) return this.pool[poolKey]
-
-  return this.pool[poolKey] = new Agent(options)
-}
-
-Request.prototype.start = function () {
-  // start() is called once we are ready to send the outgoing HTTP request.
-  // this is usually called on the first write(), end() or on nextTick()
-  var self = this
-
-  if (self._aborted) return
-
-  self._started = true
-  self.method = self.method || 'GET'
-  self.href = self.uri.href
-
-  if (self.src && self.src.stat && self.src.stat.size && !self.hasHeader('content-length')) {
-    self.setHeader('content-length', self.src.stat.size)
-  }
-  if (self._aws) {
-    self.aws(self._aws, true)
-  }
-
-  // We have a method named auth, which is completely different from the http.request
-  // auth option.  If we don't remove it, we're gonna have a bad time.
-  var reqOptions = copy(self)
-  delete reqOptions.auth
-
-  debug('make request', self.uri.href)
-  self.req = self.httpModule.request(reqOptions, self.onResponse.bind(self))
-
-  if (self.timeout && !self.timeoutTimer) {
-    self.timeoutTimer = setTimeout(function () {
-      self.req.abort()
-      var e = new Error("ETIMEDOUT")
-      e.code = "ETIMEDOUT"
-      self.emit("error", e)
-    }, self.timeout)
-
-    // Set additional timeout on socket - in case if remote
-    // server freeze after sending headers
-    if (self.req.setTimeout) { // only works on node 0.6+
-      self.req.setTimeout(self.timeout, function () {
-        if (self.req) {
-          self.req.abort()
-          var e = new Error("ESOCKETTIMEDOUT")
-          e.code = "ESOCKETTIMEDOUT"
-          self.emit("error", e)
-        }
-      })
-    }
-  }
-
-  self.req.on('error', self.clientErrorHandler)
-  self.req.on('drain', function() {
-    self.emit('drain')
-  })
-  self.on('end', function() {
-    if ( self.req.connection ) self.req.connection.removeListener('error', self._parserErrorHandler)
-  })
-  self.emit('request', self.req)
-}
-Request.prototype.onResponse = function (response) {
-  var self = this
-  debug('onResponse', self.uri.href, response.statusCode, response.headers)
-  response.on('end', function() {
-    debug('response end', self.uri.href, response.statusCode, response.headers)
-  });
-
-  if (response.connection.listeners('error').indexOf(self._parserErrorHandler) === -1) {
-    response.connection.once('error', self._parserErrorHandler)
-  }
-  if (self._aborted) {
-    debug('aborted', self.uri.href)
-    response.resume()
-    return
-  }
-  if (self._paused) response.pause()
-  else response.resume()
-
-  self.response = response
-  response.request = self
-  response.toJSON = toJSON
-
-  // XXX This is different on 0.10, because SSL is strict by default
-  if (self.httpModule === https &&
-      self.strictSSL &&
-      !response.client.authorized) {
-    debug('strict ssl error', self.uri.href)
-    var sslErr = response.client.authorizationError
-    self.emit('error', new Error('SSL Error: '+ sslErr))
-    return
-  }
-
-  if (self.setHost && self.hasHeader('host')) delete self.headers[self.hasHeader('host')]
-  if (self.timeout && self.timeoutTimer) {
-    clearTimeout(self.timeoutTimer)
-    self.timeoutTimer = null
-  }
-
-  var addCookie = function (cookie) {
-    if (self._jar){
-      if(self._jar.add){
-        self._jar.add(new Cookie(cookie))
-      }
-      else cookieJar.add(new Cookie(cookie))
-    }
-
-  }
-
-  if (hasHeader('set-cookie', response.headers) && (!self._disableCookies)) {
-    var headerName = hasHeader('set-cookie', response.headers)
-    if (Array.isArray(response.headers[headerName])) response.headers[headerName].forEach(addCookie)
-    else addCookie(response.headers[headerName])
-  }
-
-  var redirectTo = null
-  if (response.statusCode >= 300 && response.statusCode < 400 && hasHeader('location', response.headers)) {
-    var location = response.headers[hasHeader('location', response.headers)]
-    debug('redirect', location)
-
-    if (self.followAllRedirects) {
-      redirectTo = location
-    } else if (self.followRedirect) {
-      switch (self.method) {
-        case 'PATCH':
-        case 'PUT':
-        case 'POST':
-        case 'DELETE':
-          // Do not follow redirects
-          break
-        default:
-          redirectTo = location
-          break
-      }
-    }
-  } else if (response.statusCode == 401 && self._hasAuth && !self._sentAuth) {
-    var authHeader = response.headers[hasHeader('www-authenticate', response.headers)]
-    var authVerb = authHeader && authHeader.split(' ')[0]
-    debug('reauth', authVerb)
-
-    switch (authVerb) {
-      case 'Basic':
-        self.auth(self._user, self._pass, true)
-        redirectTo = self.uri
-        break
-
-      case 'Digest':
-        // TODO: More complete implementation of RFC 2617.  For reference:
-        // http://tools.ietf.org/html/rfc2617#section-3
-        // https://github.com/bagder/curl/blob/master/lib/http_digest.c
-
-        var matches = authHeader.match(/([a-z0-9_-]+)="([^"]+)"/gi)
-        var challenge = {}
-
-        for (var i = 0; i < matches.length; i++) {
-          var eqPos = matches[i].indexOf('=')
-          var key = matches[i].substring(0, eqPos)
-          var quotedValue = matches[i].substring(eqPos + 1)
-          challenge[key] = quotedValue.substring(1, quotedValue.length - 1)
-        }
-
-        var ha1 = md5(self._user + ':' + challenge.realm + ':' + self._pass)
-        var ha2 = md5(self.method + ':' + self.uri.path)
-        var digestResponse = md5(ha1 + ':' + challenge.nonce + ':1::auth:' + ha2)
-        var authValues = {
-          username: self._user,
-          realm: challenge.realm,
-          nonce: challenge.nonce,
-          uri: self.uri.path,
-          qop: challenge.qop,
-          response: digestResponse,
-          nc: 1,
-          cnonce: ''
-        }
-
-        authHeader = []
-        for (var k in authValues) {
-          authHeader.push(k + '="' + authValues[k] + '"')
-        }
-        authHeader = 'Digest ' + authHeader.join(', ')
-        self.setHeader('authorization', authHeader)
-        self._sentAuth = true
-
-        redirectTo = self.uri
-        break
-    }
-  }
-
-  if (redirectTo) {
-    debug('redirect to', redirectTo)
-
-    // ignore any potential response body.  it cannot possibly be useful
-    // to us at this point.
-    if (self._paused) response.resume()
-
-    if (self._redirectsFollowed >= self.maxRedirects) {
-      self.emit('error', new Error("Exceeded maxRedirects. Probably stuck in a redirect loop "+self.uri.href))
-      return
-    }
-    self._redirectsFollowed += 1
-
-    if (!isUrl.test(redirectTo)) {
-      redirectTo = url.resolve(self.uri.href, redirectTo)
-    }
-
-    var uriPrev = self.uri
-    self.uri = url.parse(redirectTo)
-
-    // handle the case where we change protocol from https to http or vice versa
-    if (self.uri.protocol !== uriPrev.protocol) {
-      self._updateProtocol()
-    }
-
-    self.redirects.push(
-      { statusCode : response.statusCode
-      , redirectUri: redirectTo
-      }
-    )
-    if (self.followAllRedirects && response.statusCode != 401) self.method = 'GET'
-    // self.method = 'GET' // Force all redirects to use GET || commented out fixes #215
-    delete self.src
-    delete self.req
-    delete self.agent
-    delete self._started
-    if (response.statusCode != 401) {
-      // Remove parameters from the previous response, unless this is the second request
-      // for a server that requires digest authentication.
-      delete self.body
-      delete self._form
-      if (self.headers) {
-        if (self.hasHeader('host')) delete self.headers[self.hasHeader('host')]
-        if (self.hasHeader('content-type')) delete self.headers[self.hasHeader('content-type')]
-        if (self.hasHeader('content-length')) delete self.headers[self.hasHeader('content-length')]
-      }
-    }
-
-    self.emit('redirect');
-
-    self.init()
-    return // Ignore the rest of the response
-  } else {
-    self._redirectsFollowed = self._redirectsFollowed || 0
-    // Be a good stream and emit end when the response is finished.
-    // Hack to emit end on close because of a core bug that never fires end
-    response.on('close', function () {
-      if (!self._ended) self.response.emit('end')
-    })
-
-    if (self.encoding) {
-      if (self.dests.length !== 0) {
-        console.error("Ignoring encoding parameter as this stream is being piped to another stream which makes the encoding option invalid.")
-      } else {
-        response.setEncoding(self.encoding)
-      }
-    }
-
-    self.emit('response', response)
-
-    self.dests.forEach(function (dest) {
-      self.pipeDest(dest)
-    })
-
-    response.on("data", function (chunk) {
-      self._destdata = true
-      self.emit("data", chunk)
-    })
-    response.on("end", function (chunk) {
-      self._ended = true
-      self.emit("end", chunk)
-    })
-    response.on("close", function () {self.emit("close")})
-
-    if (self.callback) {
-      var buffer = []
-      var bodyLen = 0
-      self.on("data", function (chunk) {
-        buffer.push(chunk)
-        bodyLen += chunk.length
-      })
-      self.on("end", function () {
-        debug('end event', self.uri.href)
-        if (self._aborted) {
-          debug('aborted', self.uri.href)
-          return
-        }
-
-        if (buffer.length && Buffer.isBuffer(buffer[0])) {
-          debug('has body', self.uri.href, bodyLen)
-          var body = new Buffer(bodyLen)
-          var i = 0
-          buffer.forEach(function (chunk) {
-            chunk.copy(body, i, 0, chunk.length)
-            i += chunk.length
-          })
-          if (self.encoding === null) {
-            response.body = body
-          } else {
-            response.body = body.toString(self.encoding)
-          }
-        } else if (buffer.length) {
-          // The UTF8 BOM [0xEF,0xBB,0xBF] is converted to [0xFE,0xFF] in the JS UTC16/UCS2 representation.
-          // Strip this value out when the encoding is set to 'utf8', as upstream consumers won't expect it and it breaks JSON.parse().
-          if (self.encoding === 'utf8' && buffer[0].length > 0 && buffer[0][0] === "\uFEFF") {
-            buffer[0] = buffer[0].substring(1)
-          }
-          response.body = buffer.join('')
-        }
-
-        if (self._json) {
-          try {
-            response.body = JSON.parse(response.body)
-          } catch (e) {}
-        }
-        debug('emitting complete', self.uri.href)
-        if(response.body == undefined && !self._json) {
-          response.body = "";
-        }
-        self.emit('complete', response, response.body)
-      })
-    }
-    //if no callback
-    else{
-      self.on("end", function () {
-        if (self._aborted) {
-          debug('aborted', self.uri.href)
-          return
-        }
-        self.emit('complete', response);
-      });
-    }
-  }
-  debug('finish init function', self.uri.href)
-}
-
-Request.prototype.abort = function () {
-  this._aborted = true
-
-  if (this.req) {
-    this.req.abort()
-  }
-  else if (this.response) {
-    this.response.abort()
-  }
-
-  this.emit("abort")
-}
-
-Request.prototype.pipeDest = function (dest) {
-  var response = this.response
-  // Called after the response is received
-  if (dest.headers && !dest.headersSent) {
-    if (hasHeader('content-type', response.headers)) {
-      var ctname = hasHeader('content-type', response.headers)
-      if (dest.setHeader) dest.setHeader(ctname, response.headers[ctname])
-      else dest.headers[ctname] = response.headers[ctname]
-    }
-
-    if (hasHeader('content-length', response.headers)) {
-      var clname = hasHeader('content-length', response.headers)
-      if (dest.setHeader) dest.setHeader(clname, response.headers[clname])
-      else dest.headers[clname] = response.headers[clname]
-    }
-  }
-  if (dest.setHeader && !dest.headersSent) {
-    for (var i in response.headers) {
-      dest.setHeader(i, response.headers[i])
-    }
-    dest.statusCode = response.statusCode
-  }
-  if (this.pipefilter) this.pipefilter(response, dest)
-}
-
-// Composable API
-Request.prototype.setHeader = function (name, value, clobber) {
-  if (clobber === undefined) clobber = true
-  if (clobber || !this.hasHeader(name)) this.headers[name] = value
-  else this.headers[this.hasHeader(name)] += ',' + value
-  return this
-}
-Request.prototype.setHeaders = function (headers) {
-  for (var i in headers) {this.setHeader(i, headers[i])}
-  return this
-}
-Request.prototype.hasHeader = function (header, headers) {
-  var headers = Object.keys(headers || this.headers)
-    , lheaders = headers.map(function (h) {return h.toLowerCase()})
-    ;
-  header = header.toLowerCase()
-  for (var i=0;i<lheaders.length;i++) {
-    if (lheaders[i] === header) return headers[i]
-  }
-  return false
-}
-
-var hasHeader = Request.prototype.hasHeader
-
-Request.prototype.qs = function (q, clobber) {
-  var base
-  if (!clobber && this.uri.query) base = qs.parse(this.uri.query)
-  else base = {}
-
-  for (var i in q) {
-    base[i] = q[i]
-  }
-
-  if (qs.stringify(base) === ''){
-    return this
-  }
-
-  this.uri = url.parse(this.uri.href.split('?')[0] + '?' + qs.stringify(base))
-  this.url = this.uri
-  this.path = this.uri.path
-
-  return this
-}
-Request.prototype.form = function (form) {
-  if (form) {
-    this.setHeader('content-type', 'application/x-www-form-urlencoded; charset=utf-8')
-    this.body = qs.stringify(form).toString('utf8')
-    return this
-  }
-  // create form-data object
-  this._form = new FormData()
-  return this._form
-}
-Request.prototype.multipart = function (multipart) {
-  var self = this
-  self.body = []
-
-  if (!self.hasHeader('content-type')) {
-    self.setHeader('content-type', 'multipart/related; boundary=' + self.boundary)
-  } else {
-    self.setHeader('content-type', self.headers['content-type'].split(';')[0] + '; boundary=' + self.boundary)
-  }
-
-  if (!multipart.forEach) throw new Error('Argument error, options.multipart.')
-
-  if (self.preambleCRLF) {
-    self.body.push(new Buffer('\r\n'))
-  }
-
-  multipart.forEach(function (part) {
-    var body = part.body
-    if(body == null) throw Error('Body attribute missing in multipart.')
-    delete part.body
-    var preamble = '--' + self.boundary + '\r\n'
-    Object.keys(part).forEach(function (key) {
-      preamble += key + ': ' + part[key] + '\r\n'
-    })
-    preamble += '\r\n'
-    self.body.push(new Buffer(preamble))
-    self.body.push(new Buffer(body))
-    self.body.push(new Buffer('\r\n'))
-  })
-  self.body.push(new Buffer('--' + self.boundary + '--'))
-  return self
-}
-Request.prototype.json = function (val) {
-  var self = this
-
-  if (!self.hasHeader('accept')) self.setHeader('accept', 'application/json')
-
-  this._json = true
-  if (typeof val === 'boolean') {
-    if (typeof this.body === 'object') {
-      this.body = safeStringify(this.body)
-      self.setHeader('content-type', 'application/json')
-    }
-  } else {
-    this.body = safeStringify(val)
-    self.setHeader('content-type', 'application/json')
-  }
-  return this
-}
-Request.prototype.getHeader = function (name, headers) {
-  var result, re, match
-  if (!headers) headers = this.headers
-  Object.keys(headers).forEach(function (key) {
-    re = new RegExp(name, 'i')
-    match = key.match(re)
-    if (match) result = headers[key]
-  })
-  return result
-}
-var getHeader = Request.prototype.getHeader
-
-Request.prototype.auth = function (user, pass, sendImmediately) {
-  if (typeof user !== 'string' || (pass !== undefined && typeof pass !== 'string')) {
-    throw new Error('auth() received invalid user or password')
-  }
-  this._user = user
-  this._pass = pass
-  this._hasAuth = true
-  var header = typeof pass !== 'undefined' ? user + ':' + pass : user
-  if (sendImmediately || typeof sendImmediately == 'undefined') {
-    this.setHeader('authorization', 'Basic ' + toBase64(header))
-    this._sentAuth = true
-  }
-  return this
-}
-Request.prototype.aws = function (opts, now) {
-  if (!now) {
-    this._aws = opts
-    return this
-  }
-  var date = new Date()
-  this.setHeader('date', date.toUTCString())
-  var auth =
-    { key: opts.key
-    , secret: opts.secret
-    , verb: this.method.toUpperCase()
-    , date: date
-    , contentType: this.getHeader('content-type') || ''
-    , md5: this.getHeader('content-md5') || ''
-    , amazonHeaders: aws.canonicalizeHeaders(this.headers)
-    }
-  if (opts.bucket && this.path) {
-    auth.resource = '/' + opts.bucket + this.path
-  } else if (opts.bucket && !this.path) {
-    auth.resource = '/' + opts.bucket
-  } else if (!opts.bucket && this.path) {
-    auth.resource = this.path
-  } else if (!opts.bucket && !this.path) {
-    auth.resource = '/'
-  }
-  auth.resource = aws.canonicalizeResource(auth.resource)
-  this.setHeader('authorization', aws.authorization(auth))
-
-  return this
-}
-Request.prototype.httpSignature = function (opts) {
-  var req = this
-  httpSignature.signRequest({
-    getHeader: function(header) {
-      return getHeader(header, req.headers)
-    },
-    setHeader: function(header, value) {
-      req.setHeader(header, value)
-    },
-    method: this.method,
-    path: this.path
-  }, opts)
-  debug('httpSignature authorization', this.getHeader('authorization'))
-
-  return this
-}
-
-Request.prototype.hawk = function (opts) {
-  this.setHeader('Authorization', hawk.client.header(this.uri, this.method, opts).field)
-}
-
-Request.prototype.oauth = function (_oauth) {
-  var form
-  if (this.hasHeader('content-type') &&
-      this.getHeader('content-type').slice(0, 'application/x-www-form-urlencoded'.length) ===
-        'application/x-www-form-urlencoded'
-     ) {
-    form = qs.parse(this.body)
-  }
-  if (this.uri.query) {
-    form = qs.parse(this.uri.query)
-  }
-  if (!form) form = {}
-  var oa = {}
-  for (var i in form) oa[i] = form[i]
-  for (var i in _oauth) oa['oauth_'+i] = _oauth[i]
-  if (!oa.oauth_version) oa.oauth_version = '1.0'
-  if (!oa.oauth_timestamp) oa.oauth_timestamp = Math.floor( Date.now() / 1000 ).toString()
-  if (!oa.oauth_nonce) oa.oauth_nonce = uuid().replace(/-/g, '')
-
-  oa.oauth_signature_method = 'HMAC-SHA1'
-
-  var consumer_secret = oa.oauth_consumer_secret
-  delete oa.oauth_consumer_secret
-  var token_secret = oa.oauth_token_secret
-  delete oa.oauth_token_secret
-  var timestamp = oa.oauth_timestamp
-
-  var baseurl = this.uri.protocol + '//' + this.uri.host + this.uri.pathname
-  var signature = oauth.hmacsign(this.method, baseurl, oa, consumer_secret, token_secret)
-
-  // oa.oauth_signature = signature
-  for (var i in form) {
-    if ( i.slice(0, 'oauth_') in _oauth) {
-      // skip
-    } else {
-      delete oa['oauth_'+i]
-      if (i !== 'x_auth_mode') delete oa[i]
-    }
-  }
-  oa.oauth_timestamp = timestamp
-  var authHeader = 'OAuth '+Object.keys(oa).sort().map(function (i) {return i+'="'+oauth.rfc3986(oa[i])+'"'}).join(',')
-  authHeader += ',oauth_signature="' + oauth.rfc3986(signature) + '"'
-  this.setHeader('Authorization', authHeader)
-  return this
-}
-Request.prototype.jar = function (jar) {
-  var cookies
-
-  if (this._redirectsFollowed === 0) {
-    this.originalCookieHeader = this.getHeader('cookie')
-  }
-
-  if (!jar) {
-    // disable cookies
-    cookies = false
-    this._disableCookies = true
-  } else if (jar && jar.get) {
-    // fetch cookie from the user defined cookie jar
-    cookies = jar.get({ url: this.uri.href })
-  } else {
-    // fetch cookie from the global cookie jar
-    cookies = cookieJar.get({ url: this.uri.href })
-  }
-
-  if (cookies && cookies.length) {
-    var cookieString = cookies.map(function (c) {
-      return c.name + "=" + c.value
-    }).join("; ")
-
-    if (this.originalCookieHeader) {
-      // Don't overwrite existing Cookie header
-      this.setHeader('cookie', this.originalCookieHeader + '; ' + cookieString)
-    } else {
-      this.setHeader('cookie', cookieString)
-    }
-  }
-  this._jar = jar
-  return this
-}
-
-
-// Stream API
-Request.prototype.pipe = function (dest, opts) {
-  if (this.response) {
-    if (this._destdata) {
-      throw new Error("You cannot pipe after data has been emitted from the response.")
-    } else if (this._ended) {
-      throw new Error("You cannot pipe after the response has been ended.")
-    } else {
-      stream.Stream.prototype.pipe.call(this, dest, opts)
-      this.pipeDest(dest)
-      return dest
-    }
-  } else {
-    this.dests.push(dest)
-    stream.Stream.prototype.pipe.call(this, dest, opts)
-    return dest
-  }
-}
-Request.prototype.write = function () {
-  if (!this._started) this.start()
-  return this.req.write.apply(this.req, arguments)
-}
-Request.prototype.end = function (chunk) {
-  if (chunk) this.write(chunk)
-  if (!this._started) this.start()
-  this.req.end()
-}
-Request.prototype.pause = function () {
-  if (!this.response) this._paused = true
-  else this.response.pause.apply(this.response, arguments)
-}
-Request.prototype.resume = function () {
-  if (!this.response) this._paused = false
-  else this.response.resume.apply(this.response, arguments)
-}
-Request.prototype.destroy = function () {
-  if (!this._ended) this.end()
-  else if (this.response) this.response.destroy()
-}
-
-function toJSON () {
-  return getSafe(this, '__' + (((1+Math.random())*0x10000)|0).toString(16))
-}
-
-Request.prototype.toJSON = toJSON
-
-
-module.exports = Request
\ No newline at end of file
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/googledoodle.jpg has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/run.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,40 +0,0 @@
-var spawn = require('child_process').spawn
-  , exitCode = 0
-  , timeout = 10000
-  , fs = require('fs')
-  ;
-
-fs.readdir(__dirname, function (e, files) {
-  if (e) throw e
-
-  var tests = files.filter(function (f) {return f.slice(0, 'test-'.length) === 'test-'})
-
-  var next = function () {
-    if (tests.length === 0) process.exit(exitCode);
-
-    var file = tests.shift()
-    console.log(file)
-    var proc = spawn('node', [ 'tests/' + file ])
-
-    var killed = false
-    var t = setTimeout(function () {
-      proc.kill()
-      exitCode += 1
-      console.error(file + ' timeout')
-      killed = true
-    }, timeout)
-
-    proc.stdout.pipe(process.stdout)
-    proc.stderr.pipe(process.stderr)
-    proc.on('exit', function (code) {
-      if (code && !killed) console.error(file + ' failed')
-      exitCode += code || 0
-      clearTimeout(t)
-      next()
-    })
-  }
-  next()
-    
-})
-
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/server.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,90 +0,0 @@
-var fs = require('fs')
-  , http = require('http')
-  , path = require('path')
-  , https = require('https')
-  , events = require('events')
-  , stream = require('stream')
-  , assert = require('assert')
-  ;
-
-exports.createServer =  function (port) {
-  port = port || 6767
-  var s = http.createServer(function (req, resp) {
-    s.emit(req.url, req, resp);
-  })
-  s.port = port
-  s.url = 'http://localhost:'+port
-  return s;
-}
-
-exports.createSSLServer = function(port, opts) {
-  port = port || 16767
-
-  var options = { 'key' : path.join(__dirname, 'ssl', 'test.key')
-                , 'cert': path.join(__dirname, 'ssl', 'test.crt')
-                }
-  if (opts) {
-    for (var i in opts) options[i] = opts[i]
-  }
-
-  for (var i in options) {
-    options[i] = fs.readFileSync(options[i])
-  }
-
-  var s = https.createServer(options, function (req, resp) {
-    s.emit(req.url, req, resp);
-  })
-  s.port = port
-  s.url = 'https://localhost:'+port
-  return s;
-}
-
-exports.createPostStream = function (text) {
-  var postStream = new stream.Stream();
-  postStream.writeable = true;
-  postStream.readable = true;
-  setTimeout(function () {postStream.emit('data', new Buffer(text)); postStream.emit('end')}, 0);
-  return postStream;
-}
-exports.createPostValidator = function (text, reqContentType) {
-  var l = function (req, resp) {
-    var r = '';
-    req.on('data', function (chunk) {r += chunk})
-    req.on('end', function () {
-      if (req.headers['content-type'] && req.headers['content-type'].indexOf('boundary=') >= 0) {
-        var boundary = req.headers['content-type'].split('boundary=')[1];
-        text = text.replace(/__BOUNDARY__/g, boundary);
-      }
-      if (r !== text) console.log(r, text);
-      assert.equal(r, text)
-      if (reqContentType) {
-        assert.ok(req.headers['content-type'])
-        assert.ok(~req.headers['content-type'].indexOf(reqContentType))
-      }
-      resp.writeHead(200, {'content-type':'text/plain'})
-      resp.write('OK')
-      resp.end()
-    })
-  }
-  return l;
-}
-exports.createGetResponse = function (text, contentType) {
-  var l = function (req, resp) {
-    contentType = contentType || 'text/plain'
-    resp.writeHead(200, {'content-type':contentType})
-    resp.write(text)
-    resp.end()
-  }
-  return l;
-}
-exports.createChunkResponse = function (chunks, contentType) {
-  var l = function (req, resp) {
-    contentType = contentType || 'text/plain'
-    resp.writeHead(200, {'content-type':contentType})
-    chunks.forEach(function (chunk) {
-      resp.write(chunk)
-    })
-    resp.end()
-  }
-  return l;
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/squid.conf	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,77 +0,0 @@
-#
-# Recommended minimum configuration:
-#
-acl manager proto cache_object
-acl localhost src 127.0.0.1/32 ::1
-acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
-
-# Example rule allowing access from your local networks.
-# Adapt to list your (internal) IP networks from where browsing
-# should be allowed
-acl localnet src 10.0.0.0/8	# RFC1918 possible internal network
-acl localnet src 172.16.0.0/12	# RFC1918 possible internal network
-acl localnet src 192.168.0.0/16	# RFC1918 possible internal network
-acl localnet src fc00::/7       # RFC 4193 local private network range
-acl localnet src fe80::/10      # RFC 4291 link-local (directly plugged) machines
-
-acl SSL_ports port 443
-acl Safe_ports port 80		# http
-acl Safe_ports port 21		# ftp
-acl Safe_ports port 443		# https
-acl Safe_ports port 70		# gopher
-acl Safe_ports port 210		# wais
-acl Safe_ports port 1025-65535	# unregistered ports
-acl Safe_ports port 280		# http-mgmt
-acl Safe_ports port 488		# gss-http
-acl Safe_ports port 591		# filemaker
-acl Safe_ports port 777		# multiling http
-acl CONNECT method CONNECT
-
-#
-# Recommended minimum Access Permission configuration:
-#
-# Only allow cachemgr access from localhost
-http_access allow manager localhost
-http_access deny manager
-
-# Deny requests to certain unsafe ports
-http_access deny !Safe_ports
-
-# Deny CONNECT to other than secure SSL ports
-#http_access deny CONNECT !SSL_ports
-
-# We strongly recommend the following be uncommented to protect innocent
-# web applications running on the proxy server who think the only
-# one who can access services on "localhost" is a local user
-#http_access deny to_localhost
-
-#
-# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
-#
-
-# Example rule allowing access from your local networks.
-# Adapt localnet in the ACL section to list your (internal) IP networks
-# from where browsing should be allowed
-http_access allow localnet
-http_access allow localhost
-
-# And finally deny all other access to this proxy
-http_access deny all
-
-# Squid normally listens to port 3128
-http_port 3128
-
-# We recommend you to use at least the following line.
-hierarchy_stoplist cgi-bin ?
-
-# Uncomment and adjust the following to add a disk cache directory.
-#cache_dir ufs /usr/local/var/cache 100 16 256
-
-# Leave coredumps in the first cache dir
-coredump_dir /usr/local/var/cache
-
-# Add any of your own refresh_pattern entries above these.
-refresh_pattern ^ftp:		1440	20%	10080
-refresh_pattern ^gopher:	1440	0%	1440
-refresh_pattern -i (/cgi-bin/|\?) 0	0%	0
-refresh_pattern .		0	20%	4320
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.cnf	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,20 +0,0 @@
-[ req ]
-default_bits           = 1024
-days                   = 3650
-distinguished_name     = req_distinguished_name
-attributes             = req_attributes
-prompt                 = no
-output_password        = password
-
-[ req_distinguished_name ]
-C                      = US
-ST                     = CA
-L                      = Oakland
-O                      = request
-OU                     = request Certificate Authority
-CN                     = requestCA
-emailAddress           = mikeal@mikealrogers.com
-
-[ req_attributes ]
-challengePassword              = password challenge
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.crt	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,17 +0,0 @@
------BEGIN CERTIFICATE-----
-MIICvTCCAiYCCQDn+P/MSbDsWjANBgkqhkiG9w0BAQUFADCBojELMAkGA1UEBhMC
-VVMxCzAJBgNVBAgTAkNBMRAwDgYDVQQHEwdPYWtsYW5kMRAwDgYDVQQKEwdyZXF1
-ZXN0MSYwJAYDVQQLEx1yZXF1ZXN0IENlcnRpZmljYXRlIEF1dGhvcml0eTESMBAG
-A1UEAxMJcmVxdWVzdENBMSYwJAYJKoZIhvcNAQkBFhdtaWtlYWxAbWlrZWFscm9n
-ZXJzLmNvbTAeFw0xMjAzMDEyMjUwNTZaFw0yMjAyMjcyMjUwNTZaMIGiMQswCQYD
-VQQGEwJVUzELMAkGA1UECBMCQ0ExEDAOBgNVBAcTB09ha2xhbmQxEDAOBgNVBAoT
-B3JlcXVlc3QxJjAkBgNVBAsTHXJlcXVlc3QgQ2VydGlmaWNhdGUgQXV0aG9yaXR5
-MRIwEAYDVQQDEwlyZXF1ZXN0Q0ExJjAkBgkqhkiG9w0BCQEWF21pa2VhbEBtaWtl
-YWxyb2dlcnMuY29tMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQC7t9pQUAK4
-5XJYTI6NrF0n3G2HZsfN+rPYSVzzL8SuVyb1tHXos+vbPm3NKI4E8X1yVAXU8CjJ
-5SqXnp4DAypAhaseho81cbhk7LXUhFz78OvAa+OD+xTAEAnNQ8tGUr4VGyplEjfD
-xsBVuqV2j8GPNTftr+drOCFlqfAgMrBn4wIDAQABMA0GCSqGSIb3DQEBBQUAA4GB
-ADVdTlVAL45R+PACNS7Gs4o81CwSclukBu4FJbxrkd4xGQmurgfRrYYKjtqiopQm
-D7ysRamS3HMN9/VKq2T7r3z1PMHPAy7zM4uoXbbaTKwlnX4j/8pGPn8Ca3qHXYlo
-88L/OOPc6Di7i7qckS3HFbXQCTiULtxWmy97oEuTwrAj
------END CERTIFICATE-----
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.csr	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
------BEGIN CERTIFICATE REQUEST-----
-MIICBjCCAW8CAQAwgaIxCzAJBgNVBAYTAlVTMQswCQYDVQQIEwJDQTEQMA4GA1UE
-BxMHT2FrbGFuZDEQMA4GA1UEChMHcmVxdWVzdDEmMCQGA1UECxMdcmVxdWVzdCBD
-ZXJ0aWZpY2F0ZSBBdXRob3JpdHkxEjAQBgNVBAMTCXJlcXVlc3RDQTEmMCQGCSqG
-SIb3DQEJARYXbWlrZWFsQG1pa2VhbHJvZ2Vycy5jb20wgZ8wDQYJKoZIhvcNAQEB
-BQADgY0AMIGJAoGBALu32lBQArjlclhMjo2sXSfcbYdmx836s9hJXPMvxK5XJvW0
-deiz69s+bc0ojgTxfXJUBdTwKMnlKpeengMDKkCFqx6GjzVxuGTstdSEXPvw68Br
-44P7FMAQCc1Dy0ZSvhUbKmUSN8PGwFW6pXaPwY81N+2v52s4IWWp8CAysGfjAgMB
-AAGgIzAhBgkqhkiG9w0BCQcxFBMScGFzc3dvcmQgY2hhbGxlbmdlMA0GCSqGSIb3
-DQEBBQUAA4GBAGJO7grHeVHXetjHEK8urIxdnvfB2qeZeObz4GPKIkqUurjr0rfj
-bA3EK1kDMR5aeQWR8RunixdM16Q6Ry0lEdLVWkdSwRN9dmirIHT9cypqnD/FYOia
-SdezZ0lUzXgmJIwRYRwB1KSMMocIf52ll/xC2bEGg7/ZAEuAyAgcZV3X
------END CERTIFICATE REQUEST-----
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.key	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
------BEGIN RSA PRIVATE KEY-----
-Proc-Type: 4,ENCRYPTED
-DEK-Info: DES-EDE3-CBC,C8B5887048377F02
-
-nyD5ZH0Wup2uWsDvurq5mKDaDrf8lvNn9w0SH/ZkVnfR1/bkwqrFriqJWvZNUG+q
-nS0iBYczsWLJnbub9a1zLOTENWUKVD5uqbC3aGHhnoUTNSa27DONgP8gHOn6JgR+
-GAKo01HCSTiVT4LjkwN337QKHnMP2fTzg+IoC/CigvMcq09hRLwU1/guq0GJKGwH
-gTxYNuYmQC4Tjh8vdS4liF+Ve/P3qPR2CehZrIOkDT8PHJBGQJRo4xGUIB7Tpk38
-VCk+UZ0JCS2coY8VkY/9tqFJp/ZnnQQVmaNbdRqg7ECKL+bXnNo7yjzmazPZmPe3
-/ShbE0+CTt7LrjCaQAxWbeDzqfo1lQfgN1LulTm8MCXpQaJpv7v1VhIhQ7afjMYb
-4thW/ypHPiYS2YJCAkAVlua9Oxzzh1qJoh8Df19iHtpd79Q77X/qf+1JvITlMu0U
-gi7yEatmQcmYNws1mtTC1q2DXrO90c+NZ0LK/Alse6NRL/xiUdjug2iHeTf/idOR
-Gg/5dSZbnnlj1E5zjSMDkzg6EHAFmHV4jYGSAFLEQgp4V3ZhMVoWZrvvSHgKV/Qh
-FqrAK4INr1G2+/QTd09AIRzfy3/j6yD4A9iNaOsEf9Ua7Qh6RcALRCAZTWR5QtEf
-dX+iSNJ4E85qXs0PqwkMDkoaxIJ+tmIRJY7y8oeylV8cfGAi8Soubt/i3SlR8IHC
-uDMas/2OnwafK3N7ODeE1i7r7wkzQkSHaEz0TrF8XRnP25jAICCSLiMdAAjKfxVb
-EvzsFSuAy3Jt6bU3hSLY9o4YVYKE+68ITMv9yNjvTsEiW+T+IbN34w==
------END RSA PRIVATE KEY-----
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/ca.srl	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-ADF62016AA40C9C3
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.cnf	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-[ req ]
-default_bits           = 1024
-days                   = 3650
-distinguished_name     = req_distinguished_name
-attributes             = req_attributes
-prompt                 = no
-
-[ req_distinguished_name ]
-C                      = US
-ST                     = CA
-L                      = Oakland
-O                      = request
-OU                     = testing
-CN                     = testing.request.mikealrogers.com
-emailAddress           = mikeal@mikealrogers.com
-
-[ req_attributes ]
-challengePassword              = password challenge
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.crt	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
------BEGIN CERTIFICATE-----
-MIICejCCAeMCCQCt9iAWqkDJwzANBgkqhkiG9w0BAQUFADCBojELMAkGA1UEBhMC
-VVMxCzAJBgNVBAgTAkNBMRAwDgYDVQQHEwdPYWtsYW5kMRAwDgYDVQQKEwdyZXF1
-ZXN0MSYwJAYDVQQLEx1yZXF1ZXN0IENlcnRpZmljYXRlIEF1dGhvcml0eTESMBAG
-A1UEAxMJcmVxdWVzdENBMSYwJAYJKoZIhvcNAQkBFhdtaWtlYWxAbWlrZWFscm9n
-ZXJzLmNvbTAeFw0xMjAzMDEyMjUwNTZaFw0yMjAyMjcyMjUwNTZaMIGjMQswCQYD
-VQQGEwJVUzELMAkGA1UECBMCQ0ExEDAOBgNVBAcTB09ha2xhbmQxEDAOBgNVBAoT
-B3JlcXVlc3QxEDAOBgNVBAsTB3Rlc3RpbmcxKTAnBgNVBAMTIHRlc3RpbmcucmVx
-dWVzdC5taWtlYWxyb2dlcnMuY29tMSYwJAYJKoZIhvcNAQkBFhdtaWtlYWxAbWlr
-ZWFscm9nZXJzLmNvbTBcMA0GCSqGSIb3DQEBAQUAA0sAMEgCQQDgVl0jMumvOpmM
-20W5v9yhGgZj8hPhEQF/N7yCBVBn/rWGYm70IHC8T/pR5c0LkWc5gdnCJEvKWQjh
-DBKxZD8FAgMBAAEwDQYJKoZIhvcNAQEFBQADgYEABShRkNgFbgs4vUWW9R9deNJj
-7HJoiTmvkmoOC7QzcYkjdgHbOxsSq3rBnwxsVjY9PAtPwBn0GRspOeG7KzKRgySB
-kb22LyrCFKbEOfKO/+CJc80ioK9zEPVjGsFMyAB+ftYRqM+s/4cQlTg/m89l01wC
-yapjN3RxZbInGhWR+jA=
------END CERTIFICATE-----
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.csr	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
------BEGIN CERTIFICATE REQUEST-----
-MIIBgjCCASwCAQAwgaMxCzAJBgNVBAYTAlVTMQswCQYDVQQIEwJDQTEQMA4GA1UE
-BxMHT2FrbGFuZDEQMA4GA1UEChMHcmVxdWVzdDEQMA4GA1UECxMHdGVzdGluZzEp
-MCcGA1UEAxMgdGVzdGluZy5yZXF1ZXN0Lm1pa2VhbHJvZ2Vycy5jb20xJjAkBgkq
-hkiG9w0BCQEWF21pa2VhbEBtaWtlYWxyb2dlcnMuY29tMFwwDQYJKoZIhvcNAQEB
-BQADSwAwSAJBAOBWXSMy6a86mYzbRbm/3KEaBmPyE+ERAX83vIIFUGf+tYZibvQg
-cLxP+lHlzQuRZzmB2cIkS8pZCOEMErFkPwUCAwEAAaAjMCEGCSqGSIb3DQEJBzEU
-ExJwYXNzd29yZCBjaGFsbGVuZ2UwDQYJKoZIhvcNAQEFBQADQQBD3E5WekQzCEJw
-7yOcqvtPYIxGaX8gRKkYfLPoj3pm3GF5SGqtJKhylKfi89szHXgktnQgzff9FN+A
-HidVJ/3u
------END CERTIFICATE REQUEST-----
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-var fs = require("fs")
-var https = require("https")
-var options = { key: fs.readFileSync("./server.key")
-              , cert: fs.readFileSync("./server.crt") }
-
-var server = https.createServer(options, function (req, res) {
-  res.writeHead(200)
-  res.end()
-  server.close()
-})
-server.listen(1337)
-
-var ca = fs.readFileSync("./ca.crt")
-var agent = new https.Agent({ host: "localhost", port: 1337, ca: ca })
-
-https.request({ host: "localhost"
-              , method: "HEAD"
-              , port: 1337
-              , headers: { host: "testing.request.mikealrogers.com" }
-              , agent: agent
-              , ca: [ ca ]
-              , path: "/" }, function (res) {
-  if (res.client.authorized) {
-    console.log("node test: OK")
-  } else {
-    throw new Error(res.client.authorizationError)
-  }
-}).end()
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/ca/server.key	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,9 +0,0 @@
------BEGIN RSA PRIVATE KEY-----
-MIIBOwIBAAJBAOBWXSMy6a86mYzbRbm/3KEaBmPyE+ERAX83vIIFUGf+tYZibvQg
-cLxP+lHlzQuRZzmB2cIkS8pZCOEMErFkPwUCAwEAAQJAK+r8ZM2sze8s7FRo/ApB
-iRBtO9fCaIdJwbwJnXKo4RKwZDt1l2mm+fzZ+/QaQNjY1oTROkIIXmnwRvZWfYlW
-gQIhAPKYsG+YSBN9o8Sdp1DMyZ/rUifKX3OE6q9tINkgajDVAiEA7Ltqh01+cnt0
-JEnud/8HHcuehUBLMofeg0G+gCnSbXECIQCqDvkXsWNNLnS/3lgsnvH0Baz4sbeJ
-rjIpuVEeg8eM5QIgbu0+9JmOV6ybdmmiMV4yAncoF35R/iKGVHDZCAsQzDECIQDZ
-0jGz22tlo5YMcYSqrdD3U4sds1pwiAaWFRbCunoUJw==
------END RSA PRIVATE KEY-----
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/npm-ca.crt	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
------BEGIN CERTIFICATE-----
-MIIChzCCAfACCQDauvz/KHp8ejANBgkqhkiG9w0BAQUFADCBhzELMAkGA1UEBhMC
-VVMxCzAJBgNVBAgTAkNBMRAwDgYDVQQHEwdPYWtsYW5kMQwwCgYDVQQKEwNucG0x
-IjAgBgNVBAsTGW5wbSBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkxDjAMBgNVBAMTBW5w
-bUNBMRcwFQYJKoZIhvcNAQkBFghpQGl6cy5tZTAeFw0xMTA5MDUwMTQ3MTdaFw0y
-MTA5MDIwMTQ3MTdaMIGHMQswCQYDVQQGEwJVUzELMAkGA1UECBMCQ0ExEDAOBgNV
-BAcTB09ha2xhbmQxDDAKBgNVBAoTA25wbTEiMCAGA1UECxMZbnBtIENlcnRpZmlj
-YXRlIEF1dGhvcml0eTEOMAwGA1UEAxMFbnBtQ0ExFzAVBgkqhkiG9w0BCQEWCGlA
-aXpzLm1lMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDLI4tIqPpRW+ACw9GE
-OgBlJZwK5f8nnKCLK629Pv5yJpQKs3DENExAyOgDcyaF0HD0zk8zTp+ZsLaNdKOz
-Gn2U181KGprGKAXP6DU6ByOJDWmTlY6+Ad1laYT0m64fERSpHw/hjD3D+iX4aMOl
-y0HdbT5m1ZGh6SJz3ZqxavhHLQIDAQABMA0GCSqGSIb3DQEBBQUAA4GBAC4ySDbC
-l7W1WpLmtLGEQ/yuMLUf6Jy/vr+CRp4h+UzL+IQpCv8FfxsYE7dhf/bmWTEupBkv
-yNL18lipt2jSvR3v6oAHAReotvdjqhxddpe5Holns6EQd1/xEZ7sB1YhQKJtvUrl
-ZNufy1Jf1r0ldEGeA+0ISck7s+xSh9rQD2Op
------END CERTIFICATE-----
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/test.crt	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
------BEGIN CERTIFICATE-----
-MIICQzCCAawCCQCO/XWtRFck1jANBgkqhkiG9w0BAQUFADBmMQswCQYDVQQGEwJU
-SDEQMA4GA1UECBMHQmFuZ2tvazEOMAwGA1UEBxMFU2lsb20xGzAZBgNVBAoTElRo
-ZSBSZXF1ZXN0IE1vZHVsZTEYMBYGA1UEAxMPcmVxdWVzdC5leGFtcGxlMB4XDTEx
-MTIwMzAyMjkyM1oXDTIxMTEzMDAyMjkyM1owZjELMAkGA1UEBhMCVEgxEDAOBgNV
-BAgTB0Jhbmdrb2sxDjAMBgNVBAcTBVNpbG9tMRswGQYDVQQKExJUaGUgUmVxdWVz
-dCBNb2R1bGUxGDAWBgNVBAMTD3JlcXVlc3QuZXhhbXBsZTCBnzANBgkqhkiG9w0B
-AQEFAAOBjQAwgYkCgYEAwmctddZqlA48+NXs0yOy92DijcQV1jf87zMiYAIlNUto
-wghVbTWgJU5r0pdKrD16AptnWJTzKanhItEX8XCCPgsNkq1afgTtJP7rNkwu3xcj
-eIMkhJg/ay4ZnkbnhYdsii5VTU5prix6AqWRAhbkBgoA+iVyHyof8wvZyKBoFTMC
-AwEAATANBgkqhkiG9w0BAQUFAAOBgQB6BybMJbpeiABgihDfEVBcAjDoQ8gUMgwV
-l4NulugfKTDmArqnR9aPd4ET5jX5dkMP4bwCHYsvrcYDeWEQy7x5WWuylOdKhua4
-L4cEi2uDCjqEErIG3cc1MCOk6Cl6Ld6tkIzQSf953qfdEACRytOeUqLNQcrXrqeE
-c7U8F6MWLQ==
------END CERTIFICATE-----
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/ssl/test.key	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
------BEGIN RSA PRIVATE KEY-----
-MIICXgIBAAKBgQDCZy111mqUDjz41ezTI7L3YOKNxBXWN/zvMyJgAiU1S2jCCFVt
-NaAlTmvSl0qsPXoCm2dYlPMpqeEi0RfxcII+Cw2SrVp+BO0k/us2TC7fFyN4gySE
-mD9rLhmeRueFh2yKLlVNTmmuLHoCpZECFuQGCgD6JXIfKh/zC9nIoGgVMwIDAQAB
-AoGBALXFwfUf8vHTSmGlrdZS2AGFPvEtuvldyoxi9K5u8xmdFCvxnOcLsF2RsTHt
-Mu5QYWhUpNJoG+IGLTPf7RJdj/kNtEs7xXqWy4jR36kt5z5MJzqiK+QIgiO9UFWZ
-fjUb6oeDnTIJA9YFBdYi97MDuL89iU/UK3LkJN3hd4rciSbpAkEA+MCkowF5kSFb
-rkOTBYBXZfiAG78itDXN6DXmqb9XYY+YBh3BiQM28oxCeQYyFy6pk/nstnd4TXk6
-V/ryA2g5NwJBAMgRKTY9KvxJWbESeMEFe2iBIV0c26/72Amgi7ZKUCLukLfD4tLF
-+WSZdmTbbqI1079YtwaiOVfiLm45Q/3B0eUCQAaQ/0eWSGE+Yi8tdXoVszjr4GXb
-G81qBi91DMu6U1It+jNfIba+MPsiHLcZJMVb4/oWBNukN7bD1nhwFWdlnu0CQQCf
-Is9WHkdvz2RxbZDxb8verz/7kXXJQJhx5+rZf7jIYFxqX3yvTNv3wf2jcctJaWlZ
-fVZwB193YSivcgt778xlAkEAprYUz3jczjF5r2hrgbizPzPDR94tM5BTO3ki2v3w
-kbf+j2g7FNAx6kZiVN8XwfLc8xEeUGiPKwtq3ddPDFh17w==
------END RSA PRIVATE KEY-----
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-agentOptions.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-var request = require('../index')
-  , http = require('http')
-  , server = require('./server')
-  , assert = require('assert')
-  ;
-
-var s = http.createServer(function (req, resp) {
-  resp.statusCode = 200
-  resp.end('')
-}).listen(6767, function () {
-  // requests without agentOptions should use global agent
-  var r = request('http://localhost:6767', function (e, resp, body) {
-    assert.deepEqual(r.agent, http.globalAgent);
-    assert.equal(Object.keys(r.pool).length, 0);
-
-    // requests with agentOptions should apply agentOptions to new agent in pool
-    var r2 = request('http://localhost:6767', { agentOptions: { foo: 'bar' } }, function (e, resp, body) {
-      assert.deepEqual(r2.agent.options, { foo: 'bar' });
-      assert.equal(Object.keys(r2.pool).length, 1);
-	    s.close()
- 	 });
-  })
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-basic-auth.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,163 +0,0 @@
-var assert = require('assert')
-  , http = require('http')
-  , request = require('../index')
-  ;
-
-var numBasicRequests = 0;
-
-var basicServer = http.createServer(function (req, res) {
-  console.error('Basic auth server: ', req.method, req.url);
-  numBasicRequests++;
-
-  var ok;
-
-  if (req.headers.authorization) {
-    if (req.headers.authorization == 'Basic ' + new Buffer('test:testing2').toString('base64')) {
-      ok = true;
-    } else if ( req.headers.authorization == 'Basic ' + new Buffer(':apassword').toString('base64')) {
-      ok = true;
-    } else if ( req.headers.authorization == 'Basic ' + new Buffer('justauser').toString('base64')) {
-      ok = true;
-    } else {
-      // Bad auth header, don't send back WWW-Authenticate header
-      ok = false;
-    }
-  } else {
-    // No auth header, send back WWW-Authenticate header
-    ok = false;
-    res.setHeader('www-authenticate', 'Basic realm="Private"');
-  }
-
-  if (req.url == '/post/') {
-    var expectedContent = 'data_key=data_value';
-    req.on('data', function(data) {
-      assert.equal(data, expectedContent);
-      console.log('received request data: ' + data);
-    });
-    assert.equal(req.method, 'POST');
-    assert.equal(req.headers['content-length'], '' + expectedContent.length);
-    assert.equal(req.headers['content-type'], 'application/x-www-form-urlencoded; charset=utf-8');
-  }
-
-  if (ok) {
-    console.log('request ok');
-    res.end('ok');
-  } else {
-    console.log('status=401');
-    res.statusCode = 401;
-    res.end('401');
-  }
-});
-
-basicServer.listen(6767);
-
-var tests = [
-  function(next) {
-    request({
-      'method': 'GET',
-      'uri': 'http://localhost:6767/test/',
-      'auth': {
-        'user': 'test',
-        'pass': 'testing2',
-        'sendImmediately': false
-      }
-    }, function(error, res, body) {
-      assert.equal(res.statusCode, 200);
-      assert.equal(numBasicRequests, 2);
-      next();
-    });
-  },
-
-  function(next) {
-    // If we don't set sendImmediately = false, request will send basic auth
-    request({
-      'method': 'GET',
-      'uri': 'http://localhost:6767/test2/',
-      'auth': {
-        'user': 'test',
-        'pass': 'testing2'
-      }
-    }, function(error, res, body) {
-      assert.equal(res.statusCode, 200);
-      assert.equal(numBasicRequests, 3);
-      next();
-    });
-  },
-
-  function(next) {
-    request({
-      'method': 'GET',
-      'uri': 'http://test:testing2@localhost:6767/test2/'
-    }, function(error, res, body) {
-      assert.equal(res.statusCode, 200);
-      assert.equal(numBasicRequests, 4);
-      next();
-    });
-  },
-
-  function(next) {
-    request({
-      'method': 'POST',
-      'form': { 'data_key': 'data_value' },
-      'uri': 'http://localhost:6767/post/',
-      'auth': {
-        'user': 'test',
-        'pass': 'testing2',
-        'sendImmediately': false
-      }
-    }, function(error, res, body) {
-      assert.equal(res.statusCode, 200);
-      assert.equal(numBasicRequests, 6);
-      next();
-    });
-  },
-
-  function(next) {
-    assert.doesNotThrow( function() {
-      request({
-        'method': 'GET',
-        'uri': 'http://localhost:6767/allow_empty_user/',
-        'auth': {
-          'user': '',
-          'pass': 'apassword',
-          'sendImmediately': false
-        }
-      }, function(error, res, body ) {
-        assert.equal(res.statusCode, 200);
-        assert.equal(numBasicRequests, 8);
-        next();
-      });
-    })
-  },
-
-  function(next) {
-    assert.doesNotThrow( function() {
-      request({
-        'method': 'GET',
-        'uri': 'http://localhost:6767/allow_undefined_password/',
-        'auth': {
-          'user': 'justauser',
-          'pass': undefined,
-          'sendImmediately': false
-        }
-      }, function(error, res, body ) {
-        assert.equal(res.statusCode, 200);
-        assert.equal(numBasicRequests, 10);
-        next();
-      });
-    })
-  }
-];
-
-function runTest(i) {
-  if (i < tests.length) {
-    tests[i](function() {
-      runTest(i + 1);
-    });
-  } else {
-    console.log('All tests passed');
-    basicServer.close();
-  }
-}
-
-runTest(0);
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-body.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,122 +0,0 @@
-var server = require('./server')
-  , events = require('events')
-  , stream = require('stream')
-  , assert = require('assert')
-  , request = require('../index')
-  ;
-
-var s = server.createServer();
-
-var tests =
-  { testGet :
-    { resp : server.createGetResponse("TESTING!")
-    , expectBody: "TESTING!"
-    }
-  , testGetChunkBreak :
-    { resp : server.createChunkResponse(
-      [ new Buffer([239])
-      , new Buffer([163])
-      , new Buffer([191])
-      , new Buffer([206])
-      , new Buffer([169])
-      , new Buffer([226])
-      , new Buffer([152])
-      , new Buffer([131])
-      ])
-    , expectBody: "Ω☃"
-    }
-  , testGetBuffer :
-    { resp : server.createGetResponse(new Buffer("TESTING!"))
-    , encoding: null
-    , expectBody: new Buffer("TESTING!")
-    }
-  , testGetEncoding :
-    { resp : server.createGetResponse(new Buffer('efa3bfcea9e29883', 'hex'))
-    , encoding: 'hex'
-    , expectBody: "efa3bfcea9e29883"
-    }
-  , testGetUTF8:
-     { resp: server.createGetResponse(new Buffer([0xEF, 0xBB, 0xBF, 226, 152, 131]))
-     , encoding: "utf8"
-     , expectBody: "☃"
-     }
-  , testGetJSON :
-     { resp : server.createGetResponse('{"test":true}', 'application/json')
-     , json : true
-     , expectBody: {"test":true}
-     }
-  , testPutString :
-    { resp : server.createPostValidator("PUTTINGDATA")
-    , method : "PUT"
-    , body : "PUTTINGDATA"
-    }
-  , testPutBuffer :
-    { resp : server.createPostValidator("PUTTINGDATA")
-    , method : "PUT"
-    , body : new Buffer("PUTTINGDATA")
-    }
-  , testPutJSON :
-    { resp : server.createPostValidator(JSON.stringify({foo: 'bar'}))
-    , method: "PUT"
-    , json: {foo: 'bar'}
-    }
-  , testPutMultipart :
-    { resp: server.createPostValidator(
-        '--__BOUNDARY__\r\n' +
-        'content-type: text/html\r\n' +
-        '\r\n' +
-        '<html><body>Oh hi.</body></html>' +
-        '\r\n--__BOUNDARY__\r\n\r\n' +
-        'Oh hi.' +
-        '\r\n--__BOUNDARY__--'
-        )
-    , method: "PUT"
-    , multipart:
-      [ {'content-type': 'text/html', 'body': '<html><body>Oh hi.</body></html>'}
-      , {'body': 'Oh hi.'}
-      ]
-    }
-  , testPutMultipartPreambleCRLF :
-    { resp: server.createPostValidator(
-        '\r\n--__BOUNDARY__\r\n' +
-        'content-type: text/html\r\n' +
-        '\r\n' +
-        '<html><body>Oh hi.</body></html>' +
-        '\r\n--__BOUNDARY__\r\n\r\n' +
-        'Oh hi.' +
-        '\r\n--__BOUNDARY__--'
-        )
-    , method: "PUT"
-    , preambleCRLF: true
-    , multipart:
-      [ {'content-type': 'text/html', 'body': '<html><body>Oh hi.</body></html>'}
-      , {'body': 'Oh hi.'}
-      ]
-    }
-  }
-
-s.listen(s.port, function () {
-
-  var counter = 0
-
-  for (i in tests) {
-    (function () {
-      var test = tests[i]
-      s.on('/'+i, test.resp)
-      test.uri = s.url + '/' + i
-      request(test, function (err, resp, body) {
-        if (err) throw err
-        if (test.expectBody) {
-          assert.deepEqual(test.expectBody, body)
-        }
-        counter = counter - 1;
-        if (counter === 0) {
-          console.log(Object.keys(tests).length+" tests passed.")
-          s.close()
-        }
-      })
-      counter++
-    })()
-  }
-})
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-defaults.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,129 +0,0 @@
-var server = require('./server')
-  , assert = require('assert')
-  , request = require('../index')
-  ;
-
-var s = server.createServer();
-
-s.listen(s.port, function () {
-  var counter = 0;
-  s.on('/get', function (req, resp) {
-    assert.equal(req.headers.foo, 'bar');
-    assert.equal(req.method, 'GET')
-    resp.writeHead(200, {'Content-Type': 'text/plain'});
-    resp.end('TESTING!');
-  });
-
-  // test get(string, function)
-  request.defaults({headers:{foo:"bar"}})(s.url + '/get', function (e, r, b){
-    if (e) throw e;
-    assert.deepEqual("TESTING!", b);
-    counter += 1;
-  });
-
-  s.on('/post', function (req, resp) {
-    assert.equal(req.headers.foo, 'bar');
-    assert.equal(req.headers['content-type'], null);
-    assert.equal(req.method, 'POST')
-    resp.writeHead(200, {'Content-Type': 'application/json'});
-    resp.end(JSON.stringify({foo:'bar'}));
-  });
-
-  // test post(string, object, function)
-  request.defaults({headers:{foo:"bar"}}).post(s.url + '/post', {json: true}, function (e, r, b){
-    if (e) throw e;
-    assert.deepEqual('bar', b.foo);
-    counter += 1;
-  });
-
-  s.on('/patch', function (req, resp) {
-    assert.equal(req.headers.foo, 'bar');
-    assert.equal(req.headers['content-type'], null);
-    assert.equal(req.method, 'PATCH')
-    resp.writeHead(200, {'Content-Type': 'application/json'});
-    resp.end(JSON.stringify({foo:'bar'}));
-  });
-
-  // test post(string, object, function)
-  request.defaults({headers:{foo:"bar"}}).patch(s.url + '/patch', {json: true}, function (e, r, b){
-    if (e) throw e;
-    assert.deepEqual('bar', b.foo);
-    counter += 1;
-  });
-
-  s.on('/post-body', function (req, resp) {
-    assert.equal(req.headers.foo, 'bar');
-    assert.equal(req.headers['content-type'], 'application/json');
-    assert.equal(req.method, 'POST')
-    resp.writeHead(200, {'Content-Type': 'application/json'});
-    resp.end(JSON.stringify({foo:'bar'}));
-  });
-
-  // test post(string, object, function) with body
-  request.defaults({headers:{foo:"bar"}}).post(s.url + '/post-body', {json: true, body:{bar:"baz"}}, function (e, r, b){
-    if (e) throw e;
-    assert.deepEqual('bar', b.foo);
-    counter += 1;
-  });
-
-  s.on('/del', function (req, resp) {
-    assert.equal(req.headers.foo, 'bar');
-    assert.equal(req.method, 'DELETE')
-    resp.writeHead(200, {'Content-Type': 'application/json'});
-    resp.end(JSON.stringify({foo:'bar'}));
-  });
-
-  // test .del(string, function)
-  request.defaults({headers:{foo:"bar"}, json:true}).del(s.url + '/del', function (e, r, b){
-    if (e) throw e;
-    assert.deepEqual('bar', b.foo);
-    counter += 1;
-  });
-
-  s.on('/head', function (req, resp) {
-    assert.equal(req.headers.foo, 'bar');
-    assert.equal(req.method, 'HEAD')
-    resp.writeHead(200, {'Content-Type': 'text/plain'});
-    resp.end();
-  });
-
-  // test head.(object, function)
-  request.defaults({headers:{foo:"bar"}}).head({uri: s.url + '/head'}, function (e, r, b){
-    if (e) throw e;
-    counter += 1;
-  });
-
-  s.on('/get_custom', function(req, resp) {
-    assert.equal(req.headers.foo, 'bar');
-    assert.equal(req.headers.x, 'y');
-    resp.writeHead(200, {'Content-Type': 'text/plain'});
-    resp.end();
-  });
-
-  // test custom request handler function
-  var defaultRequest = request.defaults({
-    headers:{foo:"bar"}
-    , body: 'TESTING!'
-  }, function(uri, options, callback) {
-    var params = request.initParams(uri, options, callback);
-    options = params.options;
-    options.headers.x = 'y';
-
-    return request(params.uri, params.options, params.callback);
-  });
-
-  var msg = 'defaults test failed. head request should throw earlier';
-  assert.throws(function() {
-    defaultRequest.head(s.url + '/get_custom', function(e, r, b) {
-      throw new Error(msg);
-    });
-    counter+=1;
-  }, msg);
-
-  defaultRequest.get(s.url + '/get_custom', function(e, r, b) {
-    if(e) throw e;
-    counter += 1;
-    console.log(counter.toString() + " tests passed.");
-    s.close();
-  });
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-digest-auth.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,69 +0,0 @@
-var assert = require('assert')
-  , http = require('http')
-  , request = require('../index')
-  ;
-
-// Test digest auth
-// Using header values captured from interaction with Apache
-
-var numDigestRequests = 0;
-
-var digestServer = http.createServer(function (req, res) {
-  console.error('Digest auth server: ', req.method, req.url);
-  numDigestRequests++;
-
-  var ok;
-
-  if (req.headers.authorization) {
-    if (req.headers.authorization == 'Digest username="test", realm="Private", nonce="WpcHS2/TBAA=dffcc0dbd5f96d49a5477166649b7c0ae3866a93", uri="/test/", qop="auth", response="54753ce37c10cb20b09b769f0bed730e", nc="1", cnonce=""') {
-      ok = true;
-    } else {
-      // Bad auth header, don't send back WWW-Authenticate header
-      ok = false;
-    }
-  } else {
-    // No auth header, send back WWW-Authenticate header
-    ok = false;
-    res.setHeader('www-authenticate', 'Digest realm="Private", nonce="WpcHS2/TBAA=dffcc0dbd5f96d49a5477166649b7c0ae3866a93", algorithm=MD5, qop="auth"');
-  }
-
-  if (ok) {
-    console.log('request ok');
-    res.end('ok');
-  } else {
-    console.log('status=401');
-    res.statusCode = 401;
-    res.end('401');
-  }
-});
-
-digestServer.listen(6767);
-
-request({
-  'method': 'GET',
-  'uri': 'http://localhost:6767/test/',
-  'auth': {
-    'user': 'test',
-    'pass': 'testing',
-    'sendImmediately': false
-  }
-}, function(error, response, body) {
-  assert.equal(response.statusCode, 200);
-  assert.equal(numDigestRequests, 2);
-
-  // If we don't set sendImmediately = false, request will send basic auth
-  request({
-    'method': 'GET',
-    'uri': 'http://localhost:6767/test/',
-    'auth': {
-      'user': 'test',
-      'pass': 'testing'
-    }
-  }, function(error, response, body) {
-    assert.equal(response.statusCode, 401);
-    assert.equal(numDigestRequests, 3);
-
-    console.log('All tests passed');
-    digestServer.close();
-  });
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-emptyBody.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,20 +0,0 @@
-var request = require('../index')
-  , http = require('http')
-  , assert = require('assert')
-  ;
-
-var s = http.createServer(function (req, resp) {
-  resp.statusCode = 200
-  resp.end('')
-}).listen(8080, function () {
-  var r = request('http://localhost:8080', function (e, resp, body) {
-    assert.equal(resp.statusCode, 200)
-    assert.equal(body, "")
-
-    var r2 = request({ url: 'http://localhost:8080', json: {} }, function (e, resp, body) {
-	    assert.equal(resp.statusCode, 200)
-	    assert.equal(body, undefined)
-	    s.close()
- 	 });
-  })
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-errors.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,37 +0,0 @@
-var server = require('./server')
-  , events = require('events')
-  , assert = require('assert')
-  , request = require('../index')
-  ;
-
-var local = 'http://localhost:8888/asdf'
-
-try {
-  request({uri:local, body:{}})
-  assert.fail("Should have throw") 
-} catch(e) {
-  assert.equal(e.message, 'Argument error, options.body.')
-}
-
-try {
-  request({uri:local, multipart: 'foo'})
-  assert.fail("Should have throw")
-} catch(e) {
-  assert.equal(e.message, 'Argument error, options.multipart.')
-}
-
-try {
-  request({uri:local, multipart: [{}]})
-  assert.fail("Should have throw")
-} catch(e) {
-  assert.equal(e.message, 'Body attribute missing in multipart.')
-}
-
-try {
-  request(local, {multipart: [{}]})
-  assert.fail("Should have throw")
-} catch(e) {
-  assert.equal(e.message, 'Body attribute missing in multipart.')
-}
-
-console.log("All tests passed.")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-follow-all-303.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-var request = require('../index');
-var http = require('http');
-var requests = 0;
-var assert = require('assert');
-
-var server = http.createServer(function (req, res) {
-  console.error(req.method, req.url);
-  requests ++;
-
-  if (req.method === 'POST') {
-    console.error('send 303');
-    res.setHeader('location', req.url);
-    res.statusCode = 303;
-    res.end('try again, i guess\n');
-  } else {
-    console.error('send 200')
-    res.end('ok: ' + requests);
-  }
-});
-server.listen(6767);
-
-request.post({ url: 'http://localhost:6767/foo',
-               followAllRedirects: true,
-               form: { foo: 'bar' } }, function (er, req, body) {
-  if (er) throw er;
-  assert.equal(body, 'ok: 2');
-  assert.equal(requests, 2);
-  console.error('ok - ' + process.version);
-  server.close();
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-follow-all.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-var request = require('../index');
-var http = require('http');
-var requests = 0;
-var assert = require('assert');
-
-var server = http.createServer(function (req, res) {
-  requests ++;
-
-  // redirect everything 3 times, no matter what.
-  var c = req.headers.cookie;
-
-  if (!c) c = 0;
-  else c = +c.split('=')[1] || 0;
-
-  if (c > 3) {
-    res.end('ok: '+requests);
-    return;
-  }
-
-  res.setHeader('set-cookie', 'c=' + (c + 1));
-  res.setHeader('location', req.url);
-  res.statusCode = 302;
-  res.end('try again, i guess\n');
-});
-server.listen(6767);
-
-request.post({ url: 'http://localhost:6767/foo',
-               followAllRedirects: true,
-               jar: true,
-               form: { foo: 'bar' } }, function (er, req, body) {
-  if (er) throw er;
-  assert.equal(body, 'ok: 5');
-  assert.equal(requests, 5);
-  console.error('ok - ' + process.version);
-  server.close();
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-form.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,79 +0,0 @@
-var assert = require('assert')
-var http = require('http');
-var path = require('path');
-var mime = require('mime');
-var request = require('../index');
-var fs = require('fs');
-
-var remoteFile = 'http://nodejs.org/images/logo.png';
-
-var FIELDS = [
-  {name: 'my_field', value: 'my_value'},
-  {name: 'my_buffer', value: new Buffer([1, 2, 3])},
-  {name: 'my_file', value: fs.createReadStream(__dirname + '/unicycle.jpg')},
-  {name: 'remote_file', value: request(remoteFile) }
-];
-
-var server = http.createServer(function(req, res) {
-
-  // temp workaround
-  var data = '';
-  req.setEncoding('utf8');
-
-  req.on('data', function(d) {
-    data += d;
-  });
-
-  req.on('end', function() {
-    // check for the fields' traces
-
-    // 1st field : my_field
-    var field = FIELDS.shift();
-    assert.ok( data.indexOf('form-data; name="'+field.name+'"') != -1 );
-    assert.ok( data.indexOf(field.value) != -1 );
-
-    // 2nd field : my_buffer
-    var field = FIELDS.shift();
-    assert.ok( data.indexOf('form-data; name="'+field.name+'"') != -1 );
-    assert.ok( data.indexOf(field.value) != -1 );
-
-    // 3rd field : my_file
-    var field = FIELDS.shift();
-    assert.ok( data.indexOf('form-data; name="'+field.name+'"') != -1 );
-    assert.ok( data.indexOf('; filename="'+path.basename(field.value.path)+'"') != -1 );
-    // check for unicycle.jpg traces
-    assert.ok( data.indexOf('2005:06:21 01:44:12') != -1 );
-    assert.ok( data.indexOf('Content-Type: '+mime.lookup(field.value.path) ) != -1 );
-
-    // 4th field : remote_file
-    var field = FIELDS.shift();
-    assert.ok( data.indexOf('form-data; name="'+field.name+'"') != -1 );
-    assert.ok( data.indexOf('; filename="'+path.basename(field.value.path)+'"') != -1 );
-    // check for http://nodejs.org/images/logo.png traces
-    assert.ok( data.indexOf('ImageReady') != -1 );
-    assert.ok( data.indexOf('Content-Type: '+mime.lookup(remoteFile) ) != -1 );
-
-    res.writeHead(200);
-    res.end('done');
-
-  });
-
-
-});
-
-server.listen(8080, function() {
-
-  var req = request.post('http://localhost:8080/upload', function () {
-    server.close();
-  })
-  var form = req.form()
-  
-  FIELDS.forEach(function(field) {
-    form.append(field.name, field.value);
-  });
-
-});
-
-process.on('exit', function() {
-  assert.strictEqual(FIELDS.length, 0);
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-hawk.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-var createServer = require('http').createServer
-  , request = require('../index')
-  , hawk = require('hawk')
-  , assert = require('assert')
-  ;
-
-var server = createServer(function (req, resp) {
-  
-  var getCred = function (id, callback) {
-    assert.equal(id, 'dh37fgj492je')
-    var credentials = 
-      { key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn'
-      , algorithm: 'sha256'
-      , user: 'Steve'
-      }
-    return callback(null, credentials)
-  }
-
-  hawk.server.authenticate(req, getCred, {}, function (err, credentials, attributes) {
-    resp.writeHead(!err ? 200 : 401, { 'Content-Type': 'text/plain' })
-    resp.end(!err ? 'Hello ' + credentials.user : 'Shoosh!')
-  })
-  
-})
-
-server.listen(8080, function () {
-  var creds = {key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn', algorithm: 'sha256', id:'dh37fgj492je'}
-  request('http://localhost:8080', {hawk:{credentials:creds}}, function (e, r, b) {
-    assert.equal(200, r.statusCode)
-    assert.equal(b, 'Hello Steve')
-    server.close()
-  })
-})
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-headers.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,52 +0,0 @@
-var server = require('./server')
-  , assert = require('assert')
-  , request = require('../index')
-  , Cookie = require('cookie-jar')
-  , Jar = Cookie.Jar
-  , s = server.createServer()
-
-s.listen(s.port, function () {
-  var serverUri = 'http://localhost:' + s.port
-    , numTests = 0
-    , numOutstandingTests = 0
-
-  function createTest(requestObj, serverAssertFn) {
-    var testNumber = numTests;
-    numTests += 1;
-    numOutstandingTests += 1;
-    s.on('/' + testNumber, function (req, res) {
-      serverAssertFn(req, res);
-      res.writeHead(200);
-      res.end();
-    });
-    requestObj.url = serverUri + '/' + testNumber
-    request(requestObj, function (err, res, body) {
-      assert.ok(!err)
-      assert.equal(res.statusCode, 200)
-      numOutstandingTests -= 1
-      if (numOutstandingTests === 0) {
-        console.log(numTests + ' tests passed.')
-        s.close()
-      }
-    })
-  }
-
-  // Issue #125: headers.cookie shouldn't be replaced when a cookie jar isn't specified
-  createTest({headers: {cookie: 'foo=bar'}}, function (req, res) {
-    assert.ok(req.headers.cookie)
-    assert.equal(req.headers.cookie, 'foo=bar')
-  })
-
-  // Issue #125: headers.cookie + cookie jar
-  var jar = new Jar()
-  jar.add(new Cookie('quux=baz'));
-  createTest({jar: jar, headers: {cookie: 'foo=bar'}}, function (req, res) {
-    assert.ok(req.headers.cookie)
-    assert.equal(req.headers.cookie, 'foo=bar; quux=baz')
-  })
-
-  // There should be no cookie header when neither headers.cookie nor a cookie jar is specified
-  createTest({}, function (req, res) {
-    assert.ok(!req.headers.cookie)
-  })
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-http-signature.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,106 +0,0 @@
-var createServer = require('http').createServer
-  , request = require('../index')
-  , httpSignature = require('http-signature')
-  , assert = require('assert')
-  ;
-
-var privateKeyPEMs = {}
-
-privateKeyPEMs['key-1'] =
-  '-----BEGIN RSA PRIVATE KEY-----\n' +
-  'MIIEpAIBAAKCAQEAzWSJl+Z9Bqv00FVL5N3+JCUoqmQPjIlya1BbeqQroNQ5yG1i\n' +
-  'VbYTTnMRa1zQtR6r2fNvWeg94DvxivxIG9diDMnrzijAnYlTLOl84CK2vOxkj5b6\n' +
-  '8zrLH9b/Gd6NOHsywo8IjvXvCeTfca5WUHcuVi2lT9VjygFs1ILG4RyeX1BXUumu\n' +
-  'Y8fzmposxLYdMxCqUTzAn0u9Saq2H2OVj5u114wS7OQPigu6G99dpn/iPHa3zBm8\n' +
-  '7baBWDbqZWRW0BP3K6eqq8sut1+NLhNW8ADPTdnO/SO+kvXy7fqd8atSn+HlQcx6\n' +
-  'tW42dhXf3E9uE7K78eZtW0KvfyNGAjsI1Fft2QIDAQABAoIBAG1exe3/LEBrPLfb\n' +
-  'U8iRdY0lxFvHYIhDgIwohC3wUdMYb5SMurpNdEZn+7Sh/fkUVgp/GKJViu1mvh52\n' +
-  'bKd2r52DwG9NQBQjVgkqY/auRYSglIPpr8PpYNSZlcneunCDGeqEY9hMmXc5Ssqs\n' +
-  'PQYoEKKPN+IlDTg6PguDgAfLR4IUvt9KXVvmB/SSgV9tSeTy35LECt1Lq3ozbUgu\n' +
-  '30HZI3U6/7H+X22Pxxf8vzBtzkg5rRCLgv+OeNPo16xMnqbutt4TeqEkxRv5rtOo\n' +
-  '/A1i9khBeki0OJAFJsE82qnaSZodaRsxic59VnN8sWBwEKAt87tEu5A3K3j4XSDU\n' +
-  '/avZxAECgYEA+pS3DvpiQLtHlaO3nAH6MxHRrREOARXWRDe5nUQuUNpS1xq9wte6\n' +
-  'DkFtba0UCvDLic08xvReTCbo9kH0y6zEy3zMpZuJlKbcWCkZf4S5miYPI0RTZtF8\n' +
-  'yps6hWqzYFSiO9hMYws9k4OJLxX0x3sLK7iNZ32ujcSrkPBSiBr0gxkCgYEA0dWl\n' +
-  '637K41AJ/zy0FP0syq+r4eIkfqv+/t6y2aQVUBvxJYrj9ci6XHBqoxpDV8lufVYj\n' +
-  'fUAfeI9/MZaWvQJRbnYLre0I6PJfLuCBIL5eflO77BGso165AF7QJZ+fwtgKv3zv\n' +
-  'ZX75eudCSS/cFo0po9hlbcLMT4B82zEkgT8E2MECgYEAnz+3/wrdOmpLGiyL2dff\n' +
-  '3GjsqmJ2VfY8z+niSrI0BSpbD11tT9Ct67VlCBjA7hsOH6uRfpd6/kaUMzzDiFVq\n' +
-  'VDAiFvV8QD6zNkwYalQ9aFvbrvwTTPrBpjl0vamMCiJ/YC0cjq1sGr2zh3sar1Ph\n' +
-  'S43kP+s97dcZeelhaiJHVrECgYEAsx61q/loJ/LDFeYzs1cLTVn4V7I7hQY9fkOM\n' +
-  'WM0AhInVqD6PqdfXfeFYpjJdGisQ7l0BnoGGW9vir+nkcyPvb2PFRIr6+B8tsU5j\n' +
-  '7BeVgjDoUfQkcrEBK5fEBtnj/ud9BUkY8oMZZBjVNLRuI7IMwZiPvMp0rcj4zAN/\n' +
-  'LfUlpgECgYArBvFcBxSkNAzR3Rtteud1YDboSKluRM37Ey5plrn4BS0DD0jm++aD\n' +
-  '0pG2Hsik000hibw92lCkzvvBVAqF8BuAcnPlAeYfsOaa97PGEjSKEN5bJVWZ9/om\n' +
-  '9FV1axotRN2XWlwrhixZLEaagkREXhgQc540FS5O8IaI2Vpa80Atzg==\n' +
-  '-----END RSA PRIVATE KEY-----'
-
-var publicKeyPEMs = {}
-
-publicKeyPEMs['key-1'] =
-  '-----BEGIN PUBLIC KEY-----\n' +
-  'MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAzWSJl+Z9Bqv00FVL5N3+\n' +
-  'JCUoqmQPjIlya1BbeqQroNQ5yG1iVbYTTnMRa1zQtR6r2fNvWeg94DvxivxIG9di\n' +
-  'DMnrzijAnYlTLOl84CK2vOxkj5b68zrLH9b/Gd6NOHsywo8IjvXvCeTfca5WUHcu\n' +
-  'Vi2lT9VjygFs1ILG4RyeX1BXUumuY8fzmposxLYdMxCqUTzAn0u9Saq2H2OVj5u1\n' +
-  '14wS7OQPigu6G99dpn/iPHa3zBm87baBWDbqZWRW0BP3K6eqq8sut1+NLhNW8ADP\n' +
-  'TdnO/SO+kvXy7fqd8atSn+HlQcx6tW42dhXf3E9uE7K78eZtW0KvfyNGAjsI1Fft\n' +
-  '2QIDAQAB\n' +
-  '-----END PUBLIC KEY-----'
-
-publicKeyPEMs['key-2'] =
-  '-----BEGIN PUBLIC KEY-----\n' +
-  'MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAqp04VVr9OThli9b35Omz\n' +
-  'VqSfWbsoQuRrgyWsrNRn3XkFmbWw4FzZwQ42OgGMzQ84Ta4d9zGKKQyFriTiPjPf\n' +
-  'xhhrsaJnDuybcpVhcr7UNKjSZ0S59tU3hpRiEz6hO+Nc/OSSLkvalG0VKrxOln7J\n' +
-  'LK/h3rNS/l6wDZ5S/KqsI6CYtV2ZLpn3ahLrizvEYNY038Qcm38qMWx+VJAvZ4di\n' +
-  'qqmW7RLIsLT59SWmpXdhFKnkYYGhxrk1Mwl22dBTJNY5SbriU5G3gWgzYkm8pgHr\n' +
-  '6CtrXch9ciJAcDJehPrKXNvNDOdUh8EW3fekNJerF1lWcwQg44/12v8sDPyfbaKB\n' +
-  'dQIDAQAB\n' +
-  '-----END PUBLIC KEY-----'
-
-var server = createServer(function (req, res) {
-  var parsed = httpSignature.parseRequest(req)
-  var publicKeyPEM = publicKeyPEMs[parsed.keyId]
-  var verified = httpSignature.verifySignature(parsed, publicKeyPEM)
-  res.writeHead(verified ? 200 : 400)
-  res.end()
-})
-
-server.listen(8080, function () {
-  function correctKeyTest(callback) {
-    var options = {
-      httpSignature: {
-        keyId: 'key-1',
-        key: privateKeyPEMs['key-1']
-      }
-    }
-    request('http://localhost:8080', options, function (e, r, b) {
-      assert.equal(200, r.statusCode)
-      callback()
-    })
-  }
-
-  function incorrectKeyTest(callback) {
-    var options = {
-      httpSignature: {
-        keyId: 'key-2',
-        key: privateKeyPEMs['key-1']
-      }
-    }
-    request('http://localhost:8080', options, function (e, r, b) {
-      assert.equal(400, r.statusCode)
-      callback()
-    })
-  }
-
-  var tests = [correctKeyTest, incorrectKeyTest]
-  var todo = tests.length;
-  for(var i = 0; i < tests.length; ++i) {
-    tests[i](function() {
-      if(!--todo) {
-        server.close()
-      }
-    })
-  }
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-httpModule.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,94 +0,0 @@
-var http = require('http')
-  , https = require('https')
-  , server = require('./server')
-  , assert = require('assert')
-  , request = require('../index')
-
-
-var faux_requests_made = {'http':0, 'https':0}
-function wrap_request(name, module) {
-  // Just like the http or https module, but note when a request is made.
-  var wrapped = {}
-  Object.keys(module).forEach(function(key) {
-    var value = module[key];
-
-    if(key != 'request')
-      wrapped[key] = value;
-    else
-      wrapped[key] = function(options, callback) {
-        faux_requests_made[name] += 1
-        return value.apply(this, arguments)
-      }
-  })
-
-  return wrapped;
-}
-
-
-var faux_http = wrap_request('http', http)
-  , faux_https = wrap_request('https', https)
-  , plain_server = server.createServer()
-  , https_server = server.createSSLServer()
-
-
-plain_server.listen(plain_server.port, function() {
-  plain_server.on('/plain', function (req, res) {
-    res.writeHead(200)
-    res.end('plain')
-  })
-  plain_server.on('/to_https', function (req, res) {
-    res.writeHead(301, {'location':'https://localhost:'+https_server.port + '/https'})
-    res.end()
-  })
-
-  https_server.listen(https_server.port, function() {
-    https_server.on('/https', function (req, res) {
-      res.writeHead(200)
-      res.end('https')
-    })
-    https_server.on('/to_plain', function (req, res) {
-      res.writeHead(302, {'location':'http://localhost:'+plain_server.port + '/plain'})
-      res.end()
-    })
-
-    run_tests()
-    run_tests({})
-    run_tests({'http:':faux_http})
-    run_tests({'https:':faux_https})
-    run_tests({'http:':faux_http, 'https:':faux_https})
-  })
-})
-
-function run_tests(httpModules) {
-  var to_https = 'http://localhost:'+plain_server.port+'/to_https'
-  var to_plain = 'https://localhost:'+https_server.port+'/to_plain'
-
-  request(to_https, {'httpModules':httpModules, strictSSL:false}, function (er, res, body) {
-    if (er) throw er
-    assert.equal(body, 'https', 'Received HTTPS server body')
-    done()
-  })
-
-  request(to_plain, {'httpModules':httpModules, strictSSL:false}, function (er, res, body) {
-    if (er) throw er
-    assert.equal(body, 'plain', 'Received HTTPS server body')
-    done()
-  })
-}
-
-
-var passed = 0;
-function done() {
-  passed += 1
-  var expected = 10
-
-  if(passed == expected) {
-    plain_server.close()
-    https_server.close()
-
-    assert.equal(faux_requests_made.http, 4, 'Wrapped http module called appropriately')
-    assert.equal(faux_requests_made.https, 4, 'Wrapped https module called appropriately')
-
-    console.log((expected+2) + ' tests passed.')
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-https-strict.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,97 +0,0 @@
-// a test where we validate the siguature of the keys
-// otherwise exactly the same as the ssl test
-
-var server = require('./server')
-  , assert = require('assert')
-  , request = require('../index')
-  , fs = require('fs')
-  , path = require('path')
-  , opts = { key: path.resolve(__dirname, 'ssl/ca/server.key')
-           , cert: path.resolve(__dirname, 'ssl/ca/server.crt') }
-  , s = server.createSSLServer(null, opts)
-  , caFile = path.resolve(__dirname, 'ssl/ca/ca.crt')
-  , ca = fs.readFileSync(caFile)
-
-var tests =
-  { testGet :
-    { resp : server.createGetResponse("TESTING!")
-    , expectBody: "TESTING!"
-    }
-  , testGetChunkBreak :
-    { resp : server.createChunkResponse(
-      [ new Buffer([239])
-      , new Buffer([163])
-      , new Buffer([191])
-      , new Buffer([206])
-      , new Buffer([169])
-      , new Buffer([226])
-      , new Buffer([152])
-      , new Buffer([131])
-      ])
-    , expectBody: "Ω☃"
-    }
-  , testGetJSON :
-    { resp : server.createGetResponse('{"test":true}', 'application/json')
-    , json : true
-    , expectBody: {"test":true}
-    }
-  , testPutString :
-    { resp : server.createPostValidator("PUTTINGDATA")
-    , method : "PUT"
-    , body : "PUTTINGDATA"
-    }
-  , testPutBuffer :
-    { resp : server.createPostValidator("PUTTINGDATA")
-    , method : "PUT"
-    , body : new Buffer("PUTTINGDATA")
-    }
-  , testPutJSON :
-    { resp : server.createPostValidator(JSON.stringify({foo: 'bar'}))
-    , method: "PUT"
-    , json: {foo: 'bar'}
-    }
-  , testPutMultipart :
-    { resp: server.createPostValidator(
-        '--__BOUNDARY__\r\n' +
-        'content-type: text/html\r\n' +
-        '\r\n' +
-        '<html><body>Oh hi.</body></html>' +
-        '\r\n--__BOUNDARY__\r\n\r\n' +
-        'Oh hi.' +
-        '\r\n--__BOUNDARY__--'
-        )
-    , method: "PUT"
-    , multipart:
-      [ {'content-type': 'text/html', 'body': '<html><body>Oh hi.</body></html>'}
-      , {'body': 'Oh hi.'}
-      ]
-    }
-  }
-
-s.listen(s.port, function () {
-
-  var counter = 0
-
-  for (i in tests) {
-    (function () {
-      var test = tests[i]
-      s.on('/'+i, test.resp)
-      test.uri = s.url + '/' + i
-      test.strictSSL = true
-      test.ca = ca
-      test.headers = { host: 'testing.request.mikealrogers.com' }
-      request(test, function (err, resp, body) {
-        if (err) throw err
-        if (test.expectBody) {
-          assert.deepEqual(test.expectBody, body)
-        }
-        counter = counter - 1;
-        if (counter === 0) {
-          console.log(Object.keys(tests).length+" tests passed.")
-          s.close()
-        }
-      })
-      counter++
-    })()
-  }
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-https.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,87 +0,0 @@
-var server = require('./server')
-  , assert = require('assert')
-  , request = require('../index')
-
-var s = server.createSSLServer();
-
-var tests =
-  { testGet :
-    { resp : server.createGetResponse("TESTING!")
-    , expectBody: "TESTING!"
-    }
-  , testGetChunkBreak :
-    { resp : server.createChunkResponse(
-      [ new Buffer([239])
-      , new Buffer([163])
-      , new Buffer([191])
-      , new Buffer([206])
-      , new Buffer([169])
-      , new Buffer([226])
-      , new Buffer([152])
-      , new Buffer([131])
-      ])
-    , expectBody: "Ω☃"
-    }
-  , testGetJSON :
-    { resp : server.createGetResponse('{"test":true}', 'application/json')
-    , json : true
-    , expectBody: {"test":true}
-    }
-  , testPutString :
-    { resp : server.createPostValidator("PUTTINGDATA")
-    , method : "PUT"
-    , body : "PUTTINGDATA"
-    }
-  , testPutBuffer :
-    { resp : server.createPostValidator("PUTTINGDATA")
-    , method : "PUT"
-    , body : new Buffer("PUTTINGDATA")
-    }
-  , testPutJSON :
-    { resp : server.createPostValidator(JSON.stringify({foo: 'bar'}))
-    , method: "PUT"
-    , json: {foo: 'bar'}
-    }
-  , testPutMultipart :
-    { resp: server.createPostValidator(
-        '--__BOUNDARY__\r\n' +
-        'content-type: text/html\r\n' +
-        '\r\n' +
-        '<html><body>Oh hi.</body></html>' +
-        '\r\n--__BOUNDARY__\r\n\r\n' +
-        'Oh hi.' +
-        '\r\n--__BOUNDARY__--'
-        )
-    , method: "PUT"
-    , multipart:
-      [ {'content-type': 'text/html', 'body': '<html><body>Oh hi.</body></html>'}
-      , {'body': 'Oh hi.'}
-      ]
-    }
-  }
-
-s.listen(s.port, function () {
-
-  var counter = 0
-
-  for (i in tests) {
-    (function () {
-      var test = tests[i]
-      s.on('/'+i, test.resp)
-      test.uri = s.url + '/' + i
-      test.rejectUnauthorized = false
-      request(test, function (err, resp, body) {
-        if (err) throw err
-        if (test.expectBody) {
-          assert.deepEqual(test.expectBody, body)
-        }
-        counter = counter - 1;
-        if (counter === 0) {
-          console.log(Object.keys(tests).length+" tests passed.")
-          s.close()
-        }
-      })
-      counter++
-    })()
-  }
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-isUrl.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,28 +0,0 @@
-var assert = require('assert')
-  , request = require('../index')
-  , http = require('http')
-  ;
-
-var s = http.createServer(function(req, res) {
-  res.statusCode = 200;
-  res.end('');
-}).listen(6767, function () {
-
-  // Test lowercase
-  request('http://localhost:6767', function (err, resp, body) {
-    // just need to get here without throwing an error
-    assert.equal(true, true);
-  })
-
-  // Test uppercase
-  request('HTTP://localhost:6767', function (err, resp, body) {
-    assert.equal(true, true);
-  })
-
-  // Test mixedcase
-  request('HtTp://localhost:6767', function (err, resp, body) {
-    assert.equal(true, true);
-    // clean up
-    s.close();
-  })
-})
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-localAddress.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-var request = require('../index')
-  , assert = require('assert')
-  ;
-
-request.get({
-  uri: 'http://www.google.com', localAddress: '1.2.3.4' // some invalid address
-}, function(err, res) {
-  assert(!res) // asserting that no response received
-})
-
-request.get({
-  uri: 'http://www.google.com', localAddress: '127.0.0.1'
-}, function(err, res) {
-  assert(!res) // asserting that no response received
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-oauth.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,117 +0,0 @@
-var hmacsign = require('oauth-sign').hmacsign
-  , assert = require('assert')
-  , qs = require('querystring')
-  , request = require('../index')
-  ;
-
-function getsignature (r) {
-  var sign
-  r.headers.Authorization.slice('OAuth '.length).replace(/,\ /g, ',').split(',').forEach(function (v) {
-    if (v.slice(0, 'oauth_signature="'.length) === 'oauth_signature="') sign = v.slice('oauth_signature="'.length, -1)
-  })
-  return decodeURIComponent(sign)
-}
-
-// Tests from Twitter documentation https://dev.twitter.com/docs/auth/oauth
-
-var reqsign = hmacsign('POST', 'https://api.twitter.com/oauth/request_token', 
-  { oauth_callback: 'http://localhost:3005/the_dance/process_callback?service_provider_id=11'
-  , oauth_consumer_key: 'GDdmIQH6jhtmLUypg82g'
-  , oauth_nonce: 'QP70eNmVz8jvdPevU3oJD2AfF7R7odC2XJcn4XlZJqk'
-  , oauth_signature_method: 'HMAC-SHA1'
-  , oauth_timestamp: '1272323042'
-  , oauth_version: '1.0'
-  }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98")
-
-console.log(reqsign)
-console.log('8wUi7m5HFQy76nowoCThusfgB+Q=')
-assert.equal(reqsign, '8wUi7m5HFQy76nowoCThusfgB+Q=')
-
-var accsign = hmacsign('POST', 'https://api.twitter.com/oauth/access_token',
-  { oauth_consumer_key: 'GDdmIQH6jhtmLUypg82g'
-  , oauth_nonce: '9zWH6qe0qG7Lc1telCn7FhUbLyVdjEaL3MO5uHxn8'
-  , oauth_signature_method: 'HMAC-SHA1'
-  , oauth_token: '8ldIZyxQeVrFZXFOZH5tAwj6vzJYuLQpl0WUEYtWc'
-  , oauth_timestamp: '1272323047'
-  , oauth_verifier: 'pDNg57prOHapMbhv25RNf75lVRd6JDsni1AJJIDYoTY'
-  , oauth_version: '1.0'
-  }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98", "x6qpRnlEmW9JbQn4PQVVeVG8ZLPEx6A0TOebgwcuA")
-  
-console.log(accsign)
-console.log('PUw/dHA4fnlJYM6RhXk5IU/0fCc=')
-assert.equal(accsign, 'PUw/dHA4fnlJYM6RhXk5IU/0fCc=')
-
-var upsign = hmacsign('POST', 'http://api.twitter.com/1/statuses/update.json', 
-  { oauth_consumer_key: "GDdmIQH6jhtmLUypg82g"
-  , oauth_nonce: "oElnnMTQIZvqvlfXM56aBLAf5noGD0AQR3Fmi7Q6Y"
-  , oauth_signature_method: "HMAC-SHA1"
-  , oauth_token: "819797-Jxq8aYUDRmykzVKrgoLhXSq67TEa5ruc4GJC2rWimw"
-  , oauth_timestamp: "1272325550"
-  , oauth_version: "1.0"
-  , status: 'setting up my twitter 私のさえずりを設定する'
-  }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98", "J6zix3FfA9LofH0awS24M3HcBYXO5nI1iYe8EfBA")
-
-console.log(upsign)
-console.log('yOahq5m0YjDDjfjxHaXEsW9D+X0=')
-assert.equal(upsign, 'yOahq5m0YjDDjfjxHaXEsW9D+X0=')
-
-
-var rsign = request.post(
-  { url: 'https://api.twitter.com/oauth/request_token'
-  , oauth: 
-    { callback: 'http://localhost:3005/the_dance/process_callback?service_provider_id=11'
-    , consumer_key: 'GDdmIQH6jhtmLUypg82g'
-    , nonce: 'QP70eNmVz8jvdPevU3oJD2AfF7R7odC2XJcn4XlZJqk'
-    , timestamp: '1272323042'
-    , version: '1.0'
-    , consumer_secret: "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98"
-    }
-  })
-
-setTimeout(function () {
-  console.log(getsignature(rsign))
-  assert.equal(reqsign, getsignature(rsign))
-})
-
-var raccsign = request.post(
-  { url: 'https://api.twitter.com/oauth/access_token'
-  , oauth:  
-    { consumer_key: 'GDdmIQH6jhtmLUypg82g'
-    , nonce: '9zWH6qe0qG7Lc1telCn7FhUbLyVdjEaL3MO5uHxn8'
-    , signature_method: 'HMAC-SHA1'
-    , token: '8ldIZyxQeVrFZXFOZH5tAwj6vzJYuLQpl0WUEYtWc'
-    , timestamp: '1272323047'
-    , verifier: 'pDNg57prOHapMbhv25RNf75lVRd6JDsni1AJJIDYoTY'
-    , version: '1.0'
-    , consumer_secret: "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98"
-    , token_secret: "x6qpRnlEmW9JbQn4PQVVeVG8ZLPEx6A0TOebgwcuA" 
-    }
-  })
-
-setTimeout(function () {
-  console.log(getsignature(raccsign))
-  assert.equal(accsign, getsignature(raccsign))
-}, 1) 
-
-var rupsign = request.post(
-  { url: 'http://api.twitter.com/1/statuses/update.json' 
-  , oauth: 
-    { consumer_key: "GDdmIQH6jhtmLUypg82g"
-    , nonce: "oElnnMTQIZvqvlfXM56aBLAf5noGD0AQR3Fmi7Q6Y"
-    , signature_method: "HMAC-SHA1"
-    , token: "819797-Jxq8aYUDRmykzVKrgoLhXSq67TEa5ruc4GJC2rWimw"
-    , timestamp: "1272325550"
-    , version: "1.0"
-    , consumer_secret: "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98"
-    , token_secret: "J6zix3FfA9LofH0awS24M3HcBYXO5nI1iYe8EfBA"
-    }
-  , form: {status: 'setting up my twitter 私のさえずりを設定する'} 
-  })
-setTimeout(function () {
-  console.log(getsignature(rupsign))
-  assert.equal(upsign, getsignature(rupsign))
-}, 1)
-
-
-
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-onelineproxy.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,46 +0,0 @@
-var http = require('http')
-  , assert = require('assert')
-  , request = require('../index')
-  ;
-
-var server = http.createServer(function (req, resp) {
-  resp.statusCode = 200
-  if (req.url === '/get') {
-    assert.equal(req.method, 'GET')
-    resp.write('content')
-    resp.end()
-    return
-  }
-  if (req.url === '/put') {
-    var x = ''
-    assert.equal(req.method, 'PUT')
-    req.on('data', function (chunk) {
-      x += chunk
-    })
-    req.on('end', function () {
-      assert.equal(x, 'content')
-      resp.write('success')
-      resp.end()
-    })
-    return
-  }
-  if (req.url === '/proxy') {
-    assert.equal(req.method, 'PUT')
-    return req.pipe(request('http://localhost:8080/put')).pipe(resp)
-  }
-
-  if (req.url === '/test') {
-    return request('http://localhost:8080/get').pipe(request.put('http://localhost:8080/proxy')).pipe(resp)
-  }
-  throw new Error('Unknown url', req.url)
-}).listen(8080, function () {
-  request('http://localhost:8080/test', function (e, resp, body) {
-    if (e) throw e
-    if (resp.statusCode !== 200) throw new Error('statusCode not 200 ' + resp.statusCode)
-    assert.equal(body, 'success')
-    server.close()
-  })
-})
-
-
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-params.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,93 +0,0 @@
-var server = require('./server')
-  , assert = require('assert')
-  , request = require('../index')
-  ;
-
-var s = server.createServer();
-
-var tests =
-  { testGet :
-    { resp : server.createGetResponse("TESTING!")
-    , expectBody: "TESTING!"
-    }
-    , testGetChunkBreak :
-      { resp : server.createChunkResponse(
-        [ new Buffer([239])
-        , new Buffer([163])
-        , new Buffer([191])
-        , new Buffer([206])
-        , new Buffer([169])
-        , new Buffer([226])
-        , new Buffer([152])
-        , new Buffer([131])
-        ])
-      , expectBody: "Ω☃"
-      }
-    , testGetBuffer :
-      { resp : server.createGetResponse(new Buffer("TESTING!"))
-      , encoding: null
-      , expectBody: new Buffer("TESTING!")
-      }
-    , testGetJSON :
-       { resp : server.createGetResponse('{"test":true}', 'application/json')
-       , json : true
-       , expectBody: {"test":true}
-       }
-    , testPutString :
-      { resp : server.createPostValidator("PUTTINGDATA")
-      , method : "PUT"
-      , body : "PUTTINGDATA"
-      }
-    , testPutBuffer :
-      { resp : server.createPostValidator("PUTTINGDATA")
-      , method : "PUT"
-      , body : new Buffer("PUTTINGDATA")
-      }
-    , testPutJSON :
-      { resp : server.createPostValidator(JSON.stringify({foo: 'bar'}))
-      , method: "PUT"
-      , json: {foo: 'bar'}
-      }
-    , testPutMultipart :
-      { resp: server.createPostValidator(
-          '--__BOUNDARY__\r\n' +
-          'content-type: text/html\r\n' +
-          '\r\n' +
-          '<html><body>Oh hi.</body></html>' +
-          '\r\n--__BOUNDARY__\r\n\r\n' +
-          'Oh hi.' +
-          '\r\n--__BOUNDARY__--'
-          )
-      , method: "PUT"
-      , multipart:
-        [ {'content-type': 'text/html', 'body': '<html><body>Oh hi.</body></html>'}
-        , {'body': 'Oh hi.'}
-        ]
-      }
-  }
-
-s.listen(s.port, function () {
-
-  var counter = 0
-
-  for (i in tests) {
-    (function () {
-      var test = tests[i]
-      s.on('/'+i, test.resp)
-      //test.uri = s.url + '/' + i
-      request(s.url + '/' + i, test, function (err, resp, body) {
-        if (err) throw err
-        if (test.expectBody) {
-          assert.deepEqual(test.expectBody, body)
-        }
-        counter = counter - 1;
-        if (counter === 0) {
-          assert.notEqual(typeof test.callback, 'function')
-          console.log(1 + Object.keys(tests).length+" tests passed.")
-          s.close()
-        }
-      })
-      counter++
-    })()
-  }
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-piped-redirect.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-var http = require('http')
-  , assert = require('assert')
-  , request = require('../index')
-  ;
-
-var portOne = 8968
-  , portTwo = 8969
-  ;
-
-
-// server one
-var s1 = http.createServer(function (req, resp) {
-  if (req.url == '/original') {
-    resp.writeHeader(302, {'location': '/redirected'})
-    resp.end()
-  } else if (req.url == '/redirected') {
-    resp.writeHeader(200, {'content-type': 'text/plain'})
-    resp.write('OK')
-    resp.end()
-  }
-
-}).listen(portOne);
-
-
-// server two
-var s2 = http.createServer(function (req, resp) {
-  var x = request('http://localhost:'+portOne+'/original')
-  req.pipe(x)
-  x.pipe(resp)
-
-}).listen(portTwo, function () {
-  var r = request('http://localhost:'+portTwo+'/original', function (err, res, body) {
-    assert.equal(body, 'OK')
-
-    s1.close()
-    s2.close()
-  });
-
-  // it hangs, so wait a second :)
-  r.timeout = 1000;
-
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-pipes.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,216 +0,0 @@
-var server = require('./server')
-  , events = require('events')
-  , stream = require('stream')
-  , assert = require('assert')
-  , fs = require('fs')
-  , request = require('../index')
-  , path = require('path')
-  , util = require('util')
-  ;
-
-var s = server.createServer(3453);
-
-function ValidationStream(str) {
-  this.str = str
-  this.buf = ''
-  this.on('data', function (data) {
-    this.buf += data
-  })
-  this.on('end', function () {
-    assert.equal(this.str, this.buf)
-  })
-  this.writable = true
-}
-util.inherits(ValidationStream, stream.Stream)
-ValidationStream.prototype.write = function (chunk) {
-  this.emit('data', chunk)
-}
-ValidationStream.prototype.end = function (chunk) {
-  if (chunk) emit('data', chunk)
-  this.emit('end')
-}
-
-s.listen(s.port, function () {
-  counter = 0;
-
-  var check = function () {
-    counter = counter - 1
-    if (counter === 0) {
-      console.log('All tests passed.')
-      setTimeout(function () {
-        process.exit();
-      }, 500)
-    }
-  }
-
-  // Test pipeing to a request object
-  s.once('/push', server.createPostValidator("mydata"));
-
-  var mydata = new stream.Stream();
-  mydata.readable = true
-
-  counter++
-  var r1 = request.put({url:'http://localhost:3453/push'}, function () {
-    check();
-  })
-  mydata.pipe(r1)
-
-  mydata.emit('data', 'mydata');
-  mydata.emit('end');
-
-  // Test pipeing to a request object with a json body
-  s.once('/push-json', server.createPostValidator("{\"foo\":\"bar\"}", "application/json"));
-
-  var mybodydata = new stream.Stream();
-  mybodydata.readable = true
-
-  counter++
-  var r2 = request.put({url:'http://localhost:3453/push-json',json:true}, function () {
-    check();
-  })
-  mybodydata.pipe(r2)
-
-  mybodydata.emit('data', JSON.stringify({foo:"bar"}));
-  mybodydata.emit('end');
-
-  // Test pipeing from a request object.
-  s.once('/pull', server.createGetResponse("mypulldata"));
-
-  var mypulldata = new stream.Stream();
-  mypulldata.writable = true
-
-  counter++
-  request({url:'http://localhost:3453/pull'}).pipe(mypulldata)
-
-  var d = '';
-
-  mypulldata.write = function (chunk) {
-    d += chunk;
-  }
-  mypulldata.end = function () {
-    assert.equal(d, 'mypulldata');
-    check();
-  };
-
-
-  s.on('/cat', function (req, resp) {
-    if (req.method === "GET") {
-      resp.writeHead(200, {'content-type':'text/plain-test', 'content-length':4});
-      resp.end('asdf')
-    } else if (req.method === "PUT") {
-      assert.equal(req.headers['content-type'], 'text/plain-test');
-      assert.equal(req.headers['content-length'], 4)
-      var validate = '';
-
-      req.on('data', function (chunk) {validate += chunk})
-      req.on('end', function () {
-        resp.writeHead(201);
-        resp.end();
-        assert.equal(validate, 'asdf');
-        check();
-      })
-    }
-  })
-  s.on('/pushjs', function (req, resp) {
-    if (req.method === "PUT") {
-      assert.equal(req.headers['content-type'], 'application/javascript');
-      check();
-    }
-  })
-  s.on('/catresp', function (req, resp) {
-    request.get('http://localhost:3453/cat').pipe(resp)
-  })
-  s.on('/doodle', function (req, resp) {
-    if (req.headers['x-oneline-proxy']) {
-      resp.setHeader('x-oneline-proxy', 'yup')
-    }
-    resp.writeHead('200', {'content-type':'image/jpeg'})
-    fs.createReadStream(path.join(__dirname, 'googledoodle.jpg')).pipe(resp)
-  })
-  s.on('/onelineproxy', function (req, resp) {
-    var x = request('http://localhost:3453/doodle')
-    req.pipe(x)
-    x.pipe(resp)
-  })
-
-  counter++
-  fs.createReadStream(__filename).pipe(request.put('http://localhost:3453/pushjs'))
-
-  counter++
-  request.get('http://localhost:3453/cat').pipe(request.put('http://localhost:3453/cat'))
-
-  counter++
-  request.get('http://localhost:3453/catresp', function (e, resp, body) {
-    assert.equal(resp.headers['content-type'], 'text/plain-test');
-    assert.equal(resp.headers['content-length'], 4)
-    check();
-  })
-
-  var doodleWrite = fs.createWriteStream(path.join(__dirname, 'test.jpg'))
-
-  counter++
-  request.get('http://localhost:3453/doodle').pipe(doodleWrite)
-
-  doodleWrite.on('close', function () {
-    assert.deepEqual(fs.readFileSync(path.join(__dirname, 'googledoodle.jpg')), fs.readFileSync(path.join(__dirname, 'test.jpg')))
-    check()
-  })
-
-  process.on('exit', function () {
-    fs.unlinkSync(path.join(__dirname, 'test.jpg'))
-  })
-
-  counter++
-  request.get({uri:'http://localhost:3453/onelineproxy', headers:{'x-oneline-proxy':'nope'}}, function (err, resp, body) {
-    assert.equal(resp.headers['x-oneline-proxy'], 'yup')
-    check()
-  })
-
-  s.on('/afterresponse', function (req, resp) {
-    resp.write('d')
-    resp.end()
-  })
-
-  counter++
-  var afterresp = request.post('http://localhost:3453/afterresponse').on('response', function () {
-    var v = new ValidationStream('d')
-    afterresp.pipe(v)
-    v.on('end', check)
-  })
-  
-  s.on('/forward1', function (req, resp) {
-   resp.writeHead(302, {location:'/forward2'})
-    resp.end()
-  })
-  s.on('/forward2', function (req, resp) {
-    resp.writeHead('200', {'content-type':'image/png'})
-    resp.write('d')
-    resp.end()
-  })
-  
-  counter++
-  var validateForward = new ValidationStream('d')
-  validateForward.on('end', check)
-  request.get('http://localhost:3453/forward1').pipe(validateForward)
-
-  // Test pipe options
-  s.once('/opts', server.createGetResponse('opts response'));
-
-  var optsStream = new stream.Stream();
-  optsStream.writable = true
-  
-  var optsData = '';
-  optsStream.write = function (buf) {
-    optsData += buf;
-    if (optsData === 'opts response') {
-      setTimeout(check, 10);
-    }
-  }
-
-  optsStream.end = function () {
-    assert.fail('end called')
-  };
-
-  counter++
-  request({url:'http://localhost:3453/opts'}).pipe(optsStream, { end : false })
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-pool.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-var request = require('../index')
-  , http = require('http')
-  , assert = require('assert')
-  ;
-
-var s = http.createServer(function (req, resp) {
-  resp.statusCode = 200;
-  resp.end('asdf');
-}).listen(8080, function () {
-  request({'url': 'http://localhost:8080', 'pool': false}, function (e, resp) {
-    var agent = resp.request.agent;
-    assert.strictEqual(typeof agent, 'boolean');
-    assert.strictEqual(agent, false);
-    s.close();
-  });
-});
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-protocol-changing-redirect.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,61 +0,0 @@
-var server = require('./server')
-  , assert = require('assert')
-  , request = require('../index')
-
-
-var s = server.createServer()
-var ss = server.createSSLServer()
-var sUrl = 'http://localhost:' + s.port
-var ssUrl = 'https://localhost:' + ss.port
-
-s.listen(s.port, bouncy(s, ssUrl))
-ss.listen(ss.port, bouncy(ss, sUrl))
-
-var hits = {}
-var expect = {}
-var pending = 0
-function bouncy (s, server) { return function () {
-
-  var redirs = { a: 'b'
-               , b: 'c'
-               , c: 'd'
-               , d: 'e'
-               , e: 'f'
-               , f: 'g'
-               , g: 'h'
-               , h: 'end' }
-
-  var perm = true
-  Object.keys(redirs).forEach(function (p) {
-    var t = redirs[p]
-
-    // switch type each time
-    var type = perm ? 301 : 302
-    perm = !perm
-    s.on('/' + p, function (req, res) {
-      res.writeHead(type, { location: server + '/' + t })
-      res.end()
-    })
-  })
-
-  s.on('/end', function (req, res) {
-    var h = req.headers['x-test-key']
-    hits[h] = true
-    pending --
-    if (pending === 0) done()
-  })
-}}
-
-for (var i = 0; i < 5; i ++) {
-  pending ++
-  var val = 'test_' + i
-  expect[val] = true
-  request({ url: (i % 2 ? sUrl : ssUrl) + '/a'
-          , headers: { 'x-test-key': val }
-          , rejectUnauthorized: false })
-}
-
-function done () {
-  assert.deepEqual(hits, expect)
-  process.exit(0)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-proxy.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-var server = require('./server')
-  , events = require('events')
-  , stream = require('stream')
-  , assert = require('assert')
-  , fs = require('fs')
-  , request = require('../index')
-  , path = require('path')
-  , util = require('util')
-  ;
-
-var port = 6768
-  , called = false
-  , proxiedHost = 'google.com'
-  ;
-
-var s = server.createServer(port)
-s.listen(port, function () {
-  s.on('http://google.com/', function (req, res) {
-    called = true
-    assert.equal(req.headers.host, proxiedHost)
-    res.writeHeader(200)
-    res.end()
-  })
-  request ({
-    url: 'http://'+proxiedHost,
-    proxy: 'http://localhost:'+port
-    /*
-    //should behave as if these arguments where passed:
-    url: 'http://localhost:'+port,
-    headers: {host: proxiedHost}
-    //*/
-  }, function (err, res, body) {
-    s.close()
-  })
-})
-
-process.on('exit', function () {
-  assert.ok(called, 'the request must be made to the proxy server')
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-qs.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-var request = request = require('../index')
-  , assert = require('assert')
-  ;
- 
-
-// Test adding a querystring
-var req1 = request.get({ uri: 'http://www.google.com', qs: { q : 'search' }})
-setTimeout(function() {
-	assert.equal('/?q=search', req1.path)
-}, 1)
-
-// Test replacing a querystring value
-var req2 = request.get({ uri: 'http://www.google.com?q=abc', qs: { q : 'search' }})
-setTimeout(function() {
-	assert.equal('/?q=search', req2.path)
-}, 1)
-
-// Test appending a querystring value to the ones present in the uri
-var req3 = request.get({ uri: 'http://www.google.com?x=y', qs: { q : 'search' }})
-setTimeout(function() {
-	assert.equal('/?x=y&q=search', req3.path)
-}, 1)
-
-// Test leaving a querystring alone
-var req4 = request.get({ uri: 'http://www.google.com?x=y'})
-setTimeout(function() {
-	assert.equal('/?x=y', req4.path)
-}, 1)
-
-// Test giving empty qs property
-var req5 = request.get({ uri: 'http://www.google.com', qs: {}})
-setTimeout(function(){
-	assert.equal('/', req5.path)
-}, 1)
-
-
-// Test modifying the qs after creating the request
-var req6 = request.get({ uri: 'http://www.google.com', qs: {}});
-req6.qs({ q: "test" });
-process.nextTick(function() {
-    assert.equal('/?q=test', req6.path);
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-redirect.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,155 +0,0 @@
-var server = require('./server')
-  , assert = require('assert')
-  , request = require('../index')
-  , Cookie = require('cookie-jar')
-  , Jar = Cookie.Jar
-  ;
-
-var s = server.createServer()
-
-s.listen(s.port, function () {
-  var server = 'http://localhost:' + s.port;
-  var hits = {}
-  var passed = 0;
-
-  bouncer(301, 'temp')
-  bouncer(302, 'perm')
-  bouncer(302, 'nope')
-
-  function bouncer(code, label) {
-    var landing = label+'_landing';
-
-    s.on('/'+label, function (req, res) {
-      hits[label] = true;
-      res.writeHead(code, {
-        'location':server + '/'+landing,
-        'set-cookie': 'ham=eggs'
-      })
-      res.end()
-    })
-
-    s.on('/'+landing, function (req, res) {
-      if (req.method !== 'GET') { // We should only accept GET redirects
-        console.error("Got a non-GET request to the redirect destination URL");
-        res.writeHead(400);
-        res.end();
-        return;
-      }
-      // Make sure the cookie doesn't get included twice, see #139:
-      // Make sure cookies are set properly after redirect
-      assert.equal(req.headers.cookie, 'foo=bar; quux=baz; ham=eggs');
-      hits[landing] = true;
-      res.writeHead(200)
-      res.end(landing)
-    })
-  }
-
-  // Permanent bounce
-  var jar = new Jar()
-  jar.add(new Cookie('quux=baz'))
-  request({uri: server+'/perm', jar: jar, headers: {cookie: 'foo=bar'}}, function (er, res, body) {
-    if (er) throw er
-    if (res.statusCode !== 200) throw new Error('Status is not 200: '+res.statusCode)
-    assert.ok(hits.perm, 'Original request is to /perm')
-    assert.ok(hits.perm_landing, 'Forward to permanent landing URL')
-    assert.equal(body, 'perm_landing', 'Got permanent landing content')
-    passed += 1
-    done()
-  })
-  
-  // Temporary bounce
-  request({uri: server+'/temp', jar: jar, headers: {cookie: 'foo=bar'}}, function (er, res, body) {
-    if (er) throw er
-    if (res.statusCode !== 200) throw new Error('Status is not 200: '+res.statusCode)
-    assert.ok(hits.temp, 'Original request is to /temp')
-    assert.ok(hits.temp_landing, 'Forward to temporary landing URL')
-    assert.equal(body, 'temp_landing', 'Got temporary landing content')
-    passed += 1
-    done()
-  })
-  
-  // Prevent bouncing.
-  request({uri:server+'/nope', jar: jar, headers: {cookie: 'foo=bar'}, followRedirect:false}, function (er, res, body) {
-    if (er) throw er
-    if (res.statusCode !== 302) throw new Error('Status is not 302: '+res.statusCode)
-    assert.ok(hits.nope, 'Original request to /nope')
-    assert.ok(!hits.nope_landing, 'No chasing the redirect')
-    assert.equal(res.statusCode, 302, 'Response is the bounce itself')
-    passed += 1
-    done()
-  })
-  
-  // Should not follow post redirects by default
-  request.post(server+'/temp', { jar: jar, headers: {cookie: 'foo=bar'}}, function (er, res, body) {
-    if (er) throw er
-    if (res.statusCode !== 301) throw new Error('Status is not 301: '+res.statusCode)
-    assert.ok(hits.temp, 'Original request is to /temp')
-    assert.ok(!hits.temp_landing, 'No chasing the redirect when post')
-    assert.equal(res.statusCode, 301, 'Response is the bounce itself')
-    passed += 1
-    done()
-  })
-  
-  // Should follow post redirects when followAllRedirects true
-  request.post({uri:server+'/temp', followAllRedirects:true, jar: jar, headers: {cookie: 'foo=bar'}}, function (er, res, body) {
-    if (er) throw er
-    if (res.statusCode !== 200) throw new Error('Status is not 200: '+res.statusCode)
-    assert.ok(hits.temp, 'Original request is to /temp')
-    assert.ok(hits.temp_landing, 'Forward to temporary landing URL')
-    assert.equal(body, 'temp_landing', 'Got temporary landing content')
-    passed += 1
-    done()
-  })
-  
-  request.post({uri:server+'/temp', followAllRedirects:false, jar: jar, headers: {cookie: 'foo=bar'}}, function (er, res, body) {
-    if (er) throw er
-    if (res.statusCode !== 301) throw new Error('Status is not 301: '+res.statusCode)
-    assert.ok(hits.temp, 'Original request is to /temp')
-    assert.ok(!hits.temp_landing, 'No chasing the redirect')
-    assert.equal(res.statusCode, 301, 'Response is the bounce itself')
-    passed += 1
-    done()
-  })
-
-  // Should not follow delete redirects by default
-  request.del(server+'/temp', { jar: jar, headers: {cookie: 'foo=bar'}}, function (er, res, body) {
-    if (er) throw er
-    if (res.statusCode < 301) throw new Error('Status is not a redirect.')
-    assert.ok(hits.temp, 'Original request is to /temp')
-    assert.ok(!hits.temp_landing, 'No chasing the redirect when delete')
-    assert.equal(res.statusCode, 301, 'Response is the bounce itself')
-    passed += 1
-    done()
-  })
-  
-  // Should not follow delete redirects even if followRedirect is set to true
-  request.del(server+'/temp', { followRedirect: true, jar: jar, headers: {cookie: 'foo=bar'}}, function (er, res, body) {
-    if (er) throw er
-    if (res.statusCode !== 301) throw new Error('Status is not 301: '+res.statusCode)
-    assert.ok(hits.temp, 'Original request is to /temp')
-    assert.ok(!hits.temp_landing, 'No chasing the redirect when delete')
-    assert.equal(res.statusCode, 301, 'Response is the bounce itself')
-    passed += 1
-    done()
-  })
-  
-  // Should follow delete redirects when followAllRedirects true
-  request.del(server+'/temp', {followAllRedirects:true, jar: jar, headers: {cookie: 'foo=bar'}}, function (er, res, body) {
-    if (er) throw er
-    if (res.statusCode !== 200) throw new Error('Status is not 200: '+res.statusCode)
-    assert.ok(hits.temp, 'Original request is to /temp')
-    assert.ok(hits.temp_landing, 'Forward to temporary landing URL')
-    assert.equal(body, 'temp_landing', 'Got temporary landing content')
-    passed += 1
-    done()
-  })
-
-  var reqs_done = 0;
-  function done() {
-    reqs_done += 1;
-    if(reqs_done == 9) {
-      console.log(passed + ' tests passed.')
-      s.close()
-    }
-  }
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-s3.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,13 +0,0 @@
-var request = require('../index')
-
-var r = request.get('https://log.curlybracecast.com.s3.amazonaws.com/', 
-  { aws: 
-    { key: 'AKIAI6KIQRRVMGK3WK5Q'
-    , secret: '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'
-    , bucket: 'log.curlybracecast.com'
-    }
-  }, function (e, resp, body) {
-    console.log(r.headers)
-    console.log(body)
-  }
-)
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-timeout.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,87 +0,0 @@
-var server = require('./server')
-  , events = require('events')
-  , stream = require('stream')
-  , assert = require('assert')
-  , request = require('../index')
-  ;
-
-var s = server.createServer();
-var expectedBody = "waited";
-var remainingTests = 5;
-
-s.listen(s.port, function () {
-  // Request that waits for 200ms
-  s.on('/timeout', function (req, resp) {
-    setTimeout(function(){
-      resp.writeHead(200, {'content-type':'text/plain'})
-      resp.write(expectedBody)
-      resp.end()
-    }, 200);
-  });
-
-  // Scenario that should timeout
-  var shouldTimeout = {
-    url: s.url + "/timeout",
-    timeout:100
-  }
-
-
-  request(shouldTimeout, function (err, resp, body) {
-    assert.equal(err.code, "ETIMEDOUT");
-    checkDone();
-  })
-
-
-  // Scenario that shouldn't timeout
-  var shouldntTimeout = {
-    url: s.url + "/timeout",
-    timeout:300
-  }
-
-  request(shouldntTimeout, function (err, resp, body) {
-    assert.equal(err, null);
-    assert.equal(expectedBody, body)
-    checkDone();
-  })
-
-  // Scenario with no timeout set, so shouldn't timeout
-  var noTimeout = {
-    url: s.url + "/timeout"
-  }
-
-  request(noTimeout, function (err, resp, body) {
-    assert.equal(err);
-    assert.equal(expectedBody, body)
-    checkDone();
-  })
-
-  // Scenario with a negative timeout value, should be treated a zero or the minimum delay
-  var negativeTimeout = {
-    url: s.url + "/timeout",
-    timeout:-1000
-  }
-
-  request(negativeTimeout, function (err, resp, body) {
-    assert.equal(err.code, "ETIMEDOUT");
-    checkDone();
-  })
-
-  // Scenario with a float timeout value, should be rounded by setTimeout anyway
-  var floatTimeout = {
-    url: s.url + "/timeout",
-    timeout: 100.76
-  }
-
-  request(floatTimeout, function (err, resp, body) {
-    assert.equal(err.code, "ETIMEDOUT");
-    checkDone();
-  })
-
-  function checkDone() {
-    if(--remainingTests == 0) {
-      s.close();
-      console.log("All tests passed.");
-    }
-  }
-})
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-toJSON.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-var request = require('../index')
-  , http = require('http')
-  , assert = require('assert')
-  ;
-
-var s = http.createServer(function (req, resp) {
-  resp.statusCode = 200
-  resp.end('asdf')
-}).listen(8080, function () {
-  var r = request('http://localhost:8080', function (e, resp) {
-    assert.equal(JSON.parse(JSON.stringify(r)).response.statusCode, 200)
-    s.close()
-  })
-})
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/test-tunnel.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,75 +0,0 @@
-// test that we can tunnel a https request over an http proxy
-// keeping all the CA and whatnot intact.
-//
-// Note: this requires that squid is installed.
-// If the proxy fails to start, we'll just log a warning and assume success.
-
-var server = require('./server')
-  , assert = require('assert')
-  , request = require('../index')
-  , fs = require('fs')
-  , path = require('path')
-  , caFile = path.resolve(__dirname, 'ssl/npm-ca.crt')
-  , ca = fs.readFileSync(caFile)
-  , child_process = require('child_process')
-  , sqConf = path.resolve(__dirname, 'squid.conf')
-  , sqArgs = ['-f', sqConf, '-N', '-d', '5']
-  , proxy = 'http://localhost:3128'
-  , hadError = null
-
-var squid = child_process.spawn('squid', sqArgs);
-var ready = false
-
-squid.stderr.on('data', function (c) {
-  console.error('SQUIDERR ' + c.toString().trim().split('\n')
-               .join('\nSQUIDERR '))
-  ready = c.toString().match(/ready to serve requests|Accepting HTTP Socket connections/i)
-})
-
-squid.stdout.on('data', function (c) {
-  console.error('SQUIDOUT ' + c.toString().trim().split('\n')
-               .join('\nSQUIDOUT '))
-})
-
-squid.on('error', function (c) {
-  console.error('squid: error '+c)
-  if (c && !ready) {
-    notInstalled()
-    return
-  }
-})
-
-squid.on('exit', function (c) {
-  console.error('squid: exit '+c)
-  if (c && !ready) {
-    notInstalled()
-    return
-  }
-
-  if (c) {
-    hadError = hadError || new Error('Squid exited with '+c)
-  }
-  if (hadError) throw hadError
-})
-
-setTimeout(function F () {
-  if (!ready) return setTimeout(F, 100)
-  request({ uri: 'https://registry.npmjs.org/'
-          , proxy: 'http://localhost:3128'
-          , strictSSL: true
-          , ca: ca
-          , json: true }, function (er, body) {
-    hadError = er
-    console.log(er || typeof body)
-    if (!er) console.log("ok")
-    squid.kill('SIGKILL')
-  })
-}, 100)
-
-function notInstalled() {
-  console.error('squid must be installed to run this test.')
-  console.error('skipping this test. please install squid and run again if you need to test tunneling.')
-  c = null
-  hadError = null
-  process.exit(0)
-}
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/request/tests/unicycle.jpg has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-/node_modules/*
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/License	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-Copyright (c) 2011:
-Tim Koschützki (tim@debuggable.com)
-Felix Geisendörfer (felix@debuggable.com)
-
- Permission is hereby granted, free of charge, to any person obtaining a copy
- of this software and associated documentation files (the "Software"), to deal
- in the Software without restriction, including without limitation the rights
- to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
- copies of the Software, and to permit persons to whom the Software is
- furnished to do so, subject to the following conditions:
-
- The above copyright notice and this permission notice shall be included in
- all copies or substantial portions of the Software.
-
- THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
- IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
- FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
- AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
- LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
- OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
- THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,7 +0,0 @@
-SHELL := /bin/bash
-
-test:
-	@node test/runner.js
-
-.PHONY: test
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/Readme.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,167 +0,0 @@
-# retry
-
-Abstraction for exponential and custom retry strategies for failed operations.
-
-## Installation
-
-    npm install retry
-
-## Current Status
-
-This module has been tested and is ready to be used.
-
-## Tutorial
-
-The example below will retry a potentially failing `dns.resolve` operation
-`10` times using an exponential backoff strategy. With the default settings, this
-means the last attempt is made after `34 minutes and 7 seconds`.
-
-``` javascript
-var dns = require('dns');
-var retry = require('retry');
-
-function faultTolerantResolve(address, cb) {
-  var operation = retry.operation();
-
-  operation.attempt(function(currentAttempt) {
-    dns.resolve(address, function(err, addresses) {
-      if (operation.retry(err)) {
-        return;
-      }
-
-      cb(operation.mainError(), addresses);
-    });
-  });
-}
-
-faultTolerantResolve('nodejs.org', function(err, addresses) {
-  console.log(err, addresses);
-});
-```
-
-Of course you can also configure the factors that go into the exponential
-backoff. See the API documentation below for all available settings.
-currentAttempt is an int representing the number of attempts so far.
-
-``` javascript
-var operation = retry.operation({
-  retries: 5,
-  factor: 3,
-  minTimeout: 1 * 1000,
-  maxTimeout: 60 * 1000,
-  randomize: true,
-});
-```
-
-## API
-
-### retry.operation([options])
-
-Creates a new `RetryOperation` object. See the `retry.timeouts()` function
-below for available `options`.
-
-### retry.timeouts([options])
-
-Returns an array of timeouts. All time `options` and return values are in
-milliseconds. If `options` is an array, a copy of that array is returned.
-
-`options` is a JS object that can contain any of the following keys:
-
-* `retries`: The maximum amount of times to retry the operation. Default is `10`.
-* `factor`: The exponential factor to use. Default is `2`.
-* `minTimeout`: The amount of time before starting the first retry. Default is `1000`.
-* `maxTimeout`: The maximum amount of time between two retries. Default is `Infinity`.
-* `randomize`: Randomizes the timeouts by multiplying with a factor between `1` to `2`. Default is `false`.
-
-The formula used to calculate the individual timeouts is:
-
-```
-var Math.min(random * minTimeout * Math.pow(factor, attempt), maxTimeout);
-```
-
-Have a look at [this article][article] for a better explanation of approach.
-
-If you want to tune your `factor` / `times` settings to attempt the last retry
-after a certain amount of time, you can use wolfram alpha. For example in order
-to tune for `10` attempts in `5 minutes`, you can use this equation:
-
-![screenshot](https://github.com/tim-kos/node-retry/raw/master/equation.gif)
-
-Explaining the various values from left to right:
-
-* `k = 0 ... 9`:  The `retries` value (10)
-* `1000`: The `minTimeout` value in ms (1000)
-* `x^k`: No need to change this, `x` will be your resulting factor
-* `5 * 60 * 1000`: The desired total amount of time for retrying in ms (5 minutes)
-
-To make this a little easier for you, use wolfram alpha to do the calculations:
-
-[http://www.wolframalpha.com/input/?i=Sum%5B1000*x^k%2C+{k%2C+0%2C+9}%5D+%3D+5+*+60+*+1000]()
-
-[article]: http://dthain.blogspot.com/2009/02/exponential-backoff-in-distributed.html
-
-### new RetryOperation(timeouts)
-
-Creates a new `RetryOperation` where `timeouts` is an array where each value is
-a timeout given in milliseconds.
-
-#### retryOperation.errors()
-
-Returns an array of all errors that have been passed to
-`retryOperation.retry()` so far.
-
-#### retryOperation.mainError()
-
-A reference to the error object that occured most frequently. Errors are
-compared using the `error.message` property.
-
-If multiple error messages occured the same amount of time, the last error
-object with that message is returned.
-
-If no errors occured so far, the value is `null`.
-
-#### retryOperation.attempt(fn, timeoutOps)
-
-Defines the function `fn` that is to be retried and executes it for the first
-time right away. The `fn` function can receive an optional `currentAttempt` callback that represents the number of attempts to execute `fn` so far.
-
-Optionally defines `timeoutOps` which is an object having a property `timeout` in miliseconds and a property `cb` callback function.
-Whenever your retry operation takes longer than `timeout` to execute, the timeout callback function `cb` is called.
-
-
-#### retryOperation.try(fn)
-
-This is an alias for `retryOperation.attempt(fn)`. This is deprecated.
-
-#### retryOperation.start(fn)
-
-This is an alias for `retryOperation.attempt(fn)`. This is deprecated.
-
-#### retryOperation.retry(error)
-
-Returns `false` when no `error` value is given, or the maximum amount of retries
-has been reached.
-
-Otherwise it returns `true`, and retries the operation after the timeout for
-the current attempt number.
-
-#### retryOperation.attempts()
-
-Returns an int representing the number of attempts it took to call `fn` before it was successful.
-
-## License
-
-retry is licensed under the MIT license.
-
-
-#Changelog
-
-0.6.0 Introduced optional timeOps parameter for the attempt() function which is an object having a property timeout in miliseconds and a property cb callback function. Whenever your retry operation takes longer than timeout to execute, the timeout callback function cb is called.
-
-0.5.0 Some minor refactorings.
-
-0.4.0 Changed retryOperation.try() to retryOperation.attempt(). Deprecated the aliases start() and try() for it.
-
-0.3.0 Added retryOperation.start() which is an alias for retryOperation.try().
-
-0.2.0 Added attempts() function and parameter to retryOperation.try() representing the number of attempts it took to call fn().
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/equation.gif has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/example/dns.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-var dns = require('dns');
-var retry = require('../lib/retry');
-
-function faultTolerantResolve(address, cb) {
-  var opts = {
-    times: 2,
-    factor: 2,
-    minTimeout: 1 * 1000,
-    maxTimeout: 2 * 1000,
-    randomize: true
-  };
-  var operation = retry.operation(opts);
-
-  operation.attempt(function(currentAttempt) {
-    dns.resolve(address, function(err, addresses) {
-      if (operation.retry(err)) {
-        return;
-      }
-
-      cb(operation.mainError(), operation.errors(), addresses);
-    });
-  });
-}
-
-faultTolerantResolve('nodejs.org', function(err, errors, addresses) {
-  console.warn('err:');
-  console.log(err);
-
-  console.warn('addresses:');
-  console.log(addresses);
-});
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require('./lib/retry');
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/lib/retry.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,50 +0,0 @@
-var RetryOperation = require('./retry_operation');
-
-exports.operation = function(options) {
-  var timeouts = exports.timeouts(options);
-  return new RetryOperation(timeouts);
-};
-
-exports.timeouts = function(options) {
-  if (options instanceof Array) {
-    return [].concat(options);
-  }
-
-  var opts = {
-    retries: 10,
-    factor: 2,
-    minTimeout: 1 * 1000,
-    maxTimeout: Infinity,
-    randomize: false
-  };
-  for (var key in options) {
-    opts[key] = options[key];
-  }
-
-  if (opts.minTimeout > opts.maxTimeout) {
-    throw new Error('minTimeout is greater than maxTimeout');
-  }
-
-  var timeouts = [];
-  for (var i = 0; i < opts.retries; i++) {
-    timeouts.push(this._createTimeout(i, opts));
-  }
-
-  // sort the array numerically ascending
-  timeouts.sort(function(a,b) {
-    return a - b;
-  });
-
-  return timeouts;
-};
-
-exports._createTimeout = function(attempt, opts) {
-  var random = (opts.randomize)
-    ? (Math.random() + 1)
-    : 1;
-
-  var timeout = Math.round(random * opts.minTimeout * Math.pow(opts.factor, attempt));
-  timeout = Math.min(timeout, opts.maxTimeout);
-
-  return timeout;
-};
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/lib/retry_operation.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,109 +0,0 @@
-function RetryOperation(timeouts) {
-  this._timeouts = timeouts;
-  this._fn = null;
-  this._errors = [];
-  this._attempts = 1;
-  this._operationTimeout = null;
-  this._operationTimeoutCb = null;
-  this._timeout = null;
-}
-module.exports = RetryOperation;
-
-RetryOperation.prototype.retry = function(err) {
-  if (this._timeout) {
-    clearTimeout(this._timeout);
-  }
-
-  if (!err) {
-    return false;
-  }
-
-  this._errors.push(err);
-
-  var timeout = this._timeouts.shift();
-  if (timeout === undefined) {
-    return false;
-  }
-
-  this._attempts++;
-
-  var self = this;
-  setTimeout(function() {
-    self._fn(self._attempts);
-
-    if (self._operationTimeoutCb) {
-      self._timeout = setTimeout(function() {
-        self._operationTimeoutCb(self._attempts);
-      }, self._operationTimeout);
-    }
-  }, timeout);
-
-  return true;
-};
-
-RetryOperation.prototype.attempt = function(fn, timeoutOps) {
-  this._fn = fn;
-
-  if (timeoutOps) {
-    if (timeoutOps.timeout) {
-      this._operationTimeout = timeoutOps.timeout;
-    }
-    if (timeoutOps.cb) {
-      this._operationTimeoutCb = timeoutOps.cb;
-    }
-  }
-
-  this._fn(this._attempts);
-
-  var self = this;
-  if (this._operationTimeoutCb) {
-    this._timeout = setTimeout(function() {
-      self._operationTimeoutCb();
-    }, self._operationTimeout);
-  }
-};
-
-RetryOperation.prototype.try = function(fn) {
-  console.log('Using RetryOperation.try() is deprecated');
-  this.attempt(fn);
-};
-
-RetryOperation.prototype.start = function(fn) {
-  console.log('Using RetryOperation.start() is deprecated');
-  this.attempt(fn);
-};
-
-RetryOperation.prototype.start = RetryOperation.prototype.try;
-
-RetryOperation.prototype.errors = function() {
-  return this._errors;
-};
-
-RetryOperation.prototype.attempts = function() {
-  return this._attempts;
-};
-
-RetryOperation.prototype.mainError = function() {
-  if (this._errors.length === 0) {
-    return null;
-  }
-
-  var counts = {};
-  var mainError = null;
-  var mainErrorCount = 0;
-
-  for (var i = 0; i < this._errors.length; i++) {
-    var error = this._errors[i];
-    var message = error.message;
-    var count = (counts[message] || 0) + 1;
-
-    counts[message] = count;
-
-    if (count >= mainErrorCount) {
-      mainError = error;
-      mainErrorCount = count;
-    }
-  }
-
-  return mainError;
-};
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/retry/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,29 +0,0 @@
-{
-  "author": {
-    "name": "Tim Koschützki",
-    "email": "tim@debuggable.com",
-    "url": "http://debuggable.com/"
-  },
-  "name": "retry",
-  "description": "Abstraction for exponential and custom retry strategies for failed operations.",
-  "version": "0.6.0",
-  "homepage": "https://github.com/tim-kos/node-retry",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/felixge/node-retry.git"
-  },
-  "directories": {
-    "lib": "./lib"
-  },
-  "main": "index",
-  "engines": {
-    "node": "*"
-  },
-  "dependencies": {},
-  "devDependencies": {
-    "fake": "0.2.0",
-    "far": "0.0.1"
-  },
-  "_id": "retry@0.6.0",
-  "_from": "retry"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/AUTHORS	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-# Authors sorted by whether or not they're me.
-Isaac Z. Schlueter <i@izs.me> (http://blog.izs.me)
-Wayne Larsen <wayne@larsen.st> (http://github.com/wvl)
-ritch <skawful@gmail.com>
-Marcel Laverdet
-Yosef Dinerstein <yosefd@microsoft.com>
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-Copyright 2009, 2010, 2011 Isaac Z. Schlueter.
-All rights reserved.
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-A `rm -rf` for node.
-
-Install with `npm install rimraf`, or just drop rimraf.js somewhere.
-
-## API
-
-`rimraf(f, callback)`
-
-The callback will be called with an error if there is one.  Certain
-errors are handled for you:
-
-* `EBUSY` -  rimraf will back off a maximum of opts.maxBusyTries times
-  before giving up.
-* `EMFILE` - If too many file descriptors get opened, rimraf will
-  patiently wait until more become available.
-
-
-## rimraf.sync
-
-It can remove stuff synchronously, too.  But that's not so good.  Use
-the async API.  It's better.
-
-## CLI
-
-If installed with `npm install rimraf -g` it can be used as a global
-command `rimraf <path>` which is useful for cross platform support.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/bin.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,33 +0,0 @@
-#!/usr/bin/env node
-
-var rimraf = require('./')
-
-var help = false
-var dashdash = false
-var args = process.argv.slice(2).filter(function(arg) {
-  if (dashdash)
-    return !!arg
-  else if (arg === '--')
-    dashdash = true
-  else if (arg.match(/^(-+|\/)(h(elp)?|\?)$/))
-    help = true
-  else
-    return !!arg
-});
-
-if (help || args.length === 0) {
-  // If they didn't ask for help, then this is not a "success"
-  var log = help ? console.log : console.error
-  log('Usage: rimraf <path>')
-  log('')
-  log('  Deletes all files and folders at "path" recursively.')
-  log('')
-  log('Options:')
-  log('')
-  log('  -h, --help    Display this usage info')
-  process.exit(help ? 0 : 1)
-} else {
-  args.forEach(function(arg) {
-    rimraf.sync(arg)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,65 +0,0 @@
-{
-  "name": "rimraf",
-  "version": "2.2.2",
-  "main": "rimraf.js",
-  "description": "A deep deletion module for node (like `rm -rf`)",
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "license": {
-    "type": "MIT",
-    "url": "https://github.com/isaacs/rimraf/raw/master/LICENSE"
-  },
-  "optionalDependencies": {
-    "graceful-fs": "~2"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/rimraf.git"
-  },
-  "scripts": {
-    "test": "cd test && bash run.sh"
-  },
-  "bin": {
-    "rimraf": "./bin.js"
-  },
-  "contributors": [
-    {
-      "name": "Isaac Z. Schlueter",
-      "email": "i@izs.me",
-      "url": "http://blog.izs.me"
-    },
-    {
-      "name": "Wayne Larsen",
-      "email": "wayne@larsen.st",
-      "url": "http://github.com/wvl"
-    },
-    {
-      "name": "ritch",
-      "email": "skawful@gmail.com"
-    },
-    {
-      "name": "Marcel Laverdet"
-    },
-    {
-      "name": "Yosef Dinerstein",
-      "email": "yosefd@microsoft.com"
-    }
-  ],
-  "readme": "A `rm -rf` for node.\n\nInstall with `npm install rimraf`, or just drop rimraf.js somewhere.\n\n## API\n\n`rimraf(f, callback)`\n\nThe callback will be called with an error if there is one.  Certain\nerrors are handled for you:\n\n* `EBUSY` -  rimraf will back off a maximum of opts.maxBusyTries times\n  before giving up.\n* `EMFILE` - If too many file descriptors get opened, rimraf will\n  patiently wait until more become available.\n\n\n## rimraf.sync\n\nIt can remove stuff synchronously, too.  But that's not so good.  Use\nthe async API.  It's better.\n\n## CLI\n\nIf installed with `npm install rimraf -g` it can be used as a global\ncommand `rimraf <path>` which is useful for cross platform support.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/rimraf/issues"
-  },
-  "dependencies": {
-    "graceful-fs": "~2"
-  },
-  "_id": "rimraf@2.2.2",
-  "dist": {
-    "shasum": "d99ec41dc646e55bf7a7a44a255c28bef33a8abf"
-  },
-  "_from": "rimraf@2.2.2",
-  "_resolved": "https://registry.npmjs.org/rimraf/-/rimraf-2.2.2.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/rimraf/rimraf.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,184 +0,0 @@
-module.exports = rimraf
-rimraf.sync = rimrafSync
-
-var path = require("path")
-  , fs
-
-try {
-  // optional dependency
-  fs = require("graceful-fs")
-} catch (er) {
-  fs = require("fs")
-}
-
-// for EMFILE handling
-var timeout = 0
-exports.EMFILE_MAX = 1000
-exports.BUSYTRIES_MAX = 3
-
-var isWindows = (process.platform === "win32")
-
-function rimraf (p, cb) {
-  if (!cb) throw new Error("No callback passed to rimraf()")
-
-  var busyTries = 0
-  rimraf_(p, function CB (er) {
-    if (er) {
-      if (er.code === "EBUSY" && busyTries < exports.BUSYTRIES_MAX) {
-        busyTries ++
-        var time = busyTries * 100
-        // try again, with the same exact callback as this one.
-        return setTimeout(function () {
-          rimraf_(p, CB)
-        }, time)
-      }
-
-      // this one won't happen if graceful-fs is used.
-      if (er.code === "EMFILE" && timeout < exports.EMFILE_MAX) {
-        return setTimeout(function () {
-          rimraf_(p, CB)
-        }, timeout ++)
-      }
-
-      // already gone
-      if (er.code === "ENOENT") er = null
-    }
-
-    timeout = 0
-    cb(er)
-  })
-}
-
-// Two possible strategies.
-// 1. Assume it's a file.  unlink it, then do the dir stuff on EPERM or EISDIR
-// 2. Assume it's a directory.  readdir, then do the file stuff on ENOTDIR
-//
-// Both result in an extra syscall when you guess wrong.  However, there
-// are likely far more normal files in the world than directories.  This
-// is based on the assumption that a the average number of files per
-// directory is >= 1.
-//
-// If anyone ever complains about this, then I guess the strategy could
-// be made configurable somehow.  But until then, YAGNI.
-function rimraf_ (p, cb) {
-  fs.unlink(p, function (er) {
-    if (er) {
-      if (er.code === "ENOENT")
-        return cb()
-      if (er.code === "EPERM")
-        return (isWindows) ? fixWinEPERM(p, er, cb) : rmdir(p, er, cb)
-      if (er.code === "EISDIR")
-        return rmdir(p, er, cb)
-    }
-    return cb(er)
-  })
-}
-
-function fixWinEPERM (p, er, cb) {
-  fs.chmod(p, 666, function (er2) {
-    if (er2)
-      cb(er2.code === "ENOENT" ? null : er)
-    else
-      fs.stat(p, function(er3, stats) {
-        if (er3)
-          cb(er3.code === "ENOENT" ? null : er)
-        else if (stats.isDirectory())
-          rmdir(p, er, cb)
-        else
-          fs.unlink(p, cb)
-      })
-  })
-}
-
-function fixWinEPERMSync (p, er, cb) {
-  try {
-    fs.chmodSync(p, 666)
-  } catch (er2) {
-    if (er2.code !== "ENOENT")
-      throw er
-  }
-
-  try {
-    var stats = fs.statSync(p)
-  } catch (er3) {
-    if (er3 !== "ENOENT")
-      throw er
-  }
-
-  if (stats.isDirectory())
-    rmdirSync(p, er)
-  else
-    fs.unlinkSync(p)
-}
-
-function rmdir (p, originalEr, cb) {
-  // try to rmdir first, and only readdir on ENOTEMPTY or EEXIST (SunOS)
-  // if we guessed wrong, and it's not a directory, then
-  // raise the original error.
-  fs.rmdir(p, function (er) {
-    if (er && (er.code === "ENOTEMPTY" || er.code === "EEXIST"))
-      rmkids(p, cb)
-    else if (er && er.code === "ENOTDIR")
-      cb(originalEr)
-    else
-      cb(er)
-  })
-}
-
-function rmkids(p, cb) {
-  fs.readdir(p, function (er, files) {
-    if (er)
-      return cb(er)
-    var n = files.length
-    if (n === 0)
-      return fs.rmdir(p, cb)
-    var errState
-    files.forEach(function (f) {
-      rimraf(path.join(p, f), function (er) {
-        if (errState)
-          return
-        if (er)
-          return cb(errState = er)
-        if (--n === 0)
-          fs.rmdir(p, cb)
-      })
-    })
-  })
-}
-
-// this looks simpler, and is strictly *faster*, but will
-// tie up the JavaScript thread and fail on excessively
-// deep directory trees.
-function rimrafSync (p) {
-  try {
-    fs.unlinkSync(p)
-  } catch (er) {
-    if (er.code === "ENOENT")
-      return
-    if (er.code === "EPERM")
-      return isWindows ? fixWinEPERMSync(p, er) : rmdirSync(p, er)
-    if (er.code !== "EISDIR")
-      throw er
-    rmdirSync(p, er)
-  }
-}
-
-function rmdirSync (p, originalEr) {
-  try {
-    fs.rmdirSync(p)
-  } catch (er) {
-    if (er.code === "ENOENT")
-      return
-    if (er.code === "ENOTDIR")
-      throw originalEr
-    if (er.code === "ENOTEMPTY" || er.code === "EEXIST")
-      rmkidsSync(p)
-  }
-}
-
-function rmkidsSync (p) {
-  fs.readdirSync(p).forEach(function (f) {
-    rimrafSync(path.join(p, f))
-  })
-  fs.rmdirSync(p)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-# nada
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/Makefile	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-files =  semver.browser.js \
-         semver.min.js \
-				 semver.browser.js.gz \
-				 semver.min.js.gz
-
-all: $(files)
-
-clean:
-	rm -f $(files)
-
-semver.browser.js: head.js semver.js foot.js
-	( cat head.js; \
-		cat semver.js | \
-			egrep -v '^ *\/\* nomin \*\/' | \
-			perl -pi -e 's/debug\([^\)]+\)//g'; \
-		cat foot.js ) > semver.browser.js
-
-semver.min.js: semver.browser.js
-	uglifyjs -m <semver.browser.js >semver.min.js
-
-%.gz: %
-	gzip --stdout -9 <$< >$@
-
-.PHONY: all clean
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,142 +0,0 @@
-semver(1) -- The semantic versioner for npm
-===========================================
-
-## Usage
-
-    $ npm install semver
-
-    semver.valid('1.2.3') // '1.2.3'
-    semver.valid('a.b.c') // null
-    semver.clean('  =v1.2.3   ') // '1.2.3'
-    semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true
-    semver.gt('1.2.3', '9.8.7') // false
-    semver.lt('1.2.3', '9.8.7') // true
-
-As a command-line utility:
-
-    $ semver -h
-
-    Usage: semver <version> [<version> [...]] [-r <range> | -i <inc> | -d <dec>]
-    Test if version(s) satisfy the supplied range(s), and sort them.
-
-    Multiple versions or ranges may be supplied, unless increment
-    or decrement options are specified.  In that case, only a single
-    version may be used, and it is incremented by the specified level
-
-    Program exits successfully if any valid version satisfies
-    all supplied ranges, and prints all satisfying versions.
-
-    If no versions are valid, or ranges are not satisfied,
-    then exits failure.
-
-    Versions are printed in ascending order, so supplying
-    multiple versions to the utility will just sort them.
-
-## Versions
-
-A "version" is described by the v2.0.0 specification found at
-<http://semver.org/>.
-
-A leading `"="` or `"v"` character is stripped off and ignored.
-
-## Ranges
-
-The following range styles are supported:
-
-* `1.2.3` A specific version.  When nothing else will do.  Note that
-  build metadata is still ignored, so `1.2.3+build2012` will satisfy
-  this range.
-* `>1.2.3` Greater than a specific version.
-* `<1.2.3` Less than a specific version.  If there is no prerelease
-  tag on the version range, then no prerelease version will be allowed
-  either, even though these are technically "less than".
-* `>=1.2.3` Greater than or equal to.  Note that prerelease versions
-  are NOT equal to their "normal" equivalents, so `1.2.3-beta` will
-  not satisfy this range, but `2.3.0-beta` will.
-* `<=1.2.3` Less than or equal to.  In this case, prerelease versions
-  ARE allowed, so `1.2.3-beta` would satisfy.
-* `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4`
-* `~1.2.3` := `>=1.2.3-0 <1.3.0-0`  "Reasonably close to 1.2.3".  When
-  using tilde operators, prerelease versions are supported as well,
-  but a prerelease of the next significant digit will NOT be
-  satisfactory, so `1.3.0-beta` will not satisfy `~1.2.3`.
-* `^1.2.3` := `>=1.2.3-0 <2.0.0-0`  "Compatible with 1.2.3".  When
-  using caret operators, anything from the specified version (including
-  prerelease) will be supported up to, but not including, the next
-  major version (or its prereleases). `1.5.1` will satisfy `^1.2.3`,
-  while `1.2.2` and `2.0.0-beta` will not.
-* `^0.1.3` := `>=0.1.3-0 <0.2.0-0` "Compatible with 0.1.3". 0.x.x versions are
-  special: the first non-zero component indicates potentially breaking changes,
-  meaning the caret operator matches any version with the same first non-zero
-  component starting at the specified version.
-* `^0.0.2` := `=0.0.2` "Only the version 0.0.2 is considered compatible"
-* `~1.2` := `>=1.2.0-0 <1.3.0-0` "Any version starting with 1.2"
-* `^1.2` := `>=1.2.0-0 <2.0.0-0` "Any version compatible with 1.2"
-* `1.2.x` := `>=1.2.0-0 <1.3.0-0` "Any version starting with 1.2"
-* `~1` := `>=1.0.0-0 <2.0.0-0` "Any version starting with 1"
-* `^1` := `>=1.0.0-0 <2.0.0-0` "Any version compatible with 1"
-* `1.x` := `>=1.0.0-0 <2.0.0-0` "Any version starting with 1"
-
-
-Ranges can be joined with either a space (which implies "and") or a
-`||` (which implies "or").
-
-## Functions
-
-All methods and classes take a final `loose` boolean argument that, if
-true, will be more forgiving about not-quite-valid semver strings.
-The resulting output will always be 100% strict, of course.
-
-Strict-mode Comparators and Ranges will be strict about the SemVer
-strings that they parse.
-
-* valid(v): Return the parsed version, or null if it's not valid.
-* inc(v, release): Return the version incremented by the release type
-  (major, minor, patch, or prerelease), or null if it's not valid.
-
-### Comparison
-
-* gt(v1, v2): `v1 > v2`
-* gte(v1, v2): `v1 >= v2`
-* lt(v1, v2): `v1 < v2`
-* lte(v1, v2): `v1 <= v2`
-* eq(v1, v2): `v1 == v2` This is true if they're logically equivalent,
-  even if they're not the exact same string.  You already know how to
-  compare strings.
-* neq(v1, v2): `v1 != v2` The opposite of eq.
-* cmp(v1, comparator, v2): Pass in a comparison string, and it'll call
-  the corresponding function above.  `"==="` and `"!=="` do simple
-  string comparison, but are included for completeness.  Throws if an
-  invalid comparison string is provided.
-* compare(v1, v2): Return 0 if v1 == v2, or 1 if v1 is greater, or -1 if
-  v2 is greater.  Sorts in ascending order if passed to Array.sort().
-* rcompare(v1, v2): The reverse of compare.  Sorts an array of versions
-  in descending order when passed to Array.sort().
-
-
-### Ranges
-
-* validRange(range): Return the valid range or null if it's not valid
-* satisfies(version, range): Return true if the version satisfies the
-  range.
-* maxSatisfying(versions, range): Return the highest version in the list
-  that satisfies the range, or null if none of them do.
-* gtr(version, range): Return true if version is greater than all the
-  versions possible in the range.
-* ltr(version, range): Return true if version is less than all the
-  versions possible in the range.
-* outside(version, range, hilo): Return true if the version is outside
-  the bounds of the range in either the high or low direction.  The
-  `hilo` argument must be either the string `'>'` or `'<'`.  (This is
-  the function called by `gtr` and `ltr`.)
-
-Note that, since ranges may be non-contiguous, a version might not be
-greater than a range, less than a range, *or* satisfy a range!  For
-example, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9`
-until `2.0.0`, so the version `1.2.10` would not be greater than the
-range (because 2.0.1 satisfies, which is higher), nor less than the
-range (since 1.2.8 satisfies, which is lower), and it also does not
-satisfy the range.
-
-If you want to know if a version satisfies or does not satisfy a
-range, use the `satisfies(version, range)` function.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/bin/semver	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,124 +0,0 @@
-#!/usr/bin/env node
-// Standalone semver comparison program.
-// Exits successfully and prints matching version(s) if
-// any supplied version is valid and passes all tests.
-
-var argv = process.argv.slice(2)
-  , versions = []
-  , range = []
-  , gt = []
-  , lt = []
-  , eq = []
-  , inc = null
-  , version = require("../package.json").version
-  , loose = false
-  , semver = require("../semver")
-  , reverse = false
-
-main()
-
-function main () {
-  if (!argv.length) return help()
-  while (argv.length) {
-    var a = argv.shift()
-    var i = a.indexOf('=')
-    if (i !== -1) {
-      a = a.slice(0, i)
-      argv.unshift(a.slice(i + 1))
-    }
-    switch (a) {
-      case "-rv": case "-rev": case "--rev": case "--reverse":
-        reverse = true
-        break
-      case "-l": case "--loose":
-        loose = true
-        break
-      case "-v": case "--version":
-        versions.push(argv.shift())
-        break
-      case "-i": case "--inc": case "--increment":
-        switch (argv[0]) {
-          case "major": case "minor": case "patch": case "prerelease":
-            inc = argv.shift()
-            break
-          default:
-            inc = "patch"
-            break
-        }
-        break
-      case "-r": case "--range":
-        range.push(argv.shift())
-        break
-      case "-h": case "--help": case "-?":
-        return help()
-      default:
-        versions.push(a)
-        break
-    }
-  }
-
-  versions = versions.filter(function (v) {
-    return semver.valid(v, loose)
-  })
-  if (!versions.length) return fail()
-  if (inc && (versions.length !== 1 || range.length))
-    return failInc()
-
-  for (var i = 0, l = range.length; i < l ; i ++) {
-    versions = versions.filter(function (v) {
-      return semver.satisfies(v, range[i], loose)
-    })
-    if (!versions.length) return fail()
-  }
-  return success(versions)
-}
-
-function failInc () {
-  console.error("--inc can only be used on a single version with no range")
-  fail()
-}
-
-function fail () { process.exit(1) }
-
-function success () {
-  var compare = reverse ? "rcompare" : "compare"
-  versions.sort(function (a, b) {
-    return semver[compare](a, b, loose)
-  }).map(function (v) {
-    return semver.clean(v, loose)
-  }).map(function (v) {
-    return inc ? semver.inc(v, inc, loose) : v
-  }).forEach(function (v,i,_) { console.log(v) })
-}
-
-function help () {
-  console.log(["SemVer " + version
-              ,""
-              ,"A JavaScript implementation of the http://semver.org/ specification"
-              ,"Copyright Isaac Z. Schlueter"
-              ,""
-              ,"Usage: semver [options] <version> [<version> [...]]"
-              ,"Prints valid versions sorted by SemVer precedence"
-              ,""
-              ,"Options:"
-              ,"-r --range <range>"
-              ,"        Print versions that match the specified range."
-              ,""
-              ,"-i --increment [<level>]"
-              ,"        Increment a version by the specified level.  Level can"
-              ,"        be one of: major, minor, patch, or prerelease"
-              ,"        Default level is 'patch'."
-              ,"        Only one version may be specified."
-              ,""
-              ,"-l --loose"
-              ,"        Interpret versions and ranges loosely"
-              ,""
-              ,"Program exits successfully if any valid version satisfies"
-              ,"all supplied ranges, and prints all satisfying versions."
-              ,""
-              ,"If no satisfying versions are found, then exits failure."
-              ,""
-              ,"Versions are printed in ascending order, so supplying"
-              ,"multiple versions to the utility will just sort them."
-              ].join("\n"))
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/foot.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-
-})(
-  typeof exports === 'object' ? exports :
-  typeof define === 'function' && define.amd ? {} :
-  semver = {}
-);
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/head.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-;(function(exports) {
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-{
-  "name": "semver",
-  "version": "2.2.1",
-  "description": "The semantic version parser used by npm.",
-  "main": "semver.js",
-  "browser": "semver.browser.js",
-  "min": "semver.min.js",
-  "scripts": {
-    "test": "tap test/*.js",
-    "prepublish": "make"
-  },
-  "devDependencies": {
-    "tap": "0.x >=0.0.4",
-    "uglify-js": "~2.3.6"
-  },
-  "license": "BSD",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/node-semver.git"
-  },
-  "bin": {
-    "semver": "./bin/semver"
-  },
-  "readme": "semver(1) -- The semantic versioner for npm\n===========================================\n\n## Usage\n\n    $ npm install semver\n\n    semver.valid('1.2.3') // '1.2.3'\n    semver.valid('a.b.c') // null\n    semver.clean('  =v1.2.3   ') // '1.2.3'\n    semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true\n    semver.gt('1.2.3', '9.8.7') // false\n    semver.lt('1.2.3', '9.8.7') // true\n\nAs a command-line utility:\n\n    $ semver -h\n\n    Usage: semver <version> [<version> [...]] [-r <range> | -i <inc> | -d <dec>]\n    Test if version(s) satisfy the supplied range(s), and sort them.\n\n    Multiple versions or ranges may be supplied, unless increment\n    or decrement options are specified.  In that case, only a single\n    version may be used, and it is incremented by the specified level\n\n    Program exits successfully if any valid version satisfies\n    all supplied ranges, and prints all satisfying versions.\n\n    If no versions are valid, or ranges are not satisfied,\n    then exits failure.\n\n    Versions are printed in ascending order, so supplying\n    multiple versions to the utility will just sort them.\n\n## Versions\n\nA \"version\" is described by the v2.0.0 specification found at\n<http://semver.org/>.\n\nA leading `\"=\"` or `\"v\"` character is stripped off and ignored.\n\n## Ranges\n\nThe following range styles are supported:\n\n* `1.2.3` A specific version.  When nothing else will do.  Note that\n  build metadata is still ignored, so `1.2.3+build2012` will satisfy\n  this range.\n* `>1.2.3` Greater than a specific version.\n* `<1.2.3` Less than a specific version.  If there is no prerelease\n  tag on the version range, then no prerelease version will be allowed\n  either, even though these are technically \"less than\".\n* `>=1.2.3` Greater than or equal to.  Note that prerelease versions\n  are NOT equal to their \"normal\" equivalents, so `1.2.3-beta` will\n  not satisfy this range, but `2.3.0-beta` will.\n* `<=1.2.3` Less than or equal to.  In this case, prerelease versions\n  ARE allowed, so `1.2.3-beta` would satisfy.\n* `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4`\n* `~1.2.3` := `>=1.2.3-0 <1.3.0-0`  \"Reasonably close to 1.2.3\".  When\n  using tilde operators, prerelease versions are supported as well,\n  but a prerelease of the next significant digit will NOT be\n  satisfactory, so `1.3.0-beta` will not satisfy `~1.2.3`.\n* `^1.2.3` := `>=1.2.3-0 <2.0.0-0`  \"Compatible with 1.2.3\".  When\n  using caret operators, anything from the specified version (including\n  prerelease) will be supported up to, but not including, the next\n  major version (or its prereleases). `1.5.1` will satisfy `^1.2.3`,\n  while `1.2.2` and `2.0.0-beta` will not.\n* `^0.1.3` := `>=0.1.3-0 <0.2.0-0` \"Compatible with 0.1.3\". 0.x.x versions are\n  special: the first non-zero component indicates potentially breaking changes,\n  meaning the caret operator matches any version with the same first non-zero\n  component starting at the specified version.\n* `^0.0.2` := `=0.0.2` \"Only the version 0.0.2 is considered compatible\"\n* `~1.2` := `>=1.2.0-0 <1.3.0-0` \"Any version starting with 1.2\"\n* `^1.2` := `>=1.2.0-0 <2.0.0-0` \"Any version compatible with 1.2\"\n* `1.2.x` := `>=1.2.0-0 <1.3.0-0` \"Any version starting with 1.2\"\n* `~1` := `>=1.0.0-0 <2.0.0-0` \"Any version starting with 1\"\n* `^1` := `>=1.0.0-0 <2.0.0-0` \"Any version compatible with 1\"\n* `1.x` := `>=1.0.0-0 <2.0.0-0` \"Any version starting with 1\"\n\n\nRanges can be joined with either a space (which implies \"and\") or a\n`||` (which implies \"or\").\n\n## Functions\n\nAll methods and classes take a final `loose` boolean argument that, if\ntrue, will be more forgiving about not-quite-valid semver strings.\nThe resulting output will always be 100% strict, of course.\n\nStrict-mode Comparators and Ranges will be strict about the SemVer\nstrings that they parse.\n\n* valid(v): Return the parsed version, or null if it's not valid.\n* inc(v, release): Return the version incremented by the release type\n  (major, minor, patch, or prerelease), or null if it's not valid.\n\n### Comparison\n\n* gt(v1, v2): `v1 > v2`\n* gte(v1, v2): `v1 >= v2`\n* lt(v1, v2): `v1 < v2`\n* lte(v1, v2): `v1 <= v2`\n* eq(v1, v2): `v1 == v2` This is true if they're logically equivalent,\n  even if they're not the exact same string.  You already know how to\n  compare strings.\n* neq(v1, v2): `v1 != v2` The opposite of eq.\n* cmp(v1, comparator, v2): Pass in a comparison string, and it'll call\n  the corresponding function above.  `\"===\"` and `\"!==\"` do simple\n  string comparison, but are included for completeness.  Throws if an\n  invalid comparison string is provided.\n* compare(v1, v2): Return 0 if v1 == v2, or 1 if v1 is greater, or -1 if\n  v2 is greater.  Sorts in ascending order if passed to Array.sort().\n* rcompare(v1, v2): The reverse of compare.  Sorts an array of versions\n  in descending order when passed to Array.sort().\n\n\n### Ranges\n\n* validRange(range): Return the valid range or null if it's not valid\n* satisfies(version, range): Return true if the version satisfies the\n  range.\n* maxSatisfying(versions, range): Return the highest version in the list\n  that satisfies the range, or null if none of them do.\n* gtr(version, range): Return true if version is greater than all the\n  versions possible in the range.\n* ltr(version, range): Return true if version is less than all the\n  versions possible in the range.\n* outside(version, range, hilo): Return true if the version is outside\n  the bounds of the range in either the high or low direction.  The\n  `hilo` argument must be either the string `'>'` or `'<'`.  (This is\n  the function called by `gtr` and `ltr`.)\n\nNote that, since ranges may be non-contiguous, a version might not be\ngreater than a range, less than a range, *or* satisfy a range!  For\nexample, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9`\nuntil `2.0.0`, so the version `1.2.10` would not be greater than the\nrange (because 2.0.1 satisfies, which is higher), nor less than the\nrange (since 1.2.8 satisfies, which is lower), and it also does not\nsatisfy the range.\n\nIf you want to know if a version satisfies or does not satisfy a\nrange, use the `satisfies(version, range)` function.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/node-semver/issues"
-  },
-  "homepage": "https://github.com/isaacs/node-semver",
-  "_id": "semver@2.2.1",
-  "_from": "semver@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.browser.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1007 +0,0 @@
-;(function(exports) {
-
-// export the class if we are in a Node-like system.
-if (typeof module === 'object' && module.exports === exports)
-  exports = module.exports = SemVer;
-
-// The debug function is excluded entirely from the minified version.
-
-// Note: this is the semver.org version of the spec that it implements
-// Not necessarily the package version of this code.
-exports.SEMVER_SPEC_VERSION = '2.0.0';
-
-// The actual regexps go on exports.re
-var re = exports.re = [];
-var src = exports.src = [];
-var R = 0;
-
-// The following Regular Expressions can be used for tokenizing,
-// validating, and parsing SemVer version strings.
-
-// ## Numeric Identifier
-// A single `0`, or a non-zero digit followed by zero or more digits.
-
-var NUMERICIDENTIFIER = R++;
-src[NUMERICIDENTIFIER] = '0|[1-9]\\d*';
-var NUMERICIDENTIFIERLOOSE = R++;
-src[NUMERICIDENTIFIERLOOSE] = '[0-9]+';
-
-
-// ## Non-numeric Identifier
-// Zero or more digits, followed by a letter or hyphen, and then zero or
-// more letters, digits, or hyphens.
-
-var NONNUMERICIDENTIFIER = R++;
-src[NONNUMERICIDENTIFIER] = '\\d*[a-zA-Z-][a-zA-Z0-9-]*';
-
-
-// ## Main Version
-// Three dot-separated numeric identifiers.
-
-var MAINVERSION = R++;
-src[MAINVERSION] = '(' + src[NUMERICIDENTIFIER] + ')\\.' +
-                   '(' + src[NUMERICIDENTIFIER] + ')\\.' +
-                   '(' + src[NUMERICIDENTIFIER] + ')';
-
-var MAINVERSIONLOOSE = R++;
-src[MAINVERSIONLOOSE] = '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' +
-                        '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' +
-                        '(' + src[NUMERICIDENTIFIERLOOSE] + ')';
-
-// ## Pre-release Version Identifier
-// A numeric identifier, or a non-numeric identifier.
-
-var PRERELEASEIDENTIFIER = R++;
-src[PRERELEASEIDENTIFIER] = '(?:' + src[NUMERICIDENTIFIER] +
-                            '|' + src[NONNUMERICIDENTIFIER] + ')';
-
-var PRERELEASEIDENTIFIERLOOSE = R++;
-src[PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[NUMERICIDENTIFIERLOOSE] +
-                                 '|' + src[NONNUMERICIDENTIFIER] + ')';
-
-
-// ## Pre-release Version
-// Hyphen, followed by one or more dot-separated pre-release version
-// identifiers.
-
-var PRERELEASE = R++;
-src[PRERELEASE] = '(?:-(' + src[PRERELEASEIDENTIFIER] +
-                  '(?:\\.' + src[PRERELEASEIDENTIFIER] + ')*))';
-
-var PRERELEASELOOSE = R++;
-src[PRERELEASELOOSE] = '(?:-?(' + src[PRERELEASEIDENTIFIERLOOSE] +
-                       '(?:\\.' + src[PRERELEASEIDENTIFIERLOOSE] + ')*))';
-
-// ## Build Metadata Identifier
-// Any combination of digits, letters, or hyphens.
-
-var BUILDIDENTIFIER = R++;
-src[BUILDIDENTIFIER] = '[0-9A-Za-z-]+';
-
-// ## Build Metadata
-// Plus sign, followed by one or more period-separated build metadata
-// identifiers.
-
-var BUILD = R++;
-src[BUILD] = '(?:\\+(' + src[BUILDIDENTIFIER] +
-             '(?:\\.' + src[BUILDIDENTIFIER] + ')*))';
-
-
-// ## Full Version String
-// A main version, followed optionally by a pre-release version and
-// build metadata.
-
-// Note that the only major, minor, patch, and pre-release sections of
-// the version string are capturing groups.  The build metadata is not a
-// capturing group, because it should not ever be used in version
-// comparison.
-
-var FULL = R++;
-var FULLPLAIN = 'v?' + src[MAINVERSION] +
-                src[PRERELEASE] + '?' +
-                src[BUILD] + '?';
-
-src[FULL] = '^' + FULLPLAIN + '$';
-
-// like full, but allows v1.2.3 and =1.2.3, which people do sometimes.
-// also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty
-// common in the npm registry.
-var LOOSEPLAIN = '[v=\\s]*' + src[MAINVERSIONLOOSE] +
-                 src[PRERELEASELOOSE] + '?' +
-                 src[BUILD] + '?';
-
-var LOOSE = R++;
-src[LOOSE] = '^' + LOOSEPLAIN + '$';
-
-var GTLT = R++;
-src[GTLT] = '((?:<|>)?=?)';
-
-// Something like "2.*" or "1.2.x".
-// Note that "x.x" is a valid xRange identifer, meaning "any version"
-// Only the first item is strictly required.
-var XRANGEIDENTIFIERLOOSE = R++;
-src[XRANGEIDENTIFIERLOOSE] = src[NUMERICIDENTIFIERLOOSE] + '|x|X|\\*';
-var XRANGEIDENTIFIER = R++;
-src[XRANGEIDENTIFIER] = src[NUMERICIDENTIFIER] + '|x|X|\\*';
-
-var XRANGEPLAIN = R++;
-src[XRANGEPLAIN] = '[v=\\s]*(' + src[XRANGEIDENTIFIER] + ')' +
-                   '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' +
-                   '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' +
-                   '(?:(' + src[PRERELEASE] + ')' +
-                   ')?)?)?';
-
-var XRANGEPLAINLOOSE = R++;
-src[XRANGEPLAINLOOSE] = '[v=\\s]*(' + src[XRANGEIDENTIFIERLOOSE] + ')' +
-                        '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' +
-                        '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' +
-                        '(?:(' + src[PRERELEASELOOSE] + ')' +
-                        ')?)?)?';
-
-// >=2.x, for example, means >=2.0.0-0
-// <1.x would be the same as "<1.0.0-0", though.
-var XRANGE = R++;
-src[XRANGE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAIN] + '$';
-var XRANGELOOSE = R++;
-src[XRANGELOOSE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAINLOOSE] + '$';
-
-// Tilde ranges.
-// Meaning is "reasonably at or greater than"
-var LONETILDE = R++;
-src[LONETILDE] = '(?:~>?)';
-
-var TILDETRIM = R++;
-src[TILDETRIM] = '(\\s*)' + src[LONETILDE] + '\\s+';
-re[TILDETRIM] = new RegExp(src[TILDETRIM], 'g');
-var tildeTrimReplace = '$1~';
-
-var TILDE = R++;
-src[TILDE] = '^' + src[LONETILDE] + src[XRANGEPLAIN] + '$';
-var TILDELOOSE = R++;
-src[TILDELOOSE] = '^' + src[LONETILDE] + src[XRANGEPLAINLOOSE] + '$';
-
-// Caret ranges.
-// Meaning is "at least and backwards compatible with"
-var LONECARET = R++;
-src[LONECARET] = '(?:\\^)';
-
-var CARETTRIM = R++;
-src[CARETTRIM] = '(\\s*)' + src[LONECARET] + '\\s+';
-re[CARETTRIM] = new RegExp(src[CARETTRIM], 'g');
-var caretTrimReplace = '$1^';
-
-var CARET = R++;
-src[CARET] = '^' + src[LONECARET] + src[XRANGEPLAIN] + '$';
-var CARETLOOSE = R++;
-src[CARETLOOSE] = '^' + src[LONECARET] + src[XRANGEPLAINLOOSE] + '$';
-
-// A simple gt/lt/eq thing, or just "" to indicate "any version"
-var COMPARATORLOOSE = R++;
-src[COMPARATORLOOSE] = '^' + src[GTLT] + '\\s*(' + LOOSEPLAIN + ')$|^$';
-var COMPARATOR = R++;
-src[COMPARATOR] = '^' + src[GTLT] + '\\s*(' + FULLPLAIN + ')$|^$';
-
-
-// An expression to strip any whitespace between the gtlt and the thing
-// it modifies, so that `> 1.2.3` ==> `>1.2.3`
-var COMPARATORTRIM = R++;
-src[COMPARATORTRIM] = '(\\s*)' + src[GTLT] +
-                      '\\s*(' + LOOSEPLAIN + '|' + src[XRANGEPLAIN] + ')';
-
-// this one has to use the /g flag
-re[COMPARATORTRIM] = new RegExp(src[COMPARATORTRIM], 'g');
-var comparatorTrimReplace = '$1$2$3';
-
-
-// Something like `1.2.3 - 1.2.4`
-// Note that these all use the loose form, because they'll be
-// checked against either the strict or loose comparator form
-// later.
-var HYPHENRANGE = R++;
-src[HYPHENRANGE] = '^\\s*(' + src[XRANGEPLAIN] + ')' +
-                   '\\s+-\\s+' +
-                   '(' + src[XRANGEPLAIN] + ')' +
-                   '\\s*$';
-
-var HYPHENRANGELOOSE = R++;
-src[HYPHENRANGELOOSE] = '^\\s*(' + src[XRANGEPLAINLOOSE] + ')' +
-                        '\\s+-\\s+' +
-                        '(' + src[XRANGEPLAINLOOSE] + ')' +
-                        '\\s*$';
-
-// Star ranges basically just allow anything at all.
-var STAR = R++;
-src[STAR] = '(<|>)?=?\\s*\\*';
-
-// Compile to actual regexp objects.
-// All are flag-free, unless they were created above with a flag.
-for (var i = 0; i < R; i++) {
-  ;
-  if (!re[i])
-    re[i] = new RegExp(src[i]);
-}
-
-exports.parse = parse;
-function parse(version, loose) {
-  var r = loose ? re[LOOSE] : re[FULL];
-  return (r.test(version)) ? new SemVer(version, loose) : null;
-}
-
-exports.valid = valid;
-function valid(version, loose) {
-  var v = parse(version, loose);
-  return v ? v.version : null;
-}
-
-
-exports.clean = clean;
-function clean(version, loose) {
-  var s = parse(version, loose);
-  return s ? s.version : null;
-}
-
-exports.SemVer = SemVer;
-
-function SemVer(version, loose) {
-  if (version instanceof SemVer) {
-    if (version.loose === loose)
-      return version;
-    else
-      version = version.version;
-  }
-
-  if (!(this instanceof SemVer))
-    return new SemVer(version, loose);
-
-  ;
-  this.loose = loose;
-  var m = version.trim().match(loose ? re[LOOSE] : re[FULL]);
-
-  if (!m)
-    throw new TypeError('Invalid Version: ' + version);
-
-  this.raw = version;
-
-  // these are actually numbers
-  this.major = +m[1];
-  this.minor = +m[2];
-  this.patch = +m[3];
-
-  // numberify any prerelease numeric ids
-  if (!m[4])
-    this.prerelease = [];
-  else
-    this.prerelease = m[4].split('.').map(function(id) {
-      return (/^[0-9]+$/.test(id)) ? +id : id;
-    });
-
-  this.build = m[5] ? m[5].split('.') : [];
-  this.format();
-}
-
-SemVer.prototype.format = function() {
-  this.version = this.major + '.' + this.minor + '.' + this.patch;
-  if (this.prerelease.length)
-    this.version += '-' + this.prerelease.join('.');
-  return this.version;
-};
-
-SemVer.prototype.inspect = function() {
-  return '<SemVer "' + this + '">';
-};
-
-SemVer.prototype.toString = function() {
-  return this.version;
-};
-
-SemVer.prototype.compare = function(other) {
-  ;
-  if (!(other instanceof SemVer))
-    other = new SemVer(other, this.loose);
-
-  return this.compareMain(other) || this.comparePre(other);
-};
-
-SemVer.prototype.compareMain = function(other) {
-  if (!(other instanceof SemVer))
-    other = new SemVer(other, this.loose);
-
-  return compareIdentifiers(this.major, other.major) ||
-         compareIdentifiers(this.minor, other.minor) ||
-         compareIdentifiers(this.patch, other.patch);
-};
-
-SemVer.prototype.comparePre = function(other) {
-  if (!(other instanceof SemVer))
-    other = new SemVer(other, this.loose);
-
-  // NOT having a prerelease is > having one
-  if (this.prerelease.length && !other.prerelease.length)
-    return -1;
-  else if (!this.prerelease.length && other.prerelease.length)
-    return 1;
-  else if (!this.prerelease.lenth && !other.prerelease.length)
-    return 0;
-
-  var i = 0;
-  do {
-    var a = this.prerelease[i];
-    var b = other.prerelease[i];
-    ;
-    if (a === undefined && b === undefined)
-      return 0;
-    else if (b === undefined)
-      return 1;
-    else if (a === undefined)
-      return -1;
-    else if (a === b)
-      continue;
-    else
-      return compareIdentifiers(a, b);
-  } while (++i);
-};
-
-SemVer.prototype.inc = function(release) {
-  switch (release) {
-    case 'major':
-      this.major++;
-      this.minor = -1;
-    case 'minor':
-      this.minor++;
-      this.patch = -1;
-    case 'patch':
-      this.patch++;
-      this.prerelease = [];
-      break;
-    case 'prerelease':
-      if (this.prerelease.length === 0)
-        this.prerelease = [0];
-      else {
-        var i = this.prerelease.length;
-        while (--i >= 0) {
-          if (typeof this.prerelease[i] === 'number') {
-            this.prerelease[i]++;
-            i = -2;
-          }
-        }
-        if (i === -1) // didn't increment anything
-          this.prerelease.push(0);
-      }
-      break;
-
-    default:
-      throw new Error('invalid increment argument: ' + release);
-  }
-  this.format();
-  return this;
-};
-
-exports.inc = inc;
-function inc(version, release, loose) {
-  try {
-    return new SemVer(version, loose).inc(release).version;
-  } catch (er) {
-    return null;
-  }
-}
-
-exports.compareIdentifiers = compareIdentifiers;
-
-var numeric = /^[0-9]+$/;
-function compareIdentifiers(a, b) {
-  var anum = numeric.test(a);
-  var bnum = numeric.test(b);
-
-  if (anum && bnum) {
-    a = +a;
-    b = +b;
-  }
-
-  return (anum && !bnum) ? -1 :
-         (bnum && !anum) ? 1 :
-         a < b ? -1 :
-         a > b ? 1 :
-         0;
-}
-
-exports.rcompareIdentifiers = rcompareIdentifiers;
-function rcompareIdentifiers(a, b) {
-  return compareIdentifiers(b, a);
-}
-
-exports.compare = compare;
-function compare(a, b, loose) {
-  return new SemVer(a, loose).compare(b);
-}
-
-exports.compareLoose = compareLoose;
-function compareLoose(a, b) {
-  return compare(a, b, true);
-}
-
-exports.rcompare = rcompare;
-function rcompare(a, b, loose) {
-  return compare(b, a, loose);
-}
-
-exports.sort = sort;
-function sort(list, loose) {
-  return list.sort(function(a, b) {
-    return exports.compare(a, b, loose);
-  });
-}
-
-exports.rsort = rsort;
-function rsort(list, loose) {
-  return list.sort(function(a, b) {
-    return exports.rcompare(a, b, loose);
-  });
-}
-
-exports.gt = gt;
-function gt(a, b, loose) {
-  return compare(a, b, loose) > 0;
-}
-
-exports.lt = lt;
-function lt(a, b, loose) {
-  return compare(a, b, loose) < 0;
-}
-
-exports.eq = eq;
-function eq(a, b, loose) {
-  return compare(a, b, loose) === 0;
-}
-
-exports.neq = neq;
-function neq(a, b, loose) {
-  return compare(a, b, loose) !== 0;
-}
-
-exports.gte = gte;
-function gte(a, b, loose) {
-  return compare(a, b, loose) >= 0;
-}
-
-exports.lte = lte;
-function lte(a, b, loose) {
-  return compare(a, b, loose) <= 0;
-}
-
-exports.cmp = cmp;
-function cmp(a, op, b, loose) {
-  var ret;
-  switch (op) {
-    case '===': ret = a === b; break;
-    case '!==': ret = a !== b; break;
-    case '': case '=': case '==': ret = eq(a, b, loose); break;
-    case '!=': ret = neq(a, b, loose); break;
-    case '>': ret = gt(a, b, loose); break;
-    case '>=': ret = gte(a, b, loose); break;
-    case '<': ret = lt(a, b, loose); break;
-    case '<=': ret = lte(a, b, loose); break;
-    default: throw new TypeError('Invalid operator: ' + op);
-  }
-  return ret;
-}
-
-exports.Comparator = Comparator;
-function Comparator(comp, loose) {
-  if (comp instanceof Comparator) {
-    if (comp.loose === loose)
-      return comp;
-    else
-      comp = comp.value;
-  }
-
-  if (!(this instanceof Comparator))
-    return new Comparator(comp, loose);
-
-  ;
-  this.loose = loose;
-  this.parse(comp);
-
-  if (this.semver === ANY)
-    this.value = '';
-  else
-    this.value = this.operator + this.semver.version;
-}
-
-var ANY = {};
-Comparator.prototype.parse = function(comp) {
-  var r = this.loose ? re[COMPARATORLOOSE] : re[COMPARATOR];
-  var m = comp.match(r);
-
-  if (!m)
-    throw new TypeError('Invalid comparator: ' + comp);
-
-  this.operator = m[1];
-  // if it literally is just '>' or '' then allow anything.
-  if (!m[2])
-    this.semver = ANY;
-  else {
-    this.semver = new SemVer(m[2], this.loose);
-
-    // <1.2.3-rc DOES allow 1.2.3-beta (has prerelease)
-    // >=1.2.3 DOES NOT allow 1.2.3-beta
-    // <=1.2.3 DOES allow 1.2.3-beta
-    // However, <1.2.3 does NOT allow 1.2.3-beta,
-    // even though `1.2.3-beta < 1.2.3`
-    // The assumption is that the 1.2.3 version has something you
-    // *don't* want, so we push the prerelease down to the minimum.
-    if (this.operator === '<' && !this.semver.prerelease.length) {
-      this.semver.prerelease = ['0'];
-      this.semver.format();
-    }
-  }
-};
-
-Comparator.prototype.inspect = function() {
-  return '<SemVer Comparator "' + this + '">';
-};
-
-Comparator.prototype.toString = function() {
-  return this.value;
-};
-
-Comparator.prototype.test = function(version) {
-  ;
-  return (this.semver === ANY) ? true :
-         cmp(version, this.operator, this.semver, this.loose);
-};
-
-
-exports.Range = Range;
-function Range(range, loose) {
-  if ((range instanceof Range) && range.loose === loose)
-    return range;
-
-  if (!(this instanceof Range))
-    return new Range(range, loose);
-
-  this.loose = loose;
-
-  // First, split based on boolean or ||
-  this.raw = range;
-  this.set = range.split(/\s*\|\|\s*/).map(function(range) {
-    return this.parseRange(range.trim());
-  }, this).filter(function(c) {
-    // throw out any that are not relevant for whatever reason
-    return c.length;
-  });
-
-  if (!this.set.length) {
-    throw new TypeError('Invalid SemVer Range: ' + range);
-  }
-
-  this.format();
-}
-
-Range.prototype.inspect = function() {
-  return '<SemVer Range "' + this.range + '">';
-};
-
-Range.prototype.format = function() {
-  this.range = this.set.map(function(comps) {
-    return comps.join(' ').trim();
-  }).join('||').trim();
-  return this.range;
-};
-
-Range.prototype.toString = function() {
-  return this.range;
-};
-
-Range.prototype.parseRange = function(range) {
-  var loose = this.loose;
-  range = range.trim();
-  ;
-  // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4`
-  var hr = loose ? re[HYPHENRANGELOOSE] : re[HYPHENRANGE];
-  range = range.replace(hr, hyphenReplace);
-  ;
-  // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5`
-  range = range.replace(re[COMPARATORTRIM], comparatorTrimReplace);
-  ;
-
-  // `~ 1.2.3` => `~1.2.3`
-  range = range.replace(re[TILDETRIM], tildeTrimReplace);
-
-  // `^ 1.2.3` => `^1.2.3`
-  range = range.replace(re[CARETTRIM], caretTrimReplace);
-
-  // normalize spaces
-  range = range.split(/\s+/).join(' ');
-
-  // At this point, the range is completely trimmed and
-  // ready to be split into comparators.
-
-  var compRe = loose ? re[COMPARATORLOOSE] : re[COMPARATOR];
-  var set = range.split(' ').map(function(comp) {
-    return parseComparator(comp, loose);
-  }).join(' ').split(/\s+/);
-  if (this.loose) {
-    // in loose mode, throw out any that are not valid comparators
-    set = set.filter(function(comp) {
-      return !!comp.match(compRe);
-    });
-  }
-  set = set.map(function(comp) {
-    return new Comparator(comp, loose);
-  });
-
-  return set;
-};
-
-// Mostly just for testing and legacy API reasons
-exports.toComparators = toComparators;
-function toComparators(range, loose) {
-  return new Range(range, loose).set.map(function(comp) {
-    return comp.map(function(c) {
-      return c.value;
-    }).join(' ').trim().split(' ');
-  });
-}
-
-// comprised of xranges, tildes, stars, and gtlt's at this point.
-// already replaced the hyphen ranges
-// turn into a set of JUST comparators.
-function parseComparator(comp, loose) {
-  ;
-  comp = replaceCarets(comp, loose);
-  ;
-  comp = replaceTildes(comp, loose);
-  ;
-  comp = replaceXRanges(comp, loose);
-  ;
-  comp = replaceStars(comp, loose);
-  ;
-  return comp;
-}
-
-function isX(id) {
-  return !id || id.toLowerCase() === 'x' || id === '*';
-}
-
-// ~, ~> --> * (any, kinda silly)
-// ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0
-// ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0
-// ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0
-// ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0
-// ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0
-function replaceTildes(comp, loose) {
-  return comp.trim().split(/\s+/).map(function(comp) {
-    return replaceTilde(comp, loose);
-  }).join(' ');
-}
-
-function replaceTilde(comp, loose) {
-  var r = loose ? re[TILDELOOSE] : re[TILDE];
-  return comp.replace(r, function(_, M, m, p, pr) {
-    ;
-    var ret;
-
-    if (isX(M))
-      ret = '';
-    else if (isX(m))
-      ret = '>=' + M + '.0.0-0 <' + (+M + 1) + '.0.0-0';
-    else if (isX(p))
-      // ~1.2 == >=1.2.0- <1.3.0-
-      ret = '>=' + M + '.' + m + '.0-0 <' + M + '.' + (+m + 1) + '.0-0';
-    else if (pr) {
-      ;
-      if (pr.charAt(0) !== '-')
-        pr = '-' + pr;
-      ret = '>=' + M + '.' + m + '.' + p + pr +
-            ' <' + M + '.' + (+m + 1) + '.0-0';
-    } else
-      // ~1.2.3 == >=1.2.3-0 <1.3.0-0
-      ret = '>=' + M + '.' + m + '.' + p + '-0' +
-            ' <' + M + '.' + (+m + 1) + '.0-0';
-
-    ;
-    return ret;
-  });
-}
-
-// ^ --> * (any, kinda silly)
-// ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0
-// ^2.0, ^2.0.x --> >=2.0.0 <3.0.0
-// ^1.2, ^1.2.x --> >=1.2.0 <2.0.0
-// ^1.2.3 --> >=1.2.3 <2.0.0
-// ^1.2.0 --> >=1.2.0 <2.0.0
-function replaceCarets(comp, loose) {
-  return comp.trim().split(/\s+/).map(function(comp) {
-    return replaceCaret(comp, loose);
-  }).join(' ');
-}
-
-function replaceCaret(comp, loose) {
-  var r = loose ? re[CARETLOOSE] : re[CARET];
-  return comp.replace(r, function(_, M, m, p, pr) {
-    ;
-    var ret;
-
-    if (isX(M))
-      ret = '';
-    else if (isX(m))
-      ret = '>=' + M + '.0.0-0 <' + (+M + 1) + '.0.0-0';
-    else if (isX(p)) {
-      if (M === '0')
-        ret = '>=' + M + '.' + m + '.0-0 <' + M + '.' + (+m + 1) + '.0-0';
-      else
-        ret = '>=' + M + '.' + m + '.0-0 <' + (+M + 1) + '.0.0-0';
-    } else if (pr) {
-      ;
-      if (pr.charAt(0) !== '-')
-        pr = '-' + pr;
-      if (M === '0') {
-        if (m === '0')
-          ret = '=' + M + '.' + m + '.' + p + pr;
-        else
-          ret = '>=' + M + '.' + m + '.' + p + pr +
-                ' <' + M + '.' + (+m + 1) + '.0-0';
-      } else
-        ret = '>=' + M + '.' + m + '.' + p + pr +
-              ' <' + (+M + 1) + '.0.0-0';
-    } else {
-      if (M === '0') {
-        if (m === '0')
-          ret = '=' + M + '.' + m + '.' + p;
-        else
-          ret = '>=' + M + '.' + m + '.' + p + '-0' +
-                ' <' + M + '.' + (+m + 1) + '.0-0';
-      } else
-        ret = '>=' + M + '.' + m + '.' + p + '-0' +
-              ' <' + (+M + 1) + '.0.0-0';
-    }
-
-    ;
-    return ret;
-  });
-}
-
-function replaceXRanges(comp, loose) {
-  ;
-  return comp.split(/\s+/).map(function(comp) {
-    return replaceXRange(comp, loose);
-  }).join(' ');
-}
-
-function replaceXRange(comp, loose) {
-  comp = comp.trim();
-  var r = loose ? re[XRANGELOOSE] : re[XRANGE];
-  return comp.replace(r, function(ret, gtlt, M, m, p, pr) {
-    ;
-    var xM = isX(M);
-    var xm = xM || isX(m);
-    var xp = xm || isX(p);
-    var anyX = xp;
-
-    if (gtlt === '=' && anyX)
-      gtlt = '';
-
-    if (gtlt && anyX) {
-      // replace X with 0, and then append the -0 min-prerelease
-      if (xM)
-        M = 0;
-      if (xm)
-        m = 0;
-      if (xp)
-        p = 0;
-
-      if (gtlt === '>') {
-        // >1 => >=2.0.0-0
-        // >1.2 => >=1.3.0-0
-        // >1.2.3 => >= 1.2.4-0
-        gtlt = '>=';
-        if (xM) {
-          // no change
-        } else if (xm) {
-          M = +M + 1;
-          m = 0;
-          p = 0;
-        } else if (xp) {
-          m = +m + 1;
-          p = 0;
-        }
-      }
-
-
-      ret = gtlt + M + '.' + m + '.' + p + '-0';
-    } else if (xM) {
-      // allow any
-      ret = '*';
-    } else if (xm) {
-      // append '-0' onto the version, otherwise
-      // '1.x.x' matches '2.0.0-beta', since the tag
-      // *lowers* the version value
-      ret = '>=' + M + '.0.0-0 <' + (+M + 1) + '.0.0-0';
-    } else if (xp) {
-      ret = '>=' + M + '.' + m + '.0-0 <' + M + '.' + (+m + 1) + '.0-0';
-    }
-
-    ;
-
-    return ret;
-  });
-}
-
-// Because * is AND-ed with everything else in the comparator,
-// and '' means "any version", just remove the *s entirely.
-function replaceStars(comp, loose) {
-  ;
-  // Looseness is ignored here.  star is always as loose as it gets!
-  return comp.trim().replace(re[STAR], '');
-}
-
-// This function is passed to string.replace(re[HYPHENRANGE])
-// M, m, patch, prerelease, build
-// 1.2 - 3.4.5 => >=1.2.0-0 <=3.4.5
-// 1.2.3 - 3.4 => >=1.2.0-0 <3.5.0-0 Any 3.4.x will do
-// 1.2 - 3.4 => >=1.2.0-0 <3.5.0-0
-function hyphenReplace($0,
-                       from, fM, fm, fp, fpr, fb,
-                       to, tM, tm, tp, tpr, tb) {
-
-  if (isX(fM))
-    from = '';
-  else if (isX(fm))
-    from = '>=' + fM + '.0.0-0';
-  else if (isX(fp))
-    from = '>=' + fM + '.' + fm + '.0-0';
-  else
-    from = '>=' + from;
-
-  if (isX(tM))
-    to = '';
-  else if (isX(tm))
-    to = '<' + (+tM + 1) + '.0.0-0';
-  else if (isX(tp))
-    to = '<' + tM + '.' + (+tm + 1) + '.0-0';
-  else if (tpr)
-    to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr;
-  else
-    to = '<=' + to;
-
-  return (from + ' ' + to).trim();
-}
-
-
-// if ANY of the sets match ALL of its comparators, then pass
-Range.prototype.test = function(version) {
-  if (!version)
-    return false;
-  for (var i = 0; i < this.set.length; i++) {
-    if (testSet(this.set[i], version))
-      return true;
-  }
-  return false;
-};
-
-function testSet(set, version) {
-  for (var i = 0; i < set.length; i++) {
-    if (!set[i].test(version))
-      return false;
-  }
-  return true;
-}
-
-exports.satisfies = satisfies;
-function satisfies(version, range, loose) {
-  try {
-    range = new Range(range, loose);
-  } catch (er) {
-    return false;
-  }
-  return range.test(version);
-}
-
-exports.maxSatisfying = maxSatisfying;
-function maxSatisfying(versions, range, loose) {
-  return versions.filter(function(version) {
-    return satisfies(version, range, loose);
-  }).sort(function(a, b) {
-    return rcompare(a, b, loose);
-  })[0] || null;
-}
-
-exports.validRange = validRange;
-function validRange(range, loose) {
-  try {
-    // Return '*' instead of '' so that truthiness works.
-    // This will throw if it's invalid anyway
-    return new Range(range, loose).range || '*';
-  } catch (er) {
-    return null;
-  }
-}
-
-// Determine if version is less than all the versions possible in the range
-exports.ltr = ltr;
-function ltr(version, range, loose) {
-  return outside(version, range, '<', loose);
-}
-
-// Determine if version is greater than all the versions possible in the range.
-exports.gtr = gtr;
-function gtr(version, range, loose) {
-  return outside(version, range, '>', loose);
-}
-
-exports.outside = outside;
-function outside(version, range, hilo, loose) {
-  version = new SemVer(version, loose);
-  range = new Range(range, loose);
-
-  var gtfn, ltefn, ltfn, comp, ecomp;
-  switch (hilo) {
-    case '>':
-      gtfn = gt;
-      ltefn = lte;
-      ltfn = lt;
-      comp = '>';
-      ecomp = '>=';
-      break;
-    case '<':
-      gtfn = lt;
-      ltefn = gte;
-      ltfn = gt;
-      comp = '<';
-      ecomp = '<=';
-      break;
-    default:
-      throw new TypeError('Must provide a hilo val of "<" or ">"');
-  }
-
-  // If it satisifes the range it is not outside
-  if (satisfies(version, range, loose)) {
-    return false;
-  }
-
-  // From now on, variable terms are as if we're in "gtr" mode.
-  // but note that everything is flipped for the "ltr" function.
-
-  for (var i = 0; i < range.set.length; ++i) {
-    var comparators = range.set[i];
-
-    var high = null;
-    var low = null;
-
-    comparators.forEach(function(comparator) {
-      high = high || comparator;
-      low = low || comparator;
-      if (gtfn(comparator.semver, high.semver, loose)) {
-        high = comparator;
-      } else if (ltfn(comparator.semver, low.semver, loose)) {
-        low = comparator;
-      }
-    });
-
-    // If the edge version comparator has a operator then our version
-    // isn't outside it
-    if (high.operator === comp || high.operator === ecomp) {
-      return false;
-    }
-
-    // If the lowest version comparator has an operator and our version
-    // is less than it then it isn't higher than the range
-    if ((!low.operator || low.operator === comp) &&
-        ltefn(version, low.semver)) {
-      return false;
-    } else if (low.operator === ecomp && ltfn(version, low.semver)) {
-      return false;
-    }
-  }
-  return true;
-}
-
-// Use the define() function if we're in AMD land
-if (typeof define === 'function' && define.amd)
-  define(exports);
-
-})(
-  typeof exports === 'object' ? exports :
-  typeof define === 'function' && define.amd ? {} :
-  semver = {}
-);
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.browser.js.gz has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1011 +0,0 @@
-// export the class if we are in a Node-like system.
-if (typeof module === 'object' && module.exports === exports)
-  exports = module.exports = SemVer;
-
-// The debug function is excluded entirely from the minified version.
-/* nomin */ var debug;
-/* nomin */ if (typeof process === 'object' &&
-    /* nomin */ process.env &&
-    /* nomin */ process.env.NODE_DEBUG &&
-    /* nomin */ /\bsemver\b/i.test(process.env.NODE_DEBUG))
-  /* nomin */ debug = function() {
-    /* nomin */ var args = Array.prototype.slice.call(arguments, 0);
-    /* nomin */ args.unshift('SEMVER');
-    /* nomin */ console.log.apply(console, args);
-    /* nomin */ };
-/* nomin */ else
-  /* nomin */ debug = function() {};
-
-// Note: this is the semver.org version of the spec that it implements
-// Not necessarily the package version of this code.
-exports.SEMVER_SPEC_VERSION = '2.0.0';
-
-// The actual regexps go on exports.re
-var re = exports.re = [];
-var src = exports.src = [];
-var R = 0;
-
-// The following Regular Expressions can be used for tokenizing,
-// validating, and parsing SemVer version strings.
-
-// ## Numeric Identifier
-// A single `0`, or a non-zero digit followed by zero or more digits.
-
-var NUMERICIDENTIFIER = R++;
-src[NUMERICIDENTIFIER] = '0|[1-9]\\d*';
-var NUMERICIDENTIFIERLOOSE = R++;
-src[NUMERICIDENTIFIERLOOSE] = '[0-9]+';
-
-
-// ## Non-numeric Identifier
-// Zero or more digits, followed by a letter or hyphen, and then zero or
-// more letters, digits, or hyphens.
-
-var NONNUMERICIDENTIFIER = R++;
-src[NONNUMERICIDENTIFIER] = '\\d*[a-zA-Z-][a-zA-Z0-9-]*';
-
-
-// ## Main Version
-// Three dot-separated numeric identifiers.
-
-var MAINVERSION = R++;
-src[MAINVERSION] = '(' + src[NUMERICIDENTIFIER] + ')\\.' +
-                   '(' + src[NUMERICIDENTIFIER] + ')\\.' +
-                   '(' + src[NUMERICIDENTIFIER] + ')';
-
-var MAINVERSIONLOOSE = R++;
-src[MAINVERSIONLOOSE] = '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' +
-                        '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' +
-                        '(' + src[NUMERICIDENTIFIERLOOSE] + ')';
-
-// ## Pre-release Version Identifier
-// A numeric identifier, or a non-numeric identifier.
-
-var PRERELEASEIDENTIFIER = R++;
-src[PRERELEASEIDENTIFIER] = '(?:' + src[NUMERICIDENTIFIER] +
-                            '|' + src[NONNUMERICIDENTIFIER] + ')';
-
-var PRERELEASEIDENTIFIERLOOSE = R++;
-src[PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[NUMERICIDENTIFIERLOOSE] +
-                                 '|' + src[NONNUMERICIDENTIFIER] + ')';
-
-
-// ## Pre-release Version
-// Hyphen, followed by one or more dot-separated pre-release version
-// identifiers.
-
-var PRERELEASE = R++;
-src[PRERELEASE] = '(?:-(' + src[PRERELEASEIDENTIFIER] +
-                  '(?:\\.' + src[PRERELEASEIDENTIFIER] + ')*))';
-
-var PRERELEASELOOSE = R++;
-src[PRERELEASELOOSE] = '(?:-?(' + src[PRERELEASEIDENTIFIERLOOSE] +
-                       '(?:\\.' + src[PRERELEASEIDENTIFIERLOOSE] + ')*))';
-
-// ## Build Metadata Identifier
-// Any combination of digits, letters, or hyphens.
-
-var BUILDIDENTIFIER = R++;
-src[BUILDIDENTIFIER] = '[0-9A-Za-z-]+';
-
-// ## Build Metadata
-// Plus sign, followed by one or more period-separated build metadata
-// identifiers.
-
-var BUILD = R++;
-src[BUILD] = '(?:\\+(' + src[BUILDIDENTIFIER] +
-             '(?:\\.' + src[BUILDIDENTIFIER] + ')*))';
-
-
-// ## Full Version String
-// A main version, followed optionally by a pre-release version and
-// build metadata.
-
-// Note that the only major, minor, patch, and pre-release sections of
-// the version string are capturing groups.  The build metadata is not a
-// capturing group, because it should not ever be used in version
-// comparison.
-
-var FULL = R++;
-var FULLPLAIN = 'v?' + src[MAINVERSION] +
-                src[PRERELEASE] + '?' +
-                src[BUILD] + '?';
-
-src[FULL] = '^' + FULLPLAIN + '$';
-
-// like full, but allows v1.2.3 and =1.2.3, which people do sometimes.
-// also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty
-// common in the npm registry.
-var LOOSEPLAIN = '[v=\\s]*' + src[MAINVERSIONLOOSE] +
-                 src[PRERELEASELOOSE] + '?' +
-                 src[BUILD] + '?';
-
-var LOOSE = R++;
-src[LOOSE] = '^' + LOOSEPLAIN + '$';
-
-var GTLT = R++;
-src[GTLT] = '((?:<|>)?=?)';
-
-// Something like "2.*" or "1.2.x".
-// Note that "x.x" is a valid xRange identifer, meaning "any version"
-// Only the first item is strictly required.
-var XRANGEIDENTIFIERLOOSE = R++;
-src[XRANGEIDENTIFIERLOOSE] = src[NUMERICIDENTIFIERLOOSE] + '|x|X|\\*';
-var XRANGEIDENTIFIER = R++;
-src[XRANGEIDENTIFIER] = src[NUMERICIDENTIFIER] + '|x|X|\\*';
-
-var XRANGEPLAIN = R++;
-src[XRANGEPLAIN] = '[v=\\s]*(' + src[XRANGEIDENTIFIER] + ')' +
-                   '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' +
-                   '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' +
-                   '(?:(' + src[PRERELEASE] + ')' +
-                   ')?)?)?';
-
-var XRANGEPLAINLOOSE = R++;
-src[XRANGEPLAINLOOSE] = '[v=\\s]*(' + src[XRANGEIDENTIFIERLOOSE] + ')' +
-                        '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' +
-                        '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' +
-                        '(?:(' + src[PRERELEASELOOSE] + ')' +
-                        ')?)?)?';
-
-// >=2.x, for example, means >=2.0.0-0
-// <1.x would be the same as "<1.0.0-0", though.
-var XRANGE = R++;
-src[XRANGE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAIN] + '$';
-var XRANGELOOSE = R++;
-src[XRANGELOOSE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAINLOOSE] + '$';
-
-// Tilde ranges.
-// Meaning is "reasonably at or greater than"
-var LONETILDE = R++;
-src[LONETILDE] = '(?:~>?)';
-
-var TILDETRIM = R++;
-src[TILDETRIM] = '(\\s*)' + src[LONETILDE] + '\\s+';
-re[TILDETRIM] = new RegExp(src[TILDETRIM], 'g');
-var tildeTrimReplace = '$1~';
-
-var TILDE = R++;
-src[TILDE] = '^' + src[LONETILDE] + src[XRANGEPLAIN] + '$';
-var TILDELOOSE = R++;
-src[TILDELOOSE] = '^' + src[LONETILDE] + src[XRANGEPLAINLOOSE] + '$';
-
-// Caret ranges.
-// Meaning is "at least and backwards compatible with"
-var LONECARET = R++;
-src[LONECARET] = '(?:\\^)';
-
-var CARETTRIM = R++;
-src[CARETTRIM] = '(\\s*)' + src[LONECARET] + '\\s+';
-re[CARETTRIM] = new RegExp(src[CARETTRIM], 'g');
-var caretTrimReplace = '$1^';
-
-var CARET = R++;
-src[CARET] = '^' + src[LONECARET] + src[XRANGEPLAIN] + '$';
-var CARETLOOSE = R++;
-src[CARETLOOSE] = '^' + src[LONECARET] + src[XRANGEPLAINLOOSE] + '$';
-
-// A simple gt/lt/eq thing, or just "" to indicate "any version"
-var COMPARATORLOOSE = R++;
-src[COMPARATORLOOSE] = '^' + src[GTLT] + '\\s*(' + LOOSEPLAIN + ')$|^$';
-var COMPARATOR = R++;
-src[COMPARATOR] = '^' + src[GTLT] + '\\s*(' + FULLPLAIN + ')$|^$';
-
-
-// An expression to strip any whitespace between the gtlt and the thing
-// it modifies, so that `> 1.2.3` ==> `>1.2.3`
-var COMPARATORTRIM = R++;
-src[COMPARATORTRIM] = '(\\s*)' + src[GTLT] +
-                      '\\s*(' + LOOSEPLAIN + '|' + src[XRANGEPLAIN] + ')';
-
-// this one has to use the /g flag
-re[COMPARATORTRIM] = new RegExp(src[COMPARATORTRIM], 'g');
-var comparatorTrimReplace = '$1$2$3';
-
-
-// Something like `1.2.3 - 1.2.4`
-// Note that these all use the loose form, because they'll be
-// checked against either the strict or loose comparator form
-// later.
-var HYPHENRANGE = R++;
-src[HYPHENRANGE] = '^\\s*(' + src[XRANGEPLAIN] + ')' +
-                   '\\s+-\\s+' +
-                   '(' + src[XRANGEPLAIN] + ')' +
-                   '\\s*$';
-
-var HYPHENRANGELOOSE = R++;
-src[HYPHENRANGELOOSE] = '^\\s*(' + src[XRANGEPLAINLOOSE] + ')' +
-                        '\\s+-\\s+' +
-                        '(' + src[XRANGEPLAINLOOSE] + ')' +
-                        '\\s*$';
-
-// Star ranges basically just allow anything at all.
-var STAR = R++;
-src[STAR] = '(<|>)?=?\\s*\\*';
-
-// Compile to actual regexp objects.
-// All are flag-free, unless they were created above with a flag.
-for (var i = 0; i < R; i++) {
-  debug(i, src[i]);
-  if (!re[i])
-    re[i] = new RegExp(src[i]);
-}
-
-exports.parse = parse;
-function parse(version, loose) {
-  var r = loose ? re[LOOSE] : re[FULL];
-  return (r.test(version)) ? new SemVer(version, loose) : null;
-}
-
-exports.valid = valid;
-function valid(version, loose) {
-  var v = parse(version, loose);
-  return v ? v.version : null;
-}
-
-
-exports.clean = clean;
-function clean(version, loose) {
-  var s = parse(version, loose);
-  return s ? s.version : null;
-}
-
-exports.SemVer = SemVer;
-
-function SemVer(version, loose) {
-  if (version instanceof SemVer) {
-    if (version.loose === loose)
-      return version;
-    else
-      version = version.version;
-  }
-
-  if (!(this instanceof SemVer))
-    return new SemVer(version, loose);
-
-  debug('SemVer', version, loose);
-  this.loose = loose;
-  var m = version.trim().match(loose ? re[LOOSE] : re[FULL]);
-
-  if (!m)
-    throw new TypeError('Invalid Version: ' + version);
-
-  this.raw = version;
-
-  // these are actually numbers
-  this.major = +m[1];
-  this.minor = +m[2];
-  this.patch = +m[3];
-
-  // numberify any prerelease numeric ids
-  if (!m[4])
-    this.prerelease = [];
-  else
-    this.prerelease = m[4].split('.').map(function(id) {
-      return (/^[0-9]+$/.test(id)) ? +id : id;
-    });
-
-  this.build = m[5] ? m[5].split('.') : [];
-  this.format();
-}
-
-SemVer.prototype.format = function() {
-  this.version = this.major + '.' + this.minor + '.' + this.patch;
-  if (this.prerelease.length)
-    this.version += '-' + this.prerelease.join('.');
-  return this.version;
-};
-
-SemVer.prototype.inspect = function() {
-  return '<SemVer "' + this + '">';
-};
-
-SemVer.prototype.toString = function() {
-  return this.version;
-};
-
-SemVer.prototype.compare = function(other) {
-  debug('SemVer.compare', this.version, this.loose, other);
-  if (!(other instanceof SemVer))
-    other = new SemVer(other, this.loose);
-
-  return this.compareMain(other) || this.comparePre(other);
-};
-
-SemVer.prototype.compareMain = function(other) {
-  if (!(other instanceof SemVer))
-    other = new SemVer(other, this.loose);
-
-  return compareIdentifiers(this.major, other.major) ||
-         compareIdentifiers(this.minor, other.minor) ||
-         compareIdentifiers(this.patch, other.patch);
-};
-
-SemVer.prototype.comparePre = function(other) {
-  if (!(other instanceof SemVer))
-    other = new SemVer(other, this.loose);
-
-  // NOT having a prerelease is > having one
-  if (this.prerelease.length && !other.prerelease.length)
-    return -1;
-  else if (!this.prerelease.length && other.prerelease.length)
-    return 1;
-  else if (!this.prerelease.lenth && !other.prerelease.length)
-    return 0;
-
-  var i = 0;
-  do {
-    var a = this.prerelease[i];
-    var b = other.prerelease[i];
-    debug('prerelease compare', i, a, b);
-    if (a === undefined && b === undefined)
-      return 0;
-    else if (b === undefined)
-      return 1;
-    else if (a === undefined)
-      return -1;
-    else if (a === b)
-      continue;
-    else
-      return compareIdentifiers(a, b);
-  } while (++i);
-};
-
-SemVer.prototype.inc = function(release) {
-  switch (release) {
-    case 'major':
-      this.major++;
-      this.minor = -1;
-    case 'minor':
-      this.minor++;
-      this.patch = -1;
-    case 'patch':
-      this.patch++;
-      this.prerelease = [];
-      break;
-    case 'prerelease':
-      if (this.prerelease.length === 0)
-        this.prerelease = [0];
-      else {
-        var i = this.prerelease.length;
-        while (--i >= 0) {
-          if (typeof this.prerelease[i] === 'number') {
-            this.prerelease[i]++;
-            i = -2;
-          }
-        }
-        if (i === -1) // didn't increment anything
-          this.prerelease.push(0);
-      }
-      break;
-
-    default:
-      throw new Error('invalid increment argument: ' + release);
-  }
-  this.format();
-  return this;
-};
-
-exports.inc = inc;
-function inc(version, release, loose) {
-  try {
-    return new SemVer(version, loose).inc(release).version;
-  } catch (er) {
-    return null;
-  }
-}
-
-exports.compareIdentifiers = compareIdentifiers;
-
-var numeric = /^[0-9]+$/;
-function compareIdentifiers(a, b) {
-  var anum = numeric.test(a);
-  var bnum = numeric.test(b);
-
-  if (anum && bnum) {
-    a = +a;
-    b = +b;
-  }
-
-  return (anum && !bnum) ? -1 :
-         (bnum && !anum) ? 1 :
-         a < b ? -1 :
-         a > b ? 1 :
-         0;
-}
-
-exports.rcompareIdentifiers = rcompareIdentifiers;
-function rcompareIdentifiers(a, b) {
-  return compareIdentifiers(b, a);
-}
-
-exports.compare = compare;
-function compare(a, b, loose) {
-  return new SemVer(a, loose).compare(b);
-}
-
-exports.compareLoose = compareLoose;
-function compareLoose(a, b) {
-  return compare(a, b, true);
-}
-
-exports.rcompare = rcompare;
-function rcompare(a, b, loose) {
-  return compare(b, a, loose);
-}
-
-exports.sort = sort;
-function sort(list, loose) {
-  return list.sort(function(a, b) {
-    return exports.compare(a, b, loose);
-  });
-}
-
-exports.rsort = rsort;
-function rsort(list, loose) {
-  return list.sort(function(a, b) {
-    return exports.rcompare(a, b, loose);
-  });
-}
-
-exports.gt = gt;
-function gt(a, b, loose) {
-  return compare(a, b, loose) > 0;
-}
-
-exports.lt = lt;
-function lt(a, b, loose) {
-  return compare(a, b, loose) < 0;
-}
-
-exports.eq = eq;
-function eq(a, b, loose) {
-  return compare(a, b, loose) === 0;
-}
-
-exports.neq = neq;
-function neq(a, b, loose) {
-  return compare(a, b, loose) !== 0;
-}
-
-exports.gte = gte;
-function gte(a, b, loose) {
-  return compare(a, b, loose) >= 0;
-}
-
-exports.lte = lte;
-function lte(a, b, loose) {
-  return compare(a, b, loose) <= 0;
-}
-
-exports.cmp = cmp;
-function cmp(a, op, b, loose) {
-  var ret;
-  switch (op) {
-    case '===': ret = a === b; break;
-    case '!==': ret = a !== b; break;
-    case '': case '=': case '==': ret = eq(a, b, loose); break;
-    case '!=': ret = neq(a, b, loose); break;
-    case '>': ret = gt(a, b, loose); break;
-    case '>=': ret = gte(a, b, loose); break;
-    case '<': ret = lt(a, b, loose); break;
-    case '<=': ret = lte(a, b, loose); break;
-    default: throw new TypeError('Invalid operator: ' + op);
-  }
-  return ret;
-}
-
-exports.Comparator = Comparator;
-function Comparator(comp, loose) {
-  if (comp instanceof Comparator) {
-    if (comp.loose === loose)
-      return comp;
-    else
-      comp = comp.value;
-  }
-
-  if (!(this instanceof Comparator))
-    return new Comparator(comp, loose);
-
-  debug('comparator', comp, loose);
-  this.loose = loose;
-  this.parse(comp);
-
-  if (this.semver === ANY)
-    this.value = '';
-  else
-    this.value = this.operator + this.semver.version;
-}
-
-var ANY = {};
-Comparator.prototype.parse = function(comp) {
-  var r = this.loose ? re[COMPARATORLOOSE] : re[COMPARATOR];
-  var m = comp.match(r);
-
-  if (!m)
-    throw new TypeError('Invalid comparator: ' + comp);
-
-  this.operator = m[1];
-  // if it literally is just '>' or '' then allow anything.
-  if (!m[2])
-    this.semver = ANY;
-  else {
-    this.semver = new SemVer(m[2], this.loose);
-
-    // <1.2.3-rc DOES allow 1.2.3-beta (has prerelease)
-    // >=1.2.3 DOES NOT allow 1.2.3-beta
-    // <=1.2.3 DOES allow 1.2.3-beta
-    // However, <1.2.3 does NOT allow 1.2.3-beta,
-    // even though `1.2.3-beta < 1.2.3`
-    // The assumption is that the 1.2.3 version has something you
-    // *don't* want, so we push the prerelease down to the minimum.
-    if (this.operator === '<' && !this.semver.prerelease.length) {
-      this.semver.prerelease = ['0'];
-      this.semver.format();
-    }
-  }
-};
-
-Comparator.prototype.inspect = function() {
-  return '<SemVer Comparator "' + this + '">';
-};
-
-Comparator.prototype.toString = function() {
-  return this.value;
-};
-
-Comparator.prototype.test = function(version) {
-  debug('Comparator.test', version, this.loose);
-  return (this.semver === ANY) ? true :
-         cmp(version, this.operator, this.semver, this.loose);
-};
-
-
-exports.Range = Range;
-function Range(range, loose) {
-  if ((range instanceof Range) && range.loose === loose)
-    return range;
-
-  if (!(this instanceof Range))
-    return new Range(range, loose);
-
-  this.loose = loose;
-
-  // First, split based on boolean or ||
-  this.raw = range;
-  this.set = range.split(/\s*\|\|\s*/).map(function(range) {
-    return this.parseRange(range.trim());
-  }, this).filter(function(c) {
-    // throw out any that are not relevant for whatever reason
-    return c.length;
-  });
-
-  if (!this.set.length) {
-    throw new TypeError('Invalid SemVer Range: ' + range);
-  }
-
-  this.format();
-}
-
-Range.prototype.inspect = function() {
-  return '<SemVer Range "' + this.range + '">';
-};
-
-Range.prototype.format = function() {
-  this.range = this.set.map(function(comps) {
-    return comps.join(' ').trim();
-  }).join('||').trim();
-  return this.range;
-};
-
-Range.prototype.toString = function() {
-  return this.range;
-};
-
-Range.prototype.parseRange = function(range) {
-  var loose = this.loose;
-  range = range.trim();
-  debug('range', range, loose);
-  // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4`
-  var hr = loose ? re[HYPHENRANGELOOSE] : re[HYPHENRANGE];
-  range = range.replace(hr, hyphenReplace);
-  debug('hyphen replace', range);
-  // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5`
-  range = range.replace(re[COMPARATORTRIM], comparatorTrimReplace);
-  debug('comparator trim', range, re[COMPARATORTRIM]);
-
-  // `~ 1.2.3` => `~1.2.3`
-  range = range.replace(re[TILDETRIM], tildeTrimReplace);
-
-  // `^ 1.2.3` => `^1.2.3`
-  range = range.replace(re[CARETTRIM], caretTrimReplace);
-
-  // normalize spaces
-  range = range.split(/\s+/).join(' ');
-
-  // At this point, the range is completely trimmed and
-  // ready to be split into comparators.
-
-  var compRe = loose ? re[COMPARATORLOOSE] : re[COMPARATOR];
-  var set = range.split(' ').map(function(comp) {
-    return parseComparator(comp, loose);
-  }).join(' ').split(/\s+/);
-  if (this.loose) {
-    // in loose mode, throw out any that are not valid comparators
-    set = set.filter(function(comp) {
-      return !!comp.match(compRe);
-    });
-  }
-  set = set.map(function(comp) {
-    return new Comparator(comp, loose);
-  });
-
-  return set;
-};
-
-// Mostly just for testing and legacy API reasons
-exports.toComparators = toComparators;
-function toComparators(range, loose) {
-  return new Range(range, loose).set.map(function(comp) {
-    return comp.map(function(c) {
-      return c.value;
-    }).join(' ').trim().split(' ');
-  });
-}
-
-// comprised of xranges, tildes, stars, and gtlt's at this point.
-// already replaced the hyphen ranges
-// turn into a set of JUST comparators.
-function parseComparator(comp, loose) {
-  debug('comp', comp);
-  comp = replaceCarets(comp, loose);
-  debug('caret', comp);
-  comp = replaceTildes(comp, loose);
-  debug('tildes', comp);
-  comp = replaceXRanges(comp, loose);
-  debug('xrange', comp);
-  comp = replaceStars(comp, loose);
-  debug('stars', comp);
-  return comp;
-}
-
-function isX(id) {
-  return !id || id.toLowerCase() === 'x' || id === '*';
-}
-
-// ~, ~> --> * (any, kinda silly)
-// ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0
-// ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0
-// ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0
-// ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0
-// ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0
-function replaceTildes(comp, loose) {
-  return comp.trim().split(/\s+/).map(function(comp) {
-    return replaceTilde(comp, loose);
-  }).join(' ');
-}
-
-function replaceTilde(comp, loose) {
-  var r = loose ? re[TILDELOOSE] : re[TILDE];
-  return comp.replace(r, function(_, M, m, p, pr) {
-    debug('tilde', comp, _, M, m, p, pr);
-    var ret;
-
-    if (isX(M))
-      ret = '';
-    else if (isX(m))
-      ret = '>=' + M + '.0.0-0 <' + (+M + 1) + '.0.0-0';
-    else if (isX(p))
-      // ~1.2 == >=1.2.0- <1.3.0-
-      ret = '>=' + M + '.' + m + '.0-0 <' + M + '.' + (+m + 1) + '.0-0';
-    else if (pr) {
-      debug('replaceTilde pr', pr);
-      if (pr.charAt(0) !== '-')
-        pr = '-' + pr;
-      ret = '>=' + M + '.' + m + '.' + p + pr +
-            ' <' + M + '.' + (+m + 1) + '.0-0';
-    } else
-      // ~1.2.3 == >=1.2.3-0 <1.3.0-0
-      ret = '>=' + M + '.' + m + '.' + p + '-0' +
-            ' <' + M + '.' + (+m + 1) + '.0-0';
-
-    debug('tilde return', ret);
-    return ret;
-  });
-}
-
-// ^ --> * (any, kinda silly)
-// ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0
-// ^2.0, ^2.0.x --> >=2.0.0 <3.0.0
-// ^1.2, ^1.2.x --> >=1.2.0 <2.0.0
-// ^1.2.3 --> >=1.2.3 <2.0.0
-// ^1.2.0 --> >=1.2.0 <2.0.0
-function replaceCarets(comp, loose) {
-  return comp.trim().split(/\s+/).map(function(comp) {
-    return replaceCaret(comp, loose);
-  }).join(' ');
-}
-
-function replaceCaret(comp, loose) {
-  var r = loose ? re[CARETLOOSE] : re[CARET];
-  return comp.replace(r, function(_, M, m, p, pr) {
-    debug('caret', comp, _, M, m, p, pr);
-    var ret;
-
-    if (isX(M))
-      ret = '';
-    else if (isX(m))
-      ret = '>=' + M + '.0.0-0 <' + (+M + 1) + '.0.0-0';
-    else if (isX(p)) {
-      if (M === '0')
-        ret = '>=' + M + '.' + m + '.0-0 <' + M + '.' + (+m + 1) + '.0-0';
-      else
-        ret = '>=' + M + '.' + m + '.0-0 <' + (+M + 1) + '.0.0-0';
-    } else if (pr) {
-      debug('replaceCaret pr', pr);
-      if (pr.charAt(0) !== '-')
-        pr = '-' + pr;
-      if (M === '0') {
-        if (m === '0')
-          ret = '=' + M + '.' + m + '.' + p + pr;
-        else
-          ret = '>=' + M + '.' + m + '.' + p + pr +
-                ' <' + M + '.' + (+m + 1) + '.0-0';
-      } else
-        ret = '>=' + M + '.' + m + '.' + p + pr +
-              ' <' + (+M + 1) + '.0.0-0';
-    } else {
-      if (M === '0') {
-        if (m === '0')
-          ret = '=' + M + '.' + m + '.' + p;
-        else
-          ret = '>=' + M + '.' + m + '.' + p + '-0' +
-                ' <' + M + '.' + (+m + 1) + '.0-0';
-      } else
-        ret = '>=' + M + '.' + m + '.' + p + '-0' +
-              ' <' + (+M + 1) + '.0.0-0';
-    }
-
-    debug('caret return', ret);
-    return ret;
-  });
-}
-
-function replaceXRanges(comp, loose) {
-  debug('replaceXRanges', comp, loose);
-  return comp.split(/\s+/).map(function(comp) {
-    return replaceXRange(comp, loose);
-  }).join(' ');
-}
-
-function replaceXRange(comp, loose) {
-  comp = comp.trim();
-  var r = loose ? re[XRANGELOOSE] : re[XRANGE];
-  return comp.replace(r, function(ret, gtlt, M, m, p, pr) {
-    debug('xRange', comp, ret, gtlt, M, m, p, pr);
-    var xM = isX(M);
-    var xm = xM || isX(m);
-    var xp = xm || isX(p);
-    var anyX = xp;
-
-    if (gtlt === '=' && anyX)
-      gtlt = '';
-
-    if (gtlt && anyX) {
-      // replace X with 0, and then append the -0 min-prerelease
-      if (xM)
-        M = 0;
-      if (xm)
-        m = 0;
-      if (xp)
-        p = 0;
-
-      if (gtlt === '>') {
-        // >1 => >=2.0.0-0
-        // >1.2 => >=1.3.0-0
-        // >1.2.3 => >= 1.2.4-0
-        gtlt = '>=';
-        if (xM) {
-          // no change
-        } else if (xm) {
-          M = +M + 1;
-          m = 0;
-          p = 0;
-        } else if (xp) {
-          m = +m + 1;
-          p = 0;
-        }
-      }
-
-
-      ret = gtlt + M + '.' + m + '.' + p + '-0';
-    } else if (xM) {
-      // allow any
-      ret = '*';
-    } else if (xm) {
-      // append '-0' onto the version, otherwise
-      // '1.x.x' matches '2.0.0-beta', since the tag
-      // *lowers* the version value
-      ret = '>=' + M + '.0.0-0 <' + (+M + 1) + '.0.0-0';
-    } else if (xp) {
-      ret = '>=' + M + '.' + m + '.0-0 <' + M + '.' + (+m + 1) + '.0-0';
-    }
-
-    debug('xRange return', ret);
-
-    return ret;
-  });
-}
-
-// Because * is AND-ed with everything else in the comparator,
-// and '' means "any version", just remove the *s entirely.
-function replaceStars(comp, loose) {
-  debug('replaceStars', comp, loose);
-  // Looseness is ignored here.  star is always as loose as it gets!
-  return comp.trim().replace(re[STAR], '');
-}
-
-// This function is passed to string.replace(re[HYPHENRANGE])
-// M, m, patch, prerelease, build
-// 1.2 - 3.4.5 => >=1.2.0-0 <=3.4.5
-// 1.2.3 - 3.4 => >=1.2.0-0 <3.5.0-0 Any 3.4.x will do
-// 1.2 - 3.4 => >=1.2.0-0 <3.5.0-0
-function hyphenReplace($0,
-                       from, fM, fm, fp, fpr, fb,
-                       to, tM, tm, tp, tpr, tb) {
-
-  if (isX(fM))
-    from = '';
-  else if (isX(fm))
-    from = '>=' + fM + '.0.0-0';
-  else if (isX(fp))
-    from = '>=' + fM + '.' + fm + '.0-0';
-  else
-    from = '>=' + from;
-
-  if (isX(tM))
-    to = '';
-  else if (isX(tm))
-    to = '<' + (+tM + 1) + '.0.0-0';
-  else if (isX(tp))
-    to = '<' + tM + '.' + (+tm + 1) + '.0-0';
-  else if (tpr)
-    to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr;
-  else
-    to = '<=' + to;
-
-  return (from + ' ' + to).trim();
-}
-
-
-// if ANY of the sets match ALL of its comparators, then pass
-Range.prototype.test = function(version) {
-  if (!version)
-    return false;
-  for (var i = 0; i < this.set.length; i++) {
-    if (testSet(this.set[i], version))
-      return true;
-  }
-  return false;
-};
-
-function testSet(set, version) {
-  for (var i = 0; i < set.length; i++) {
-    if (!set[i].test(version))
-      return false;
-  }
-  return true;
-}
-
-exports.satisfies = satisfies;
-function satisfies(version, range, loose) {
-  try {
-    range = new Range(range, loose);
-  } catch (er) {
-    return false;
-  }
-  return range.test(version);
-}
-
-exports.maxSatisfying = maxSatisfying;
-function maxSatisfying(versions, range, loose) {
-  return versions.filter(function(version) {
-    return satisfies(version, range, loose);
-  }).sort(function(a, b) {
-    return rcompare(a, b, loose);
-  })[0] || null;
-}
-
-exports.validRange = validRange;
-function validRange(range, loose) {
-  try {
-    // Return '*' instead of '' so that truthiness works.
-    // This will throw if it's invalid anyway
-    return new Range(range, loose).range || '*';
-  } catch (er) {
-    return null;
-  }
-}
-
-// Determine if version is less than all the versions possible in the range
-exports.ltr = ltr;
-function ltr(version, range, loose) {
-  return outside(version, range, '<', loose);
-}
-
-// Determine if version is greater than all the versions possible in the range.
-exports.gtr = gtr;
-function gtr(version, range, loose) {
-  return outside(version, range, '>', loose);
-}
-
-exports.outside = outside;
-function outside(version, range, hilo, loose) {
-  version = new SemVer(version, loose);
-  range = new Range(range, loose);
-
-  var gtfn, ltefn, ltfn, comp, ecomp;
-  switch (hilo) {
-    case '>':
-      gtfn = gt;
-      ltefn = lte;
-      ltfn = lt;
-      comp = '>';
-      ecomp = '>=';
-      break;
-    case '<':
-      gtfn = lt;
-      ltefn = gte;
-      ltfn = gt;
-      comp = '<';
-      ecomp = '<=';
-      break;
-    default:
-      throw new TypeError('Must provide a hilo val of "<" or ">"');
-  }
-
-  // If it satisifes the range it is not outside
-  if (satisfies(version, range, loose)) {
-    return false;
-  }
-
-  // From now on, variable terms are as if we're in "gtr" mode.
-  // but note that everything is flipped for the "ltr" function.
-
-  for (var i = 0; i < range.set.length; ++i) {
-    var comparators = range.set[i];
-
-    var high = null;
-    var low = null;
-
-    comparators.forEach(function(comparator) {
-      high = high || comparator;
-      low = low || comparator;
-      if (gtfn(comparator.semver, high.semver, loose)) {
-        high = comparator;
-      } else if (ltfn(comparator.semver, low.semver, loose)) {
-        low = comparator;
-      }
-    });
-
-    // If the edge version comparator has a operator then our version
-    // isn't outside it
-    if (high.operator === comp || high.operator === ecomp) {
-      return false;
-    }
-
-    // If the lowest version comparator has an operator and our version
-    // is less than it then it isn't higher than the range
-    if ((!low.operator || low.operator === comp) &&
-        ltefn(version, low.semver)) {
-      return false;
-    } else if (low.operator === ecomp && ltfn(version, low.semver)) {
-      return false;
-    }
-  }
-  return true;
-}
-
-// Use the define() function if we're in AMD land
-if (typeof define === 'function' && define.amd)
-  define(exports);
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.min.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-(function(e){if(typeof module==="object"&&module.exports===e)e=module.exports=H;e.SEMVER_SPEC_VERSION="2.0.0";var r=e.re=[];var t=e.src=[];var n=0;var i=n++;t[i]="0|[1-9]\\d*";var s=n++;t[s]="[0-9]+";var o=n++;t[o]="\\d*[a-zA-Z-][a-zA-Z0-9-]*";var a=n++;t[a]="("+t[i]+")\\."+"("+t[i]+")\\."+"("+t[i]+")";var f=n++;t[f]="("+t[s]+")\\."+"("+t[s]+")\\."+"("+t[s]+")";var u=n++;t[u]="(?:"+t[i]+"|"+t[o]+")";var l=n++;t[l]="(?:"+t[s]+"|"+t[o]+")";var c=n++;t[c]="(?:-("+t[u]+"(?:\\."+t[u]+")*))";var p=n++;t[p]="(?:-?("+t[l]+"(?:\\."+t[l]+")*))";var h=n++;t[h]="[0-9A-Za-z-]+";var v=n++;t[v]="(?:\\+("+t[h]+"(?:\\."+t[h]+")*))";var m=n++;var g="v?"+t[a]+t[c]+"?"+t[v]+"?";t[m]="^"+g+"$";var w="[v=\\s]*"+t[f]+t[p]+"?"+t[v]+"?";var d=n++;t[d]="^"+w+"$";var y=n++;t[y]="((?:<|>)?=?)";var $=n++;t[$]=t[s]+"|x|X|\\*";var j=n++;t[j]=t[i]+"|x|X|\\*";var b=n++;t[b]="[v=\\s]*("+t[j]+")"+"(?:\\.("+t[j]+")"+"(?:\\.("+t[j]+")"+"(?:("+t[c]+")"+")?)?)?";var E=n++;t[E]="[v=\\s]*("+t[$]+")"+"(?:\\.("+t[$]+")"+"(?:\\.("+t[$]+")"+"(?:("+t[p]+")"+")?)?)?";var k=n++;t[k]="^"+t[y]+"\\s*"+t[b]+"$";var x=n++;t[x]="^"+t[y]+"\\s*"+t[E]+"$";var R=n++;t[R]="(?:~>?)";var S=n++;t[S]="(\\s*)"+t[R]+"\\s+";r[S]=new RegExp(t[S],"g");var V="$1~";var I=n++;t[I]="^"+t[R]+t[b]+"$";var A=n++;t[A]="^"+t[R]+t[E]+"$";var C=n++;t[C]="(?:\\^)";var T=n++;t[T]="(\\s*)"+t[C]+"\\s+";r[T]=new RegExp(t[T],"g");var M="$1^";var z=n++;t[z]="^"+t[C]+t[b]+"$";var P=n++;t[P]="^"+t[C]+t[E]+"$";var Z=n++;t[Z]="^"+t[y]+"\\s*("+w+")$|^$";var q=n++;t[q]="^"+t[y]+"\\s*("+g+")$|^$";var L=n++;t[L]="(\\s*)"+t[y]+"\\s*("+w+"|"+t[b]+")";r[L]=new RegExp(t[L],"g");var X="$1$2$3";var _=n++;t[_]="^\\s*("+t[b]+")"+"\\s+-\\s+"+"("+t[b]+")"+"\\s*$";var N=n++;t[N]="^\\s*("+t[E]+")"+"\\s+-\\s+"+"("+t[E]+")"+"\\s*$";var O=n++;t[O]="(<|>)?=?\\s*\\*";for(var B=0;B<n;B++){if(!r[B])r[B]=new RegExp(t[B])}e.parse=D;function D(e,t){var n=t?r[d]:r[m];return n.test(e)?new H(e,t):null}e.valid=F;function F(e,r){var t=D(e,r);return t?t.version:null}e.clean=G;function G(e,r){var t=D(e,r);return t?t.version:null}e.SemVer=H;function H(e,t){if(e instanceof H){if(e.loose===t)return e;else e=e.version}if(!(this instanceof H))return new H(e,t);this.loose=t;var n=e.trim().match(t?r[d]:r[m]);if(!n)throw new TypeError("Invalid Version: "+e);this.raw=e;this.major=+n[1];this.minor=+n[2];this.patch=+n[3];if(!n[4])this.prerelease=[];else this.prerelease=n[4].split(".").map(function(e){return/^[0-9]+$/.test(e)?+e:e});this.build=n[5]?n[5].split("."):[];this.format()}H.prototype.format=function(){this.version=this.major+"."+this.minor+"."+this.patch;if(this.prerelease.length)this.version+="-"+this.prerelease.join(".");return this.version};H.prototype.inspect=function(){return'<SemVer "'+this+'">'};H.prototype.toString=function(){return this.version};H.prototype.compare=function(e){if(!(e instanceof H))e=new H(e,this.loose);return this.compareMain(e)||this.comparePre(e)};H.prototype.compareMain=function(e){if(!(e instanceof H))e=new H(e,this.loose);return Q(this.major,e.major)||Q(this.minor,e.minor)||Q(this.patch,e.patch)};H.prototype.comparePre=function(e){if(!(e instanceof H))e=new H(e,this.loose);if(this.prerelease.length&&!e.prerelease.length)return-1;else if(!this.prerelease.length&&e.prerelease.length)return 1;else if(!this.prerelease.lenth&&!e.prerelease.length)return 0;var r=0;do{var t=this.prerelease[r];var n=e.prerelease[r];if(t===undefined&&n===undefined)return 0;else if(n===undefined)return 1;else if(t===undefined)return-1;else if(t===n)continue;else return Q(t,n)}while(++r)};H.prototype.inc=function(e){switch(e){case"major":this.major++;this.minor=-1;case"minor":this.minor++;this.patch=-1;case"patch":this.patch++;this.prerelease=[];break;case"prerelease":if(this.prerelease.length===0)this.prerelease=[0];else{var r=this.prerelease.length;while(--r>=0){if(typeof this.prerelease[r]==="number"){this.prerelease[r]++;r=-2}}if(r===-1)this.prerelease.push(0)}break;default:throw new Error("invalid increment argument: "+e)}this.format();return this};e.inc=J;function J(e,r,t){try{return new H(e,t).inc(r).version}catch(n){return null}}e.compareIdentifiers=Q;var K=/^[0-9]+$/;function Q(e,r){var t=K.test(e);var n=K.test(r);if(t&&n){e=+e;r=+r}return t&&!n?-1:n&&!t?1:e<r?-1:e>r?1:0}e.rcompareIdentifiers=U;function U(e,r){return Q(r,e)}e.compare=W;function W(e,r,t){return new H(e,t).compare(r)}e.compareLoose=Y;function Y(e,r){return W(e,r,true)}e.rcompare=er;function er(e,r,t){return W(r,e,t)}e.sort=rr;function rr(r,t){return r.sort(function(r,n){return e.compare(r,n,t)})}e.rsort=tr;function tr(r,t){return r.sort(function(r,n){return e.rcompare(r,n,t)})}e.gt=nr;function nr(e,r,t){return W(e,r,t)>0}e.lt=ir;function ir(e,r,t){return W(e,r,t)<0}e.eq=sr;function sr(e,r,t){return W(e,r,t)===0}e.neq=or;function or(e,r,t){return W(e,r,t)!==0}e.gte=ar;function ar(e,r,t){return W(e,r,t)>=0}e.lte=fr;function fr(e,r,t){return W(e,r,t)<=0}e.cmp=ur;function ur(e,r,t,n){var i;switch(r){case"===":i=e===t;break;case"!==":i=e!==t;break;case"":case"=":case"==":i=sr(e,t,n);break;case"!=":i=or(e,t,n);break;case">":i=nr(e,t,n);break;case">=":i=ar(e,t,n);break;case"<":i=ir(e,t,n);break;case"<=":i=fr(e,t,n);break;default:throw new TypeError("Invalid operator: "+r)}return i}e.Comparator=lr;function lr(e,r){if(e instanceof lr){if(e.loose===r)return e;else e=e.value}if(!(this instanceof lr))return new lr(e,r);this.loose=r;this.parse(e);if(this.semver===cr)this.value="";else this.value=this.operator+this.semver.version}var cr={};lr.prototype.parse=function(e){var t=this.loose?r[Z]:r[q];var n=e.match(t);if(!n)throw new TypeError("Invalid comparator: "+e);this.operator=n[1];if(!n[2])this.semver=cr;else{this.semver=new H(n[2],this.loose);if(this.operator==="<"&&!this.semver.prerelease.length){this.semver.prerelease=["0"];this.semver.format()}}};lr.prototype.inspect=function(){return'<SemVer Comparator "'+this+'">'};lr.prototype.toString=function(){return this.value};lr.prototype.test=function(e){return this.semver===cr?true:ur(e,this.operator,this.semver,this.loose)};e.Range=pr;function pr(e,r){if(e instanceof pr&&e.loose===r)return e;if(!(this instanceof pr))return new pr(e,r);this.loose=r;this.raw=e;this.set=e.split(/\s*\|\|\s*/).map(function(e){return this.parseRange(e.trim())},this).filter(function(e){return e.length});if(!this.set.length){throw new TypeError("Invalid SemVer Range: "+e)}this.format()}pr.prototype.inspect=function(){return'<SemVer Range "'+this.range+'">'};pr.prototype.format=function(){this.range=this.set.map(function(e){return e.join(" ").trim()}).join("||").trim();return this.range};pr.prototype.toString=function(){return this.range};pr.prototype.parseRange=function(e){var t=this.loose;e=e.trim();var n=t?r[N]:r[_];e=e.replace(n,Er);e=e.replace(r[L],X);e=e.replace(r[S],V);e=e.replace(r[T],M);e=e.split(/\s+/).join(" ");var i=t?r[Z]:r[q];var s=e.split(" ").map(function(e){return vr(e,t)}).join(" ").split(/\s+/);if(this.loose){s=s.filter(function(e){return!!e.match(i)})}s=s.map(function(e){return new lr(e,t)});return s};e.toComparators=hr;function hr(e,r){return new pr(e,r).set.map(function(e){return e.map(function(e){return e.value}).join(" ").trim().split(" ")})}function vr(e,r){e=dr(e,r);e=gr(e,r);e=$r(e,r);e=br(e,r);return e}function mr(e){return!e||e.toLowerCase()==="x"||e==="*"}function gr(e,r){return e.trim().split(/\s+/).map(function(e){return wr(e,r)}).join(" ")}function wr(e,t){var n=t?r[A]:r[I];return e.replace(n,function(e,r,t,n,i){var s;if(mr(r))s="";else if(mr(t))s=">="+r+".0.0-0 <"+(+r+1)+".0.0-0";else if(mr(n))s=">="+r+"."+t+".0-0 <"+r+"."+(+t+1)+".0-0";else if(i){if(i.charAt(0)!=="-")i="-"+i;s=">="+r+"."+t+"."+n+i+" <"+r+"."+(+t+1)+".0-0"}else s=">="+r+"."+t+"."+n+"-0"+" <"+r+"."+(+t+1)+".0-0";return s})}function dr(e,r){return e.trim().split(/\s+/).map(function(e){return yr(e,r)}).join(" ")}function yr(e,t){var n=t?r[P]:r[z];return e.replace(n,function(e,r,t,n,i){var s;if(mr(r))s="";else if(mr(t))s=">="+r+".0.0-0 <"+(+r+1)+".0.0-0";else if(mr(n)){if(r==="0")s=">="+r+"."+t+".0-0 <"+r+"."+(+t+1)+".0-0";else s=">="+r+"."+t+".0-0 <"+(+r+1)+".0.0-0"}else if(i){if(i.charAt(0)!=="-")i="-"+i;if(r==="0"){if(t==="0")s="="+r+"."+t+"."+n+i;else s=">="+r+"."+t+"."+n+i+" <"+r+"."+(+t+1)+".0-0"}else s=">="+r+"."+t+"."+n+i+" <"+(+r+1)+".0.0-0"}else{if(r==="0"){if(t==="0")s="="+r+"."+t+"."+n;else s=">="+r+"."+t+"."+n+"-0"+" <"+r+"."+(+t+1)+".0-0"}else s=">="+r+"."+t+"."+n+"-0"+" <"+(+r+1)+".0.0-0"}return s})}function $r(e,r){return e.split(/\s+/).map(function(e){return jr(e,r)}).join(" ")}function jr(e,t){e=e.trim();var n=t?r[x]:r[k];return e.replace(n,function(e,r,t,n,i,s){var o=mr(t);var a=o||mr(n);var f=a||mr(i);var u=f;if(r==="="&&u)r="";if(r&&u){if(o)t=0;if(a)n=0;if(f)i=0;if(r===">"){r=">=";if(o){}else if(a){t=+t+1;n=0;i=0}else if(f){n=+n+1;i=0}}e=r+t+"."+n+"."+i+"-0"}else if(o){e="*"}else if(a){e=">="+t+".0.0-0 <"+(+t+1)+".0.0-0"}else if(f){e=">="+t+"."+n+".0-0 <"+t+"."+(+n+1)+".0-0"}return e})}function br(e,t){return e.trim().replace(r[O],"")}function Er(e,r,t,n,i,s,o,a,f,u,l,c,p){if(mr(t))r="";else if(mr(n))r=">="+t+".0.0-0";else if(mr(i))r=">="+t+"."+n+".0-0";else r=">="+r;if(mr(f))a="";else if(mr(u))a="<"+(+f+1)+".0.0-0";else if(mr(l))a="<"+f+"."+(+u+1)+".0-0";else if(c)a="<="+f+"."+u+"."+l+"-"+c;else a="<="+a;return(r+" "+a).trim()}pr.prototype.test=function(e){if(!e)return false;for(var r=0;r<this.set.length;r++){if(kr(this.set[r],e))return true}return false};function kr(e,r){for(var t=0;t<e.length;t++){if(!e[t].test(r))return false}return true}e.satisfies=xr;function xr(e,r,t){try{r=new pr(r,t)}catch(n){return false}return r.test(e)}e.maxSatisfying=Rr;function Rr(e,r,t){return e.filter(function(e){return xr(e,r,t)}).sort(function(e,r){return er(e,r,t)})[0]||null}e.validRange=Sr;function Sr(e,r){try{return new pr(e,r).range||"*"}catch(t){return null}}e.ltr=Vr;function Vr(e,r,t){return Ar(e,r,"<",t)}e.gtr=Ir;function Ir(e,r,t){return Ar(e,r,">",t)}e.outside=Ar;function Ar(e,r,t,n){e=new H(e,n);r=new pr(r,n);var i,s,o,a,f;switch(t){case">":i=nr;s=fr;o=ir;a=">";f=">=";break;case"<":i=ir;s=ar;o=nr;a="<";f="<=";break;default:throw new TypeError('Must provide a hilo val of "<" or ">"')}if(xr(e,r,n)){return false}for(var u=0;u<r.set.length;++u){var l=r.set[u];var c=null;var p=null;l.forEach(function(e){c=c||e;p=p||e;if(i(e.semver,c.semver,n)){c=e}else if(o(e.semver,p.semver,n)){p=e}});if(c.operator===a||c.operator===f){return false}if((!p.operator||p.operator===a)&&s(e,p.semver)){return false}else if(p.operator===f&&o(e,p.semver)){return false}}return true}if(typeof define==="function"&&define.amd)define(e)})(typeof exports==="object"?exports:typeof define==="function"&&define.amd?{}:semver={});
\ No newline at end of file
Binary file node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/semver/semver.min.js.gz has changed
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,4 +0,0 @@
-node_modules
-test
-.gitignore
-.travis.yml
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,46 +0,0 @@
-Copyright (c) 2013 Forbes Lindesay
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-
-The MIT License (MIT)
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE.
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,49 +0,0 @@
-# sha
-
-Check and get file hashes (using any algorithm)
-
-[![Build Status](https://travis-ci.org/ForbesLindesay/sha.png?branch=master)](https://travis-ci.org/ForbesLindesay/sha)
-[![Dependency Status](https://gemnasium.com/ForbesLindesay/sha.png)](https://gemnasium.com/ForbesLindesay/sha)
-[![NPM version](https://badge.fury.io/js/sha.png)](http://badge.fury.io/js/sha)
-
-## Installation
-
-    $ npm install sha
-
-## API
-
-### check(fileName, expected, [options,] cb) / checkSync(filename, expected, [options])
-
-Asynchronously check that `fileName` has a "hash" of `expected`.  The callback will be called with either `null` or an error (indicating that they did not match).
-
-Options:
-
-- algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash`
-
-### get(fileName, [options,] cb) / getSync(filename, [options])
-
-Asynchronously get the "hash" of `fileName`.  The callback will be called with an optional `error` object and the (lower cased) hex digest of the hash.
-
-Options:
-
-- algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash`
-
-### stream(expected, [options])
-
-Check the hash of a stream without ever buffering it.  This is a pass through stream so you can do things like:
-
-```js
-fs.createReadStream('src')
-  .pipe(sha.stream('expected'))
-  .pipe(fs.createWriteStream('dest'))
-```
-
-`dest` will be a complete copy of `src` and an error will be emitted if the hash did not match `'expected'`.
-
-Options:
-
-- algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash`
-
-## License
-
-You may use this software under the BSD or MIT.  Take your pick.  If you want me to release it under another license, open a pull request.
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,120 +0,0 @@
-'use strict'
-
-var Transform = require('stream').Transform || require('readable-stream').Transform
-var crypto = require('crypto')
-var fs
-try {
-  fs = require('graceful-fs')
-} catch (ex) {
-  fs = require('fs')
-}
-try {
-  process.binding('crypto')
-} catch (e) {
-  var er = new Error( 'crypto binding not found.\n'
-                    + 'Please build node with openssl.\n'
-                    + e.message )
-  throw er
-}
-
-exports.check = check
-exports.checkSync = checkSync
-exports.get = get
-exports.getSync = getSync
-exports.stream = stream
-
-function check(file, expected, options, cb) {
-  if (typeof options === 'function') {
-    cb = options
-    options = undefined
-  }
-  expected = expected.toLowerCase().trim()
-  get(file, options, function (er, actual) {
-    if (er) {
-      if (er.message) er.message += ' while getting shasum for ' + file
-      return cb(er)
-    }
-    if (actual === expected) return cb(null)
-    cb(new Error(
-        'shasum check failed for ' + file + '\n'
-      + 'Expected: ' + expected + '\n'
-      + 'Actual:   ' + actual))
-  })
-}
-function checkSync(file, expected, options) {
-  expected = expected.toLowerCase().trim()
-  var actual
-  try {
-    actual = getSync(file, options)
-  } catch (er) {
-    if (er.message) er.message += ' while getting shasum for ' + file
-    throw er
-  }
-  if (actual !== expected) {
-    var ex = new Error(
-        'shasum check failed for ' + file + '\n'
-      + 'Expected: ' + expected + '\n'
-      + 'Actual:   ' + actual)
-    throw ex
-  }
-}
-
-
-function get(file, options, cb) {
-  if (typeof options === 'function') {
-    cb = options
-    options = undefined
-  }
-  options = options || {}
-  var algorithm = options.algorithm || 'sha1'
-  var hash = crypto.createHash(algorithm)
-  var source = fs.createReadStream(file)
-  var errState = null
-  source
-    .on('error', function (er) {
-      if (errState) return
-      return cb(errState = er)
-    })
-    .on('data', function (chunk) {
-      if (errState) return
-      hash.update(chunk)
-    })
-    .on('end', function () {
-      if (errState) return
-      var actual = hash.digest("hex").toLowerCase().trim()
-      cb(null, actual)
-    })
-}
-
-function getSync(file, options) {
-  options = options || {}
-  var algorithm = options.algorithm || 'sha1'
-  var hash = crypto.createHash(algorithm)
-  var source = fs.readFileSync(file)
-  hash.update(source)
-  return hash.digest("hex").toLowerCase().trim()
-}
-
-function stream(expected, options) {
-  expected = expected.toLowerCase().trim()
-  options = options || {}
-  var algorithm = options.algorithm || 'sha1'
-  var hash = crypto.createHash(algorithm)
-
-  var stream = new Transform()
-  stream._transform = function (chunk, encoding, callback) {
-    hash.update(chunk)
-    stream.push(chunk)
-    callback()
-  }
-  stream._flush = function (cb) {
-    var actual = hash.digest("hex").toLowerCase().trim()
-    if (actual === expected) return cb(null)
-    cb(new Error(
-        'shasum check failed for:\n'
-      + '  Expected: ' + expected + '\n'
-      + '  Actual:   ' + actual))
-    this.push(null)
-  }
-  return stream
-}
\ No newline at end of file
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,27 +0,0 @@
-Copyright (c) Isaac Z. Schlueter ("Author")
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,768 +0,0 @@
-# readable-stream
-
-A new class of streams for Node.js
-
-This module provides the new Stream base classes introduced in Node
-v0.10, for use in Node v0.8.  You can use it to have programs that
-have to work with node v0.8, while being forward-compatible for v0.10
-and beyond.  When you drop support for v0.8, you can remove this
-module, and only use the native streams.
-
-This is almost exactly the same codebase as appears in Node v0.10.
-However:
-
-1. The exported object is actually the Readable class.  Decorating the
-   native `stream` module would be global pollution.
-2. In v0.10, you can safely use `base64` as an argument to
-   `setEncoding` in Readable streams.  However, in v0.8, the
-   StringDecoder class has no `end()` method, which is problematic for
-   Base64.  So, don't use that, because it'll break and be weird.
-
-Other than that, the API is the same as `require('stream')` in v0.10,
-so the API docs are reproduced below.
-
-----------
-
-    Stability: 2 - Unstable
-
-A stream is an abstract interface implemented by various objects in
-Node.  For example a request to an HTTP server is a stream, as is
-stdout. Streams are readable, writable, or both. All streams are
-instances of [EventEmitter][]
-
-You can load the Stream base classes by doing `require('stream')`.
-There are base classes provided for Readable streams, Writable
-streams, Duplex streams, and Transform streams.
-
-## Compatibility
-
-In earlier versions of Node, the Readable stream interface was
-simpler, but also less powerful and less useful.
-
-* Rather than waiting for you to call the `read()` method, `'data'`
-  events would start emitting immediately.  If you needed to do some
-  I/O to decide how to handle data, then you had to store the chunks
-  in some kind of buffer so that they would not be lost.
-* The `pause()` method was advisory, rather than guaranteed.  This
-  meant that you still had to be prepared to receive `'data'` events
-  even when the stream was in a paused state.
-
-In Node v0.10, the Readable class described below was added.  For
-backwards compatibility with older Node programs, Readable streams
-switch into "old mode" when a `'data'` event handler is added, or when
-the `pause()` or `resume()` methods are called.  The effect is that,
-even if you are not using the new `read()` method and `'readable'`
-event, you no longer have to worry about losing `'data'` chunks.
-
-Most programs will continue to function normally.  However, this
-introduces an edge case in the following conditions:
-
-* No `'data'` event handler is added.
-* The `pause()` and `resume()` methods are never called.
-
-For example, consider the following code:
-
-```javascript
-// WARNING!  BROKEN!
-net.createServer(function(socket) {
-
-  // we add an 'end' method, but never consume the data
-  socket.on('end', function() {
-    // It will never get here.
-    socket.end('I got your message (but didnt read it)\n');
-  });
-
-}).listen(1337);
-```
-
-In versions of node prior to v0.10, the incoming message data would be
-simply discarded.  However, in Node v0.10 and beyond, the socket will
-remain paused forever.
-
-The workaround in this situation is to call the `resume()` method to
-trigger "old mode" behavior:
-
-```javascript
-// Workaround
-net.createServer(function(socket) {
-
-  socket.on('end', function() {
-    socket.end('I got your message (but didnt read it)\n');
-  });
-
-  // start the flow of data, discarding it.
-  socket.resume();
-
-}).listen(1337);
-```
-
-In addition to new Readable streams switching into old-mode, pre-v0.10
-style streams can be wrapped in a Readable class using the `wrap()`
-method.
-
-## Class: stream.Readable
-
-<!--type=class-->
-
-A `Readable Stream` has the following methods, members, and events.
-
-Note that `stream.Readable` is an abstract class designed to be
-extended with an underlying implementation of the `_read(size)`
-method. (See below.)
-
-### new stream.Readable([options])
-
-* `options` {Object}
-  * `highWaterMark` {Number} The maximum number of bytes to store in
-    the internal buffer before ceasing to read from the underlying
-    resource.  Default=16kb
-  * `encoding` {String} If specified, then buffers will be decoded to
-    strings using the specified encoding.  Default=null
-  * `objectMode` {Boolean} Whether this stream should behave
-    as a stream of objects. Meaning that stream.read(n) returns
-    a single value instead of a Buffer of size n
-
-In classes that extend the Readable class, make sure to call the
-constructor so that the buffering settings can be properly
-initialized.
-
-### readable.\_read(size)
-
-* `size` {Number} Number of bytes to read asynchronously
-
-Note: **This function should NOT be called directly.**  It should be
-implemented by child classes, and called by the internal Readable
-class methods only.
-
-All Readable stream implementations must provide a `_read` method
-to fetch data from the underlying resource.
-
-This method is prefixed with an underscore because it is internal to
-the class that defines it, and should not be called directly by user
-programs.  However, you **are** expected to override this method in
-your own extension classes.
-
-When data is available, put it into the read queue by calling
-`readable.push(chunk)`.  If `push` returns false, then you should stop
-reading.  When `_read` is called again, you should start pushing more
-data.
-
-The `size` argument is advisory.  Implementations where a "read" is a
-single call that returns data can use this to know how much data to
-fetch.  Implementations where that is not relevant, such as TCP or
-TLS, may ignore this argument, and simply provide data whenever it
-becomes available.  There is no need, for example to "wait" until
-`size` bytes are available before calling `stream.push(chunk)`.
-
-### readable.push(chunk)
-
-* `chunk` {Buffer | null | String} Chunk of data to push into the read queue
-* return {Boolean} Whether or not more pushes should be performed
-
-Note: **This function should be called by Readable implementors, NOT
-by consumers of Readable subclasses.**  The `_read()` function will not
-be called again until at least one `push(chunk)` call is made.  If no
-data is available, then you MAY call `push('')` (an empty string) to
-allow a future `_read` call, without adding any data to the queue.
-
-The `Readable` class works by putting data into a read queue to be
-pulled out later by calling the `read()` method when the `'readable'`
-event fires.
-
-The `push()` method will explicitly insert some data into the read
-queue.  If it is called with `null` then it will signal the end of the
-data.
-
-In some cases, you may be wrapping a lower-level source which has some
-sort of pause/resume mechanism, and a data callback.  In those cases,
-you could wrap the low-level source object by doing something like
-this:
-
-```javascript
-// source is an object with readStop() and readStart() methods,
-// and an `ondata` member that gets called when it has data, and
-// an `onend` member that gets called when the data is over.
-
-var stream = new Readable();
-
-source.ondata = function(chunk) {
-  // if push() returns false, then we need to stop reading from source
-  if (!stream.push(chunk))
-    source.readStop();
-};
-
-source.onend = function() {
-  stream.push(null);
-};
-
-// _read will be called when the stream wants to pull more data in
-// the advisory size argument is ignored in this case.
-stream._read = function(n) {
-  source.readStart();
-};
-```
-
-### readable.unshift(chunk)
-
-* `chunk` {Buffer | null | String} Chunk of data to unshift onto the read queue
-* return {Boolean} Whether or not more pushes should be performed
-
-This is the corollary of `readable.push(chunk)`.  Rather than putting
-the data at the *end* of the read queue, it puts it at the *front* of
-the read queue.
-
-This is useful in certain use-cases where a stream is being consumed
-by a parser, which needs to "un-consume" some data that it has
-optimistically pulled out of the source.
-
-```javascript
-// A parser for a simple data protocol.
-// The "header" is a JSON object, followed by 2 \n characters, and
-// then a message body.
-//
-// Note: This can be done more simply as a Transform stream.  See below.
-
-function SimpleProtocol(source, options) {
-  if (!(this instanceof SimpleProtocol))
-    return new SimpleProtocol(options);
-
-  Readable.call(this, options);
-  this._inBody = false;
-  this._sawFirstCr = false;
-
-  // source is a readable stream, such as a socket or file
-  this._source = source;
-
-  var self = this;
-  source.on('end', function() {
-    self.push(null);
-  });
-
-  // give it a kick whenever the source is readable
-  // read(0) will not consume any bytes
-  source.on('readable', function() {
-    self.read(0);
-  });
-
-  this._rawHeader = [];
-  this.header = null;
-}
-
-SimpleProtocol.prototype = Object.create(
-  Readable.prototype, { constructor: { value: SimpleProtocol }});
-
-SimpleProtocol.prototype._read = function(n) {
-  if (!this._inBody) {
-    var chunk = this._source.read();
-
-    // if the source doesn't have data, we don't have data yet.
-    if (chunk === null)
-      return this.push('');
-
-    // check if the chunk has a \n\n
-    var split = -1;
-    for (var i = 0; i < chunk.length; i++) {
-      if (chunk[i] === 10) { // '\n'
-        if (this._sawFirstCr) {
-          split = i;
-          break;
-        } else {
-          this._sawFirstCr = true;
-        }
-      } else {
-        this._sawFirstCr = false;
-      }
-    }
-
-    if (split === -1) {
-      // still waiting for the \n\n
-      // stash the chunk, and try again.
-      this._rawHeader.push(chunk);
-      this.push('');
-    } else {
-      this._inBody = true;
-      var h = chunk.slice(0, split);
-      this._rawHeader.push(h);
-      var header = Buffer.concat(this._rawHeader).toString();
-      try {
-        this.header = JSON.parse(header);
-      } catch (er) {
-        this.emit('error', new Error('invalid simple protocol data'));
-        return;
-      }
-      // now, because we got some extra data, unshift the rest
-      // back into the read queue so that our consumer will see it.
-      var b = chunk.slice(split);
-      this.unshift(b);
-
-      // and let them know that we are done parsing the header.
-      this.emit('header', this.header);
-    }
-  } else {
-    // from there on, just provide the data to our consumer.
-    // careful not to push(null), since that would indicate EOF.
-    var chunk = this._source.read();
-    if (chunk) this.push(chunk);
-  }
-};
-
-// Usage:
-var parser = new SimpleProtocol(source);
-// Now parser is a readable stream that will emit 'header'
-// with the parsed header data.
-```
-
-### readable.wrap(stream)
-
-* `stream` {Stream} An "old style" readable stream
-
-If you are using an older Node library that emits `'data'` events and
-has a `pause()` method that is advisory only, then you can use the
-`wrap()` method to create a Readable stream that uses the old stream
-as its data source.
-
-For example:
-
-```javascript
-var OldReader = require('./old-api-module.js').OldReader;
-var oreader = new OldReader;
-var Readable = require('stream').Readable;
-var myReader = new Readable().wrap(oreader);
-
-myReader.on('readable', function() {
-  myReader.read(); // etc.
-});
-```
-
-### Event: 'readable'
-
-When there is data ready to be consumed, this event will fire.
-
-When this event emits, call the `read()` method to consume the data.
-
-### Event: 'end'
-
-Emitted when the stream has received an EOF (FIN in TCP terminology).
-Indicates that no more `'data'` events will happen. If the stream is
-also writable, it may be possible to continue writing.
-
-### Event: 'data'
-
-The `'data'` event emits either a `Buffer` (by default) or a string if
-`setEncoding()` was used.
-
-Note that adding a `'data'` event listener will switch the Readable
-stream into "old mode", where data is emitted as soon as it is
-available, rather than waiting for you to call `read()` to consume it.
-
-### Event: 'error'
-
-Emitted if there was an error receiving data.
-
-### Event: 'close'
-
-Emitted when the underlying resource (for example, the backing file
-descriptor) has been closed. Not all streams will emit this.
-
-### readable.setEncoding(encoding)
-
-Makes the `'data'` event emit a string instead of a `Buffer`. `encoding`
-can be `'utf8'`, `'utf16le'` (`'ucs2'`), `'ascii'`, or `'hex'`.
-
-The encoding can also be set by specifying an `encoding` field to the
-constructor.
-
-### readable.read([size])
-
-* `size` {Number | null} Optional number of bytes to read.
-* Return: {Buffer | String | null}
-
-Note: **This function SHOULD be called by Readable stream users.**
-
-Call this method to consume data once the `'readable'` event is
-emitted.
-
-The `size` argument will set a minimum number of bytes that you are
-interested in.  If not set, then the entire content of the internal
-buffer is returned.
-
-If there is no data to consume, or if there are fewer bytes in the
-internal buffer than the `size` argument, then `null` is returned, and
-a future `'readable'` event will be emitted when more is available.
-
-Calling `stream.read(0)` will always return `null`, and will trigger a
-refresh of the internal buffer, but otherwise be a no-op.
-
-### readable.pipe(destination, [options])
-
-* `destination` {Writable Stream}
-* `options` {Object} Optional
-  * `end` {Boolean} Default=true
-
-Connects this readable stream to `destination` WriteStream. Incoming
-data on this stream gets written to `destination`.  Properly manages
-back-pressure so that a slow destination will not be overwhelmed by a
-fast readable stream.
-
-This function returns the `destination` stream.
-
-For example, emulating the Unix `cat` command:
-
-    process.stdin.pipe(process.stdout);
-
-By default `end()` is called on the destination when the source stream
-emits `end`, so that `destination` is no longer writable. Pass `{ end:
-false }` as `options` to keep the destination stream open.
-
-This keeps `writer` open so that "Goodbye" can be written at the
-end.
-
-    reader.pipe(writer, { end: false });
-    reader.on("end", function() {
-      writer.end("Goodbye\n");
-    });
-
-Note that `process.stderr` and `process.stdout` are never closed until
-the process exits, regardless of the specified options.
-
-### readable.unpipe([destination])
-
-* `destination` {Writable Stream} Optional
-
-Undo a previously established `pipe()`.  If no destination is
-provided, then all previously established pipes are removed.
-
-### readable.pause()
-
-Switches the readable stream into "old mode", where data is emitted
-using a `'data'` event rather than being buffered for consumption via
-the `read()` method.
-
-Ceases the flow of data.  No `'data'` events are emitted while the
-stream is in a paused state.
-
-### readable.resume()
-
-Switches the readable stream into "old mode", where data is emitted
-using a `'data'` event rather than being buffered for consumption via
-the `read()` method.
-
-Resumes the incoming `'data'` events after a `pause()`.
-
-
-## Class: stream.Writable
-
-<!--type=class-->
-
-A `Writable` Stream has the following methods, members, and events.
-
-Note that `stream.Writable` is an abstract class designed to be
-extended with an underlying implementation of the
-`_write(chunk, encoding, cb)` method. (See below.)
-
-### new stream.Writable([options])
-
-* `options` {Object}
-  * `highWaterMark` {Number} Buffer level when `write()` starts
-    returning false. Default=16kb
-  * `decodeStrings` {Boolean} Whether or not to decode strings into
-    Buffers before passing them to `_write()`.  Default=true
-
-In classes that extend the Writable class, make sure to call the
-constructor so that the buffering settings can be properly
-initialized.
-
-### writable.\_write(chunk, encoding, callback)
-
-* `chunk` {Buffer | String} The chunk to be written.  Will always
-  be a buffer unless the `decodeStrings` option was set to `false`.
-* `encoding` {String} If the chunk is a string, then this is the
-  encoding type.  Ignore chunk is a buffer.  Note that chunk will
-  **always** be a buffer unless the `decodeStrings` option is
-  explicitly set to `false`.
-* `callback` {Function} Call this function (optionally with an error
-  argument) when you are done processing the supplied chunk.
-
-All Writable stream implementations must provide a `_write` method to
-send data to the underlying resource.
-
-Note: **This function MUST NOT be called directly.**  It should be
-implemented by child classes, and called by the internal Writable
-class methods only.
-
-Call the callback using the standard `callback(error)` pattern to
-signal that the write completed successfully or with an error.
-
-If the `decodeStrings` flag is set in the constructor options, then
-`chunk` may be a string rather than a Buffer, and `encoding` will
-indicate the sort of string that it is.  This is to support
-implementations that have an optimized handling for certain string
-data encodings.  If you do not explicitly set the `decodeStrings`
-option to `false`, then you can safely ignore the `encoding` argument,
-and assume that `chunk` will always be a Buffer.
-
-This method is prefixed with an underscore because it is internal to
-the class that defines it, and should not be called directly by user
-programs.  However, you **are** expected to override this method in
-your own extension classes.
-
-
-### writable.write(chunk, [encoding], [callback])
-
-* `chunk` {Buffer | String} Data to be written
-* `encoding` {String} Optional.  If `chunk` is a string, then encoding
-  defaults to `'utf8'`
-* `callback` {Function} Optional.  Called when this chunk is
-  successfully written.
-* Returns {Boolean}
-
-Writes `chunk` to the stream.  Returns `true` if the data has been
-flushed to the underlying resource.  Returns `false` to indicate that
-the buffer is full, and the data will be sent out in the future. The
-`'drain'` event will indicate when the buffer is empty again.
-
-The specifics of when `write()` will return false, is determined by
-the `highWaterMark` option provided to the constructor.
-
-### writable.end([chunk], [encoding], [callback])
-
-* `chunk` {Buffer | String} Optional final data to be written
-* `encoding` {String} Optional.  If `chunk` is a string, then encoding
-  defaults to `'utf8'`
-* `callback` {Function} Optional.  Called when the final chunk is
-  successfully written.
-
-Call this method to signal the end of the data being written to the
-stream.
-
-### Event: 'drain'
-
-Emitted when the stream's write queue empties and it's safe to write
-without buffering again. Listen for it when `stream.write()` returns
-`false`.
-
-### Event: 'close'
-
-Emitted when the underlying resource (for example, the backing file
-descriptor) has been closed. Not all streams will emit this.
-
-### Event: 'finish'
-
-When `end()` is called and there are no more chunks to write, this
-event is emitted.
-
-### Event: 'pipe'
-
-* `source` {Readable Stream}
-
-Emitted when the stream is passed to a readable stream's pipe method.
-
-### Event 'unpipe'
-
-* `source` {Readable Stream}
-
-Emitted when a previously established `pipe()` is removed using the
-source Readable stream's `unpipe()` method.
-
-## Class: stream.Duplex
-
-<!--type=class-->
-
-A "duplex" stream is one that is both Readable and Writable, such as a
-TCP socket connection.
-
-Note that `stream.Duplex` is an abstract class designed to be
-extended with an underlying implementation of the `_read(size)`
-and `_write(chunk, encoding, callback)` methods as you would with a Readable or
-Writable stream class.
-
-Since JavaScript doesn't have multiple prototypal inheritance, this
-class prototypally inherits from Readable, and then parasitically from
-Writable.  It is thus up to the user to implement both the lowlevel
-`_read(n)` method as well as the lowlevel `_write(chunk, encoding, cb)` method
-on extension duplex classes.
-
-### new stream.Duplex(options)
-
-* `options` {Object} Passed to both Writable and Readable
-  constructors. Also has the following fields:
-  * `allowHalfOpen` {Boolean} Default=true.  If set to `false`, then
-    the stream will automatically end the readable side when the
-    writable side ends and vice versa.
-
-In classes that extend the Duplex class, make sure to call the
-constructor so that the buffering settings can be properly
-initialized.
-
-## Class: stream.Transform
-
-A "transform" stream is a duplex stream where the output is causally
-connected in some way to the input, such as a zlib stream or a crypto
-stream.
-
-There is no requirement that the output be the same size as the input,
-the same number of chunks, or arrive at the same time.  For example, a
-Hash stream will only ever have a single chunk of output which is
-provided when the input is ended.  A zlib stream will either produce
-much smaller or much larger than its input.
-
-Rather than implement the `_read()` and `_write()` methods, Transform
-classes must implement the `_transform()` method, and may optionally
-also implement the `_flush()` method.  (See below.)
-
-### new stream.Transform([options])
-
-* `options` {Object} Passed to both Writable and Readable
-  constructors.
-
-In classes that extend the Transform class, make sure to call the
-constructor so that the buffering settings can be properly
-initialized.
-
-### transform.\_transform(chunk, encoding, callback)
-
-* `chunk` {Buffer | String} The chunk to be transformed.  Will always
-  be a buffer unless the `decodeStrings` option was set to `false`.
-* `encoding` {String} If the chunk is a string, then this is the
-  encoding type.  (Ignore if `decodeStrings` chunk is a buffer.)
-* `callback` {Function} Call this function (optionally with an error
-  argument) when you are done processing the supplied chunk.
-
-Note: **This function MUST NOT be called directly.**  It should be
-implemented by child classes, and called by the internal Transform
-class methods only.
-
-All Transform stream implementations must provide a `_transform`
-method to accept input and produce output.
-
-`_transform` should do whatever has to be done in this specific
-Transform class, to handle the bytes being written, and pass them off
-to the readable portion of the interface.  Do asynchronous I/O,
-process things, and so on.
-
-Call `transform.push(outputChunk)` 0 or more times to generate output
-from this input chunk, depending on how much data you want to output
-as a result of this chunk.
-
-Call the callback function only when the current chunk is completely
-consumed.  Note that there may or may not be output as a result of any
-particular input chunk.
-
-This method is prefixed with an underscore because it is internal to
-the class that defines it, and should not be called directly by user
-programs.  However, you **are** expected to override this method in
-your own extension classes.
-
-### transform.\_flush(callback)
-
-* `callback` {Function} Call this function (optionally with an error
-  argument) when you are done flushing any remaining data.
-
-Note: **This function MUST NOT be called directly.**  It MAY be implemented
-by child classes, and if so, will be called by the internal Transform
-class methods only.
-
-In some cases, your transform operation may need to emit a bit more
-data at the end of the stream.  For example, a `Zlib` compression
-stream will store up some internal state so that it can optimally
-compress the output.  At the end, however, it needs to do the best it
-can with what is left, so that the data will be complete.
-
-In those cases, you can implement a `_flush` method, which will be
-called at the very end, after all the written data is consumed, but
-before emitting `end` to signal the end of the readable side.  Just
-like with `_transform`, call `transform.push(chunk)` zero or more
-times, as appropriate, and call `callback` when the flush operation is
-complete.
-
-This method is prefixed with an underscore because it is internal to
-the class that defines it, and should not be called directly by user
-programs.  However, you **are** expected to override this method in
-your own extension classes.
-
-### Example: `SimpleProtocol` parser
-
-The example above of a simple protocol parser can be implemented much
-more simply by using the higher level `Transform` stream class.
-
-In this example, rather than providing the input as an argument, it
-would be piped into the parser, which is a more idiomatic Node stream
-approach.
-
-```javascript
-function SimpleProtocol(options) {
-  if (!(this instanceof SimpleProtocol))
-    return new SimpleProtocol(options);
-
-  Transform.call(this, options);
-  this._inBody = false;
-  this._sawFirstCr = false;
-  this._rawHeader = [];
-  this.header = null;
-}
-
-SimpleProtocol.prototype = Object.create(
-  Transform.prototype, { constructor: { value: SimpleProtocol }});
-
-SimpleProtocol.prototype._transform = function(chunk, encoding, done) {
-  if (!this._inBody) {
-    // check if the chunk has a \n\n
-    var split = -1;
-    for (var i = 0; i < chunk.length; i++) {
-      if (chunk[i] === 10) { // '\n'
-        if (this._sawFirstCr) {
-          split = i;
-          break;
-        } else {
-          this._sawFirstCr = true;
-        }
-      } else {
-        this._sawFirstCr = false;
-      }
-    }
-
-    if (split === -1) {
-      // still waiting for the \n\n
-      // stash the chunk, and try again.
-      this._rawHeader.push(chunk);
-    } else {
-      this._inBody = true;
-      var h = chunk.slice(0, split);
-      this._rawHeader.push(h);
-      var header = Buffer.concat(this._rawHeader).toString();
-      try {
-        this.header = JSON.parse(header);
-      } catch (er) {
-        this.emit('error', new Error('invalid simple protocol data'));
-        return;
-      }
-      // and let them know that we are done parsing the header.
-      this.emit('header', this.header);
-
-      // now, because we got some extra data, emit this first.
-      this.push(b);
-    }
-  } else {
-    // from there on, just provide the data to our consumer as-is.
-    this.push(b);
-  }
-  done();
-};
-
-var parser = new SimpleProtocol();
-source.pipe(parser)
-
-// Now parser is a readable stream that will emit 'header'
-// with the parsed header data.
-```
-
-
-## Class: stream.PassThrough
-
-This is a trivial implementation of a `Transform` stream that simply
-passes the input bytes across to the output.  Its purpose is mainly
-for examples and testing, but there are occasionally use cases where
-it can come in handy.
-
-
-[EventEmitter]: events.html#events_class_events_eventemitter
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/duplex.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require("./lib/_stream_duplex.js")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/examples/CAPSLOCKTYPER.JS	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,32 +0,0 @@
-var Transform = require('../transform');
-var inherits = require('util').inherits;
-
-// subclass
-function MyStream () {
-  Transform.call(this, {
-    lowWaterMark: 0,
-    encoding: 'utf8'
-  });
-}
-inherits(MyStream, Transform);
-
-MyStream.prototype._transform = function (chunk, outputFn, callback) {
-  outputFn(new Buffer(String(chunk).toUpperCase()));
-  callback();
-};
-
-// use it!
-var s = new MyStream();
-process.stdin.resume();
-process.stdin.pipe(s).pipe(process.stdout);
-if (process.stdin.setRawMode)
-  process.stdin.setRawMode(true);
-process.stdin.on('data', function (c) {
-  c = c.toString();
-  if (c === '\u0003' || c === '\u0004') {
-    process.stdin.pause();
-    s.end();
-  }
-  if (c === '\r')
-    process.stdout.write('\n');
-});
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/examples/typer-fsr.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-var fs = require('fs');
-var FSReadable = require('../fs.js');
-var rst = new FSReadable(__filename);
-
-rst.on('end', function() {
-  process.stdin.pause();
-});
-
-process.stdin.setRawMode(true);
-process.stdin.on('data', function() {
-  var c = rst.read(3);
-  if (!c) return;
-  process.stdout.write(c);
-});
-process.stdin.resume();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/examples/typer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,17 +0,0 @@
-var fs = require('fs');
-var fst = fs.createReadStream(__filename);
-var Readable = require('../readable.js');
-var rst = new Readable();
-rst.wrap(fst);
-
-rst.on('end', function() {
-  process.stdin.pause();
-});
-
-process.stdin.setRawMode(true);
-process.stdin.on('data', function() {
-  var c = rst.read(3);
-  if (!c) return setTimeout(process.exit, 500)
-  process.stdout.write(c);
-});
-process.stdin.resume();
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/float.patch	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,68 +0,0 @@
-diff --git a/lib/_stream_duplex.js b/lib/_stream_duplex.js
-index c5a741c..a2e0d8e 100644
---- a/lib/_stream_duplex.js
-+++ b/lib/_stream_duplex.js
-@@ -26,8 +26,8 @@
- 
- module.exports = Duplex;
- var util = require('util');
--var Readable = require('_stream_readable');
--var Writable = require('_stream_writable');
-+var Readable = require('./_stream_readable');
-+var Writable = require('./_stream_writable');
- 
- util.inherits(Duplex, Readable);
- 
-diff --git a/lib/_stream_passthrough.js b/lib/_stream_passthrough.js
-index a5e9864..330c247 100644
---- a/lib/_stream_passthrough.js
-+++ b/lib/_stream_passthrough.js
-@@ -25,7 +25,7 @@
- 
- module.exports = PassThrough;
- 
--var Transform = require('_stream_transform');
-+var Transform = require('./_stream_transform');
- var util = require('util');
- util.inherits(PassThrough, Transform);
- 
-diff --git a/lib/_stream_readable.js b/lib/_stream_readable.js
-index 2259d2e..e6681ee 100644
---- a/lib/_stream_readable.js
-+++ b/lib/_stream_readable.js
-@@ -23,6 +23,9 @@ module.exports = Readable;
- Readable.ReadableState = ReadableState;
- 
- var EE = require('events').EventEmitter;
-+if (!EE.listenerCount) EE.listenerCount = function(emitter, type) {
-+  return emitter.listeners(type).length;
-+};
- var Stream = require('stream');
- var util = require('util');
- var StringDecoder;
-diff --git a/lib/_stream_transform.js b/lib/_stream_transform.js
-index e925b4b..f08b05e 100644
---- a/lib/_stream_transform.js
-+++ b/lib/_stream_transform.js
-@@ -64,7 +64,7 @@
- 
- module.exports = Transform;
- 
--var Duplex = require('_stream_duplex');
-+var Duplex = require('./_stream_duplex');
- var util = require('util');
- util.inherits(Transform, Duplex);
- 
-diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js
-index a26f711..56ca47d 100644
---- a/lib/_stream_writable.js
-+++ b/lib/_stream_writable.js
-@@ -109,7 +109,7 @@ function WritableState(options, stream) {
- function Writable(options) {
-   // Writable ctor is applied to Duplexes, though they're not
-   // instanceof Writable, they're instanceof Readable.
--  if (!(this instanceof Writable) && !(this instanceof Stream.Duplex))
-+  if (!(this instanceof Writable) && !(this instanceof require('./_stream_duplex')))
-     return new Writable(options);
- 
-   this._writableState = new WritableState(options, this);
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/fs.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1705 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-// Maintainers, keep in mind that octal literals are not allowed
-// in strict mode. Use the decimal value and add a comment with
-// the octal value. Example:
-//
-//   var mode = 438; /* mode=0666 */
-
-var util = require('util');
-var pathModule = require('path');
-
-var binding = process.binding('fs');
-var constants = process.binding('constants');
-var fs = exports;
-var Stream = require('stream').Stream;
-var EventEmitter = require('events').EventEmitter;
-
-var Readable = require('./lib/_stream_readable.js');
-var Writable = require('./lib/_stream_writable.js');
-
-var kMinPoolSpace = 128;
-var kPoolSize = 40 * 1024;
-
-var O_APPEND = constants.O_APPEND || 0;
-var O_CREAT = constants.O_CREAT || 0;
-var O_DIRECTORY = constants.O_DIRECTORY || 0;
-var O_EXCL = constants.O_EXCL || 0;
-var O_NOCTTY = constants.O_NOCTTY || 0;
-var O_NOFOLLOW = constants.O_NOFOLLOW || 0;
-var O_RDONLY = constants.O_RDONLY || 0;
-var O_RDWR = constants.O_RDWR || 0;
-var O_SYMLINK = constants.O_SYMLINK || 0;
-var O_SYNC = constants.O_SYNC || 0;
-var O_TRUNC = constants.O_TRUNC || 0;
-var O_WRONLY = constants.O_WRONLY || 0;
-
-var isWindows = process.platform === 'win32';
-
-var DEBUG = process.env.NODE_DEBUG && /fs/.test(process.env.NODE_DEBUG);
-
-function rethrow() {
-  // Only enable in debug mode. A backtrace uses ~1000 bytes of heap space and
-  // is fairly slow to generate.
-  if (DEBUG) {
-    var backtrace = new Error;
-    return function(err) {
-      if (err) {
-        backtrace.message = err.message;
-        err = backtrace;
-        throw err;
-      }
-    };
-  }
-
-  return function(err) {
-    if (err) {
-      throw err;  // Forgot a callback but don't know where? Use NODE_DEBUG=fs
-    }
-  };
-}
-
-function maybeCallback(cb) {
-  return typeof cb === 'function' ? cb : rethrow();
-}
-
-// Ensure that callbacks run in the global context. Only use this function
-// for callbacks that are passed to the binding layer, callbacks that are
-// invoked from JS already run in the proper scope.
-function makeCallback(cb) {
-  if (typeof cb !== 'function') {
-    return rethrow();
-  }
-
-  return function() {
-    return cb.apply(null, arguments);
-  };
-}
-
-function assertEncoding(encoding) {
-  if (encoding && !Buffer.isEncoding(encoding)) {
-    throw new Error('Unknown encoding: ' + encoding);
-  }
-}
-
-function nullCheck(path, callback) {
-  if (('' + path).indexOf('\u0000') !== -1) {
-    var er = new Error('Path must be a string without null bytes.');
-    if (!callback)
-      throw er;
-    process.nextTick(function() {
-      callback(er);
-    });
-    return false;
-  }
-  return true;
-}
-
-fs.Stats = binding.Stats;
-
-fs.Stats.prototype._checkModeProperty = function(property) {
-  return ((this.mode & constants.S_IFMT) === property);
-};
-
-fs.Stats.prototype.isDirectory = function() {
-  return this._checkModeProperty(constants.S_IFDIR);
-};
-
-fs.Stats.prototype.isFile = function() {
-  return this._checkModeProperty(constants.S_IFREG);
-};
-
-fs.Stats.prototype.isBlockDevice = function() {
-  return this._checkModeProperty(constants.S_IFBLK);
-};
-
-fs.Stats.prototype.isCharacterDevice = function() {
-  return this._checkModeProperty(constants.S_IFCHR);
-};
-
-fs.Stats.prototype.isSymbolicLink = function() {
-  return this._checkModeProperty(constants.S_IFLNK);
-};
-
-fs.Stats.prototype.isFIFO = function() {
-  return this._checkModeProperty(constants.S_IFIFO);
-};
-
-fs.Stats.prototype.isSocket = function() {
-  return this._checkModeProperty(constants.S_IFSOCK);
-};
-
-fs.exists = function(path, callback) {
-  if (!nullCheck(path, cb)) return;
-  binding.stat(pathModule._makeLong(path), cb);
-  function cb(err, stats) {
-    if (callback) callback(err ? false : true);
-  }
-};
-
-fs.existsSync = function(path) {
-  try {
-    nullCheck(path);
-    binding.stat(pathModule._makeLong(path));
-    return true;
-  } catch (e) {
-    return false;
-  }
-};
-
-fs.readFile = function(path, encoding_) {
-  var encoding = typeof(encoding_) === 'string' ? encoding_ : null;
-  var callback = maybeCallback(arguments[arguments.length - 1]);
-
-  assertEncoding(encoding);
-
-  // first, stat the file, so we know the size.
-  var size;
-  var buffer; // single buffer with file data
-  var buffers; // list for when size is unknown
-  var pos = 0;
-  var fd;
-
-  fs.open(path, constants.O_RDONLY, 438 /*=0666*/, function(er, fd_) {
-    if (er) return callback(er);
-    fd = fd_;
-
-    fs.fstat(fd, function(er, st) {
-      if (er) return callback(er);
-      size = st.size;
-      if (size === 0) {
-        // the kernel lies about many files.
-        // Go ahead and try to read some bytes.
-        buffers = [];
-        return read();
-      }
-
-      buffer = new Buffer(size);
-      read();
-    });
-  });
-
-  function read() {
-    if (size === 0) {
-      buffer = new Buffer(8192);
-      fs.read(fd, buffer, 0, 8192, -1, afterRead);
-    } else {
-      fs.read(fd, buffer, pos, size - pos, -1, afterRead);
-    }
-  }
-
-  function afterRead(er, bytesRead) {
-    if (er) {
-      return fs.close(fd, function(er2) {
-        return callback(er);
-      });
-    }
-
-    if (bytesRead === 0) {
-      return close();
-    }
-
-    pos += bytesRead;
-    if (size !== 0) {
-      if (pos === size) close();
-      else read();
-    } else {
-      // unknown size, just read until we don't get bytes.
-      buffers.push(buffer.slice(0, bytesRead));
-      read();
-    }
-  }
-
-  function close() {
-    fs.close(fd, function(er) {
-      if (size === 0) {
-        // collected the data into the buffers list.
-        buffer = Buffer.concat(buffers, pos);
-      } else if (pos < size) {
-        buffer = buffer.slice(0, pos);
-      }
-
-      if (encoding) buffer = buffer.toString(encoding);
-      return callback(er, buffer);
-    });
-  }
-};
-
-fs.readFileSync = function(path, encoding) {
-  assertEncoding(encoding);
-
-  var fd = fs.openSync(path, constants.O_RDONLY, 438 /*=0666*/);
-
-  var size;
-  var threw = true;
-  try {
-    size = fs.fstatSync(fd).size;
-    threw = false;
-  } finally {
-    if (threw) fs.closeSync(fd);
-  }
-
-  var pos = 0;
-  var buffer; // single buffer with file data
-  var buffers; // list for when size is unknown
-
-  if (size === 0) {
-    buffers = [];
-  } else {
-    buffer = new Buffer(size);
-  }
-
-  var done = false;
-  while (!done) {
-    var threw = true;
-    try {
-      if (size !== 0) {
-        var bytesRead = fs.readSync(fd, buffer, pos, size - pos);
-      } else {
-        // the kernel lies about many files.
-        // Go ahead and try to read some bytes.
-        buffer = new Buffer(8192);
-        var bytesRead = fs.readSync(fd, buffer, 0, 8192);
-        if (bytesRead) {
-          buffers.push(buffer.slice(0, bytesRead));
-        }
-      }
-      threw = false;
-    } finally {
-      if (threw) fs.closeSync(fd);
-    }
-
-    pos += bytesRead;
-    done = (bytesRead === 0) || (size !== 0 && pos >= size);
-  }
-
-  fs.closeSync(fd);
-
-  if (size === 0) {
-    // data was collected into the buffers list.
-    buffer = Buffer.concat(buffers, pos);
-  } else if (pos < size) {
-    buffer = buffer.slice(0, pos);
-  }
-
-  if (encoding) buffer = buffer.toString(encoding);
-  return buffer;
-};
-
-
-// Used by binding.open and friends
-function stringToFlags(flag) {
-  // Only mess with strings
-  if (typeof flag !== 'string') {
-    return flag;
-  }
-
-  // O_EXCL is mandated by POSIX, Windows supports it too.
-  // Let's add a check anyway, just in case.
-  if (!O_EXCL && ~flag.indexOf('x')) {
-    throw errnoException('ENOSYS', 'fs.open(O_EXCL)');
-  }
-
-  switch (flag) {
-    case 'r' : return O_RDONLY;
-    case 'rs' : return O_RDONLY | O_SYNC;
-    case 'r+' : return O_RDWR;
-    case 'rs+' : return O_RDWR | O_SYNC;
-
-    case 'w' : return O_TRUNC | O_CREAT | O_WRONLY;
-    case 'wx' : // fall through
-    case 'xw' : return O_TRUNC | O_CREAT | O_WRONLY | O_EXCL;
-
-    case 'w+' : return O_TRUNC | O_CREAT | O_RDWR;
-    case 'wx+': // fall through
-    case 'xw+': return O_TRUNC | O_CREAT | O_RDWR | O_EXCL;
-
-    case 'a' : return O_APPEND | O_CREAT | O_WRONLY;
-    case 'ax' : // fall through
-    case 'xa' : return O_APPEND | O_CREAT | O_WRONLY | O_EXCL;
-
-    case 'a+' : return O_APPEND | O_CREAT | O_RDWR;
-    case 'ax+': // fall through
-    case 'xa+': return O_APPEND | O_CREAT | O_RDWR | O_EXCL;
-  }
-
-  throw new Error('Unknown file open flag: ' + flag);
-}
-
-// exported but hidden, only used by test/simple/test-fs-open-flags.js
-Object.defineProperty(exports, '_stringToFlags', {
-  enumerable: false,
-  value: stringToFlags
-});
-
-
-// Yes, the follow could be easily DRYed up but I provide the explicit
-// list to make the arguments clear.
-
-fs.close = function(fd, callback) {
-  binding.close(fd, makeCallback(callback));
-};
-
-fs.closeSync = function(fd) {
-  return binding.close(fd);
-};
-
-function modeNum(m, def) {
-  switch (typeof m) {
-    case 'number': return m;
-    case 'string': return parseInt(m, 8);
-    default:
-      if (def) {
-        return modeNum(def);
-      } else {
-        return undefined;
-      }
-  }
-}
-
-fs.open = function(path, flags, mode, callback) {
-  callback = makeCallback(arguments[arguments.length - 1]);
-  mode = modeNum(mode, 438 /*=0666*/);
-
-  if (!nullCheck(path, callback)) return;
-  binding.open(pathModule._makeLong(path),
-               stringToFlags(flags),
-               mode,
-               callback);
-};
-
-fs.openSync = function(path, flags, mode) {
-  mode = modeNum(mode, 438 /*=0666*/);
-  nullCheck(path);
-  return binding.open(pathModule._makeLong(path), stringToFlags(flags), mode);
-};
-
-fs.read = function(fd, buffer, offset, length, position, callback) {
-  if (!Buffer.isBuffer(buffer)) {
-    // legacy string interface (fd, length, position, encoding, callback)
-    var cb = arguments[4],
-        encoding = arguments[3];
-
-    assertEncoding(encoding);
-
-    position = arguments[2];
-    length = arguments[1];
-    buffer = new Buffer(length);
-    offset = 0;
-
-    callback = function(err, bytesRead) {
-      if (!cb) return;
-
-      var str = (bytesRead > 0) ? buffer.toString(encoding, 0, bytesRead) : '';
-
-      (cb)(err, str, bytesRead);
-    };
-  }
-
-  function wrapper(err, bytesRead) {
-    // Retain a reference to buffer so that it can't be GC'ed too soon.
-    callback && callback(err, bytesRead || 0, buffer);
-  }
-
-  binding.read(fd, buffer, offset, length, position, wrapper);
-};
-
-fs.readSync = function(fd, buffer, offset, length, position) {
-  var legacy = false;
-  if (!Buffer.isBuffer(buffer)) {
-    // legacy string interface (fd, length, position, encoding, callback)
-    legacy = true;
-    var encoding = arguments[3];
-
-    assertEncoding(encoding);
-
-    position = arguments[2];
-    length = arguments[1];
-    buffer = new Buffer(length);
-
-    offset = 0;
-  }
-
-  var r = binding.read(fd, buffer, offset, length, position);
-  if (!legacy) {
-    return r;
-  }
-
-  var str = (r > 0) ? buffer.toString(encoding, 0, r) : '';
-  return [str, r];
-};
-
-fs.write = function(fd, buffer, offset, length, position, callback) {
-  if (!Buffer.isBuffer(buffer)) {
-    // legacy string interface (fd, data, position, encoding, callback)
-    callback = arguments[4];
-    position = arguments[2];
-    assertEncoding(arguments[3]);
-
-    buffer = new Buffer('' + arguments[1], arguments[3]);
-    offset = 0;
-    length = buffer.length;
-  }
-
-  if (!length) {
-    if (typeof callback == 'function') {
-      process.nextTick(function() {
-        callback(undefined, 0);
-      });
-    }
-    return;
-  }
-
-  callback = maybeCallback(callback);
-
-  function wrapper(err, written) {
-    // Retain a reference to buffer so that it can't be GC'ed too soon.
-    callback(err, written || 0, buffer);
-  }
-
-  binding.write(fd, buffer, offset, length, position, wrapper);
-};
-
-fs.writeSync = function(fd, buffer, offset, length, position) {
-  if (!Buffer.isBuffer(buffer)) {
-    // legacy string interface (fd, data, position, encoding)
-    position = arguments[2];
-    assertEncoding(arguments[3]);
-
-    buffer = new Buffer('' + arguments[1], arguments[3]);
-    offset = 0;
-    length = buffer.length;
-  }
-  if (!length) return 0;
-
-  return binding.write(fd, buffer, offset, length, position);
-};
-
-fs.rename = function(oldPath, newPath, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(oldPath, callback)) return;
-  if (!nullCheck(newPath, callback)) return;
-  binding.rename(pathModule._makeLong(oldPath),
-                 pathModule._makeLong(newPath),
-                 callback);
-};
-
-fs.renameSync = function(oldPath, newPath) {
-  nullCheck(oldPath);
-  nullCheck(newPath);
-  return binding.rename(pathModule._makeLong(oldPath),
-                        pathModule._makeLong(newPath));
-};
-
-fs.truncate = function(path, len, callback) {
-  if (typeof path === 'number') {
-    // legacy
-    return fs.ftruncate(path, len, callback);
-  }
-  if (typeof len === 'function') {
-    callback = len;
-    len = 0;
-  } else if (typeof len === 'undefined') {
-    len = 0;
-  }
-  callback = maybeCallback(callback);
-  fs.open(path, 'w', function(er, fd) {
-    if (er) return callback(er);
-    binding.ftruncate(fd, len, function(er) {
-      fs.close(fd, function(er2) {
-        callback(er || er2);
-      });
-    });
-  });
-};
-
-fs.truncateSync = function(path, len) {
-  if (typeof path === 'number') {
-    // legacy
-    return fs.ftruncateSync(path, len);
-  }
-  if (typeof len === 'undefined') {
-    len = 0;
-  }
-  // allow error to be thrown, but still close fd.
-  var fd = fs.openSync(path, 'w');
-  try {
-    var ret = fs.ftruncateSync(fd, len);
-  } finally {
-    fs.closeSync(fd);
-  }
-  return ret;
-};
-
-fs.ftruncate = function(fd, len, callback) {
-  if (typeof len === 'function') {
-    callback = len;
-    len = 0;
-  } else if (typeof len === 'undefined') {
-    len = 0;
-  }
-  binding.ftruncate(fd, len, makeCallback(callback));
-};
-
-fs.ftruncateSync = function(fd, len) {
-  if (typeof len === 'undefined') {
-    len = 0;
-  }
-  return binding.ftruncate(fd, len);
-};
-
-fs.rmdir = function(path, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.rmdir(pathModule._makeLong(path), callback);
-};
-
-fs.rmdirSync = function(path) {
-  nullCheck(path);
-  return binding.rmdir(pathModule._makeLong(path));
-};
-
-fs.fdatasync = function(fd, callback) {
-  binding.fdatasync(fd, makeCallback(callback));
-};
-
-fs.fdatasyncSync = function(fd) {
-  return binding.fdatasync(fd);
-};
-
-fs.fsync = function(fd, callback) {
-  binding.fsync(fd, makeCallback(callback));
-};
-
-fs.fsyncSync = function(fd) {
-  return binding.fsync(fd);
-};
-
-fs.mkdir = function(path, mode, callback) {
-  if (typeof mode === 'function') callback = mode;
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.mkdir(pathModule._makeLong(path),
-                modeNum(mode, 511 /*=0777*/),
-                callback);
-};
-
-fs.mkdirSync = function(path, mode) {
-  nullCheck(path);
-  return binding.mkdir(pathModule._makeLong(path),
-                       modeNum(mode, 511 /*=0777*/));
-};
-
-fs.sendfile = function(outFd, inFd, inOffset, length, callback) {
-  binding.sendfile(outFd, inFd, inOffset, length, makeCallback(callback));
-};
-
-fs.sendfileSync = function(outFd, inFd, inOffset, length) {
-  return binding.sendfile(outFd, inFd, inOffset, length);
-};
-
-fs.readdir = function(path, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.readdir(pathModule._makeLong(path), callback);
-};
-
-fs.readdirSync = function(path) {
-  nullCheck(path);
-  return binding.readdir(pathModule._makeLong(path));
-};
-
-fs.fstat = function(fd, callback) {
-  binding.fstat(fd, makeCallback(callback));
-};
-
-fs.lstat = function(path, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.lstat(pathModule._makeLong(path), callback);
-};
-
-fs.stat = function(path, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.stat(pathModule._makeLong(path), callback);
-};
-
-fs.fstatSync = function(fd) {
-  return binding.fstat(fd);
-};
-
-fs.lstatSync = function(path) {
-  nullCheck(path);
-  return binding.lstat(pathModule._makeLong(path));
-};
-
-fs.statSync = function(path) {
-  nullCheck(path);
-  return binding.stat(pathModule._makeLong(path));
-};
-
-fs.readlink = function(path, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.readlink(pathModule._makeLong(path), callback);
-};
-
-fs.readlinkSync = function(path) {
-  nullCheck(path);
-  return binding.readlink(pathModule._makeLong(path));
-};
-
-function preprocessSymlinkDestination(path, type) {
-  if (!isWindows) {
-    // No preprocessing is needed on Unix.
-    return path;
-  } else if (type === 'junction') {
-    // Junctions paths need to be absolute and \\?\-prefixed.
-    return pathModule._makeLong(path);
-  } else {
-    // Windows symlinks don't tolerate forward slashes.
-    return ('' + path).replace(/\//g, '\\');
-  }
-}
-
-fs.symlink = function(destination, path, type_, callback) {
-  var type = (typeof type_ === 'string' ? type_ : null);
-  var callback = makeCallback(arguments[arguments.length - 1]);
-
-  if (!nullCheck(destination, callback)) return;
-  if (!nullCheck(path, callback)) return;
-
-  binding.symlink(preprocessSymlinkDestination(destination, type),
-                  pathModule._makeLong(path),
-                  type,
-                  callback);
-};
-
-fs.symlinkSync = function(destination, path, type) {
-  type = (typeof type === 'string' ? type : null);
-
-  nullCheck(destination);
-  nullCheck(path);
-
-  return binding.symlink(preprocessSymlinkDestination(destination, type),
-                         pathModule._makeLong(path),
-                         type);
-};
-
-fs.link = function(srcpath, dstpath, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(srcpath, callback)) return;
-  if (!nullCheck(dstpath, callback)) return;
-
-  binding.link(pathModule._makeLong(srcpath),
-               pathModule._makeLong(dstpath),
-               callback);
-};
-
-fs.linkSync = function(srcpath, dstpath) {
-  nullCheck(srcpath);
-  nullCheck(dstpath);
-  return binding.link(pathModule._makeLong(srcpath),
-                      pathModule._makeLong(dstpath));
-};
-
-fs.unlink = function(path, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.unlink(pathModule._makeLong(path), callback);
-};
-
-fs.unlinkSync = function(path) {
-  nullCheck(path);
-  return binding.unlink(pathModule._makeLong(path));
-};
-
-fs.fchmod = function(fd, mode, callback) {
-  binding.fchmod(fd, modeNum(mode), makeCallback(callback));
-};
-
-fs.fchmodSync = function(fd, mode) {
-  return binding.fchmod(fd, modeNum(mode));
-};
-
-if (constants.hasOwnProperty('O_SYMLINK')) {
-  fs.lchmod = function(path, mode, callback) {
-    callback = maybeCallback(callback);
-    fs.open(path, constants.O_WRONLY | constants.O_SYMLINK, function(err, fd) {
-      if (err) {
-        callback(err);
-        return;
-      }
-      // prefer to return the chmod error, if one occurs,
-      // but still try to close, and report closing errors if they occur.
-      fs.fchmod(fd, mode, function(err) {
-        fs.close(fd, function(err2) {
-          callback(err || err2);
-        });
-      });
-    });
-  };
-
-  fs.lchmodSync = function(path, mode) {
-    var fd = fs.openSync(path, constants.O_WRONLY | constants.O_SYMLINK);
-
-    // prefer to return the chmod error, if one occurs,
-    // but still try to close, and report closing errors if they occur.
-    var err, err2;
-    try {
-      var ret = fs.fchmodSync(fd, mode);
-    } catch (er) {
-      err = er;
-    }
-    try {
-      fs.closeSync(fd);
-    } catch (er) {
-      err2 = er;
-    }
-    if (err || err2) throw (err || err2);
-    return ret;
-  };
-}
-
-
-fs.chmod = function(path, mode, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.chmod(pathModule._makeLong(path),
-                modeNum(mode),
-                callback);
-};
-
-fs.chmodSync = function(path, mode) {
-  nullCheck(path);
-  return binding.chmod(pathModule._makeLong(path), modeNum(mode));
-};
-
-if (constants.hasOwnProperty('O_SYMLINK')) {
-  fs.lchown = function(path, uid, gid, callback) {
-    callback = maybeCallback(callback);
-    fs.open(path, constants.O_WRONLY | constants.O_SYMLINK, function(err, fd) {
-      if (err) {
-        callback(err);
-        return;
-      }
-      fs.fchown(fd, uid, gid, callback);
-    });
-  };
-
-  fs.lchownSync = function(path, uid, gid) {
-    var fd = fs.openSync(path, constants.O_WRONLY | constants.O_SYMLINK);
-    return fs.fchownSync(fd, uid, gid);
-  };
-}
-
-fs.fchown = function(fd, uid, gid, callback) {
-  binding.fchown(fd, uid, gid, makeCallback(callback));
-};
-
-fs.fchownSync = function(fd, uid, gid) {
-  return binding.fchown(fd, uid, gid);
-};
-
-fs.chown = function(path, uid, gid, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.chown(pathModule._makeLong(path), uid, gid, callback);
-};
-
-fs.chownSync = function(path, uid, gid) {
-  nullCheck(path);
-  return binding.chown(pathModule._makeLong(path), uid, gid);
-};
-
-// converts Date or number to a fractional UNIX timestamp
-function toUnixTimestamp(time) {
-  if (typeof time == 'number') {
-    return time;
-  }
-  if (time instanceof Date) {
-    // convert to 123.456 UNIX timestamp
-    return time.getTime() / 1000;
-  }
-  throw new Error('Cannot parse time: ' + time);
-}
-
-// exported for unit tests, not for public consumption
-fs._toUnixTimestamp = toUnixTimestamp;
-
-fs.utimes = function(path, atime, mtime, callback) {
-  callback = makeCallback(callback);
-  if (!nullCheck(path, callback)) return;
-  binding.utimes(pathModule._makeLong(path),
-                 toUnixTimestamp(atime),
-                 toUnixTimestamp(mtime),
-                 callback);
-};
-
-fs.utimesSync = function(path, atime, mtime) {
-  nullCheck(path);
-  atime = toUnixTimestamp(atime);
-  mtime = toUnixTimestamp(mtime);
-  binding.utimes(pathModule._makeLong(path), atime, mtime);
-};
-
-fs.futimes = function(fd, atime, mtime, callback) {
-  atime = toUnixTimestamp(atime);
-  mtime = toUnixTimestamp(mtime);
-  binding.futimes(fd, atime, mtime, makeCallback(callback));
-};
-
-fs.futimesSync = function(fd, atime, mtime) {
-  atime = toUnixTimestamp(atime);
-  mtime = toUnixTimestamp(mtime);
-  binding.futimes(fd, atime, mtime);
-};
-
-function writeAll(fd, buffer, offset, length, position, callback) {
-  callback = maybeCallback(arguments[arguments.length - 1]);
-
-  // write(fd, buffer, offset, length, position, callback)
-  fs.write(fd, buffer, offset, length, position, function(writeErr, written) {
-    if (writeErr) {
-      fs.close(fd, function() {
-        if (callback) callback(writeErr);
-      });
-    } else {
-      if (written === length) {
-        fs.close(fd, callback);
-      } else {
-        offset += written;
-        length -= written;
-        position += written;
-        writeAll(fd, buffer, offset, length, position, callback);
-      }
-    }
-  });
-}
-
-fs.writeFile = function(path, data, encoding_, callback) {
-  var encoding = (typeof(encoding_) == 'string' ? encoding_ : 'utf8');
-  assertEncoding(encoding);
-
-  callback = maybeCallback(arguments[arguments.length - 1]);
-  fs.open(path, 'w', 438 /*=0666*/, function(openErr, fd) {
-    if (openErr) {
-      if (callback) callback(openErr);
-    } else {
-      var buffer = Buffer.isBuffer(data) ? data : new Buffer('' + data,
-          encoding);
-      writeAll(fd, buffer, 0, buffer.length, 0, callback);
-    }
-  });
-};
-
-fs.writeFileSync = function(path, data, encoding) {
-  assertEncoding(encoding);
-
-  var fd = fs.openSync(path, 'w');
-  if (!Buffer.isBuffer(data)) {
-    data = new Buffer('' + data, encoding || 'utf8');
-  }
-  var written = 0;
-  var length = data.length;
-  try {
-    while (written < length) {
-      written += fs.writeSync(fd, data, written, length - written, written);
-    }
-  } finally {
-    fs.closeSync(fd);
-  }
-};
-
-fs.appendFile = function(path, data, encoding_, callback) {
-  var encoding = (typeof(encoding_) == 'string' ? encoding_ : 'utf8');
-  assertEncoding(encoding);
-
-  callback = maybeCallback(arguments[arguments.length - 1]);
-
-  fs.open(path, 'a', 438 /*=0666*/, function(err, fd) {
-    if (err) return callback(err);
-    var buffer = Buffer.isBuffer(data) ? data : new Buffer('' + data, encoding);
-    writeAll(fd, buffer, 0, buffer.length, null, callback);
-  });
-};
-
-fs.appendFileSync = function(path, data, encoding) {
-  assertEncoding(encoding);
-
-  var fd = fs.openSync(path, 'a');
-  if (!Buffer.isBuffer(data)) {
-    data = new Buffer('' + data, encoding || 'utf8');
-  }
-  var written = 0;
-  var position = null;
-  var length = data.length;
-
-  try {
-    while (written < length) {
-      written += fs.writeSync(fd, data, written, length - written, position);
-      position += written; // XXX not safe with multiple concurrent writers?
-    }
-  } finally {
-    fs.closeSync(fd);
-  }
-};
-
-function errnoException(errorno, syscall) {
-  // TODO make this more compatible with ErrnoException from src/node.cc
-  // Once all of Node is using this function the ErrnoException from
-  // src/node.cc should be removed.
-  var e = new Error(syscall + ' ' + errorno);
-  e.errno = e.code = errorno;
-  e.syscall = syscall;
-  return e;
-}
-
-
-function FSWatcher() {
-  EventEmitter.call(this);
-
-  var self = this;
-  var FSEvent = process.binding('fs_event_wrap').FSEvent;
-  this._handle = new FSEvent();
-  this._handle.owner = this;
-
-  this._handle.onchange = function(status, event, filename) {
-    if (status) {
-      self._handle.close();
-      self.emit('error', errnoException(errno, 'watch'));
-    } else {
-      self.emit('change', event, filename);
-    }
-  };
-}
-util.inherits(FSWatcher, EventEmitter);
-
-FSWatcher.prototype.start = function(filename, persistent) {
-  nullCheck(filename);
-  var r = this._handle.start(pathModule._makeLong(filename), persistent);
-
-  if (r) {
-    this._handle.close();
-    throw errnoException(errno, 'watch');
-  }
-};
-
-FSWatcher.prototype.close = function() {
-  this._handle.close();
-};
-
-fs.watch = function(filename) {
-  nullCheck(filename);
-  var watcher;
-  var options;
-  var listener;
-
-  if ('object' == typeof arguments[1]) {
-    options = arguments[1];
-    listener = arguments[2];
-  } else {
-    options = {};
-    listener = arguments[1];
-  }
-
-  if (options.persistent === undefined) options.persistent = true;
-
-  watcher = new FSWatcher();
-  watcher.start(filename, options.persistent);
-
-  if (listener) {
-    watcher.addListener('change', listener);
-  }
-
-  return watcher;
-};
-
-
-// Stat Change Watchers
-
-function StatWatcher() {
-  EventEmitter.call(this);
-
-  var self = this;
-  this._handle = new binding.StatWatcher();
-
-  // uv_fs_poll is a little more powerful than ev_stat but we curb it for
-  // the sake of backwards compatibility
-  var oldStatus = -1;
-
-  this._handle.onchange = function(current, previous, newStatus) {
-    if (oldStatus === -1 &&
-        newStatus === -1 &&
-        current.nlink === previous.nlink) return;
-
-    oldStatus = newStatus;
-    self.emit('change', current, previous);
-  };
-
-  this._handle.onstop = function() {
-    self.emit('stop');
-  };
-}
-util.inherits(StatWatcher, EventEmitter);
-
-
-StatWatcher.prototype.start = function(filename, persistent, interval) {
-  nullCheck(filename);
-  this._handle.start(pathModule._makeLong(filename), persistent, interval);
-};
-
-
-StatWatcher.prototype.stop = function() {
-  this._handle.stop();
-};
-
-
-var statWatchers = {};
-function inStatWatchers(filename) {
-  return Object.prototype.hasOwnProperty.call(statWatchers, filename) &&
-      statWatchers[filename];
-}
-
-
-fs.watchFile = function(filename) {
-  nullCheck(filename);
-  var stat;
-  var listener;
-
-  var options = {
-    // Poll interval in milliseconds. 5007 is what libev used to use. It's
-    // a little on the slow side but let's stick with it for now to keep
-    // behavioral changes to a minimum.
-    interval: 5007,
-    persistent: true
-  };
-
-  if ('object' == typeof arguments[1]) {
-    options = util._extend(options, arguments[1]);
-    listener = arguments[2];
-  } else {
-    listener = arguments[1];
-  }
-
-  if (!listener) {
-    throw new Error('watchFile requires a listener function');
-  }
-
-  if (inStatWatchers(filename)) {
-    stat = statWatchers[filename];
-  } else {
-    stat = statWatchers[filename] = new StatWatcher();
-    stat.start(filename, options.persistent, options.interval);
-  }
-  stat.addListener('change', listener);
-  return stat;
-};
-
-fs.unwatchFile = function(filename, listener) {
-  nullCheck(filename);
-  if (!inStatWatchers(filename)) return;
-
-  var stat = statWatchers[filename];
-
-  if (typeof listener === 'function') {
-    stat.removeListener('change', listener);
-  } else {
-    stat.removeAllListeners('change');
-  }
-
-  if (stat.listeners('change').length === 0) {
-    stat.stop();
-    statWatchers[filename] = undefined;
-  }
-};
-
-// Realpath
-// Not using realpath(2) because it's bad.
-// See: http://insanecoding.blogspot.com/2007/11/pathmax-simply-isnt.html
-
-var normalize = pathModule.normalize;
-
-// Regexp that finds the next partion of a (partial) path
-// result is [base_with_slash, base], e.g. ['somedir/', 'somedir']
-if (isWindows) {
-  var nextPartRe = /(.*?)(?:[\/\\]+|$)/g;
-} else {
-  var nextPartRe = /(.*?)(?:[\/]+|$)/g;
-}
-
-// Regex to find the device root, including trailing slash. E.g. 'c:\\'.
-if (isWindows) {
-  var splitRootRe = /^(?:[a-zA-Z]:|[\\\/]{2}[^\\\/]+[\\\/][^\\\/]+)?[\\\/]*/;
-} else {
-  var splitRootRe = /^[\/]*/;
-}
-
-fs.realpathSync = function realpathSync(p, cache) {
-  // make p is absolute
-  p = pathModule.resolve(p);
-
-  if (cache && Object.prototype.hasOwnProperty.call(cache, p)) {
-    return cache[p];
-  }
-
-  var original = p,
-      seenLinks = {},
-      knownHard = {};
-
-  // current character position in p
-  var pos;
-  // the partial path so far, including a trailing slash if any
-  var current;
-  // the partial path without a trailing slash (except when pointing at a root)
-  var base;
-  // the partial path scanned in the previous round, with slash
-  var previous;
-
-  start();
-
-  function start() {
-    // Skip over roots
-    var m = splitRootRe.exec(p);
-    pos = m[0].length;
-    current = m[0];
-    base = m[0];
-    previous = '';
-
-    // On windows, check that the root exists. On unix there is no need.
-    if (isWindows && !knownHard[base]) {
-      fs.lstatSync(base);
-      knownHard[base] = true;
-    }
-  }
-
-  // walk down the path, swapping out linked pathparts for their real
-  // values
-  // NB: p.length changes.
-  while (pos < p.length) {
-    // find the next part
-    nextPartRe.lastIndex = pos;
-    var result = nextPartRe.exec(p);
-    previous = current;
-    current += result[0];
-    base = previous + result[1];
-    pos = nextPartRe.lastIndex;
-
-    // continue if not a symlink
-    if (knownHard[base] || (cache && cache[base] === base)) {
-      continue;
-    }
-
-    var resolvedLink;
-    if (cache && Object.prototype.hasOwnProperty.call(cache, base)) {
-      // some known symbolic link.  no need to stat again.
-      resolvedLink = cache[base];
-    } else {
-      var stat = fs.lstatSync(base);
-      if (!stat.isSymbolicLink()) {
-        knownHard[base] = true;
-        if (cache) cache[base] = base;
-        continue;
-      }
-
-      // read the link if it wasn't read before
-      // dev/ino always return 0 on windows, so skip the check.
-      var linkTarget = null;
-      if (!isWindows) {
-        var id = stat.dev.toString(32) + ':' + stat.ino.toString(32);
-        if (seenLinks.hasOwnProperty(id)) {
-          linkTarget = seenLinks[id];
-        }
-      }
-      if (linkTarget === null) {
-        fs.statSync(base);
-        linkTarget = fs.readlinkSync(base);
-      }
-      resolvedLink = pathModule.resolve(previous, linkTarget);
-      // track this, if given a cache.
-      if (cache) cache[base] = resolvedLink;
-      if (!isWindows) seenLinks[id] = linkTarget;
-    }
-
-    // resolve the link, then start over
-    p = pathModule.resolve(resolvedLink, p.slice(pos));
-    start();
-  }
-
-  if (cache) cache[original] = p;
-
-  return p;
-};
-
-
-fs.realpath = function realpath(p, cache, cb) {
-  if (typeof cb !== 'function') {
-    cb = maybeCallback(cache);
-    cache = null;
-  }
-
-  // make p is absolute
-  p = pathModule.resolve(p);
-
-  if (cache && Object.prototype.hasOwnProperty.call(cache, p)) {
-    return process.nextTick(cb.bind(null, null, cache[p]));
-  }
-
-  var original = p,
-      seenLinks = {},
-      knownHard = {};
-
-  // current character position in p
-  var pos;
-  // the partial path so far, including a trailing slash if any
-  var current;
-  // the partial path without a trailing slash (except when pointing at a root)
-  var base;
-  // the partial path scanned in the previous round, with slash
-  var previous;
-
-  start();
-
-  function start() {
-    // Skip over roots
-    var m = splitRootRe.exec(p);
-    pos = m[0].length;
-    current = m[0];
-    base = m[0];
-    previous = '';
-
-    // On windows, check that the root exists. On unix there is no need.
-    if (isWindows && !knownHard[base]) {
-      fs.lstat(base, function(err) {
-        if (err) return cb(err);
-        knownHard[base] = true;
-        LOOP();
-      });
-    } else {
-      process.nextTick(LOOP);
-    }
-  }
-
-  // walk down the path, swapping out linked pathparts for their real
-  // values
-  function LOOP() {
-    // stop if scanned past end of path
-    if (pos >= p.length) {
-      if (cache) cache[original] = p;
-      return cb(null, p);
-    }
-
-    // find the next part
-    nextPartRe.lastIndex = pos;
-    var result = nextPartRe.exec(p);
-    previous = current;
-    current += result[0];
-    base = previous + result[1];
-    pos = nextPartRe.lastIndex;
-
-    // continue if not a symlink
-    if (knownHard[base] || (cache && cache[base] === base)) {
-      return process.nextTick(LOOP);
-    }
-
-    if (cache && Object.prototype.hasOwnProperty.call(cache, base)) {
-      // known symbolic link.  no need to stat again.
-      return gotResolvedLink(cache[base]);
-    }
-
-    return fs.lstat(base, gotStat);
-  }
-
-  function gotStat(err, stat) {
-    if (err) return cb(err);
-
-    // if not a symlink, skip to the next path part
-    if (!stat.isSymbolicLink()) {
-      knownHard[base] = true;
-      if (cache) cache[base] = base;
-      return process.nextTick(LOOP);
-    }
-
-    // stat & read the link if not read before
-    // call gotTarget as soon as the link target is known
-    // dev/ino always return 0 on windows, so skip the check.
-    if (!isWindows) {
-      var id = stat.dev.toString(32) + ':' + stat.ino.toString(32);
-      if (seenLinks.hasOwnProperty(id)) {
-        return gotTarget(null, seenLinks[id], base);
-      }
-    }
-    fs.stat(base, function(err) {
-      if (err) return cb(err);
-
-      fs.readlink(base, function(err, target) {
-        if (!isWindows) seenLinks[id] = target;
-        gotTarget(err, target);
-      });
-    });
-  }
-
-  function gotTarget(err, target, base) {
-    if (err) return cb(err);
-
-    var resolvedLink = pathModule.resolve(previous, target);
-    if (cache) cache[base] = resolvedLink;
-    gotResolvedLink(resolvedLink);
-  }
-
-  function gotResolvedLink(resolvedLink) {
-    // resolve the link, then start over
-    p = pathModule.resolve(resolvedLink, p.slice(pos));
-    start();
-  }
-};
-
-
-
-var pool;
-
-function allocNewPool() {
-  pool = new Buffer(kPoolSize);
-  pool.used = 0;
-}
-
-
-
-fs.createReadStream = function(path, options) {
-  return new ReadStream(path, options);
-};
-
-util.inherits(ReadStream, Readable);
-fs.ReadStream = ReadStream;
-
-function ReadStream(path, options) {
-  if (!(this instanceof ReadStream))
-    return new ReadStream(path, options);
-
-  // a little bit bigger buffer and water marks by default
-  options = util._extend({
-    bufferSize: 64 * 1024,
-    lowWaterMark: 16 * 1024,
-    highWaterMark: 64 * 1024
-  }, options || {});
-
-  Readable.call(this, options);
-
-  this.path = path;
-  this.fd = options.hasOwnProperty('fd') ? options.fd : null;
-  this.flags = options.hasOwnProperty('flags') ? options.flags : 'r';
-  this.mode = options.hasOwnProperty('mode') ? options.mode : 438; /*=0666*/
-
-  this.start = options.hasOwnProperty('start') ? options.start : undefined;
-  this.end = options.hasOwnProperty('start') ? options.end : undefined;
-  this.pos = undefined;
-
-  if (this.start !== undefined) {
-    if ('number' !== typeof this.start) {
-      throw TypeError('start must be a Number');
-    }
-    if (this.end === undefined) {
-      this.end = Infinity;
-    } else if ('number' !== typeof this.end) {
-      throw TypeError('end must be a Number');
-    }
-
-    if (this.start > this.end) {
-      throw new Error('start must be <= end');
-    }
-
-    this.pos = this.start;
-  }
-
-  if (typeof this.fd !== 'number')
-    this.open();
-
-  this.on('end', function() {
-    this.destroy();
-  });
-}
-
-fs.FileReadStream = fs.ReadStream; // support the legacy name
-
-ReadStream.prototype.open = function() {
-  var self = this;
-  fs.open(this.path, this.flags, this.mode, function(er, fd) {
-    if (er) {
-      self.destroy();
-      self.emit('error', er);
-      return;
-    }
-
-    self.fd = fd;
-    self.emit('open', fd);
-    // start the flow of data.
-    self.read();
-  });
-};
-
-ReadStream.prototype._read = function(n, cb) {
-  if (typeof this.fd !== 'number')
-    return this.once('open', function() {
-      this._read(n, cb);
-    });
-
-  if (this.destroyed)
-    return;
-
-  if (!pool || pool.length - pool.used < kMinPoolSpace) {
-    // discard the old pool. Can't add to the free list because
-    // users might have refernces to slices on it.
-    pool = null;
-    allocNewPool();
-  }
-
-  // Grab another reference to the pool in the case that while we're
-  // in the thread pool another read() finishes up the pool, and
-  // allocates a new one.
-  var thisPool = pool;
-  var toRead = Math.min(pool.length - pool.used, n);
-  var start = pool.used;
-
-  if (this.pos !== undefined)
-    toRead = Math.min(this.end - this.pos + 1, toRead);
-
-  // already read everything we were supposed to read!
-  // treat as EOF.
-  if (toRead <= 0)
-    return cb();
-
-  // the actual read.
-  var self = this;
-  fs.read(this.fd, pool, pool.used, toRead, this.pos, onread);
-
-  // move the pool positions, and internal position for reading.
-  if (this.pos !== undefined)
-    this.pos += toRead;
-  pool.used += toRead;
-
-  function onread(er, bytesRead) {
-    if (er) {
-      self.destroy();
-      return cb(er);
-    }
-
-    var b = null;
-    if (bytesRead > 0)
-      b = thisPool.slice(start, start + bytesRead);
-
-    cb(null, b);
-  }
-};
-
-
-ReadStream.prototype.destroy = function() {
-  if (this.destroyed)
-    return;
-  this.destroyed = true;
-  if ('number' === typeof this.fd)
-    this.close();
-};
-
-
-ReadStream.prototype.close = function(cb) {
-  if (cb)
-    this.once('close', cb);
-  if (this.closed || 'number' !== typeof this.fd) {
-    if ('number' !== typeof this.fd)
-      this.once('open', close);
-    return process.nextTick(this.emit.bind(this, 'close'));
-  }
-  this.closed = true;
-  var self = this;
-  close();
-
-  function close() {
-    fs.close(self.fd, function(er) {
-      if (er)
-        self.emit('error', er);
-      else
-        self.emit('close');
-    });
-  }
-};
-
-
-
-
-fs.createWriteStream = function(path, options) {
-  return new WriteStream(path, options);
-};
-
-util.inherits(WriteStream, Writable);
-fs.WriteStream = WriteStream;
-function WriteStream(path, options) {
-  if (!(this instanceof WriteStream))
-    return new WriteStream(path, options);
-
-  // a little bit bigger buffer and water marks by default
-  options = util._extend({
-    bufferSize: 64 * 1024,
-    lowWaterMark: 16 * 1024,
-    highWaterMark: 64 * 1024
-  }, options || {});
-
-  Writable.call(this, options);
-
-  this.path = path;
-  this.fd = null;
-
-  this.fd = options.hasOwnProperty('fd') ? options.fd : null;
-  this.flags = options.hasOwnProperty('flags') ? options.flags : 'w';
-  this.mode = options.hasOwnProperty('mode') ? options.mode : 438; /*=0666*/
-
-  this.start = options.hasOwnProperty('start') ? options.start : undefined;
-  this.pos = undefined;
-  this.bytesWritten = 0;
-
-  if (this.start !== undefined) {
-    if ('number' !== typeof this.start) {
-      throw TypeError('start must be a Number');
-    }
-    if (this.start < 0) {
-      throw new Error('start must be >= zero');
-    }
-
-    this.pos = this.start;
-  }
-
-  if ('number' !== typeof this.fd)
-    this.open();
-
-  // dispose on finish.
-  this.once('finish', this.close);
-}
-
-fs.FileWriteStream = fs.WriteStream; // support the legacy name
-
-
-WriteStream.prototype.open = function() {
-  fs.open(this.path, this.flags, this.mode, function(er, fd) {
-    if (er) {
-      this.destroy();
-      this.emit('error', er);
-      return;
-    }
-
-    this.fd = fd;
-    this.emit('open', fd);
-  }.bind(this));
-};
-
-
-WriteStream.prototype._write = function(data, cb) {
-  if (!Buffer.isBuffer(data))
-    return this.emit('error', new Error('Invalid data'));
-
-  if (typeof this.fd !== 'number')
-    return this.once('open', this._write.bind(this, data, cb));
-
-  fs.write(this.fd, data, 0, data.length, this.pos, function(er, bytes) {
-    if (er) {
-      this.destroy();
-      return cb(er);
-    }
-    this.bytesWritten += bytes;
-    cb();
-  }.bind(this));
-
-  if (this.pos !== undefined)
-    this.pos += data.length;
-};
-
-
-WriteStream.prototype.destroy = ReadStream.prototype.destroy;
-WriteStream.prototype.close = ReadStream.prototype.close;
-
-// There is no shutdown() for files.
-WriteStream.prototype.destroySoon = WriteStream.prototype.end;
-
-
-// SyncWriteStream is internal. DO NOT USE.
-// Temporary hack for process.stdout and process.stderr when piped to files.
-function SyncWriteStream(fd) {
-  Stream.call(this);
-
-  this.fd = fd;
-  this.writable = true;
-  this.readable = false;
-}
-
-util.inherits(SyncWriteStream, Stream);
-
-
-// Export
-fs.SyncWriteStream = SyncWriteStream;
-
-
-SyncWriteStream.prototype.write = function(data, arg1, arg2) {
-  var encoding, cb;
-
-  // parse arguments
-  if (arg1) {
-    if (typeof arg1 === 'string') {
-      encoding = arg1;
-      cb = arg2;
-    } else if (typeof arg1 === 'function') {
-      cb = arg1;
-    } else {
-      throw new Error('bad arg');
-    }
-  }
-  assertEncoding(encoding);
-
-  // Change strings to buffers. SLOW
-  if (typeof data == 'string') {
-    data = new Buffer(data, encoding);
-  }
-
-  fs.writeSync(this.fd, data, 0, data.length);
-
-  if (cb) {
-    process.nextTick(cb);
-  }
-
-  return true;
-};
-
-
-SyncWriteStream.prototype.end = function(data, arg1, arg2) {
-  if (data) {
-    this.write(data, arg1, arg2);
-  }
-  this.destroy();
-};
-
-
-SyncWriteStream.prototype.destroy = function() {
-  fs.closeSync(this.fd);
-  this.fd = null;
-  this.emit('close');
-  return true;
-};
-
-SyncWriteStream.prototype.destroySoon = SyncWriteStream.prototype.destroy;
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_duplex.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,69 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-// a duplex stream is just a stream that is both readable and writable.
-// Since JS doesn't have multiple prototypal inheritance, this class
-// prototypally inherits from Readable, and then parasitically from
-// Writable.
-
-module.exports = Duplex;
-var util = require('util');
-var Readable = require('./_stream_readable');
-var Writable = require('./_stream_writable');
-
-util.inherits(Duplex, Readable);
-
-Object.keys(Writable.prototype).forEach(function(method) {
-  if (!Duplex.prototype[method])
-    Duplex.prototype[method] = Writable.prototype[method];
-});
-
-function Duplex(options) {
-  if (!(this instanceof Duplex))
-    return new Duplex(options);
-
-  Readable.call(this, options);
-  Writable.call(this, options);
-
-  if (options && options.readable === false)
-    this.readable = false;
-
-  if (options && options.writable === false)
-    this.writable = false;
-
-  this.allowHalfOpen = true;
-  if (options && options.allowHalfOpen === false)
-    this.allowHalfOpen = false;
-
-  this.once('end', onend);
-}
-
-// the no-half-open enforcer
-function onend() {
-  // if we allow half-open state, or if the writable side ended,
-  // then we're ok.
-  if (this.allowHalfOpen || this._writableState.ended)
-    return;
-
-  // no more data can be written.
-  // But allow more writes to happen in this tick.
-  process.nextTick(this.end.bind(this));
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_passthrough.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,41 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-// a passthrough stream.
-// basically just the most minimal sort of Transform stream.
-// Every written chunk gets output as-is.
-
-module.exports = PassThrough;
-
-var Transform = require('./_stream_transform');
-var util = require('util');
-util.inherits(PassThrough, Transform);
-
-function PassThrough(options) {
-  if (!(this instanceof PassThrough))
-    return new PassThrough(options);
-
-  Transform.call(this, options);
-}
-
-PassThrough.prototype._transform = function(chunk, encoding, cb) {
-  cb(null, chunk);
-};
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_readable.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,927 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-module.exports = Readable;
-Readable.ReadableState = ReadableState;
-
-var EE = require('events').EventEmitter;
-if (!EE.listenerCount) EE.listenerCount = function(emitter, type) {
-  return emitter.listeners(type).length;
-};
-var Stream = require('stream');
-var util = require('util');
-var StringDecoder;
-
-util.inherits(Readable, Stream);
-
-function ReadableState(options, stream) {
-  options = options || {};
-
-  // the point at which it stops calling _read() to fill the buffer
-  // Note: 0 is a valid value, means "don't call _read preemptively ever"
-  var hwm = options.highWaterMark;
-  this.highWaterMark = (hwm || hwm === 0) ? hwm : 16 * 1024;
-
-  // cast to ints.
-  this.highWaterMark = ~~this.highWaterMark;
-
-  this.buffer = [];
-  this.length = 0;
-  this.pipes = null;
-  this.pipesCount = 0;
-  this.flowing = false;
-  this.ended = false;
-  this.endEmitted = false;
-  this.reading = false;
-
-  // In streams that never have any data, and do push(null) right away,
-  // the consumer can miss the 'end' event if they do some I/O before
-  // consuming the stream.  So, we don't emit('end') until some reading
-  // happens.
-  this.calledRead = false;
-
-  // a flag to be able to tell if the onwrite cb is called immediately,
-  // or on a later tick.  We set this to true at first, becuase any
-  // actions that shouldn't happen until "later" should generally also
-  // not happen before the first write call.
-  this.sync = true;
-
-  // whenever we return null, then we set a flag to say
-  // that we're awaiting a 'readable' event emission.
-  this.needReadable = false;
-  this.emittedReadable = false;
-  this.readableListening = false;
-
-
-  // object stream flag. Used to make read(n) ignore n and to
-  // make all the buffer merging and length checks go away
-  this.objectMode = !!options.objectMode;
-
-  // Crypto is kind of old and crusty.  Historically, its default string
-  // encoding is 'binary' so we have to make this configurable.
-  // Everything else in the universe uses 'utf8', though.
-  this.defaultEncoding = options.defaultEncoding || 'utf8';
-
-  // when piping, we only care about 'readable' events that happen
-  // after read()ing all the bytes and not getting any pushback.
-  this.ranOut = false;
-
-  // the number of writers that are awaiting a drain event in .pipe()s
-  this.awaitDrain = 0;
-
-  // if true, a maybeReadMore has been scheduled
-  this.readingMore = false;
-
-  this.decoder = null;
-  this.encoding = null;
-  if (options.encoding) {
-    if (!StringDecoder)
-      StringDecoder = require('string_decoder').StringDecoder;
-    this.decoder = new StringDecoder(options.encoding);
-    this.encoding = options.encoding;
-  }
-}
-
-function Readable(options) {
-  if (!(this instanceof Readable))
-    return new Readable(options);
-
-  this._readableState = new ReadableState(options, this);
-
-  // legacy
-  this.readable = true;
-
-  Stream.call(this);
-}
-
-// Manually shove something into the read() buffer.
-// This returns true if the highWaterMark has not been hit yet,
-// similar to how Writable.write() returns true if you should
-// write() some more.
-Readable.prototype.push = function(chunk, encoding) {
-  var state = this._readableState;
-
-  if (typeof chunk === 'string' && !state.objectMode) {
-    encoding = encoding || state.defaultEncoding;
-    if (encoding !== state.encoding) {
-      chunk = new Buffer(chunk, encoding);
-      encoding = '';
-    }
-  }
-
-  return readableAddChunk(this, state, chunk, encoding, false);
-};
-
-// Unshift should *always* be something directly out of read()
-Readable.prototype.unshift = function(chunk) {
-  var state = this._readableState;
-  return readableAddChunk(this, state, chunk, '', true);
-};
-
-function readableAddChunk(stream, state, chunk, encoding, addToFront) {
-  var er = chunkInvalid(state, chunk);
-  if (er) {
-    stream.emit('error', er);
-  } else if (chunk === null || chunk === undefined) {
-    state.reading = false;
-    if (!state.ended)
-      onEofChunk(stream, state);
-  } else if (state.objectMode || chunk && chunk.length > 0) {
-    if (state.ended && !addToFront) {
-      var e = new Error('stream.push() after EOF');
-      stream.emit('error', e);
-    } else if (state.endEmitted && addToFront) {
-      var e = new Error('stream.unshift() after end event');
-      stream.emit('error', e);
-    } else {
-      if (state.decoder && !addToFront && !encoding)
-        chunk = state.decoder.write(chunk);
-
-      // update the buffer info.
-      state.length += state.objectMode ? 1 : chunk.length;
-      if (addToFront) {
-        state.buffer.unshift(chunk);
-      } else {
-        state.reading = false;
-        state.buffer.push(chunk);
-      }
-
-      if (state.needReadable)
-        emitReadable(stream);
-
-      maybeReadMore(stream, state);
-    }
-  } else if (!addToFront) {
-    state.reading = false;
-  }
-
-  return needMoreData(state);
-}
-
-
-
-// if it's past the high water mark, we can push in some more.
-// Also, if we have no data yet, we can stand some
-// more bytes.  This is to work around cases where hwm=0,
-// such as the repl.  Also, if the push() triggered a
-// readable event, and the user called read(largeNumber) such that
-// needReadable was set, then we ought to push more, so that another
-// 'readable' event will be triggered.
-function needMoreData(state) {
-  return !state.ended &&
-         (state.needReadable ||
-          state.length < state.highWaterMark ||
-          state.length === 0);
-}
-
-// backwards compatibility.
-Readable.prototype.setEncoding = function(enc) {
-  if (!StringDecoder)
-    StringDecoder = require('string_decoder').StringDecoder;
-  this._readableState.decoder = new StringDecoder(enc);
-  this._readableState.encoding = enc;
-};
-
-// Don't raise the hwm > 128MB
-var MAX_HWM = 0x800000;
-function roundUpToNextPowerOf2(n) {
-  if (n >= MAX_HWM) {
-    n = MAX_HWM;
-  } else {
-    // Get the next highest power of 2
-    n--;
-    for (var p = 1; p < 32; p <<= 1) n |= n >> p;
-    n++;
-  }
-  return n;
-}
-
-function howMuchToRead(n, state) {
-  if (state.length === 0 && state.ended)
-    return 0;
-
-  if (state.objectMode)
-    return n === 0 ? 0 : 1;
-
-  if (isNaN(n) || n === null) {
-    // only flow one buffer at a time
-    if (state.flowing && state.buffer.length)
-      return state.buffer[0].length;
-    else
-      return state.length;
-  }
-
-  if (n <= 0)
-    return 0;
-
-  // If we're asking for more than the target buffer level,
-  // then raise the water mark.  Bump up to the next highest
-  // power of 2, to prevent increasing it excessively in tiny
-  // amounts.
-  if (n > state.highWaterMark)
-    state.highWaterMark = roundUpToNextPowerOf2(n);
-
-  // don't have that much.  return null, unless we've ended.
-  if (n > state.length) {
-    if (!state.ended) {
-      state.needReadable = true;
-      return 0;
-    } else
-      return state.length;
-  }
-
-  return n;
-}
-
-// you can override either this method, or the async _read(n) below.
-Readable.prototype.read = function(n) {
-  var state = this._readableState;
-  state.calledRead = true;
-  var nOrig = n;
-
-  if (typeof n !== 'number' || n > 0)
-    state.emittedReadable = false;
-
-  // if we're doing read(0) to trigger a readable event, but we
-  // already have a bunch of data in the buffer, then just trigger
-  // the 'readable' event and move on.
-  if (n === 0 &&
-      state.needReadable &&
-      (state.length >= state.highWaterMark || state.ended)) {
-    emitReadable(this);
-    return null;
-  }
-
-  n = howMuchToRead(n, state);
-
-  // if we've ended, and we're now clear, then finish it up.
-  if (n === 0 && state.ended) {
-    if (state.length === 0)
-      endReadable(this);
-    return null;
-  }
-
-  // All the actual chunk generation logic needs to be
-  // *below* the call to _read.  The reason is that in certain
-  // synthetic stream cases, such as passthrough streams, _read
-  // may be a completely synchronous operation which may change
-  // the state of the read buffer, providing enough data when
-  // before there was *not* enough.
-  //
-  // So, the steps are:
-  // 1. Figure out what the state of things will be after we do
-  // a read from the buffer.
-  //
-  // 2. If that resulting state will trigger a _read, then call _read.
-  // Note that this may be asynchronous, or synchronous.  Yes, it is
-  // deeply ugly to write APIs this way, but that still doesn't mean
-  // that the Readable class should behave improperly, as streams are
-  // designed to be sync/async agnostic.
-  // Take note if the _read call is sync or async (ie, if the read call
-  // has returned yet), so that we know whether or not it's safe to emit
-  // 'readable' etc.
-  //
-  // 3. Actually pull the requested chunks out of the buffer and return.
-
-  // if we need a readable event, then we need to do some reading.
-  var doRead = state.needReadable;
-
-  // if we currently have less than the highWaterMark, then also read some
-  if (state.length - n <= state.highWaterMark)
-    doRead = true;
-
-  // however, if we've ended, then there's no point, and if we're already
-  // reading, then it's unnecessary.
-  if (state.ended || state.reading)
-    doRead = false;
-
-  if (doRead) {
-    state.reading = true;
-    state.sync = true;
-    // if the length is currently zero, then we *need* a readable event.
-    if (state.length === 0)
-      state.needReadable = true;
-    // call internal read method
-    this._read(state.highWaterMark);
-    state.sync = false;
-  }
-
-  // If _read called its callback synchronously, then `reading`
-  // will be false, and we need to re-evaluate how much data we
-  // can return to the user.
-  if (doRead && !state.reading)
-    n = howMuchToRead(nOrig, state);
-
-  var ret;
-  if (n > 0)
-    ret = fromList(n, state);
-  else
-    ret = null;
-
-  if (ret === null) {
-    state.needReadable = true;
-    n = 0;
-  }
-
-  state.length -= n;
-
-  // If we have nothing in the buffer, then we want to know
-  // as soon as we *do* get something into the buffer.
-  if (state.length === 0 && !state.ended)
-    state.needReadable = true;
-
-  // If we happened to read() exactly the remaining amount in the
-  // buffer, and the EOF has been seen at this point, then make sure
-  // that we emit 'end' on the very next tick.
-  if (state.ended && !state.endEmitted && state.length === 0)
-    endReadable(this);
-
-  return ret;
-};
-
-function chunkInvalid(state, chunk) {
-  var er = null;
-  if (!Buffer.isBuffer(chunk) &&
-      'string' !== typeof chunk &&
-      chunk !== null &&
-      chunk !== undefined &&
-      !state.objectMode &&
-      !er) {
-    er = new TypeError('Invalid non-string/buffer chunk');
-  }
-  return er;
-}
-
-
-function onEofChunk(stream, state) {
-  if (state.decoder && !state.ended) {
-    var chunk = state.decoder.end();
-    if (chunk && chunk.length) {
-      state.buffer.push(chunk);
-      state.length += state.objectMode ? 1 : chunk.length;
-    }
-  }
-  state.ended = true;
-
-  // if we've ended and we have some data left, then emit
-  // 'readable' now to make sure it gets picked up.
-  if (state.length > 0)
-    emitReadable(stream);
-  else
-    endReadable(stream);
-}
-
-// Don't emit readable right away in sync mode, because this can trigger
-// another read() call => stack overflow.  This way, it might trigger
-// a nextTick recursion warning, but that's not so bad.
-function emitReadable(stream) {
-  var state = stream._readableState;
-  state.needReadable = false;
-  if (state.emittedReadable)
-    return;
-
-  state.emittedReadable = true;
-  if (state.sync)
-    process.nextTick(function() {
-      emitReadable_(stream);
-    });
-  else
-    emitReadable_(stream);
-}
-
-function emitReadable_(stream) {
-  stream.emit('readable');
-}
-
-
-// at this point, the user has presumably seen the 'readable' event,
-// and called read() to consume some data.  that may have triggered
-// in turn another _read(n) call, in which case reading = true if
-// it's in progress.
-// However, if we're not ended, or reading, and the length < hwm,
-// then go ahead and try to read some more preemptively.
-function maybeReadMore(stream, state) {
-  if (!state.readingMore) {
-    state.readingMore = true;
-    process.nextTick(function() {
-      maybeReadMore_(stream, state);
-    });
-  }
-}
-
-function maybeReadMore_(stream, state) {
-  var len = state.length;
-  while (!state.reading && !state.flowing && !state.ended &&
-         state.length < state.highWaterMark) {
-    stream.read(0);
-    if (len === state.length)
-      // didn't get any data, stop spinning.
-      break;
-    else
-      len = state.length;
-  }
-  state.readingMore = false;
-}
-
-// abstract method.  to be overridden in specific implementation classes.
-// call cb(er, data) where data is <= n in length.
-// for virtual (non-string, non-buffer) streams, "length" is somewhat
-// arbitrary, and perhaps not very meaningful.
-Readable.prototype._read = function(n) {
-  this.emit('error', new Error('not implemented'));
-};
-
-Readable.prototype.pipe = function(dest, pipeOpts) {
-  var src = this;
-  var state = this._readableState;
-
-  switch (state.pipesCount) {
-    case 0:
-      state.pipes = dest;
-      break;
-    case 1:
-      state.pipes = [state.pipes, dest];
-      break;
-    default:
-      state.pipes.push(dest);
-      break;
-  }
-  state.pipesCount += 1;
-
-  var doEnd = (!pipeOpts || pipeOpts.end !== false) &&
-              dest !== process.stdout &&
-              dest !== process.stderr;
-
-  var endFn = doEnd ? onend : cleanup;
-  if (state.endEmitted)
-    process.nextTick(endFn);
-  else
-    src.once('end', endFn);
-
-  dest.on('unpipe', onunpipe);
-  function onunpipe(readable) {
-    if (readable !== src) return;
-    cleanup();
-  }
-
-  function onend() {
-    dest.end();
-  }
-
-  // when the dest drains, it reduces the awaitDrain counter
-  // on the source.  This would be more elegant with a .once()
-  // handler in flow(), but adding and removing repeatedly is
-  // too slow.
-  var ondrain = pipeOnDrain(src);
-  dest.on('drain', ondrain);
-
-  function cleanup() {
-    // cleanup event handlers once the pipe is broken
-    dest.removeListener('close', onclose);
-    dest.removeListener('finish', onfinish);
-    dest.removeListener('drain', ondrain);
-    dest.removeListener('error', onerror);
-    dest.removeListener('unpipe', onunpipe);
-    src.removeListener('end', onend);
-    src.removeListener('end', cleanup);
-
-    // if the reader is waiting for a drain event from this
-    // specific writer, then it would cause it to never start
-    // flowing again.
-    // So, if this is awaiting a drain, then we just call it now.
-    // If we don't know, then assume that we are waiting for one.
-    if (!dest._writableState || dest._writableState.needDrain)
-      ondrain();
-  }
-
-  // if the dest has an error, then stop piping into it.
-  // however, don't suppress the throwing behavior for this.
-  function onerror(er) {
-    unpipe();
-    dest.removeListener('error', onerror);
-    if (EE.listenerCount(dest, 'error') === 0)
-      dest.emit('error', er);
-  }
-  // This is a brutally ugly hack to make sure that our error handler
-  // is attached before any userland ones.  NEVER DO THIS.
-  if (!dest._events.error)
-    dest.on('error', onerror);
-  else if (Array.isArray(dest._events.error))
-    dest._events.error.unshift(onerror);
-  else
-    dest._events.error = [onerror, dest._events.error];
-
-
-
-  // Both close and finish should trigger unpipe, but only once.
-  function onclose() {
-    dest.removeListener('finish', onfinish);
-    unpipe();
-  }
-  dest.once('close', onclose);
-  function onfinish() {
-    dest.removeListener('close', onclose);
-    unpipe();
-  }
-  dest.once('finish', onfinish);
-
-  function unpipe() {
-    src.unpipe(dest);
-  }
-
-  // tell the dest that it's being piped to
-  dest.emit('pipe', src);
-
-  // start the flow if it hasn't been started already.
-  if (!state.flowing) {
-    // the handler that waits for readable events after all
-    // the data gets sucked out in flow.
-    // This would be easier to follow with a .once() handler
-    // in flow(), but that is too slow.
-    this.on('readable', pipeOnReadable);
-
-    state.flowing = true;
-    process.nextTick(function() {
-      flow(src);
-    });
-  }
-
-  return dest;
-};
-
-function pipeOnDrain(src) {
-  return function() {
-    var dest = this;
-    var state = src._readableState;
-    state.awaitDrain--;
-    if (state.awaitDrain === 0)
-      flow(src);
-  };
-}
-
-function flow(src) {
-  var state = src._readableState;
-  var chunk;
-  state.awaitDrain = 0;
-
-  function write(dest, i, list) {
-    var written = dest.write(chunk);
-    if (false === written) {
-      state.awaitDrain++;
-    }
-  }
-
-  while (state.pipesCount && null !== (chunk = src.read())) {
-
-    if (state.pipesCount === 1)
-      write(state.pipes, 0, null);
-    else
-      state.pipes.forEach(write);
-
-    src.emit('data', chunk);
-
-    // if anyone needs a drain, then we have to wait for that.
-    if (state.awaitDrain > 0)
-      return;
-  }
-
-  // if every destination was unpiped, either before entering this
-  // function, or in the while loop, then stop flowing.
-  //
-  // NB: This is a pretty rare edge case.
-  if (state.pipesCount === 0) {
-    state.flowing = false;
-
-    // if there were data event listeners added, then switch to old mode.
-    if (EE.listenerCount(src, 'data') > 0)
-      emitDataEvents(src);
-    return;
-  }
-
-  // at this point, no one needed a drain, so we just ran out of data
-  // on the next readable event, start it over again.
-  state.ranOut = true;
-}
-
-function pipeOnReadable() {
-  if (this._readableState.ranOut) {
-    this._readableState.ranOut = false;
-    flow(this);
-  }
-}
-
-
-Readable.prototype.unpipe = function(dest) {
-  var state = this._readableState;
-
-  // if we're not piping anywhere, then do nothing.
-  if (state.pipesCount === 0)
-    return this;
-
-  // just one destination.  most common case.
-  if (state.pipesCount === 1) {
-    // passed in one, but it's not the right one.
-    if (dest && dest !== state.pipes)
-      return this;
-
-    if (!dest)
-      dest = state.pipes;
-
-    // got a match.
-    state.pipes = null;
-    state.pipesCount = 0;
-    this.removeListener('readable', pipeOnReadable);
-    state.flowing = false;
-    if (dest)
-      dest.emit('unpipe', this);
-    return this;
-  }
-
-  // slow case. multiple pipe destinations.
-
-  if (!dest) {
-    // remove all.
-    var dests = state.pipes;
-    var len = state.pipesCount;
-    state.pipes = null;
-    state.pipesCount = 0;
-    this.removeListener('readable', pipeOnReadable);
-    state.flowing = false;
-
-    for (var i = 0; i < len; i++)
-      dests[i].emit('unpipe', this);
-    return this;
-  }
-
-  // try to find the right one.
-  var i = state.pipes.indexOf(dest);
-  if (i === -1)
-    return this;
-
-  state.pipes.splice(i, 1);
-  state.pipesCount -= 1;
-  if (state.pipesCount === 1)
-    state.pipes = state.pipes[0];
-
-  dest.emit('unpipe', this);
-
-  return this;
-};
-
-// set up data events if they are asked for
-// Ensure readable listeners eventually get something
-Readable.prototype.on = function(ev, fn) {
-  var res = Stream.prototype.on.call(this, ev, fn);
-
-  if (ev === 'data' && !this._readableState.flowing)
-    emitDataEvents(this);
-
-  if (ev === 'readable' && this.readable) {
-    var state = this._readableState;
-    if (!state.readableListening) {
-      state.readableListening = true;
-      state.emittedReadable = false;
-      state.needReadable = true;
-      if (!state.reading) {
-        this.read(0);
-      } else if (state.length) {
-        emitReadable(this, state);
-      }
-    }
-  }
-
-  return res;
-};
-Readable.prototype.addListener = Readable.prototype.on;
-
-// pause() and resume() are remnants of the legacy readable stream API
-// If the user uses them, then switch into old mode.
-Readable.prototype.resume = function() {
-  emitDataEvents(this);
-  this.read(0);
-  this.emit('resume');
-};
-
-Readable.prototype.pause = function() {
-  emitDataEvents(this, true);
-  this.emit('pause');
-};
-
-function emitDataEvents(stream, startPaused) {
-  var state = stream._readableState;
-
-  if (state.flowing) {
-    // https://github.com/isaacs/readable-stream/issues/16
-    throw new Error('Cannot switch to old mode now.');
-  }
-
-  var paused = startPaused || false;
-  var readable = false;
-
-  // convert to an old-style stream.
-  stream.readable = true;
-  stream.pipe = Stream.prototype.pipe;
-  stream.on = stream.addListener = Stream.prototype.on;
-
-  stream.on('readable', function() {
-    readable = true;
-
-    var c;
-    while (!paused && (null !== (c = stream.read())))
-      stream.emit('data', c);
-
-    if (c === null) {
-      readable = false;
-      stream._readableState.needReadable = true;
-    }
-  });
-
-  stream.pause = function() {
-    paused = true;
-    this.emit('pause');
-  };
-
-  stream.resume = function() {
-    paused = false;
-    if (readable)
-      process.nextTick(function() {
-        stream.emit('readable');
-      });
-    else
-      this.read(0);
-    this.emit('resume');
-  };
-
-  // now make it start, just in case it hadn't already.
-  stream.emit('readable');
-}
-
-// wrap an old-style stream as the async data source.
-// This is *not* part of the readable stream interface.
-// It is an ugly unfortunate mess of history.
-Readable.prototype.wrap = function(stream) {
-  var state = this._readableState;
-  var paused = false;
-
-  var self = this;
-  stream.on('end', function() {
-    if (state.decoder && !state.ended) {
-      var chunk = state.decoder.end();
-      if (chunk && chunk.length)
-        self.push(chunk);
-    }
-
-    self.push(null);
-  });
-
-  stream.on('data', function(chunk) {
-    if (state.decoder)
-      chunk = state.decoder.write(chunk);
-    if (!chunk || !state.objectMode && !chunk.length)
-      return;
-
-    var ret = self.push(chunk);
-    if (!ret) {
-      paused = true;
-      stream.pause();
-    }
-  });
-
-  // proxy all the other methods.
-  // important when wrapping filters and duplexes.
-  for (var i in stream) {
-    if (typeof stream[i] === 'function' &&
-        typeof this[i] === 'undefined') {
-      this[i] = function(method) { return function() {
-        return stream[method].apply(stream, arguments);
-      }}(i);
-    }
-  }
-
-  // proxy certain important events.
-  var events = ['error', 'close', 'destroy', 'pause', 'resume'];
-  events.forEach(function(ev) {
-    stream.on(ev, self.emit.bind(self, ev));
-  });
-
-  // when we try to consume some more bytes, simply unpause the
-  // underlying stream.
-  self._read = function(n) {
-    if (paused) {
-      paused = false;
-      stream.resume();
-    }
-  };
-
-  return self;
-};
-
-
-
-// exposed for testing purposes only.
-Readable._fromList = fromList;
-
-// Pluck off n bytes from an array of buffers.
-// Length is the combined lengths of all the buffers in the list.
-function fromList(n, state) {
-  var list = state.buffer;
-  var length = state.length;
-  var stringMode = !!state.decoder;
-  var objectMode = !!state.objectMode;
-  var ret;
-
-  // nothing in the list, definitely empty.
-  if (list.length === 0)
-    return null;
-
-  if (length === 0)
-    ret = null;
-  else if (objectMode)
-    ret = list.shift();
-  else if (!n || n >= length) {
-    // read it all, truncate the array.
-    if (stringMode)
-      ret = list.join('');
-    else
-      ret = Buffer.concat(list, length);
-    list.length = 0;
-  } else {
-    // read just some of it.
-    if (n < list[0].length) {
-      // just take a part of the first list item.
-      // slice is the same for buffers and strings.
-      var buf = list[0];
-      ret = buf.slice(0, n);
-      list[0] = buf.slice(n);
-    } else if (n === list[0].length) {
-      // first list is a perfect match
-      ret = list.shift();
-    } else {
-      // complex case.
-      // we have enough to cover it, but it spans past the first buffer.
-      if (stringMode)
-        ret = '';
-      else
-        ret = new Buffer(n);
-
-      var c = 0;
-      for (var i = 0, l = list.length; i < l && c < n; i++) {
-        var buf = list[0];
-        var cpy = Math.min(n - c, buf.length);
-
-        if (stringMode)
-          ret += buf.slice(0, cpy);
-        else
-          buf.copy(ret, c, 0, cpy);
-
-        if (cpy < buf.length)
-          list[0] = buf.slice(cpy);
-        else
-          list.shift();
-
-        c += cpy;
-      }
-    }
-  }
-
-  return ret;
-}
-
-function endReadable(stream) {
-  var state = stream._readableState;
-
-  // If we get here before consuming all the bytes, then that is a
-  // bug in node.  Should never happen.
-  if (state.length > 0)
-    throw new Error('endReadable called on non-empty stream');
-
-  if (!state.endEmitted && state.calledRead) {
-    state.ended = true;
-    process.nextTick(function() {
-      // Check that we didn't get one last unshift.
-      if (!state.endEmitted && state.length === 0) {
-        state.endEmitted = true;
-        stream.readable = false;
-        stream.emit('end');
-      }
-    });
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_transform.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,205 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-
-// a transform stream is a readable/writable stream where you do
-// something with the data.  Sometimes it's called a "filter",
-// but that's not a great name for it, since that implies a thing where
-// some bits pass through, and others are simply ignored.  (That would
-// be a valid example of a transform, of course.)
-//
-// While the output is causally related to the input, it's not a
-// necessarily symmetric or synchronous transformation.  For example,
-// a zlib stream might take multiple plain-text writes(), and then
-// emit a single compressed chunk some time in the future.
-//
-// Here's how this works:
-//
-// The Transform stream has all the aspects of the readable and writable
-// stream classes.  When you write(chunk), that calls _write(chunk,cb)
-// internally, and returns false if there's a lot of pending writes
-// buffered up.  When you call read(), that calls _read(n) until
-// there's enough pending readable data buffered up.
-//
-// In a transform stream, the written data is placed in a buffer.  When
-// _read(n) is called, it transforms the queued up data, calling the
-// buffered _write cb's as it consumes chunks.  If consuming a single
-// written chunk would result in multiple output chunks, then the first
-// outputted bit calls the readcb, and subsequent chunks just go into
-// the read buffer, and will cause it to emit 'readable' if necessary.
-//
-// This way, back-pressure is actually determined by the reading side,
-// since _read has to be called to start processing a new chunk.  However,
-// a pathological inflate type of transform can cause excessive buffering
-// here.  For example, imagine a stream where every byte of input is
-// interpreted as an integer from 0-255, and then results in that many
-// bytes of output.  Writing the 4 bytes {ff,ff,ff,ff} would result in
-// 1kb of data being output.  In this case, you could write a very small
-// amount of input, and end up with a very large amount of output.  In
-// such a pathological inflating mechanism, there'd be no way to tell
-// the system to stop doing the transform.  A single 4MB write could
-// cause the system to run out of memory.
-//
-// However, even in such a pathological case, only a single written chunk
-// would be consumed, and then the rest would wait (un-transformed) until
-// the results of the previous transformed chunk were consumed.
-
-module.exports = Transform;
-
-var Duplex = require('./_stream_duplex');
-var util = require('util');
-util.inherits(Transform, Duplex);
-
-
-function TransformState(options, stream) {
-  this.afterTransform = function(er, data) {
-    return afterTransform(stream, er, data);
-  };
-
-  this.needTransform = false;
-  this.transforming = false;
-  this.writecb = null;
-  this.writechunk = null;
-}
-
-function afterTransform(stream, er, data) {
-  var ts = stream._transformState;
-  ts.transforming = false;
-
-  var cb = ts.writecb;
-
-  if (!cb)
-    return stream.emit('error', new Error('no writecb in Transform class'));
-
-  ts.writechunk = null;
-  ts.writecb = null;
-
-  if (data !== null && data !== undefined)
-    stream.push(data);
-
-  if (cb)
-    cb(er);
-
-  var rs = stream._readableState;
-  rs.reading = false;
-  if (rs.needReadable || rs.length < rs.highWaterMark) {
-    stream._read(rs.highWaterMark);
-  }
-}
-
-
-function Transform(options) {
-  if (!(this instanceof Transform))
-    return new Transform(options);
-
-  Duplex.call(this, options);
-
-  var ts = this._transformState = new TransformState(options, this);
-
-  // when the writable side finishes, then flush out anything remaining.
-  var stream = this;
-
-  // start out asking for a readable event once data is transformed.
-  this._readableState.needReadable = true;
-
-  // we have implemented the _read method, and done the other things
-  // that Readable wants before the first _read call, so unset the
-  // sync guard flag.
-  this._readableState.sync = false;
-
-  this.once('finish', function() {
-    if ('function' === typeof this._flush)
-      this._flush(function(er) {
-        done(stream, er);
-      });
-    else
-      done(stream);
-  });
-}
-
-Transform.prototype.push = function(chunk, encoding) {
-  this._transformState.needTransform = false;
-  return Duplex.prototype.push.call(this, chunk, encoding);
-};
-
-// This is the part where you do stuff!
-// override this function in implementation classes.
-// 'chunk' is an input chunk.
-//
-// Call `push(newChunk)` to pass along transformed output
-// to the readable side.  You may call 'push' zero or more times.
-//
-// Call `cb(err)` when you are done with this chunk.  If you pass
-// an error, then that'll put the hurt on the whole operation.  If you
-// never call cb(), then you'll never get another chunk.
-Transform.prototype._transform = function(chunk, encoding, cb) {
-  throw new Error('not implemented');
-};
-
-Transform.prototype._write = function(chunk, encoding, cb) {
-  var ts = this._transformState;
-  ts.writecb = cb;
-  ts.writechunk = chunk;
-  ts.writeencoding = encoding;
-  if (!ts.transforming) {
-    var rs = this._readableState;
-    if (ts.needTransform ||
-        rs.needReadable ||
-        rs.length < rs.highWaterMark)
-      this._read(rs.highWaterMark);
-  }
-};
-
-// Doesn't matter what the args are here.
-// _transform does all the work.
-// That we got here means that the readable side wants more data.
-Transform.prototype._read = function(n) {
-  var ts = this._transformState;
-
-  if (ts.writechunk && ts.writecb && !ts.transforming) {
-    ts.transforming = true;
-    this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform);
-  } else {
-    // mark that we need a transform, so that any data that comes in
-    // will get processed, now that we've asked for it.
-    ts.needTransform = true;
-  }
-};
-
-
-function done(stream, er) {
-  if (er)
-    return stream.emit('error', er);
-
-  // if there's nothing in the write buffer, then that means
-  // that nothing more will ever be provided
-  var ws = stream._writableState;
-  var rs = stream._readableState;
-  var ts = stream._transformState;
-
-  if (ws.length)
-    throw new Error('calling transform done when ws.length != 0');
-
-  if (ts.transforming)
-    throw new Error('calling transform done when still transforming');
-
-  return stream.push(null);
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_writable.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,367 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-// A bit simpler than readable streams.
-// Implement an async ._write(chunk, cb), and it'll handle all
-// the drain event emission and buffering.
-
-module.exports = Writable;
-Writable.WritableState = WritableState;
-
-var util = require('util');
-var assert = require('assert');
-var Stream = require('stream');
-
-util.inherits(Writable, Stream);
-
-function WriteReq(chunk, encoding, cb) {
-  this.chunk = chunk;
-  this.encoding = encoding;
-  this.callback = cb;
-}
-
-function WritableState(options, stream) {
-  options = options || {};
-
-  // the point at which write() starts returning false
-  // Note: 0 is a valid value, means that we always return false if
-  // the entire buffer is not flushed immediately on write()
-  var hwm = options.highWaterMark;
-  this.highWaterMark = (hwm || hwm === 0) ? hwm : 16 * 1024;
-
-  // object stream flag to indicate whether or not this stream
-  // contains buffers or objects.
-  this.objectMode = !!options.objectMode;
-
-  // cast to ints.
-  this.highWaterMark = ~~this.highWaterMark;
-
-  this.needDrain = false;
-  // at the start of calling end()
-  this.ending = false;
-  // when end() has been called, and returned
-  this.ended = false;
-  // when 'finish' is emitted
-  this.finished = false;
-
-  // should we decode strings into buffers before passing to _write?
-  // this is here so that some node-core streams can optimize string
-  // handling at a lower level.
-  var noDecode = options.decodeStrings === false;
-  this.decodeStrings = !noDecode;
-
-  // Crypto is kind of old and crusty.  Historically, its default string
-  // encoding is 'binary' so we have to make this configurable.
-  // Everything else in the universe uses 'utf8', though.
-  this.defaultEncoding = options.defaultEncoding || 'utf8';
-
-  // not an actual buffer we keep track of, but a measurement
-  // of how much we're waiting to get pushed to some underlying
-  // socket or file.
-  this.length = 0;
-
-  // a flag to see when we're in the middle of a write.
-  this.writing = false;
-
-  // a flag to be able to tell if the onwrite cb is called immediately,
-  // or on a later tick.  We set this to true at first, becuase any
-  // actions that shouldn't happen until "later" should generally also
-  // not happen before the first write call.
-  this.sync = true;
-
-  // a flag to know if we're processing previously buffered items, which
-  // may call the _write() callback in the same tick, so that we don't
-  // end up in an overlapped onwrite situation.
-  this.bufferProcessing = false;
-
-  // the callback that's passed to _write(chunk,cb)
-  this.onwrite = function(er) {
-    onwrite(stream, er);
-  };
-
-  // the callback that the user supplies to write(chunk,encoding,cb)
-  this.writecb = null;
-
-  // the amount that is being written when _write is called.
-  this.writelen = 0;
-
-  this.buffer = [];
-}
-
-function Writable(options) {
-  // Writable ctor is applied to Duplexes, though they're not
-  // instanceof Writable, they're instanceof Readable.
-  if (!(this instanceof Writable) && !(this instanceof require('./_stream_duplex')))
-    return new Writable(options);
-
-  this._writableState = new WritableState(options, this);
-
-  // legacy.
-  this.writable = true;
-
-  Stream.call(this);
-}
-
-// Otherwise people can pipe Writable streams, which is just wrong.
-Writable.prototype.pipe = function() {
-  this.emit('error', new Error('Cannot pipe. Not readable.'));
-};
-
-
-function writeAfterEnd(stream, state, cb) {
-  var er = new Error('write after end');
-  // TODO: defer error events consistently everywhere, not just the cb
-  stream.emit('error', er);
-  process.nextTick(function() {
-    cb(er);
-  });
-}
-
-// If we get something that is not a buffer, string, null, or undefined,
-// and we're not in objectMode, then that's an error.
-// Otherwise stream chunks are all considered to be of length=1, and the
-// watermarks determine how many objects to keep in the buffer, rather than
-// how many bytes or characters.
-function validChunk(stream, state, chunk, cb) {
-  var valid = true;
-  if (!Buffer.isBuffer(chunk) &&
-      'string' !== typeof chunk &&
-      chunk !== null &&
-      chunk !== undefined &&
-      !state.objectMode) {
-    var er = new TypeError('Invalid non-string/buffer chunk');
-    stream.emit('error', er);
-    process.nextTick(function() {
-      cb(er);
-    });
-    valid = false;
-  }
-  return valid;
-}
-
-Writable.prototype.write = function(chunk, encoding, cb) {
-  var state = this._writableState;
-  var ret = false;
-
-  if (typeof encoding === 'function') {
-    cb = encoding;
-    encoding = null;
-  }
-
-  if (Buffer.isBuffer(chunk))
-    encoding = 'buffer';
-  else if (!encoding)
-    encoding = state.defaultEncoding;
-
-  if (typeof cb !== 'function')
-    cb = function() {};
-
-  if (state.ended)
-    writeAfterEnd(this, state, cb);
-  else if (validChunk(this, state, chunk, cb))
-    ret = writeOrBuffer(this, state, chunk, encoding, cb);
-
-  return ret;
-};
-
-function decodeChunk(state, chunk, encoding) {
-  if (!state.objectMode &&
-      state.decodeStrings !== false &&
-      typeof chunk === 'string') {
-    chunk = new Buffer(chunk, encoding);
-  }
-  return chunk;
-}
-
-// if we're already writing something, then just put this
-// in the queue, and wait our turn.  Otherwise, call _write
-// If we return false, then we need a drain event, so set that flag.
-function writeOrBuffer(stream, state, chunk, encoding, cb) {
-  chunk = decodeChunk(state, chunk, encoding);
-  var len = state.objectMode ? 1 : chunk.length;
-
-  state.length += len;
-
-  var ret = state.length < state.highWaterMark;
-  state.needDrain = !ret;
-
-  if (state.writing)
-    state.buffer.push(new WriteReq(chunk, encoding, cb));
-  else
-    doWrite(stream, state, len, chunk, encoding, cb);
-
-  return ret;
-}
-
-function doWrite(stream, state, len, chunk, encoding, cb) {
-  state.writelen = len;
-  state.writecb = cb;
-  state.writing = true;
-  state.sync = true;
-  stream._write(chunk, encoding, state.onwrite);
-  state.sync = false;
-}
-
-function onwriteError(stream, state, sync, er, cb) {
-  if (sync)
-    process.nextTick(function() {
-      cb(er);
-    });
-  else
-    cb(er);
-
-  stream.emit('error', er);
-}
-
-function onwriteStateUpdate(state) {
-  state.writing = false;
-  state.writecb = null;
-  state.length -= state.writelen;
-  state.writelen = 0;
-}
-
-function onwrite(stream, er) {
-  var state = stream._writableState;
-  var sync = state.sync;
-  var cb = state.writecb;
-
-  onwriteStateUpdate(state);
-
-  if (er)
-    onwriteError(stream, state, sync, er, cb);
-  else {
-    // Check if we're actually ready to finish, but don't emit yet
-    var finished = needFinish(stream, state);
-
-    if (!finished && !state.bufferProcessing && state.buffer.length)
-      clearBuffer(stream, state);
-
-    if (sync) {
-      process.nextTick(function() {
-        afterWrite(stream, state, finished, cb);
-      });
-    } else {
-      afterWrite(stream, state, finished, cb);
-    }
-  }
-}
-
-function afterWrite(stream, state, finished, cb) {
-  if (!finished)
-    onwriteDrain(stream, state);
-  cb();
-  if (finished)
-    finishMaybe(stream, state);
-}
-
-// Must force callback to be called on nextTick, so that we don't
-// emit 'drain' before the write() consumer gets the 'false' return
-// value, and has a chance to attach a 'drain' listener.
-function onwriteDrain(stream, state) {
-  if (state.length === 0 && state.needDrain) {
-    state.needDrain = false;
-    stream.emit('drain');
-  }
-}
-
-
-// if there's something in the buffer waiting, then process it
-function clearBuffer(stream, state) {
-  state.bufferProcessing = true;
-
-  for (var c = 0; c < state.buffer.length; c++) {
-    var entry = state.buffer[c];
-    var chunk = entry.chunk;
-    var encoding = entry.encoding;
-    var cb = entry.callback;
-    var len = state.objectMode ? 1 : chunk.length;
-
-    doWrite(stream, state, len, chunk, encoding, cb);
-
-    // if we didn't call the onwrite immediately, then
-    // it means that we need to wait until it does.
-    // also, that means that the chunk and cb are currently
-    // being processed, so move the buffer counter past them.
-    if (state.writing) {
-      c++;
-      break;
-    }
-  }
-
-  state.bufferProcessing = false;
-  if (c < state.buffer.length)
-    state.buffer = state.buffer.slice(c);
-  else
-    state.buffer.length = 0;
-}
-
-Writable.prototype._write = function(chunk, encoding, cb) {
-  cb(new Error('not implemented'));
-};
-
-Writable.prototype.end = function(chunk, encoding, cb) {
-  var state = this._writableState;
-
-  if (typeof chunk === 'function') {
-    cb = chunk;
-    chunk = null;
-    encoding = null;
-  } else if (typeof encoding === 'function') {
-    cb = encoding;
-    encoding = null;
-  }
-
-  if (typeof chunk !== 'undefined' && chunk !== null)
-    this.write(chunk, encoding);
-
-  // ignore unnecessary end() calls.
-  if (!state.ending && !state.finished)
-    endWritable(this, state, cb);
-};
-
-
-function needFinish(stream, state) {
-  return (state.ending &&
-          state.length === 0 &&
-          !state.finished &&
-          !state.writing);
-}
-
-function finishMaybe(stream, state) {
-  var need = needFinish(stream, state);
-  if (need) {
-    state.finished = true;
-    stream.emit('finish');
-  }
-  return need;
-}
-
-function endWritable(stream, state, cb) {
-  state.ending = true;
-  finishMaybe(stream, state);
-  if (cb) {
-    if (state.finished)
-      process.nextTick(cb);
-    else
-      stream.once('finish', cb);
-  }
-  state.ended = true;
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,39 +0,0 @@
-{
-  "name": "readable-stream",
-  "version": "1.0.17",
-  "description": "An exploration of a new kind of readable streams for Node.js",
-  "main": "readable.js",
-  "dependencies": {},
-  "devDependencies": {
-    "tap": "~0.2.6"
-  },
-  "scripts": {
-    "test": "tap test/simple/*.js"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/readable-stream"
-  },
-  "keywords": [
-    "readable",
-    "stream",
-    "pipe"
-  ],
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "license": "BSD",
-  "readme": "# readable-stream\n\nA new class of streams for Node.js\n\nThis module provides the new Stream base classes introduced in Node\nv0.10, for use in Node v0.8.  You can use it to have programs that\nhave to work with node v0.8, while being forward-compatible for v0.10\nand beyond.  When you drop support for v0.8, you can remove this\nmodule, and only use the native streams.\n\nThis is almost exactly the same codebase as appears in Node v0.10.\nHowever:\n\n1. The exported object is actually the Readable class.  Decorating the\n   native `stream` module would be global pollution.\n2. In v0.10, you can safely use `base64` as an argument to\n   `setEncoding` in Readable streams.  However, in v0.8, the\n   StringDecoder class has no `end()` method, which is problematic for\n   Base64.  So, don't use that, because it'll break and be weird.\n\nOther than that, the API is the same as `require('stream')` in v0.10,\nso the API docs are reproduced below.\n\n----------\n\n    Stability: 2 - Unstable\n\nA stream is an abstract interface implemented by various objects in\nNode.  For example a request to an HTTP server is a stream, as is\nstdout. Streams are readable, writable, or both. All streams are\ninstances of [EventEmitter][]\n\nYou can load the Stream base classes by doing `require('stream')`.\nThere are base classes provided for Readable streams, Writable\nstreams, Duplex streams, and Transform streams.\n\n## Compatibility\n\nIn earlier versions of Node, the Readable stream interface was\nsimpler, but also less powerful and less useful.\n\n* Rather than waiting for you to call the `read()` method, `'data'`\n  events would start emitting immediately.  If you needed to do some\n  I/O to decide how to handle data, then you had to store the chunks\n  in some kind of buffer so that they would not be lost.\n* The `pause()` method was advisory, rather than guaranteed.  This\n  meant that you still had to be prepared to receive `'data'` events\n  even when the stream was in a paused state.\n\nIn Node v0.10, the Readable class described below was added.  For\nbackwards compatibility with older Node programs, Readable streams\nswitch into \"old mode\" when a `'data'` event handler is added, or when\nthe `pause()` or `resume()` methods are called.  The effect is that,\neven if you are not using the new `read()` method and `'readable'`\nevent, you no longer have to worry about losing `'data'` chunks.\n\nMost programs will continue to function normally.  However, this\nintroduces an edge case in the following conditions:\n\n* No `'data'` event handler is added.\n* The `pause()` and `resume()` methods are never called.\n\nFor example, consider the following code:\n\n```javascript\n// WARNING!  BROKEN!\nnet.createServer(function(socket) {\n\n  // we add an 'end' method, but never consume the data\n  socket.on('end', function() {\n    // It will never get here.\n    socket.end('I got your message (but didnt read it)\\n');\n  });\n\n}).listen(1337);\n```\n\nIn versions of node prior to v0.10, the incoming message data would be\nsimply discarded.  However, in Node v0.10 and beyond, the socket will\nremain paused forever.\n\nThe workaround in this situation is to call the `resume()` method to\ntrigger \"old mode\" behavior:\n\n```javascript\n// Workaround\nnet.createServer(function(socket) {\n\n  socket.on('end', function() {\n    socket.end('I got your message (but didnt read it)\\n');\n  });\n\n  // start the flow of data, discarding it.\n  socket.resume();\n\n}).listen(1337);\n```\n\nIn addition to new Readable streams switching into old-mode, pre-v0.10\nstyle streams can be wrapped in a Readable class using the `wrap()`\nmethod.\n\n## Class: stream.Readable\n\n<!--type=class-->\n\nA `Readable Stream` has the following methods, members, and events.\n\nNote that `stream.Readable` is an abstract class designed to be\nextended with an underlying implementation of the `_read(size)`\nmethod. (See below.)\n\n### new stream.Readable([options])\n\n* `options` {Object}\n  * `highWaterMark` {Number} The maximum number of bytes to store in\n    the internal buffer before ceasing to read from the underlying\n    resource.  Default=16kb\n  * `encoding` {String} If specified, then buffers will be decoded to\n    strings using the specified encoding.  Default=null\n  * `objectMode` {Boolean} Whether this stream should behave\n    as a stream of objects. Meaning that stream.read(n) returns\n    a single value instead of a Buffer of size n\n\nIn classes that extend the Readable class, make sure to call the\nconstructor so that the buffering settings can be properly\ninitialized.\n\n### readable.\\_read(size)\n\n* `size` {Number} Number of bytes to read asynchronously\n\nNote: **This function should NOT be called directly.**  It should be\nimplemented by child classes, and called by the internal Readable\nclass methods only.\n\nAll Readable stream implementations must provide a `_read` method\nto fetch data from the underlying resource.\n\nThis method is prefixed with an underscore because it is internal to\nthe class that defines it, and should not be called directly by user\nprograms.  However, you **are** expected to override this method in\nyour own extension classes.\n\nWhen data is available, put it into the read queue by calling\n`readable.push(chunk)`.  If `push` returns false, then you should stop\nreading.  When `_read` is called again, you should start pushing more\ndata.\n\nThe `size` argument is advisory.  Implementations where a \"read\" is a\nsingle call that returns data can use this to know how much data to\nfetch.  Implementations where that is not relevant, such as TCP or\nTLS, may ignore this argument, and simply provide data whenever it\nbecomes available.  There is no need, for example to \"wait\" until\n`size` bytes are available before calling `stream.push(chunk)`.\n\n### readable.push(chunk)\n\n* `chunk` {Buffer | null | String} Chunk of data to push into the read queue\n* return {Boolean} Whether or not more pushes should be performed\n\nNote: **This function should be called by Readable implementors, NOT\nby consumers of Readable subclasses.**  The `_read()` function will not\nbe called again until at least one `push(chunk)` call is made.  If no\ndata is available, then you MAY call `push('')` (an empty string) to\nallow a future `_read` call, without adding any data to the queue.\n\nThe `Readable` class works by putting data into a read queue to be\npulled out later by calling the `read()` method when the `'readable'`\nevent fires.\n\nThe `push()` method will explicitly insert some data into the read\nqueue.  If it is called with `null` then it will signal the end of the\ndata.\n\nIn some cases, you may be wrapping a lower-level source which has some\nsort of pause/resume mechanism, and a data callback.  In those cases,\nyou could wrap the low-level source object by doing something like\nthis:\n\n```javascript\n// source is an object with readStop() and readStart() methods,\n// and an `ondata` member that gets called when it has data, and\n// an `onend` member that gets called when the data is over.\n\nvar stream = new Readable();\n\nsource.ondata = function(chunk) {\n  // if push() returns false, then we need to stop reading from source\n  if (!stream.push(chunk))\n    source.readStop();\n};\n\nsource.onend = function() {\n  stream.push(null);\n};\n\n// _read will be called when the stream wants to pull more data in\n// the advisory size argument is ignored in this case.\nstream._read = function(n) {\n  source.readStart();\n};\n```\n\n### readable.unshift(chunk)\n\n* `chunk` {Buffer | null | String} Chunk of data to unshift onto the read queue\n* return {Boolean} Whether or not more pushes should be performed\n\nThis is the corollary of `readable.push(chunk)`.  Rather than putting\nthe data at the *end* of the read queue, it puts it at the *front* of\nthe read queue.\n\nThis is useful in certain use-cases where a stream is being consumed\nby a parser, which needs to \"un-consume\" some data that it has\noptimistically pulled out of the source.\n\n```javascript\n// A parser for a simple data protocol.\n// The \"header\" is a JSON object, followed by 2 \\n characters, and\n// then a message body.\n//\n// Note: This can be done more simply as a Transform stream.  See below.\n\nfunction SimpleProtocol(source, options) {\n  if (!(this instanceof SimpleProtocol))\n    return new SimpleProtocol(options);\n\n  Readable.call(this, options);\n  this._inBody = false;\n  this._sawFirstCr = false;\n\n  // source is a readable stream, such as a socket or file\n  this._source = source;\n\n  var self = this;\n  source.on('end', function() {\n    self.push(null);\n  });\n\n  // give it a kick whenever the source is readable\n  // read(0) will not consume any bytes\n  source.on('readable', function() {\n    self.read(0);\n  });\n\n  this._rawHeader = [];\n  this.header = null;\n}\n\nSimpleProtocol.prototype = Object.create(\n  Readable.prototype, { constructor: { value: SimpleProtocol }});\n\nSimpleProtocol.prototype._read = function(n) {\n  if (!this._inBody) {\n    var chunk = this._source.read();\n\n    // if the source doesn't have data, we don't have data yet.\n    if (chunk === null)\n      return this.push('');\n\n    // check if the chunk has a \\n\\n\n    var split = -1;\n    for (var i = 0; i < chunk.length; i++) {\n      if (chunk[i] === 10) { // '\\n'\n        if (this._sawFirstCr) {\n          split = i;\n          break;\n        } else {\n          this._sawFirstCr = true;\n        }\n      } else {\n        this._sawFirstCr = false;\n      }\n    }\n\n    if (split === -1) {\n      // still waiting for the \\n\\n\n      // stash the chunk, and try again.\n      this._rawHeader.push(chunk);\n      this.push('');\n    } else {\n      this._inBody = true;\n      var h = chunk.slice(0, split);\n      this._rawHeader.push(h);\n      var header = Buffer.concat(this._rawHeader).toString();\n      try {\n        this.header = JSON.parse(header);\n      } catch (er) {\n        this.emit('error', new Error('invalid simple protocol data'));\n        return;\n      }\n      // now, because we got some extra data, unshift the rest\n      // back into the read queue so that our consumer will see it.\n      var b = chunk.slice(split);\n      this.unshift(b);\n\n      // and let them know that we are done parsing the header.\n      this.emit('header', this.header);\n    }\n  } else {\n    // from there on, just provide the data to our consumer.\n    // careful not to push(null), since that would indicate EOF.\n    var chunk = this._source.read();\n    if (chunk) this.push(chunk);\n  }\n};\n\n// Usage:\nvar parser = new SimpleProtocol(source);\n// Now parser is a readable stream that will emit 'header'\n// with the parsed header data.\n```\n\n### readable.wrap(stream)\n\n* `stream` {Stream} An \"old style\" readable stream\n\nIf you are using an older Node library that emits `'data'` events and\nhas a `pause()` method that is advisory only, then you can use the\n`wrap()` method to create a Readable stream that uses the old stream\nas its data source.\n\nFor example:\n\n```javascript\nvar OldReader = require('./old-api-module.js').OldReader;\nvar oreader = new OldReader;\nvar Readable = require('stream').Readable;\nvar myReader = new Readable().wrap(oreader);\n\nmyReader.on('readable', function() {\n  myReader.read(); // etc.\n});\n```\n\n### Event: 'readable'\n\nWhen there is data ready to be consumed, this event will fire.\n\nWhen this event emits, call the `read()` method to consume the data.\n\n### Event: 'end'\n\nEmitted when the stream has received an EOF (FIN in TCP terminology).\nIndicates that no more `'data'` events will happen. If the stream is\nalso writable, it may be possible to continue writing.\n\n### Event: 'data'\n\nThe `'data'` event emits either a `Buffer` (by default) or a string if\n`setEncoding()` was used.\n\nNote that adding a `'data'` event listener will switch the Readable\nstream into \"old mode\", where data is emitted as soon as it is\navailable, rather than waiting for you to call `read()` to consume it.\n\n### Event: 'error'\n\nEmitted if there was an error receiving data.\n\n### Event: 'close'\n\nEmitted when the underlying resource (for example, the backing file\ndescriptor) has been closed. Not all streams will emit this.\n\n### readable.setEncoding(encoding)\n\nMakes the `'data'` event emit a string instead of a `Buffer`. `encoding`\ncan be `'utf8'`, `'utf16le'` (`'ucs2'`), `'ascii'`, or `'hex'`.\n\nThe encoding can also be set by specifying an `encoding` field to the\nconstructor.\n\n### readable.read([size])\n\n* `size` {Number | null} Optional number of bytes to read.\n* Return: {Buffer | String | null}\n\nNote: **This function SHOULD be called by Readable stream users.**\n\nCall this method to consume data once the `'readable'` event is\nemitted.\n\nThe `size` argument will set a minimum number of bytes that you are\ninterested in.  If not set, then the entire content of the internal\nbuffer is returned.\n\nIf there is no data to consume, or if there are fewer bytes in the\ninternal buffer than the `size` argument, then `null` is returned, and\na future `'readable'` event will be emitted when more is available.\n\nCalling `stream.read(0)` will always return `null`, and will trigger a\nrefresh of the internal buffer, but otherwise be a no-op.\n\n### readable.pipe(destination, [options])\n\n* `destination` {Writable Stream}\n* `options` {Object} Optional\n  * `end` {Boolean} Default=true\n\nConnects this readable stream to `destination` WriteStream. Incoming\ndata on this stream gets written to `destination`.  Properly manages\nback-pressure so that a slow destination will not be overwhelmed by a\nfast readable stream.\n\nThis function returns the `destination` stream.\n\nFor example, emulating the Unix `cat` command:\n\n    process.stdin.pipe(process.stdout);\n\nBy default `end()` is called on the destination when the source stream\nemits `end`, so that `destination` is no longer writable. Pass `{ end:\nfalse }` as `options` to keep the destination stream open.\n\nThis keeps `writer` open so that \"Goodbye\" can be written at the\nend.\n\n    reader.pipe(writer, { end: false });\n    reader.on(\"end\", function() {\n      writer.end(\"Goodbye\\n\");\n    });\n\nNote that `process.stderr` and `process.stdout` are never closed until\nthe process exits, regardless of the specified options.\n\n### readable.unpipe([destination])\n\n* `destination` {Writable Stream} Optional\n\nUndo a previously established `pipe()`.  If no destination is\nprovided, then all previously established pipes are removed.\n\n### readable.pause()\n\nSwitches the readable stream into \"old mode\", where data is emitted\nusing a `'data'` event rather than being buffered for consumption via\nthe `read()` method.\n\nCeases the flow of data.  No `'data'` events are emitted while the\nstream is in a paused state.\n\n### readable.resume()\n\nSwitches the readable stream into \"old mode\", where data is emitted\nusing a `'data'` event rather than being buffered for consumption via\nthe `read()` method.\n\nResumes the incoming `'data'` events after a `pause()`.\n\n\n## Class: stream.Writable\n\n<!--type=class-->\n\nA `Writable` Stream has the following methods, members, and events.\n\nNote that `stream.Writable` is an abstract class designed to be\nextended with an underlying implementation of the\n`_write(chunk, encoding, cb)` method. (See below.)\n\n### new stream.Writable([options])\n\n* `options` {Object}\n  * `highWaterMark` {Number} Buffer level when `write()` starts\n    returning false. Default=16kb\n  * `decodeStrings` {Boolean} Whether or not to decode strings into\n    Buffers before passing them to `_write()`.  Default=true\n\nIn classes that extend the Writable class, make sure to call the\nconstructor so that the buffering settings can be properly\ninitialized.\n\n### writable.\\_write(chunk, encoding, callback)\n\n* `chunk` {Buffer | String} The chunk to be written.  Will always\n  be a buffer unless the `decodeStrings` option was set to `false`.\n* `encoding` {String} If the chunk is a string, then this is the\n  encoding type.  Ignore chunk is a buffer.  Note that chunk will\n  **always** be a buffer unless the `decodeStrings` option is\n  explicitly set to `false`.\n* `callback` {Function} Call this function (optionally with an error\n  argument) when you are done processing the supplied chunk.\n\nAll Writable stream implementations must provide a `_write` method to\nsend data to the underlying resource.\n\nNote: **This function MUST NOT be called directly.**  It should be\nimplemented by child classes, and called by the internal Writable\nclass methods only.\n\nCall the callback using the standard `callback(error)` pattern to\nsignal that the write completed successfully or with an error.\n\nIf the `decodeStrings` flag is set in the constructor options, then\n`chunk` may be a string rather than a Buffer, and `encoding` will\nindicate the sort of string that it is.  This is to support\nimplementations that have an optimized handling for certain string\ndata encodings.  If you do not explicitly set the `decodeStrings`\noption to `false`, then you can safely ignore the `encoding` argument,\nand assume that `chunk` will always be a Buffer.\n\nThis method is prefixed with an underscore because it is internal to\nthe class that defines it, and should not be called directly by user\nprograms.  However, you **are** expected to override this method in\nyour own extension classes.\n\n\n### writable.write(chunk, [encoding], [callback])\n\n* `chunk` {Buffer | String} Data to be written\n* `encoding` {String} Optional.  If `chunk` is a string, then encoding\n  defaults to `'utf8'`\n* `callback` {Function} Optional.  Called when this chunk is\n  successfully written.\n* Returns {Boolean}\n\nWrites `chunk` to the stream.  Returns `true` if the data has been\nflushed to the underlying resource.  Returns `false` to indicate that\nthe buffer is full, and the data will be sent out in the future. The\n`'drain'` event will indicate when the buffer is empty again.\n\nThe specifics of when `write()` will return false, is determined by\nthe `highWaterMark` option provided to the constructor.\n\n### writable.end([chunk], [encoding], [callback])\n\n* `chunk` {Buffer | String} Optional final data to be written\n* `encoding` {String} Optional.  If `chunk` is a string, then encoding\n  defaults to `'utf8'`\n* `callback` {Function} Optional.  Called when the final chunk is\n  successfully written.\n\nCall this method to signal the end of the data being written to the\nstream.\n\n### Event: 'drain'\n\nEmitted when the stream's write queue empties and it's safe to write\nwithout buffering again. Listen for it when `stream.write()` returns\n`false`.\n\n### Event: 'close'\n\nEmitted when the underlying resource (for example, the backing file\ndescriptor) has been closed. Not all streams will emit this.\n\n### Event: 'finish'\n\nWhen `end()` is called and there are no more chunks to write, this\nevent is emitted.\n\n### Event: 'pipe'\n\n* `source` {Readable Stream}\n\nEmitted when the stream is passed to a readable stream's pipe method.\n\n### Event 'unpipe'\n\n* `source` {Readable Stream}\n\nEmitted when a previously established `pipe()` is removed using the\nsource Readable stream's `unpipe()` method.\n\n## Class: stream.Duplex\n\n<!--type=class-->\n\nA \"duplex\" stream is one that is both Readable and Writable, such as a\nTCP socket connection.\n\nNote that `stream.Duplex` is an abstract class designed to be\nextended with an underlying implementation of the `_read(size)`\nand `_write(chunk, encoding, callback)` methods as you would with a Readable or\nWritable stream class.\n\nSince JavaScript doesn't have multiple prototypal inheritance, this\nclass prototypally inherits from Readable, and then parasitically from\nWritable.  It is thus up to the user to implement both the lowlevel\n`_read(n)` method as well as the lowlevel `_write(chunk, encoding, cb)` method\non extension duplex classes.\n\n### new stream.Duplex(options)\n\n* `options` {Object} Passed to both Writable and Readable\n  constructors. Also has the following fields:\n  * `allowHalfOpen` {Boolean} Default=true.  If set to `false`, then\n    the stream will automatically end the readable side when the\n    writable side ends and vice versa.\n\nIn classes that extend the Duplex class, make sure to call the\nconstructor so that the buffering settings can be properly\ninitialized.\n\n## Class: stream.Transform\n\nA \"transform\" stream is a duplex stream where the output is causally\nconnected in some way to the input, such as a zlib stream or a crypto\nstream.\n\nThere is no requirement that the output be the same size as the input,\nthe same number of chunks, or arrive at the same time.  For example, a\nHash stream will only ever have a single chunk of output which is\nprovided when the input is ended.  A zlib stream will either produce\nmuch smaller or much larger than its input.\n\nRather than implement the `_read()` and `_write()` methods, Transform\nclasses must implement the `_transform()` method, and may optionally\nalso implement the `_flush()` method.  (See below.)\n\n### new stream.Transform([options])\n\n* `options` {Object} Passed to both Writable and Readable\n  constructors.\n\nIn classes that extend the Transform class, make sure to call the\nconstructor so that the buffering settings can be properly\ninitialized.\n\n### transform.\\_transform(chunk, encoding, callback)\n\n* `chunk` {Buffer | String} The chunk to be transformed.  Will always\n  be a buffer unless the `decodeStrings` option was set to `false`.\n* `encoding` {String} If the chunk is a string, then this is the\n  encoding type.  (Ignore if `decodeStrings` chunk is a buffer.)\n* `callback` {Function} Call this function (optionally with an error\n  argument) when you are done processing the supplied chunk.\n\nNote: **This function MUST NOT be called directly.**  It should be\nimplemented by child classes, and called by the internal Transform\nclass methods only.\n\nAll Transform stream implementations must provide a `_transform`\nmethod to accept input and produce output.\n\n`_transform` should do whatever has to be done in this specific\nTransform class, to handle the bytes being written, and pass them off\nto the readable portion of the interface.  Do asynchronous I/O,\nprocess things, and so on.\n\nCall `transform.push(outputChunk)` 0 or more times to generate output\nfrom this input chunk, depending on how much data you want to output\nas a result of this chunk.\n\nCall the callback function only when the current chunk is completely\nconsumed.  Note that there may or may not be output as a result of any\nparticular input chunk.\n\nThis method is prefixed with an underscore because it is internal to\nthe class that defines it, and should not be called directly by user\nprograms.  However, you **are** expected to override this method in\nyour own extension classes.\n\n### transform.\\_flush(callback)\n\n* `callback` {Function} Call this function (optionally with an error\n  argument) when you are done flushing any remaining data.\n\nNote: **This function MUST NOT be called directly.**  It MAY be implemented\nby child classes, and if so, will be called by the internal Transform\nclass methods only.\n\nIn some cases, your transform operation may need to emit a bit more\ndata at the end of the stream.  For example, a `Zlib` compression\nstream will store up some internal state so that it can optimally\ncompress the output.  At the end, however, it needs to do the best it\ncan with what is left, so that the data will be complete.\n\nIn those cases, you can implement a `_flush` method, which will be\ncalled at the very end, after all the written data is consumed, but\nbefore emitting `end` to signal the end of the readable side.  Just\nlike with `_transform`, call `transform.push(chunk)` zero or more\ntimes, as appropriate, and call `callback` when the flush operation is\ncomplete.\n\nThis method is prefixed with an underscore because it is internal to\nthe class that defines it, and should not be called directly by user\nprograms.  However, you **are** expected to override this method in\nyour own extension classes.\n\n### Example: `SimpleProtocol` parser\n\nThe example above of a simple protocol parser can be implemented much\nmore simply by using the higher level `Transform` stream class.\n\nIn this example, rather than providing the input as an argument, it\nwould be piped into the parser, which is a more idiomatic Node stream\napproach.\n\n```javascript\nfunction SimpleProtocol(options) {\n  if (!(this instanceof SimpleProtocol))\n    return new SimpleProtocol(options);\n\n  Transform.call(this, options);\n  this._inBody = false;\n  this._sawFirstCr = false;\n  this._rawHeader = [];\n  this.header = null;\n}\n\nSimpleProtocol.prototype = Object.create(\n  Transform.prototype, { constructor: { value: SimpleProtocol }});\n\nSimpleProtocol.prototype._transform = function(chunk, encoding, done) {\n  if (!this._inBody) {\n    // check if the chunk has a \\n\\n\n    var split = -1;\n    for (var i = 0; i < chunk.length; i++) {\n      if (chunk[i] === 10) { // '\\n'\n        if (this._sawFirstCr) {\n          split = i;\n          break;\n        } else {\n          this._sawFirstCr = true;\n        }\n      } else {\n        this._sawFirstCr = false;\n      }\n    }\n\n    if (split === -1) {\n      // still waiting for the \\n\\n\n      // stash the chunk, and try again.\n      this._rawHeader.push(chunk);\n    } else {\n      this._inBody = true;\n      var h = chunk.slice(0, split);\n      this._rawHeader.push(h);\n      var header = Buffer.concat(this._rawHeader).toString();\n      try {\n        this.header = JSON.parse(header);\n      } catch (er) {\n        this.emit('error', new Error('invalid simple protocol data'));\n        return;\n      }\n      // and let them know that we are done parsing the header.\n      this.emit('header', this.header);\n\n      // now, because we got some extra data, emit this first.\n      this.push(b);\n    }\n  } else {\n    // from there on, just provide the data to our consumer as-is.\n    this.push(b);\n  }\n  done();\n};\n\nvar parser = new SimpleProtocol();\nsource.pipe(parser)\n\n// Now parser is a readable stream that will emit 'header'\n// with the parsed header data.\n```\n\n\n## Class: stream.PassThrough\n\nThis is a trivial implementation of a `Transform` stream that simply\npasses the input bytes across to the output.  Its purpose is mainly\nfor examples and testing, but there are occasionally use cases where\nit can come in handy.\n\n\n[EventEmitter]: events.html#events_class_events_eventemitter\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/readable-stream/issues"
-  },
-  "_id": "readable-stream@1.0.17",
-  "dist": {
-    "shasum": "cbc295fdf394dfa1225d225d02e6b6d0f409fd4b"
-  },
-  "_from": "readable-stream@1.0",
-  "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.0.17.tgz"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/passthrough.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require("./lib/_stream_passthrough.js")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/readable.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-exports = module.exports = require('./lib/_stream_readable.js');
-exports.Readable = exports;
-exports.Writable = require('./lib/_stream_writable.js');
-exports.Duplex = require('./lib/_stream_duplex.js');
-exports.Transform = require('./lib/_stream_transform.js');
-exports.PassThrough = require('./lib/_stream_passthrough.js');
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/transform.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require("./lib/_stream_transform.js")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/writable.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports = require("./lib/_stream_writable.js")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/zlib.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,452 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-var Transform = require('./lib/_stream_transform.js');
-
-var binding = process.binding('zlib');
-var util = require('util');
-var assert = require('assert').ok;
-
-// zlib doesn't provide these, so kludge them in following the same
-// const naming scheme zlib uses.
-binding.Z_MIN_WINDOWBITS = 8;
-binding.Z_MAX_WINDOWBITS = 15;
-binding.Z_DEFAULT_WINDOWBITS = 15;
-
-// fewer than 64 bytes per chunk is stupid.
-// technically it could work with as few as 8, but even 64 bytes
-// is absurdly low.  Usually a MB or more is best.
-binding.Z_MIN_CHUNK = 64;
-binding.Z_MAX_CHUNK = Infinity;
-binding.Z_DEFAULT_CHUNK = (16 * 1024);
-
-binding.Z_MIN_MEMLEVEL = 1;
-binding.Z_MAX_MEMLEVEL = 9;
-binding.Z_DEFAULT_MEMLEVEL = 8;
-
-binding.Z_MIN_LEVEL = -1;
-binding.Z_MAX_LEVEL = 9;
-binding.Z_DEFAULT_LEVEL = binding.Z_DEFAULT_COMPRESSION;
-
-// expose all the zlib constants
-Object.keys(binding).forEach(function(k) {
-  if (k.match(/^Z/)) exports[k] = binding[k];
-});
-
-// translation table for return codes.
-exports.codes = {
-  Z_OK: binding.Z_OK,
-  Z_STREAM_END: binding.Z_STREAM_END,
-  Z_NEED_DICT: binding.Z_NEED_DICT,
-  Z_ERRNO: binding.Z_ERRNO,
-  Z_STREAM_ERROR: binding.Z_STREAM_ERROR,
-  Z_DATA_ERROR: binding.Z_DATA_ERROR,
-  Z_MEM_ERROR: binding.Z_MEM_ERROR,
-  Z_BUF_ERROR: binding.Z_BUF_ERROR,
-  Z_VERSION_ERROR: binding.Z_VERSION_ERROR
-};
-
-Object.keys(exports.codes).forEach(function(k) {
-  exports.codes[exports.codes[k]] = k;
-});
-
-exports.Deflate = Deflate;
-exports.Inflate = Inflate;
-exports.Gzip = Gzip;
-exports.Gunzip = Gunzip;
-exports.DeflateRaw = DeflateRaw;
-exports.InflateRaw = InflateRaw;
-exports.Unzip = Unzip;
-
-exports.createDeflate = function(o) {
-  return new Deflate(o);
-};
-
-exports.createInflate = function(o) {
-  return new Inflate(o);
-};
-
-exports.createDeflateRaw = function(o) {
-  return new DeflateRaw(o);
-};
-
-exports.createInflateRaw = function(o) {
-  return new InflateRaw(o);
-};
-
-exports.createGzip = function(o) {
-  return new Gzip(o);
-};
-
-exports.createGunzip = function(o) {
-  return new Gunzip(o);
-};
-
-exports.createUnzip = function(o) {
-  return new Unzip(o);
-};
-
-
-// Convenience methods.
-// compress/decompress a string or buffer in one step.
-exports.deflate = function(buffer, callback) {
-  zlibBuffer(new Deflate(), buffer, callback);
-};
-
-exports.gzip = function(buffer, callback) {
-  zlibBuffer(new Gzip(), buffer, callback);
-};
-
-exports.deflateRaw = function(buffer, callback) {
-  zlibBuffer(new DeflateRaw(), buffer, callback);
-};
-
-exports.unzip = function(buffer, callback) {
-  zlibBuffer(new Unzip(), buffer, callback);
-};
-
-exports.inflate = function(buffer, callback) {
-  zlibBuffer(new Inflate(), buffer, callback);
-};
-
-exports.gunzip = function(buffer, callback) {
-  zlibBuffer(new Gunzip(), buffer, callback);
-};
-
-exports.inflateRaw = function(buffer, callback) {
-  zlibBuffer(new InflateRaw(), buffer, callback);
-};
-
-function zlibBuffer(engine, buffer, callback) {
-  var buffers = [];
-  var nread = 0;
-
-  engine.on('error', onError);
-  engine.on('end', onEnd);
-
-  engine.end(buffer);
-  flow();
-
-  function flow() {
-    var chunk;
-    while (null !== (chunk = engine.read())) {
-      buffers.push(chunk);
-      nread += chunk.length;
-    }
-    engine.once('readable', flow);
-  }
-
-  function onError(err) {
-    engine.removeListener('end', onEnd);
-    engine.removeListener('readable', flow);
-    callback(err);
-  }
-
-  function onEnd() {
-    var buf = Buffer.concat(buffers, nread);
-    buffers = [];
-    callback(null, buf);
-  }
-}
-
-
-// generic zlib
-// minimal 2-byte header
-function Deflate(opts) {
-  if (!(this instanceof Deflate)) return new Deflate(opts);
-  Zlib.call(this, opts, binding.DEFLATE);
-}
-
-function Inflate(opts) {
-  if (!(this instanceof Inflate)) return new Inflate(opts);
-  Zlib.call(this, opts, binding.INFLATE);
-}
-
-
-
-// gzip - bigger header, same deflate compression
-function Gzip(opts) {
-  if (!(this instanceof Gzip)) return new Gzip(opts);
-  Zlib.call(this, opts, binding.GZIP);
-}
-
-function Gunzip(opts) {
-  if (!(this instanceof Gunzip)) return new Gunzip(opts);
-  Zlib.call(this, opts, binding.GUNZIP);
-}
-
-
-
-// raw - no header
-function DeflateRaw(opts) {
-  if (!(this instanceof DeflateRaw)) return new DeflateRaw(opts);
-  Zlib.call(this, opts, binding.DEFLATERAW);
-}
-
-function InflateRaw(opts) {
-  if (!(this instanceof InflateRaw)) return new InflateRaw(opts);
-  Zlib.call(this, opts, binding.INFLATERAW);
-}
-
-
-// auto-detect header.
-function Unzip(opts) {
-  if (!(this instanceof Unzip)) return new Unzip(opts);
-  Zlib.call(this, opts, binding.UNZIP);
-}
-
-
-// the Zlib class they all inherit from
-// This thing manages the queue of requests, and returns
-// true or false if there is anything in the queue when
-// you call the .write() method.
-
-function Zlib(opts, mode) {
-  this._opts = opts = opts || {};
-  this._chunkSize = opts.chunkSize || exports.Z_DEFAULT_CHUNK;
-
-  Transform.call(this, opts);
-
-  // means a different thing there.
-  this._readableState.chunkSize = null;
-
-  if (opts.chunkSize) {
-    if (opts.chunkSize < exports.Z_MIN_CHUNK ||
-        opts.chunkSize > exports.Z_MAX_CHUNK) {
-      throw new Error('Invalid chunk size: ' + opts.chunkSize);
-    }
-  }
-
-  if (opts.windowBits) {
-    if (opts.windowBits < exports.Z_MIN_WINDOWBITS ||
-        opts.windowBits > exports.Z_MAX_WINDOWBITS) {
-      throw new Error('Invalid windowBits: ' + opts.windowBits);
-    }
-  }
-
-  if (opts.level) {
-    if (opts.level < exports.Z_MIN_LEVEL ||
-        opts.level > exports.Z_MAX_LEVEL) {
-      throw new Error('Invalid compression level: ' + opts.level);
-    }
-  }
-
-  if (opts.memLevel) {
-    if (opts.memLevel < exports.Z_MIN_MEMLEVEL ||
-        opts.memLevel > exports.Z_MAX_MEMLEVEL) {
-      throw new Error('Invalid memLevel: ' + opts.memLevel);
-    }
-  }
-
-  if (opts.strategy) {
-    if (opts.strategy != exports.Z_FILTERED &&
-        opts.strategy != exports.Z_HUFFMAN_ONLY &&
-        opts.strategy != exports.Z_RLE &&
-        opts.strategy != exports.Z_FIXED &&
-        opts.strategy != exports.Z_DEFAULT_STRATEGY) {
-      throw new Error('Invalid strategy: ' + opts.strategy);
-    }
-  }
-
-  if (opts.dictionary) {
-    if (!Buffer.isBuffer(opts.dictionary)) {
-      throw new Error('Invalid dictionary: it should be a Buffer instance');
-    }
-  }
-
-  this._binding = new binding.Zlib(mode);
-
-  var self = this;
-  this._hadError = false;
-  this._binding.onerror = function(message, errno) {
-    // there is no way to cleanly recover.
-    // continuing only obscures problems.
-    self._binding = null;
-    self._hadError = true;
-
-    var error = new Error(message);
-    error.errno = errno;
-    error.code = exports.codes[errno];
-    self.emit('error', error);
-  };
-
-  this._binding.init(opts.windowBits || exports.Z_DEFAULT_WINDOWBITS,
-                     opts.level || exports.Z_DEFAULT_COMPRESSION,
-                     opts.memLevel || exports.Z_DEFAULT_MEMLEVEL,
-                     opts.strategy || exports.Z_DEFAULT_STRATEGY,
-                     opts.dictionary);
-
-  this._buffer = new Buffer(this._chunkSize);
-  this._offset = 0;
-  this._closed = false;
-
-  this.once('end', this.close);
-}
-
-util.inherits(Zlib, Transform);
-
-Zlib.prototype.reset = function reset() {
-  return this._binding.reset();
-};
-
-Zlib.prototype._flush = function(output, callback) {
-  var rs = this._readableState;
-  var self = this;
-  this._transform(null, output, function(er) {
-    if (er)
-      return callback(er);
-
-    // now a weird thing happens... it could be that you called flush
-    // but everything had already actually been consumed, but it wasn't
-    // enough to get over the Readable class's lowWaterMark.
-    // In that case, we emit 'readable' now to make sure it's consumed.
-    if (rs.length &&
-        rs.length < rs.lowWaterMark &&
-        !rs.ended &&
-        rs.needReadable)
-      self.emit('readable');
-
-    callback();
-  });
-};
-
-Zlib.prototype.flush = function(callback) {
-  var ws = this._writableState;
-  var ts = this._transformState;
-
-  if (ws.writing) {
-    ws.needDrain = true;
-    var self = this;
-    this.once('drain', function() {
-      self._flush(ts.output, callback);
-    });
-    return;
-  }
-
-  this._flush(ts.output, callback || function() {});
-};
-
-Zlib.prototype.close = function(callback) {
-  if (callback)
-    process.nextTick(callback);
-
-  if (this._closed)
-    return;
-
-  this._closed = true;
-
-  this._binding.close();
-
-  var self = this;
-  process.nextTick(function() {
-    self.emit('close');
-  });
-};
-
-Zlib.prototype._transform = function(chunk, output, cb) {
-  var flushFlag;
-  var ws = this._writableState;
-  var ending = ws.ending || ws.ended;
-  var last = ending && (!chunk || ws.length === chunk.length);
-
-  if (chunk !== null && !Buffer.isBuffer(chunk))
-    return cb(new Error('invalid input'));
-
-  // If it's the last chunk, or a final flush, we use the Z_FINISH flush flag.
-  // If it's explicitly flushing at some other time, then we use
-  // Z_FULL_FLUSH. Otherwise, use Z_NO_FLUSH for maximum compression
-  // goodness.
-  if (last)
-    flushFlag = binding.Z_FINISH;
-  else if (chunk === null)
-    flushFlag = binding.Z_FULL_FLUSH;
-  else
-    flushFlag = binding.Z_NO_FLUSH;
-
-  var availInBefore = chunk && chunk.length;
-  var availOutBefore = this._chunkSize - this._offset;
-  var inOff = 0;
-
-  var req = this._binding.write(flushFlag,
-                                chunk, // in
-                                inOff, // in_off
-                                availInBefore, // in_len
-                                this._buffer, // out
-                                this._offset, //out_off
-                                availOutBefore); // out_len
-
-  req.buffer = chunk;
-  req.callback = callback;
-
-  var self = this;
-  function callback(availInAfter, availOutAfter, buffer) {
-    if (self._hadError)
-      return;
-
-    var have = availOutBefore - availOutAfter;
-    assert(have >= 0, 'have should not go down');
-
-    if (have > 0) {
-      var out = self._buffer.slice(self._offset, self._offset + have);
-      self._offset += have;
-      // serve some output to the consumer.
-      output(out);
-    }
-
-    // exhausted the output buffer, or used all the input create a new one.
-    if (availOutAfter === 0 || self._offset >= self._chunkSize) {
-      availOutBefore = self._chunkSize;
-      self._offset = 0;
-      self._buffer = new Buffer(self._chunkSize);
-    }
-
-    if (availOutAfter === 0) {
-      // Not actually done.  Need to reprocess.
-      // Also, update the availInBefore to the availInAfter value,
-      // so that if we have to hit it a third (fourth, etc.) time,
-      // it'll have the correct byte counts.
-      inOff += (availInBefore - availInAfter);
-      availInBefore = availInAfter;
-
-      var newReq = self._binding.write(flushFlag,
-                                       chunk,
-                                       inOff,
-                                       availInBefore,
-                                       self._buffer,
-                                       self._offset,
-                                       self._chunkSize);
-      newReq.callback = callback; // this same function
-      newReq.buffer = chunk;
-      return;
-    }
-
-    // finished with the chunk.
-    cb();
-  }
-};
-
-util.inherits(Deflate, Zlib);
-util.inherits(Inflate, Zlib);
-util.inherits(Gzip, Zlib);
-util.inherits(Gunzip, Zlib);
-util.inherits(DeflateRaw, Zlib);
-util.inherits(InflateRaw, Zlib);
-util.inherits(Unzip, Zlib);
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/sha/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-{
-  "name": "sha",
-  "version": "1.2.3",
-  "description": "Check and get file hashes",
-  "scripts": {
-    "test": "mocha -R spec"
-  },
-  "repository": {
-    "type": "git",
-    "url": "https://github.com/ForbesLindesay/sha.git"
-  },
-  "license": "BSD",
-  "optionalDependencies": {
-    "graceful-fs": "2",
-    "readable-stream": "1.0"
-  },
-  "devDependencies": {
-    "mocha": "~1.9.0"
-  },
-  "readme": "# sha\r\n\r\nCheck and get file hashes (using any algorithm)\r\n\r\n[![Build Status](https://travis-ci.org/ForbesLindesay/sha.png?branch=master)](https://travis-ci.org/ForbesLindesay/sha)\r\n[![Dependency Status](https://gemnasium.com/ForbesLindesay/sha.png)](https://gemnasium.com/ForbesLindesay/sha)\r\n[![NPM version](https://badge.fury.io/js/sha.png)](http://badge.fury.io/js/sha)\r\n\r\n## Installation\r\n\r\n    $ npm install sha\r\n\r\n## API\r\n\r\n### check(fileName, expected, [options,] cb) / checkSync(filename, expected, [options])\r\n\r\nAsynchronously check that `fileName` has a \"hash\" of `expected`.  The callback will be called with either `null` or an error (indicating that they did not match).\r\n\r\nOptions:\r\n\r\n- algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash`\r\n\r\n### get(fileName, [options,] cb) / getSync(filename, [options])\r\n\r\nAsynchronously get the \"hash\" of `fileName`.  The callback will be called with an optional `error` object and the (lower cased) hex digest of the hash.\r\n\r\nOptions:\r\n\r\n- algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash`\r\n\r\n### stream(expected, [options])\r\n\r\nCheck the hash of a stream without ever buffering it.  This is a pass through stream so you can do things like:\r\n\r\n```js\r\nfs.createReadStream('src')\r\n  .pipe(sha.stream('expected'))\r\n  .pipe(fs.createWriteStream('dest'))\r\n```\r\n\r\n`dest` will be a complete copy of `src` and an error will be emitted if the hash did not match `'expected'`.\r\n\r\nOptions:\r\n\r\n- algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash`\r\n\r\n## License\r\n\r\nYou may use this software under the BSD or MIT.  Take your pick.  If you want me to release it under another license, open a pull request.",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/ForbesLindesay/sha/issues"
-  },
-  "dependencies": {
-    "graceful-fs": "2",
-    "readable-stream": "1.0"
-  },
-  "_id": "sha@1.2.3",
-  "_from": "sha@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,15 +0,0 @@
-The ISC License
-
-Copyright (c) Isaac Z. Schlueter
-
-Permission to use, copy, modify, and/or distribute this software for any
-purpose with or without fee is hereby granted, provided that the above
-copyright notice and this permission notice appear in all copies.
-
-THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
-WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
-MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
-ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
-WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
-ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
-IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,143 +0,0 @@
-# Controlling Flow: callbacks are easy
-
-## What's actually hard?
-
-- Doing a bunch of things in a specific order.
-- Knowing when stuff is done.
-- Handling failures.
-- Breaking up functionality into parts (avoid nested inline callbacks)
-
-
-## Common Mistakes
-
-- Abandoning convention and consistency.
-- Putting all callbacks inline.
-- Using libraries without grokking them.
-- Trying to make async code look sync.
-
-## Define Conventions
-
-- Two kinds of functions: *actors* take action, *callbacks* get results.
-- Essentially the continuation pattern. Resulting code *looks* similar
-  to fibers, but is *much* simpler to implement.
-- Node works this way in the lowlevel APIs already, and it's very flexible.
-
-## Callbacks
-
-- Simple responders
-- Must always be prepared to handle errors, that's why it's the first argument.
-- Often inline anonymous, but not always.
-- Can trap and call other callbacks with modified data, or pass errors upwards.
-
-## Actors
-
-- Last argument is a callback.
-- If any error occurs, and can't be handled, pass it to the callback and return.
-- Must not throw. Return value ignored.
-- return x ==> return cb(null, x)
-- throw er ==> return cb(er)
-
-```javascript
-// return true if a path is either
-// a symlink or a directory.
-function isLinkOrDir (path, cb) {
-  fs.lstat(path, function (er, s) {
-    if (er) return cb(er)
-    return cb(null, s.isDirectory() || s.isSymbolicLink())
-  })
-}
-```
-
-# asyncMap
-
-## Usecases
-
-- I have a list of 10 files, and need to read all of them, and then continue when they're all done.
-- I have a dozen URLs, and need to fetch them all, and then continue when they're all done.
-- I have 4 connected users, and need to send a message to all of them, and then continue when that's done.
-- I have a list of n things, and I need to dosomething with all of them, in parallel, and get the results once they're all complete.
-
-
-## Solution
-
-```javascript
-var asyncMap = require("slide").asyncMap
-function writeFiles (files, what, cb) {
-  asyncMap(files, function (f, cb) {
-    fs.writeFile(f, what, cb)
-  }, cb)
-}
-writeFiles([my, file, list], "foo", cb)
-```
-
-# chain
-
-## Usecases
-
-- I have to do a bunch of things, in order. Get db credentials out of a file,
-  read the data from the db, write that data to another file.
-- If anything fails, do not continue.
-- I still have to provide an array of functions, which is a lot of boilerplate,
-  and a pita if your functions take args like
-
-```javascript
-function (cb) {
-  blah(a, b, c, cb)
-}
-```
-
-- Results are discarded, which is a bit lame.
-- No way to branch.
-
-## Solution
-
-- reduces boilerplate by converting an array of [fn, args] to an actor
-  that takes no arguments (except cb)
-- A bit like Function#bind, but tailored for our use-case.
-- bindActor(obj, "method", a, b, c)
-- bindActor(fn, a, b, c)
-- bindActor(obj, fn, a, b, c)
-- branching, skipping over falsey arguments
-
-```javascript
-chain([
-  doThing && [thing, a, b, c]
-, isFoo && [doFoo, "foo"]
-, subChain && [chain, [one, two]]
-], cb)
-```
-
-- tracking results: results are stored in an optional array passed as argument,
-  last result is always in results[results.length - 1].
-- treat chain.first and chain.last as placeholders for the first/last
-  result up until that point.
-
-
-## Non-trivial example
-
-- Read number files in a directory
-- Add the results together
-- Ping a web service with the result
-- Write the response to a file
-- Delete the number files
-
-```javascript
-var chain = require("slide").chain
-function myProgram (cb) {
-  var res = [], last = chain.last, first = chain.first
-  chain([
-    [fs, "readdir", "the-directory"]
-  , [readFiles, "the-directory", last]
-  , [sum, last]
-  , [ping, "POST", "example.com", 80, "/foo", last]
-  , [fs, "writeFile", "result.txt", last]
-  , [rmFiles, "./the-directory", first]
-  ], res, cb)
-}
-```
-
-# Conclusion: Convention Profits
-
-- Consistent API from top to bottom.
-- Sneak in at any point to inject functionality. Testable, reusable, ...
-- When ruby and python users whine, you can smile condescendingly.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/index.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,1 +0,0 @@
-module.exports=require("./lib/slide")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/async-map-ordered.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,65 +0,0 @@
-
-throw new Error("TODO: Not yet implemented.")
-
-/*
-usage:
-
-Like asyncMap, but only can take a single cb, and guarantees
-the order of the results.
-*/
-
-module.exports = asyncMapOrdered
-
-function asyncMapOrdered (list, fn, cb_) {
-  if (typeof cb_ !== "function") throw new Error(
-    "No callback provided to asyncMapOrdered")
-
-  if (typeof fn !== "function") throw new Error(
-    "No map function provided to asyncMapOrdered")
-
-  if (list === undefined || list === null) return cb_(null, [])
-  if (!Array.isArray(list)) list = [list]
-  if (!list.length) return cb_(null, [])
-
-  var errState = null
-    , l = list.length
-    , a = l
-    , res = []
-    , resCount = 0
-    , maxArgLen = 0
-
-  function cb (index) { return function () {
-    if (errState) return
-    var er = arguments[0]
-    var argLen = arguments.length
-    maxArgLen = Math.max(maxArgLen, argLen)
-    res[index] = argLen === 1 ? [er] : Array.apply(null, arguments)
-
-    // see if any new things have been added.
-    if (list.length > l) {
-      var newList = list.slice(l)
-      a += (list.length - l)
-      var oldLen = l
-      l = list.length
-      process.nextTick(function () {
-        newList.forEach(function (ar, i) { fn(ar, cb(i + oldLen)) })
-      })
-    }
-
-    if (er || --a === 0) {
-      errState = er
-      cb_.apply(null, [errState].concat(flip(res, resCount, maxArgLen)))
-    }
-  }}
-  // expect the supplied cb function to be called
-  // "n" times for each thing in the array.
-  list.forEach(function (ar) {
-    steps.forEach(function (fn, i) { fn(ar, cb(i)) })
-  })
-}
-
-function flip (res, resCount, argLen) {
-  var flat = []
-  // res = [[er, x, y], [er, x1, y1], [er, x2, y2, z2]]
-  // return [[x, x1, x2], [y, y1, y2], [undefined, undefined, z2]]
-  
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/async-map.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-
-/*
-usage:
-
-// do something to a list of things
-asyncMap(myListOfStuff, function (thing, cb) { doSomething(thing.foo, cb) }, cb)
-// do more than one thing to each item
-asyncMap(list, fooFn, barFn, cb)
-
-*/
-
-module.exports = asyncMap
-
-function asyncMap () {
-  var steps = Array.prototype.slice.call(arguments)
-    , list = steps.shift() || []
-    , cb_ = steps.pop()
-  if (typeof cb_ !== "function") throw new Error(
-    "No callback provided to asyncMap")
-  if (!list) return cb_(null, [])
-  if (!Array.isArray(list)) list = [list]
-  var n = steps.length
-    , data = [] // 2d array
-    , errState = null
-    , l = list.length
-    , a = l * n
-  if (!a) return cb_(null, [])
-  function cb (er) {
-    if (errState) return
-    var argLen = arguments.length
-    for (var i = 1; i < argLen; i ++) if (arguments[i] !== undefined) {
-      data[i - 1] = (data[i - 1] || []).concat(arguments[i])
-    }
-    // see if any new things have been added.
-    if (list.length > l) {
-      var newList = list.slice(l)
-      a += (list.length - l) * n
-      l = list.length
-      process.nextTick(function () {
-        newList.forEach(function (ar) {
-          steps.forEach(function (fn) { fn(ar, cb) })
-        })
-      })
-    }
-
-    if (er || --a === 0) {
-      errState = er
-      cb_.apply(null, [errState].concat(data))
-    }
-  }
-  // expect the supplied cb function to be called
-  // "n" times for each thing in the array.
-  list.forEach(function (ar) {
-    steps.forEach(function (fn) { fn(ar, cb) })
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/bind-actor.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,16 +0,0 @@
-module.exports = bindActor
-function bindActor () {
-  var args = 
-        Array.prototype.slice.call
-        (arguments) // jswtf.
-    , obj = null
-    , fn
-  if (typeof args[0] === "object") {
-    obj = args.shift()
-    fn = args.shift()
-    if (typeof fn === "string")
-      fn = obj[ fn ]
-  } else fn = args.shift()
-  return function (cb) {
-    fn.apply(obj, args.concat(cb)) }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/chain.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,20 +0,0 @@
-module.exports = chain
-var bindActor = require("./bind-actor.js")
-chain.first = {} ; chain.last = {}
-function chain (things, cb) {
-  var res = []
-  ;(function LOOP (i, len) {
-    if (i >= len) return cb(null,res)
-    if (Array.isArray(things[i]))
-      things[i] = bindActor.apply(null,
-        things[i].map(function(i){
-          return (i===chain.first) ? res[0]
-           : (i===chain.last)
-             ? res[res.length - 1] : i }))
-    if (!things[i]) return LOOP(i + 1, len)
-    things[i](function (er, data) {
-      if (er) return cb(er, res)
-      if (data !== undefined) res = res.concat(data)
-      LOOP(i + 1, len)
-    })
-  })(0, things.length) }
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/lib/slide.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-exports.asyncMap = require("./async-map")
-exports.bindActor = require("./bind-actor")
-exports.chain = require("./chain")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/slide/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "name": "slide",
-  "version": "1.1.5",
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "contributors": [
-    {
-      "name": "S. Sriram",
-      "email": "ssriram@gmail.com",
-      "url": "http://www.565labs.com"
-    }
-  ],
-  "description": "A flow control lib small enough to fit on in a slide presentation. Derived live at Oak.JS",
-  "main": "./lib/slide.js",
-  "dependencies": {},
-  "devDependencies": {},
-  "engines": {
-    "node": "*"
-  },
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/slide-flow-control.git"
-  },
-  "license": "ISC",
-  "readme": "# Controlling Flow: callbacks are easy\n\n## What's actually hard?\n\n- Doing a bunch of things in a specific order.\n- Knowing when stuff is done.\n- Handling failures.\n- Breaking up functionality into parts (avoid nested inline callbacks)\n\n\n## Common Mistakes\n\n- Abandoning convention and consistency.\n- Putting all callbacks inline.\n- Using libraries without grokking them.\n- Trying to make async code look sync.\n\n## Define Conventions\n\n- Two kinds of functions: *actors* take action, *callbacks* get results.\n- Essentially the continuation pattern. Resulting code *looks* similar\n  to fibers, but is *much* simpler to implement.\n- Node works this way in the lowlevel APIs already, and it's very flexible.\n\n## Callbacks\n\n- Simple responders\n- Must always be prepared to handle errors, that's why it's the first argument.\n- Often inline anonymous, but not always.\n- Can trap and call other callbacks with modified data, or pass errors upwards.\n\n## Actors\n\n- Last argument is a callback.\n- If any error occurs, and can't be handled, pass it to the callback and return.\n- Must not throw. Return value ignored.\n- return x ==> return cb(null, x)\n- throw er ==> return cb(er)\n\n```javascript\n// return true if a path is either\n// a symlink or a directory.\nfunction isLinkOrDir (path, cb) {\n  fs.lstat(path, function (er, s) {\n    if (er) return cb(er)\n    return cb(null, s.isDirectory() || s.isSymbolicLink())\n  })\n}\n```\n\n# asyncMap\n\n## Usecases\n\n- I have a list of 10 files, and need to read all of them, and then continue when they're all done.\n- I have a dozen URLs, and need to fetch them all, and then continue when they're all done.\n- I have 4 connected users, and need to send a message to all of them, and then continue when that's done.\n- I have a list of n things, and I need to dosomething with all of them, in parallel, and get the results once they're all complete.\n\n\n## Solution\n\n```javascript\nvar asyncMap = require(\"slide\").asyncMap\nfunction writeFiles (files, what, cb) {\n  asyncMap(files, function (f, cb) {\n    fs.writeFile(f, what, cb)\n  }, cb)\n}\nwriteFiles([my, file, list], \"foo\", cb)\n```\n\n# chain\n\n## Usecases\n\n- I have to do a bunch of things, in order. Get db credentials out of a file,\n  read the data from the db, write that data to another file.\n- If anything fails, do not continue.\n- I still have to provide an array of functions, which is a lot of boilerplate,\n  and a pita if your functions take args like\n\n```javascript\nfunction (cb) {\n  blah(a, b, c, cb)\n}\n```\n\n- Results are discarded, which is a bit lame.\n- No way to branch.\n\n## Solution\n\n- reduces boilerplate by converting an array of [fn, args] to an actor\n  that takes no arguments (except cb)\n- A bit like Function#bind, but tailored for our use-case.\n- bindActor(obj, \"method\", a, b, c)\n- bindActor(fn, a, b, c)\n- bindActor(obj, fn, a, b, c)\n- branching, skipping over falsey arguments\n\n```javascript\nchain([\n  doThing && [thing, a, b, c]\n, isFoo && [doFoo, \"foo\"]\n, subChain && [chain, [one, two]]\n], cb)\n```\n\n- tracking results: results are stored in an optional array passed as argument,\n  last result is always in results[results.length - 1].\n- treat chain.first and chain.last as placeholders for the first/last\n  result up until that point.\n\n\n## Non-trivial example\n\n- Read number files in a directory\n- Add the results together\n- Ping a web service with the result\n- Write the response to a file\n- Delete the number files\n\n```javascript\nvar chain = require(\"slide\").chain\nfunction myProgram (cb) {\n  var res = [], last = chain.last, first = chain.first\n  chain([\n    [fs, \"readdir\", \"the-directory\"]\n  , [readFiles, \"the-directory\", last]\n  , [sum, last]\n  , [ping, \"POST\", \"example.com\", 80, \"/foo\", last]\n  , [fs, \"writeFile\", \"result.txt\", last]\n  , [rmFiles, \"./the-directory\", first]\n  ], res, cb)\n}\n```\n\n# Conclusion: Convention Profits\n\n- Consistent API from top to bottom.\n- Sneak in at any point to inject functionality. Testable, reusable, ...\n- When ruby and python users whine, you can smile condescendingly.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/slide-flow-control/issues"
-  },
-  "_id": "slide@1.1.5",
-  "_from": "slide@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/.npmignore	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-.*.swp
-node_modules
-examples/extract/
-test/tmp/
-test/fixtures/
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/.travis.yml	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-language: node_js
-node_js:
-  - 0.6
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/LICENCE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-Copyright (c) Isaac Z. Schlueter
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS
-``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
-TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
-INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
-CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
-ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
-POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,47 +0,0 @@
-# node-tar
-
-Tar for Node.js.
-
-## Goals of this project
-
-1. Be able to parse and reasonably extract the contents of any tar file
-   created by any program that creates tar files, period.
-
-        At least, this includes every version of:
-
-        * bsdtar
-        * gnutar
-        * solaris posix tar
-        * Joerg Schilling's star ("Schilly tar")
-
-2. Create tar files that can be extracted by any of the following tar programs:
-
-        * bsdtar/libarchive version 2.6.2
-        * gnutar 1.15 and above
-        * SunOS Posix tar
-        * Joerg Schilling's star ("Schilly tar")
-
-3. 100% test coverage.  Speed is important.  Correctness is slightly more important.
-
-4. Create the kind of tar interface that Node users would want to use.
-
-5. Satisfy npm's needs for a portable tar implementation with a JavaScript interface.
-
-6. No excuses.  No complaining.  No tolerance for failure.
-
-## But isn't there already a tar.js?
-
-Yes, there are a few.  This one is going to be better, and it will be
-fanatically maintained, because npm will depend on it.
-
-That's why I need to write it from scratch.  Creating and extracting
-tarballs is such a large part of what npm does, I simply can't have it
-be a black box any longer.
-
-## Didn't you have something already?  Where'd it go?
-
-It's in the "old" folder.  It's not functional.  Don't use it.
-
-It was a useful exploration to learn the issues involved, but like most
-software of any reasonable complexity, node-tar won't be useful until
-it's been written at least 3 times.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/examples/extracter.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,11 +0,0 @@
-var tar = require("../tar.js")
-  , fs = require("fs")
-
-fs.createReadStream(__dirname + "/../test/fixtures/c.tar")
-  .pipe(tar.Extract({ path: __dirname + "/extract" }))
-  .on("error", function (er) {
-    console.error("error here")
-  })
-  .on("end", function () {
-    console.error("done")
-  })
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/examples/reader.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-var tar = require("../tar.js")
-  , fs = require("fs")
-
-fs.createReadStream(__dirname + "/../test/fixtures/c.tar")
-  .pipe(tar.Parse())
-  .on("extendedHeader", function (e) {
-    console.error("extended pax header", e.props)
-    e.on("end", function () {
-      console.error("extended pax fields:", e.fields)
-    })
-  })
-  .on("ignoredEntry", function (e) {
-    console.error("ignoredEntry?!?", e.props)
-  })
-  .on("longLinkpath", function (e) {
-    console.error("longLinkpath entry", e.props)
-    e.on("end", function () {
-      console.error("value=%j", e.body.toString())
-    })
-  })
-  .on("longPath", function (e) {
-    console.error("longPath entry", e.props)
-    e.on("end", function () {
-      console.error("value=%j", e.body.toString())
-    })
-  })
-  .on("entry", function (e) {
-    console.error("entry", e.props)
-    e.on("data", function (c) {
-      console.error("  >>>" + c.toString().replace(/\n/g, "\\n"))
-    })
-    e.on("end", function () {
-      console.error("  <<<EOF")
-    })
-  })
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/buffer-entry.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-// just like the Entry class, but it buffers the contents
-//
-// XXX It would be good to set a maximum BufferEntry filesize,
-// since it eats up memory.  In normal operation,
-// these are only for long filenames or link names, which are
-// rarely very big.
-
-module.exports = BufferEntry
-
-var inherits = require("inherits")
-  , Entry = require("./entry.js")
-
-function BufferEntry () {
-  Entry.apply(this, arguments)
-  this._buffer = new Buffer(this.props.size)
-  this._offset = 0
-  this.body = ""
-  this.on("end", function () {
-    this.body = this._buffer.toString().slice(0, -1)
-  })
-}
-
-inherits(BufferEntry, Entry)
-
-// collect the bytes as they come in.
-BufferEntry.prototype.write = function (c) {
-  c.copy(this._buffer, this._offset)
-  this._offset += c.length
-  Entry.prototype.write.call(this, c)
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/entry-writer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,169 +0,0 @@
-module.exports = EntryWriter
-
-var tar = require("../tar.js")
-  , TarHeader = require("./header.js")
-  , Entry = require("./entry.js")
-  , inherits = require("inherits")
-  , BlockStream = require("block-stream")
-  , ExtendedHeaderWriter
-  , Stream = require("stream").Stream
-  , EOF = {}
-
-inherits(EntryWriter, Stream)
-
-function EntryWriter (props) {
-  var me = this
-
-  if (!(me instanceof EntryWriter)) {
-    return new EntryWriter(props)
-  }
-
-  Stream.apply(this)
-
-  me.writable = true
-  me.readable = true
-
-  me._stream = new BlockStream(512)
-
-  me._stream.on("data", function (c) {
-    me.emit("data", c)
-  })
-
-  me._stream.on("drain", function () {
-    me.emit("drain")
-  })
-
-  me._stream.on("end", function () {
-    me.emit("end")
-    me.emit("close")
-  })
-
-  me.props = props
-  if (props.type === "Directory") {
-    props.size = 0
-  }
-  props.ustar = "ustar\0"
-  props.ustarver = "00"
-  me.path = props.path
-
-  me._buffer = []
-  me._didHeader = false
-  me._meta = false
-
-  me.on("pipe", function () {
-    me._process()
-  })
-}
-
-EntryWriter.prototype.write = function (c) {
-  // console.error(".. ew write")
-  if (this._ended) return this.emit("error", new Error("write after end"))
-  this._buffer.push(c)
-  this._process()
-  this._needDrain = this._buffer.length > 0
-  return !this._needDrain
-}
-
-EntryWriter.prototype.end = function (c) {
-  // console.error(".. ew end")
-  if (c) this._buffer.push(c)
-  this._buffer.push(EOF)
-  this._ended = true
-  this._process()
-  this._needDrain = this._buffer.length > 0
-}
-
-EntryWriter.prototype.pause = function () {
-  // console.error(".. ew pause")
-  this._paused = true
-  this.emit("pause")
-}
-
-EntryWriter.prototype.resume = function () {
-  // console.error(".. ew resume")
-  this._paused = false
-  this.emit("resume")
-  this._process()
-}
-
-EntryWriter.prototype.add = function (entry) {
-  // console.error(".. ew add")
-  if (!this.parent) return this.emit("error", new Error("no parent"))
-
-  // make sure that the _header and such is emitted, and clear out
-  // the _currentEntry link on the parent.
-  if (!this._ended) this.end()
-
-  return this.parent.add(entry)
-}
-
-EntryWriter.prototype._header = function () {
-  // console.error(".. ew header")
-  if (this._didHeader) return
-  this._didHeader = true
-
-  var headerBlock = TarHeader.encode(this.props)
-
-  if (this.props.needExtended && !this._meta) {
-    var me = this
-
-    ExtendedHeaderWriter = ExtendedHeaderWriter ||
-      require("./extended-header-writer.js")
-
-    ExtendedHeaderWriter(this.props)
-      .on("data", function (c) {
-        me.emit("data", c)
-      })
-      .on("error", function (er) {
-        me.emit("error", er)
-      })
-      .end()
-  }
-
-  // console.error(".. .. ew headerBlock emitting")
-  this.emit("data", headerBlock)
-  this.emit("header")
-}
-
-EntryWriter.prototype._process = function () {
-  // console.error(".. .. ew process")
-  if (!this._didHeader && !this._meta) {
-    this._header()
-  }
-
-  if (this._paused || this._processing) {
-    // console.error(".. .. .. paused=%j, processing=%j", this._paused, this._processing)
-    return
-  }
-
-  this._processing = true
-
-  var buf = this._buffer
-  for (var i = 0; i < buf.length; i ++) {
-    // console.error(".. .. .. i=%d", i)
-
-    var c = buf[i]
-
-    if (c === EOF) this._stream.end()
-    else this._stream.write(c)
-
-    if (this._paused) {
-      // console.error(".. .. .. paused mid-emission")
-      this._processing = false
-      if (i < buf.length) {
-        this._needDrain = true
-        this._buffer = buf.slice(i + 1)
-      }
-      return
-    }
-  }
-
-  // console.error(".. .. .. emitted")
-  this._buffer.length = 0
-  this._processing = false
-
-  // console.error(".. .. .. emitting drain")
-  this.emit("drain")
-}
-
-EntryWriter.prototype.destroy = function () {}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/entry.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,213 +0,0 @@
-// A passthrough read/write stream that sets its properties
-// based on a header, extendedHeader, and globalHeader
-//
-// Can be either a file system object of some sort, or
-// a pax/ustar metadata entry.
-
-module.exports = Entry
-
-var TarHeader = require("./header.js")
-  , tar = require("../tar")
-  , assert = require("assert").ok
-  , Stream = require("stream").Stream
-  , inherits = require("inherits")
-  , fstream = require("fstream").Abstract
-
-function Entry (header, extended, global) {
-  Stream.call(this)
-  this.readable = true
-  this.writable = true
-
-  this._needDrain = false
-  this._paused = false
-  this._reading = false
-  this._ending = false
-  this._ended = false
-  this._remaining = 0
-  this._queue = []
-  this._index = 0
-  this._queueLen = 0
-
-  this._read = this._read.bind(this)
-
-  this.props = {}
-  this._header = header
-  this._extended = extended || {}
-
-  // globals can change throughout the course of
-  // a file parse operation.  Freeze it at its current state.
-  this._global = {}
-  var me = this
-  Object.keys(global || {}).forEach(function (g) {
-    me._global[g] = global[g]
-  })
-
-  this._setProps()
-}
-
-inherits(Entry, Stream)
-
-Entry.prototype.write = function (c) {
-  if (this._ending) this.error("write() after end()", null, true)
-  if (this._remaining === 0) {
-    this.error("invalid bytes past eof")
-  }
-
-  // often we'll get a bunch of \0 at the end of the last write,
-  // since chunks will always be 512 bytes when reading a tarball.
-  if (c.length > this._remaining) {
-    c = c.slice(0, this._remaining)
-  }
-  this._remaining -= c.length
-
-  // put it on the stack.
-  var ql = this._queueLen
-  this._queue.push(c)
-  this._queueLen ++
-
-  this._read()
-
-  // either paused, or buffered
-  if (this._paused || ql > 0) {
-    this._needDrain = true
-    return false
-  }
-
-  return true
-}
-
-Entry.prototype.end = function (c) {
-  if (c) this.write(c)
-  this._ending = true
-  this._read()
-}
-
-Entry.prototype.pause = function () {
-  this._paused = true
-  this.emit("pause")
-}
-
-Entry.prototype.resume = function () {
-  // console.error("    Tar Entry resume", this.path)
-  this.emit("resume")
-  this._paused = false
-  this._read()
-  return this._queueLen - this._index > 1
-}
-
-  // This is bound to the instance
-Entry.prototype._read = function () {
-  // console.error("    Tar Entry _read", this.path)
-
-  if (this._paused || this._reading || this._ended) return
-
-  // set this flag so that event handlers don't inadvertently
-  // get multiple _read() calls running.
-  this._reading = true
-
-  // have any data to emit?
-  while (this._index < this._queueLen && !this._paused) {
-    var chunk = this._queue[this._index ++]
-    this.emit("data", chunk)
-  }
-
-  // check if we're drained
-  if (this._index >= this._queueLen) {
-    this._queue.length = this._queueLen = this._index = 0
-    if (this._needDrain) {
-      this._needDrain = false
-      this.emit("drain")
-    }
-    if (this._ending) {
-      this._ended = true
-      this.emit("end")
-    }
-  }
-
-  // if the queue gets too big, then pluck off whatever we can.
-  // this should be fairly rare.
-  var mql = this._maxQueueLen
-  if (this._queueLen > mql && this._index > 0) {
-    mql = Math.min(this._index, mql)
-    this._index -= mql
-    this._queueLen -= mql
-    this._queue = this._queue.slice(mql)
-  }
-
-  this._reading = false
-}
-
-Entry.prototype._setProps = function () {
-  // props = extended->global->header->{}
-  var header = this._header
-    , extended = this._extended
-    , global = this._global
-    , props = this.props
-
-  // first get the values from the normal header.
-  var fields = tar.fields
-  for (var f = 0; fields[f] !== null; f ++) {
-    var field = fields[f]
-      , val = header[field]
-    if (typeof val !== "undefined") props[field] = val
-  }
-
-  // next, the global header for this file.
-  // numeric values, etc, will have already been parsed.
-  ;[global, extended].forEach(function (p) {
-    Object.keys(p).forEach(function (f) {
-      if (typeof p[f] !== "undefined") props[f] = p[f]
-    })
-  })
-
-  // no nulls allowed in path or linkpath
-  ;["path", "linkpath"].forEach(function (p) {
-    if (props.hasOwnProperty(p)) {
-      props[p] = props[p].split("\0")[0]
-    }
-  })
-
-
-  // set date fields to be a proper date
-  ;["mtime", "ctime", "atime"].forEach(function (p) {
-    if (props.hasOwnProperty(p)) {
-      props[p] = new Date(props[p] * 1000)
-    }
-  })
-
-  // set the type so that we know what kind of file to create
-  var type
-  switch (tar.types[props.type]) {
-    case "OldFile":
-    case "ContiguousFile":
-      type = "File"
-      break
-
-    case "GNUDumpDir":
-      type = "Directory"
-      break
-
-    case undefined:
-      type = "Unknown"
-      break
-
-    case "Link":
-    case "SymbolicLink":
-    case "CharacterDevice":
-    case "BlockDevice":
-    case "Directory":
-    case "FIFO":
-    default:
-      type = tar.types[props.type]
-  }
-
-  this.type = type
-  this.path = props.path
-  this.size = props.size
-
-  // size is special, since it signals when the file needs to end.
-  this._remaining = props.size
-}
-
-Entry.prototype.warn = fstream.warn
-Entry.prototype.error = fstream.error
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/extended-header-writer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,191 +0,0 @@
-
-module.exports = ExtendedHeaderWriter
-
-var inherits = require("inherits")
-  , EntryWriter = require("./entry-writer.js")
-
-inherits(ExtendedHeaderWriter, EntryWriter)
-
-var tar = require("../tar.js")
-  , path = require("path")
-  , TarHeader = require("./header.js")
-
-// props is the props of the thing we need to write an
-// extended header for.
-// Don't be shy with it.  Just encode everything.
-function ExtendedHeaderWriter (props) {
-  // console.error(">> ehw ctor")
-  var me = this
-
-  if (!(me instanceof ExtendedHeaderWriter)) {
-    return new ExtendedHeaderWriter(props)
-  }
-
-  me.fields = props
-
-  var p =
-    { path : ("PaxHeader" + path.join("/", props.path || ""))
-             .replace(/\\/g, "/").substr(0, 100)
-    , mode : props.mode || 0666
-    , uid : props.uid || 0
-    , gid : props.gid || 0
-    , size : 0 // will be set later
-    , mtime : props.mtime || Date.now() / 1000
-    , type : "x"
-    , linkpath : ""
-    , ustar : "ustar\0"
-    , ustarver : "00"
-    , uname : props.uname || ""
-    , gname : props.gname || ""
-    , devmaj : props.devmaj || 0
-    , devmin : props.devmin || 0
-    }
-
-
-  EntryWriter.call(me, p)
-  // console.error(">> ehw props", me.props)
-  me.props = p
-
-  me._meta = true
-}
-
-ExtendedHeaderWriter.prototype.end = function () {
-  // console.error(">> ehw end")
-  var me = this
-
-  if (me._ended) return
-  me._ended = true
-
-  me._encodeFields()
-
-  if (me.props.size === 0) {
-    // nothing to write!
-    me._ready = true
-    me._stream.end()
-    return
-  }
-
-  me._stream.write(TarHeader.encode(me.props))
-  me.body.forEach(function (l) {
-    me._stream.write(l)
-  })
-  me._ready = true
-
-  // console.error(">> ehw _process calling end()", me.props)
-  this._stream.end()
-}
-
-ExtendedHeaderWriter.prototype._encodeFields = function () {
-  // console.error(">> ehw _encodeFields")
-  this.body = []
-  if (this.fields.prefix) {
-    this.fields.path = this.fields.prefix + "/" + this.fields.path
-    this.fields.prefix = ""
-  }
-  encodeFields(this.fields, "", this.body, this.fields.noProprietary)
-  var me = this
-  this.body.forEach(function (l) {
-    me.props.size += l.length
-  })
-}
-
-function encodeFields (fields, prefix, body, nop) {
-  // console.error(">> >> ehw encodeFields")
-  // "%d %s=%s\n", <length>, <keyword>, <value>
-  // The length is a decimal number, and includes itself and the \n
-  // Numeric values are decimal strings.
-
-  Object.keys(fields).forEach(function (k) {
-    var val = fields[k]
-      , numeric = tar.numeric[k]
-
-    if (prefix) k = prefix + "." + k
-
-    // already including NODETAR.type, don't need File=true also
-    if (k === fields.type && val === true) return
-
-    switch (k) {
-      // don't include anything that's always handled just fine
-      // in the normal header, or only meaningful in the context
-      // of nodetar
-      case "mode":
-      case "cksum":
-      case "ustar":
-      case "ustarver":
-      case "prefix":
-      case "basename":
-      case "dirname":
-      case "needExtended":
-      case "block":
-      case "filter":
-        return
-
-      case "rdev":
-        if (val === 0) return
-        break
-
-      case "nlink":
-      case "dev": // Truly a hero among men, Creator of Star!
-      case "ino": // Speak his name with reverent awe!  It is:
-        k = "SCHILY." + k
-        break
-
-      default: break
-    }
-
-    if (val && typeof val === "object" &&
-        !Buffer.isBuffer(val)) encodeFields(val, k, body, nop)
-    else if (val === null || val === undefined) return
-    else body.push.apply(body, encodeField(k, val, nop))
-  })
-
-  return body
-}
-
-function encodeField (k, v, nop) {
-  // lowercase keys must be valid, otherwise prefix with
-  // "NODETAR."
-  if (k.charAt(0) === k.charAt(0).toLowerCase()) {
-    var m = k.split(".")[0]
-    if (!tar.knownExtended[m]) k = "NODETAR." + k
-  }
-
-  // no proprietary
-  if (nop && k.charAt(0) !== k.charAt(0).toLowerCase()) {
-    return []
-  }
-
-  if (typeof val === "number") val = val.toString(10)
-
-  var s = new Buffer(" " + k + "=" + v + "\n")
-    , digits = Math.floor(Math.log(s.length) / Math.log(10)) + 1
-
-  // console.error("1 s=%j digits=%j s.length=%d", s.toString(), digits, s.length)
-
-  // if adding that many digits will make it go over that length,
-  // then add one to it. For example, if the string is:
-  // " foo=bar\n"
-  // then that's 9 characters.  With the "9", that bumps the length
-  // up to 10.  However, this is invalid:
-  // "10 foo=bar\n"
-  // but, since that's actually 11 characters, since 10 adds another
-  // character to the length, and the length includes the number
-  // itself.  In that case, just bump it up again.
-  if (s.length + digits >= Math.pow(10, digits)) digits += 1
-  // console.error("2 s=%j digits=%j s.length=%d", s.toString(), digits, s.length)
-
-  var len = digits + s.length
-  // console.error("3 s=%j digits=%j s.length=%d len=%d", s.toString(), digits, s.length, len)
-  var lenBuf = new Buffer("" + len)
-  if (lenBuf.length + s.length !== len) {
-    throw new Error("Bad length calculation\n"+
-                    "len="+len+"\n"+
-                    "lenBuf="+JSON.stringify(lenBuf.toString())+"\n"+
-                    "lenBuf.length="+lenBuf.length+"\n"+
-                    "digits="+digits+"\n"+
-                    "s="+JSON.stringify(s.toString())+"\n"+
-                    "s.length="+s.length)
-  }
-
-  return [lenBuf, s]
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/extended-header.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,140 +0,0 @@
-// An Entry consisting of:
-//
-// "%d %s=%s\n", <length>, <keyword>, <value>
-//
-// The length is a decimal number, and includes itself and the \n
-// \0 does not terminate anything.  Only the length terminates the string.
-// Numeric values are decimal strings.
-
-module.exports = ExtendedHeader
-
-var Entry = require("./entry.js")
-  , inherits = require("inherits")
-  , tar = require("../tar.js")
-  , numeric = tar.numeric
-  , keyTrans = { "SCHILY.dev": "dev"
-               , "SCHILY.ino": "ino"
-               , "SCHILY.nlink": "nlink" }
-
-function ExtendedHeader () {
-  Entry.apply(this, arguments)
-  this.on("data", this._parse)
-  this.fields = {}
-  this._position = 0
-  this._fieldPos = 0
-  this._state = SIZE
-  this._sizeBuf = []
-  this._keyBuf = []
-  this._valBuf = []
-  this._size = -1
-  this._key = ""
-}
-
-inherits(ExtendedHeader, Entry)
-ExtendedHeader.prototype._parse = parse
-
-var s = 0
-  , states = ExtendedHeader.states = {}
-  , SIZE = states.SIZE = s++
-  , KEY  = states.KEY  = s++
-  , VAL  = states.VAL  = s++
-  , ERR  = states.ERR  = s++
-
-Object.keys(states).forEach(function (s) {
-  states[states[s]] = states[s]
-})
-
-states[s] = null
-
-// char code values for comparison
-var _0 = "0".charCodeAt(0)
-  , _9 = "9".charCodeAt(0)
-  , point = ".".charCodeAt(0)
-  , a = "a".charCodeAt(0)
-  , Z = "Z".charCodeAt(0)
-  , a = "a".charCodeAt(0)
-  , z = "z".charCodeAt(0)
-  , space = " ".charCodeAt(0)
-  , eq = "=".charCodeAt(0)
-  , cr = "\n".charCodeAt(0)
-
-function parse (c) {
-  if (this._state === ERR) return
-
-  for ( var i = 0, l = c.length
-      ; i < l
-      ; this._position++, this._fieldPos++, i++) {
-    // console.error("top of loop, size="+this._size)
-
-    var b = c[i]
-
-    if (this._size >= 0 && this._fieldPos > this._size) {
-      error(this, "field exceeds length="+this._size)
-      return
-    }
-
-    switch (this._state) {
-      case ERR: return
-
-      case SIZE:
-        // console.error("parsing size, b=%d, rest=%j", b, c.slice(i).toString())
-        if (b === space) {
-          this._state = KEY
-          // this._fieldPos = this._sizeBuf.length
-          this._size = parseInt(new Buffer(this._sizeBuf).toString(), 10)
-          this._sizeBuf.length = 0
-          continue
-        }
-        if (b < _0 || b > _9) {
-          error(this, "expected [" + _0 + ".." + _9 + "], got " + b)
-          return
-        }
-        this._sizeBuf.push(b)
-        continue
-
-      case KEY:
-        // can be any char except =, not > size.
-        if (b === eq) {
-          this._state = VAL
-          this._key = new Buffer(this._keyBuf).toString()
-          if (keyTrans[this._key]) this._key = keyTrans[this._key]
-          this._keyBuf.length = 0
-          continue
-        }
-        this._keyBuf.push(b)
-        continue
-
-      case VAL:
-        // field must end with cr
-        if (this._fieldPos === this._size - 1) {
-          // console.error("finished with "+this._key)
-          if (b !== cr) {
-            error(this, "expected \\n at end of field")
-            return
-          }
-          var val = new Buffer(this._valBuf).toString()
-          if (numeric[this._key]) {
-            val = parseFloat(val)
-          }
-          this.fields[this._key] = val
-
-          this._valBuf.length = 0
-          this._state = SIZE
-          this._size = -1
-          this._fieldPos = -1
-          continue
-        }
-        this._valBuf.push(b)
-        continue
-    }
-  }
-}
-
-function error (me, msg) {
-  msg = "invalid header: " + msg
-      + "\nposition=" + me._position
-      + "\nfield position=" + me._fieldPos
-
-  me.error(msg)
-  me.state = ERR
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/extract.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,78 +0,0 @@
-// give it a tarball and a path, and it'll dump the contents
-
-module.exports = Extract
-
-var tar = require("../tar.js")
-  , fstream = require("fstream")
-  , inherits = require("inherits")
-  , path = require("path")
-
-function Extract (opts) {
-  if (!(this instanceof Extract)) return new Extract(opts)
-  tar.Parse.apply(this)
-
-  // have to dump into a directory
-  opts.type = "Directory"
-  opts.Directory = true
-
-  if (typeof opts !== "object") {
-    opts = { path: opts }
-  }
-
-  // better to drop in cwd? seems more standard.
-  opts.path = opts.path || path.resolve("node-tar-extract")
-  opts.type = "Directory"
-  opts.Directory = true
-
-  // similar to --strip or --strip-components
-  opts.strip = +opts.strip
-  if (!opts.strip || opts.strip <= 0) opts.strip = 0
-
-  this._fst = fstream.Writer(opts)
-
-  this.pause()
-  var me = this
-
-  // Hardlinks in tarballs are relative to the root
-  // of the tarball.  So, they need to be resolved against
-  // the target directory in order to be created properly.
-  me.on("entry", function (entry) {
-    // if there's a "strip" argument, then strip off that many
-    // path components.
-    if (opts.strip) {
-      var p = entry.path.split("/").slice(opts.strip).join("/")
-      entry.path = entry.props.path = p
-      if (entry.linkpath) {
-        var lp = entry.linkpath.split("/").slice(opts.strip).join("/")
-        entry.linkpath = entry.props.linkpath = lp
-      }
-    }
-    if (entry.type !== "Link") return
-    entry.linkpath = entry.props.linkpath =
-      path.join(opts.path, path.join("/", entry.props.linkpath))
-  })
-
-  this._fst.on("ready", function () {
-    me.pipe(me._fst, { end: false })
-    me.resume()
-  })
-
-  // this._fst.on("end", function () {
-  //   console.error("\nEEEE Extract End", me._fst.path)
-  // })
-
-  this._fst.on("close", function () {
-    // console.error("\nEEEE Extract End", me._fst.path)
-    me.emit("end")
-    me.emit("close")
-  })
-}
-
-inherits(Extract, tar.Parse)
-
-Extract.prototype._streamEnd = function () {
-  var me = this
-  if (!me._ended) me.error("unexpected eof")
-  me._fst.end()
-  // my .end() is coming later.
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/global-header-writer.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-module.exports = GlobalHeaderWriter
-
-var ExtendedHeaderWriter = require("./extended-header-writer.js")
-  , inherits = require("inherits")
-
-inherits(GlobalHeaderWriter, ExtendedHeaderWriter)
-
-function GlobalHeaderWriter (props) {
-  if (!(this instanceof GlobalHeaderWriter)) {
-    return new GlobalHeaderWriter(props)
-  }
-  ExtendedHeaderWriter.call(this, props)
-  this.props.type = "g"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/header.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,385 +0,0 @@
-// parse a 512-byte header block to a data object, or vice-versa
-// If the data won't fit nicely in a simple header, then generate
-// the appropriate extended header file, and return that.
-
-module.exports = TarHeader
-
-var tar = require("../tar.js")
-  , fields = tar.fields
-  , fieldOffs = tar.fieldOffs
-  , fieldEnds = tar.fieldEnds
-  , fieldSize = tar.fieldSize
-  , numeric = tar.numeric
-  , assert = require("assert").ok
-  , space = " ".charCodeAt(0)
-  , slash = "/".charCodeAt(0)
-  , bslash = process.platform === "win32" ? "\\".charCodeAt(0) : null
-
-function TarHeader (block) {
-  if (!(this instanceof TarHeader)) return new TarHeader(block)
-  if (block) this.decode(block)
-}
-
-TarHeader.prototype =
-  { decode : decode
-  , encode: encode
-  , calcSum: calcSum
-  , checkSum: checkSum
-  }
-
-TarHeader.parseNumeric = parseNumeric
-TarHeader.encode = encode
-TarHeader.decode = decode
-
-// note that this will only do the normal ustar header, not any kind
-// of extended posix header file.  If something doesn't fit comfortably,
-// then it will set obj.needExtended = true, and set the block to
-// the closest approximation.
-function encode (obj) {
-  if (!obj && !(this instanceof TarHeader)) throw new Error(
-    "encode must be called on a TarHeader, or supplied an object")
-
-  obj = obj || this
-  var block = obj.block = new Buffer(512)
-
-  // if the object has a "prefix", then that's actually an extension of
-  // the path field.
-  if (obj.prefix) {
-    // console.error("%% header encoding, got a prefix", obj.prefix)
-    obj.path = obj.prefix + "/" + obj.path
-    // console.error("%% header encoding, prefixed path", obj.path)
-    obj.prefix = ""
-  }
-
-  obj.needExtended = false
-
-  if (obj.mode) {
-    if (typeof obj.mode === "string") obj.mode = parseInt(obj.mode, 8)
-    obj.mode = obj.mode & 0777
-  }
-
-  for (var f = 0; fields[f] !== null; f ++) {
-    var field = fields[f]
-      , off = fieldOffs[f]
-      , end = fieldEnds[f]
-      , ret
-
-    switch (field) {
-      case "cksum":
-        // special, done below, after all the others
-        break
-
-      case "prefix":
-        // special, this is an extension of the "path" field.
-        // console.error("%% header encoding, skip prefix later")
-        break
-
-      case "type":
-        // convert from long name to a single char.
-        var type = obj.type || "0"
-        if (type.length > 1) {
-          type = tar.types[obj.type]
-          if (!type) type = "0"
-        }
-        writeText(block, off, end, type)
-        break
-
-      case "path":
-        // uses the "prefix" field if > 100 bytes, but <= 255
-        var pathLen = Buffer.byteLength(obj.path)
-          , pathFSize = fieldSize[fields.path]
-          , prefFSize = fieldSize[fields.prefix]
-
-        // paths between 100 and 255 should use the prefix field.
-        // longer than 255
-        if (pathLen > pathFSize &&
-            pathLen <= pathFSize + prefFSize) {
-          // need to find a slash somewhere in the middle so that
-          // path and prefix both fit in their respective fields
-          var searchStart = pathLen - 1 - pathFSize
-            , searchEnd = prefFSize
-            , found = false
-            , pathBuf = new Buffer(obj.path)
-
-          for ( var s = searchStart
-              ; (s <= searchEnd)
-              ; s ++ ) {
-            if (pathBuf[s] === slash || pathBuf[s] === bslash) {
-              found = s
-              break
-            }
-          }
-
-          if (found !== false) {
-            prefix = pathBuf.slice(0, found).toString("utf8")
-            path = pathBuf.slice(found + 1).toString("utf8")
-
-            ret = writeText(block, off, end, path)
-            off = fieldOffs[fields.prefix]
-            end = fieldEnds[fields.prefix]
-            // console.error("%% header writing prefix", off, end, prefix)
-            ret = writeText(block, off, end, prefix) || ret
-            break
-          }
-        }
-
-        // paths less than 100 chars don't need a prefix
-        // and paths longer than 255 need an extended header and will fail
-        // on old implementations no matter what we do here.
-        // Null out the prefix, and fallthrough to default.
-        // console.error("%% header writing no prefix")
-        var poff = fieldOffs[fields.prefix]
-          , pend = fieldEnds[fields.prefix]
-        writeText(block, poff, pend, "")
-        // fallthrough
-
-      // all other fields are numeric or text
-      default:
-        ret = numeric[field]
-            ? writeNumeric(block, off, end, obj[field])
-            : writeText(block, off, end, obj[field] || "")
-        break
-    }
-    obj.needExtended = obj.needExtended || ret
-  }
-
-  var off = fieldOffs[fields.cksum]
-    , end = fieldEnds[fields.cksum]
-
-  writeNumeric(block, off, end, calcSum.call(this, block))
-
-  return block
-}
-
-// if it's a negative number, or greater than will fit,
-// then use write256.
-var MAXNUM = { 12: 077777777777
-             , 11: 07777777777
-             , 8 : 07777777
-             , 7 : 0777777 }
-function writeNumeric (block, off, end, num) {
-  var writeLen = end - off
-    , maxNum = MAXNUM[writeLen] || 0
-
-  num = num || 0
-  // console.error("  numeric", num)
-
-  if (num instanceof Date ||
-      Object.prototype.toString.call(num) === "[object Date]") {
-    num = num.getTime() / 1000
-  }
-
-  if (num > maxNum || num < 0) {
-    write256(block, off, end, num)
-    // need an extended header if negative or too big.
-    return true
-  }
-
-  // god, tar is so annoying
-  // if the string is small enough, you should put a space
-  // between the octal string and the \0, but if it doesn't
-  // fit, then don't.
-  var numStr = Math.floor(num).toString(8)
-  if (num < MAXNUM[writeLen - 1]) numStr += " "
-
-  // pad with "0" chars
-  if (numStr.length < writeLen) {
-    numStr = (new Array(writeLen - numStr.length).join("0")) + numStr
-  }
-
-  if (numStr.length !== writeLen - 1) {
-    throw new Error("invalid length: " + JSON.stringify(numStr) + "\n" +
-                    "expected: "+writeLen)
-  }
-  block.write(numStr, off, writeLen, "utf8")
-  block[end - 1] = 0
-}
-
-function write256 (block, off, end, num) {
-  var buf = block.slice(off, end)
-  var positive = num >= 0
-  buf[0] = positive ? 0x80 : 0xFF
-
-  // get the number as a base-256 tuple
-  if (!positive) num *= -1
-  var tuple = []
-  do {
-    var n = num % 256
-    tuple.push(n)
-    num = (num - n) / 256
-  } while (num)
-
-  var bytes = tuple.length
-
-  var fill = buf.length - bytes
-  for (var i = 1; i < fill; i ++) {
-    buf[i] = positive ? 0 : 0xFF
-  }
-
-  // tuple is a base256 number, with [0] as the *least* significant byte
-  // if it's negative, then we need to flip all the bits once we hit the
-  // first non-zero bit.  The 2's-complement is (0x100 - n), and the 1's-
-  // complement is (0xFF - n).
-  var zero = true
-  for (i = bytes; i > 0; i --) {
-    var byte = tuple[bytes - i]
-    if (positive) buf[fill + i] = byte
-    else if (zero && byte === 0) buf[fill + i] = 0
-    else if (zero) {
-      zero = false
-      buf[fill + i] = 0x100 - byte
-    } else buf[fill + i] = 0xFF - byte
-  }
-}
-
-function writeText (block, off, end, str) {
-  // strings are written as utf8, then padded with \0
-  var strLen = Buffer.byteLength(str)
-    , writeLen = Math.min(strLen, end - off)
-    // non-ascii fields need extended headers
-    // long fields get truncated
-    , needExtended = strLen !== str.length || strLen > writeLen
-
-  // write the string, and null-pad
-  if (writeLen > 0) block.write(str, off, writeLen, "utf8")
-  for (var i = off + writeLen; i < end; i ++) block[i] = 0
-
-  return needExtended
-}
-
-function calcSum (block) {
-  block = block || this.block
-  assert(Buffer.isBuffer(block) && block.length === 512)
-
-  if (!block) throw new Error("Need block to checksum")
-
-  // now figure out what it would be if the cksum was "        "
-  var sum = 0
-    , start = fieldOffs[fields.cksum]
-    , end = fieldEnds[fields.cksum]
-
-  for (var i = 0; i < fieldOffs[fields.cksum]; i ++) {
-    sum += block[i]
-  }
-
-  for (var i = start; i < end; i ++) {
-    sum += space
-  }
-
-  for (var i = end; i < 512; i ++) {
-    sum += block[i]
-  }
-
-  return sum
-}
-
-
-function checkSum (block) {
-  var sum = calcSum.call(this, block)
-  block = block || this.block
-
-  var cksum = block.slice(fieldOffs[fields.cksum], fieldEnds[fields.cksum])
-  cksum = parseNumeric(cksum)
-
-  return cksum === sum
-}
-
-function decode (block) {
-  block = block || this.block
-  assert(Buffer.isBuffer(block) && block.length === 512)
-
-  this.block = block
-  this.cksumValid = this.checkSum()
-
-  var prefix = null
-
-  // slice off each field.
-  for (var f = 0; fields[f] !== null; f ++) {
-    var field = fields[f]
-      , val = block.slice(fieldOffs[f], fieldEnds[f])
-
-    switch (field) {
-      case "ustar":
-        // if not ustar, then everything after that is just padding.
-        if (val.toString() !== "ustar\0") {
-          this.ustar = false
-          return
-        } else {
-          // console.error("ustar:", val, val.toString())
-          this.ustar = val.toString()
-        }
-        break
-
-      // prefix is special, since it might signal the xstar header
-      case "prefix":
-        var atime = parseNumeric(val.slice(131, 131 + 12))
-          , ctime = parseNumeric(val.slice(131 + 12, 131 + 12 + 12))
-        if ((val[130] === 0 || val[130] === space) &&
-            typeof atime === "number" &&
-            typeof ctime === "number" &&
-            val[131 + 12] === space &&
-            val[131 + 12 + 12] === space) {
-          this.atime = atime
-          this.ctime = ctime
-          val = val.slice(0, 130)
-        }
-        prefix = val.toString("utf8").replace(/\0+$/, "")
-        // console.error("%% header reading prefix", prefix)
-        break
-
-      // all other fields are null-padding text
-      // or a number.
-      default:
-        if (numeric[field]) {
-          this[field] = parseNumeric(val)
-        } else {
-          this[field] = val.toString("utf8").replace(/\0+$/, "")
-        }
-        break
-    }
-  }
-
-  // if we got a prefix, then prepend it to the path.
-  if (prefix) {
-    this.path = prefix + "/" + this.path
-    // console.error("%% header got a prefix", this.path)
-  }
-}
-
-function parse256 (buf) {
-  // first byte MUST be either 80 or FF
-  // 80 for positive, FF for 2's comp
-  var positive
-  if (buf[0] === 0x80) positive = true
-  else if (buf[0] === 0xFF) positive = false
-  else return null
-
-  // build up a base-256 tuple from the least sig to the highest
-  var zero = false
-    , tuple = []
-  for (var i = buf.length - 1; i > 0; i --) {
-    var byte = buf[i]
-    if (positive) tuple.push(byte)
-    else if (zero && byte === 0) tuple.push(0)
-    else if (zero) {
-      zero = false
-      tuple.push(0x100 - byte)
-    } else tuple.push(0xFF - byte)
-  }
-
-  for (var sum = 0, i = 0, l = tuple.length; i < l; i ++) {
-    sum += tuple[i] * Math.pow(256, i)
-  }
-
-  return positive ? sum : -1 * sum
-}
-
-function parseNumeric (f) {
-  if (f[0] & 0x80) return parse256(f)
-
-  var str = f.toString("utf8").split("\0")[0].trim()
-    , res = parseInt(str, 8)
-
-  return isNaN(res) ? null : res
-}
-
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/pack.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,231 +0,0 @@
-// pipe in an fstream, and it'll make a tarball.
-// key-value pair argument is global extended header props.
-
-module.exports = Pack
-
-var EntryWriter = require("./entry-writer.js")
-  , Stream = require("stream").Stream
-  , path = require("path")
-  , inherits = require("inherits")
-  , GlobalHeaderWriter = require("./global-header-writer.js")
-  , collect = require("fstream").collect
-  , eof = new Buffer(512)
-
-for (var i = 0; i < 512; i ++) eof[i] = 0
-
-inherits(Pack, Stream)
-
-function Pack (props) {
-  // console.error("-- p ctor")
-  var me = this
-  if (!(me instanceof Pack)) return new Pack(props)
-
-  if (props) me._noProprietary = props.noProprietary
-  else me._noProprietary = false
-
-  me._global = props
-
-  me.readable = true
-  me.writable = true
-  me._buffer = []
-  // console.error("-- -- set current to null in ctor")
-  me._currentEntry = null
-  me._processing = false
-
-  me._pipeRoot = null
-  me.on("pipe", function (src) {
-    if (src.root === me._pipeRoot) return
-    me._pipeRoot = src
-    src.on("end", function () {
-      me._pipeRoot = null
-    })
-    me.add(src)
-  })
-}
-
-Pack.prototype.addGlobal = function (props) {
-  // console.error("-- p addGlobal")
-  if (this._didGlobal) return
-  this._didGlobal = true
-
-  var me = this
-  GlobalHeaderWriter(props)
-    .on("data", function (c) {
-      me.emit("data", c)
-    })
-    .end()
-}
-
-Pack.prototype.add = function (stream) {
-  if (this._global && !this._didGlobal) this.addGlobal(this._global)
-
-  if (this._ended) return this.emit("error", new Error("add after end"))
-
-  collect(stream)
-  this._buffer.push(stream)
-  this._process()
-  this._needDrain = this._buffer.length > 0
-  return !this._needDrain
-}
-
-Pack.prototype.pause = function () {
-  this._paused = true
-  if (this._currentEntry) this._currentEntry.pause()
-  this.emit("pause")
-}
-
-Pack.prototype.resume = function () {
-  this._paused = false
-  if (this._currentEntry) this._currentEntry.resume()
-  this.emit("resume")
-  this._process()
-}
-
-Pack.prototype.end = function () {
-  this._ended = true
-  this._buffer.push(eof)
-  this._process()
-}
-
-Pack.prototype._process = function () {
-  var me = this
-  if (me._paused || me._processing) {
-    return
-  }
-
-  var entry = me._buffer.shift()
-
-  if (!entry) {
-    if (me._needDrain) {
-      me.emit("drain")
-    }
-    return
-  }
-
-  if (entry.ready === false) {
-    // console.error("-- entry is not ready", entry)
-    me._buffer.unshift(entry)
-    entry.on("ready", function () {
-      // console.error("-- -- ready!", entry)
-      me._process()
-    })
-    return
-  }
-
-  me._processing = true
-
-  if (entry === eof) {
-    // need 2 ending null blocks.
-    me.emit("data", eof)
-    me.emit("data", eof)
-    me.emit("end")
-    me.emit("close")
-    return
-  }
-
-  // Change the path to be relative to the root dir that was
-  // added to the tarball.
-  //
-  // XXX This should be more like how -C works, so you can
-  // explicitly set a root dir, and also explicitly set a pathname
-  // in the tarball to use.  That way we can skip a lot of extra
-  // work when resolving symlinks for bundled dependencies in npm.
-
-  var root = path.dirname((entry.root || entry).path)
-  var wprops = {}
-
-  Object.keys(entry.props || {}).forEach(function (k) {
-    wprops[k] = entry.props[k]
-  })
-
-  if (me._noProprietary) wprops.noProprietary = true
-
-  wprops.path = path.relative(root, entry.path || '')
-
-  // actually not a matter of opinion or taste.
-  if (process.platform === "win32") {
-    wprops.path = wprops.path.replace(/\\/g, "/")
-  }
-
-  if (!wprops.type)
-    wprops.type = 'Directory'
-
-  switch (wprops.type) {
-    // sockets not supported
-    case "Socket":
-      return
-
-    case "Directory":
-      wprops.path += "/"
-      wprops.size = 0
-      break
-
-    case "Link":
-      var lp = path.resolve(path.dirname(entry.path), entry.linkpath)
-      wprops.linkpath = path.relative(root, lp) || "."
-      wprops.size = 0
-      break
-
-    case "SymbolicLink":
-      var lp = path.resolve(path.dirname(entry.path), entry.linkpath)
-      wprops.linkpath = path.relative(path.dirname(entry.path), lp) || "."
-      wprops.size = 0
-      break
-  }
-
-  // console.error("-- new writer", wprops)
-  // if (!wprops.type) {
-  //   // console.error("-- no type?", entry.constructor.name, entry)
-  // }
-
-  // console.error("-- -- set current to new writer", wprops.path)
-  var writer = me._currentEntry = EntryWriter(wprops)
-
-  writer.parent = me
-
-  // writer.on("end", function () {
-  //   // console.error("-- -- writer end", writer.path)
-  // })
-
-  writer.on("data", function (c) {
-    me.emit("data", c)
-  })
-
-  writer.on("header", function () {
-    Buffer.prototype.toJSON = function () {
-      return this.toString().split(/\0/).join(".")
-    }
-    // console.error("-- -- writer header %j", writer.props)
-    if (writer.props.size === 0) nextEntry()
-  })
-  writer.on("close", nextEntry)
-
-  var ended = false
-  function nextEntry () {
-    if (ended) return
-    ended = true
-
-    // console.error("-- -- writer close", writer.path)
-    // console.error("-- -- set current to null", wprops.path)
-    me._currentEntry = null
-    me._processing = false
-    me._process()
-  }
-
-  writer.on("error", function (er) {
-    // console.error("-- -- writer error", writer.path)
-    me.emit("error", er)
-  })
-
-  // if it's the root, then there's no need to add its entries,
-  // or data, since they'll be added directly.
-  if (entry === me._pipeRoot) {
-    // console.error("-- is the root, don't auto-add")
-    writer.add = null
-  }
-
-  entry.pipe(writer)
-}
-
-Pack.prototype.destroy = function () {}
-Pack.prototype.write = function () {}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/lib/parse.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,270 +0,0 @@
-
-// A writable stream.
-// It emits "entry" events, which provide a readable stream that has
-// header info attached.
-
-module.exports = Parse.create = Parse
-
-var stream = require("stream")
-  , Stream = stream.Stream
-  , BlockStream = require("block-stream")
-  , tar = require("../tar.js")
-  , TarHeader = require("./header.js")
-  , Entry = require("./entry.js")
-  , BufferEntry = require("./buffer-entry.js")
-  , ExtendedHeader = require("./extended-header.js")
-  , assert = require("assert").ok
-  , inherits = require("inherits")
-  , fstream = require("fstream")
-
-// reading a tar is a lot like reading a directory
-// However, we're actually not going to run the ctor,
-// since it does a stat and various other stuff.
-// This inheritance gives us the pause/resume/pipe
-// behavior that is desired.
-inherits(Parse, fstream.Reader)
-
-function Parse () {
-  var me = this
-  if (!(me instanceof Parse)) return new Parse()
-
-  // doesn't apply fstream.Reader ctor?
-  // no, becasue we don't want to stat/etc, we just
-  // want to get the entry/add logic from .pipe()
-  Stream.apply(me)
-
-  me.writable = true
-  me.readable = true
-  me._stream = new BlockStream(512)
-  me.position = 0
-
-  me._stream.on("error", function (e) {
-    me.emit("error", e)
-  })
-
-  me._stream.on("data", function (c) {
-    me._process(c)
-  })
-
-  me._stream.on("end", function () {
-    me._streamEnd()
-  })
-
-  me._stream.on("drain", function () {
-    me.emit("drain")
-  })
-}
-
-// overridden in Extract class, since it needs to
-// wait for its DirWriter part to finish before
-// emitting "end"
-Parse.prototype._streamEnd = function () {
-  var me = this
-  if (!me._ended) me.error("unexpected eof")
-  me.emit("end")
-}
-
-// a tar reader is actually a filter, not just a readable stream.
-// So, you should pipe a tarball stream into it, and it needs these
-// write/end methods to do that.
-Parse.prototype.write = function (c) {
-  if (this._ended) {
-    // gnutar puts a LOT of nulls at the end.
-    // you can keep writing these things forever.
-    // Just ignore them.
-    for (var i = 0, l = c.length; i > l; i ++) {
-      if (c[i] !== 0) return this.error("write() after end()")
-    }
-    return
-  }
-  return this._stream.write(c)
-}
-
-Parse.prototype.end = function (c) {
-  this._ended = true
-  return this._stream.end(c)
-}
-
-// don't need to do anything, since we're just
-// proxying the data up from the _stream.
-// Just need to override the parent's "Not Implemented"
-// error-thrower.
-Parse.prototype._read = function () {}
-
-Parse.prototype._process = function (c) {
-  assert(c && c.length === 512, "block size should be 512")
-
-  // one of three cases.
-  // 1. A new header
-  // 2. A part of a file/extended header
-  // 3. One of two or more EOF null blocks
-
-  if (this._entry) {
-    var entry = this._entry
-    entry.write(c)
-    if (entry._remaining === 0) {
-      entry.end()
-      this._entry = null
-    }
-  } else {
-    // either zeroes or a header
-    var zero = true
-    for (var i = 0; i < 512 && zero; i ++) {
-      zero = c[i] === 0
-    }
-
-    // eof is *at least* 2 blocks of nulls, and then the end of the
-    // file.  you can put blocks of nulls between entries anywhere,
-    // so appending one tarball to another is technically valid.
-    // ending without the eof null blocks is not allowed, however.
-    if (zero) {
-      this._ended = this._eofStarted
-      this._eofStarted = true
-    } else {
-      this._ended = this._eofStarted = false
-      this._startEntry(c)
-    }
-
-  }
-
-  this.position += 512
-}
-
-// take a header chunk, start the right kind of entry.
-Parse.prototype._startEntry = function (c) {
-  var header = new TarHeader(c)
-    , self = this
-    , entry
-    , ev
-    , EntryType
-    , onend
-    , meta = false
-
-  if (null === header.size || !header.cksumValid) {
-    var e = new Error("invalid tar file")
-    e.header = header
-    e.tar_file_offset = this.position
-    e.tar_block = this.position / 512
-    this.emit("error", e)
-  }
-
-  switch (tar.types[header.type]) {
-    case "File":
-    case "OldFile":
-    case "Link":
-    case "SymbolicLink":
-    case "CharacterDevice":
-    case "BlockDevice":
-    case "Directory":
-    case "FIFO":
-    case "ContiguousFile":
-    case "GNUDumpDir":
-      // start a file.
-      // pass in any extended headers
-      // These ones consumers are typically most interested in.
-      EntryType = Entry
-      ev = "entry"
-      break
-
-    case "GlobalExtendedHeader":
-      // extended headers that apply to the rest of the tarball
-      EntryType = ExtendedHeader
-      onend = function () {
-        self._global = self._global || {}
-        Object.keys(entry.fields).forEach(function (k) {
-          self._global[k] = entry.fields[k]
-        })
-      }
-      ev = "globalExtendedHeader"
-      meta = true
-      break
-
-    case "ExtendedHeader":
-    case "OldExtendedHeader":
-      // extended headers that apply to the next entry
-      EntryType = ExtendedHeader
-      onend = function () {
-        self._extended = entry.fields
-      }
-      ev = "extendedHeader"
-      meta = true
-      break
-
-    case "NextFileHasLongLinkpath":
-      // set linkpath=<contents> in extended header
-      EntryType = BufferEntry
-      onend = function () {
-        self._extended = self._extended || {}
-        self._extended.linkpath = entry.body
-      }
-      ev = "longLinkpath"
-      meta = true
-      break
-
-    case "NextFileHasLongPath":
-    case "OldGnuLongPath":
-      // set path=<contents> in file-extended header
-      EntryType = BufferEntry
-      onend = function () {
-        self._extended = self._extended || {}
-        self._extended.path = entry.body
-      }
-      ev = "longPath"
-      meta = true
-      break
-
-    default:
-      // all the rest we skip, but still set the _entry
-      // member, so that we can skip over their data appropriately.
-      // emit an event to say that this is an ignored entry type?
-      EntryType = Entry
-      ev = "ignoredEntry"
-      break
-  }
-
-  var global, extended
-  if (meta) {
-    global = extended = null
-  } else {
-    var global = this._global
-    var extended = this._extended
-
-    // extendedHeader only applies to one entry, so once we start
-    // an entry, it's over.
-    this._extended = null
-  }
-  entry = new EntryType(header, extended, global)
-  entry.meta = meta
-
-  // only proxy data events of normal files.
-  if (!meta) {
-    entry.on("data", function (c) {
-      me.emit("data", c)
-    })
-  }
-
-  if (onend) entry.on("end", onend)
-
-  this._entry = entry
-  var me = this
-
-  entry.on("pause", function () {
-    me.pause()
-  })
-
-  entry.on("resume", function () {
-    me.resume()
-  })
-
-  if (this.listeners("*").length) {
-    this.emit("*", ev, entry)
-  }
-
-  this.emit(ev, entry)
-
-  // Zero-byte entry.  End immediately.
-  if (entry.props.size === 0) {
-    entry.end()
-    this._entry = null
-  }
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "tar",
-  "description": "tar for node",
-  "version": "0.1.18",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/node-tar.git"
-  },
-  "main": "tar.js",
-  "scripts": {
-    "test": "tap test/*.js"
-  },
-  "dependencies": {
-    "inherits": "2",
-    "block-stream": "*",
-    "fstream": "~0.1.8"
-  },
-  "devDependencies": {
-    "tap": "0.x",
-    "rimraf": "1.x"
-  },
-  "license": "BSD",
-  "readme": "# node-tar\n\nTar for Node.js.\n\n## Goals of this project\n\n1. Be able to parse and reasonably extract the contents of any tar file\n   created by any program that creates tar files, period.\n\n        At least, this includes every version of:\n\n        * bsdtar\n        * gnutar\n        * solaris posix tar\n        * Joerg Schilling's star (\"Schilly tar\")\n\n2. Create tar files that can be extracted by any of the following tar programs:\n\n        * bsdtar/libarchive version 2.6.2\n        * gnutar 1.15 and above\n        * SunOS Posix tar\n        * Joerg Schilling's star (\"Schilly tar\")\n\n3. 100% test coverage.  Speed is important.  Correctness is slightly more important.\n\n4. Create the kind of tar interface that Node users would want to use.\n\n5. Satisfy npm's needs for a portable tar implementation with a JavaScript interface.\n\n6. No excuses.  No complaining.  No tolerance for failure.\n\n## But isn't there already a tar.js?\n\nYes, there are a few.  This one is going to be better, and it will be\nfanatically maintained, because npm will depend on it.\n\nThat's why I need to write it from scratch.  Creating and extracting\ntarballs is such a large part of what npm does, I simply can't have it\nbe a black box any longer.\n\n## Didn't you have something already?  Where'd it go?\n\nIt's in the \"old\" folder.  It's not functional.  Don't use it.\n\nIt was a useful exploration to learn the issues involved, but like most\nsoftware of any reasonable complexity, node-tar won't be useful until\nit's been written at least 3 times.\n",
-  "readmeFilename": "README.md",
-  "bugs": {
-    "url": "https://github.com/isaacs/node-tar/issues"
-  },
-  "_id": "tar@0.1.18",
-  "_from": "tar@latest"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/tar/tar.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,173 +0,0 @@
-// field paths that every tar file must have.
-// header is padded to 512 bytes.
-var f = 0
-  , fields = {}
-  , path = fields.path = f++
-  , mode = fields.mode = f++
-  , uid = fields.uid = f++
-  , gid = fields.gid = f++
-  , size = fields.size = f++
-  , mtime = fields.mtime = f++
-  , cksum = fields.cksum = f++
-  , type = fields.type = f++
-  , linkpath = fields.linkpath = f++
-  , headerSize = 512
-  , blockSize = 512
-  , fieldSize = []
-
-fieldSize[path] = 100
-fieldSize[mode] = 8
-fieldSize[uid] = 8
-fieldSize[gid] = 8
-fieldSize[size] = 12
-fieldSize[mtime] = 12
-fieldSize[cksum] = 8
-fieldSize[type] = 1
-fieldSize[linkpath] = 100
-
-// "ustar\0" may introduce another bunch of headers.
-// these are optional, and will be nulled out if not present.
-
-var ustar = fields.ustar = f++
-  , ustarver = fields.ustarver = f++
-  , uname = fields.uname = f++
-  , gname = fields.gname = f++
-  , devmaj = fields.devmaj = f++
-  , devmin = fields.devmin = f++
-  , prefix = fields.prefix = f++
-  , fill = fields.fill = f++
-
-// terminate fields.
-fields[f] = null
-
-fieldSize[ustar] = 6
-fieldSize[ustarver] = 2
-fieldSize[uname] = 32
-fieldSize[gname] = 32
-fieldSize[devmaj] = 8
-fieldSize[devmin] = 8
-fieldSize[prefix] = 155
-fieldSize[fill] = 12
-
-// nb: prefix field may in fact be 130 bytes of prefix,
-// a null char, 12 bytes for atime, 12 bytes for ctime.
-//
-// To recognize this format:
-// 1. prefix[130] === ' ' or '\0'
-// 2. atime and ctime are octal numeric values
-// 3. atime and ctime have ' ' in their last byte
-
-var fieldEnds = {}
-  , fieldOffs = {}
-  , fe = 0
-for (var i = 0; i < f; i ++) {
-  fieldOffs[i] = fe
-  fieldEnds[i] = (fe += fieldSize[i])
-}
-
-// build a translation table of field paths.
-Object.keys(fields).forEach(function (f) {
-  if (fields[f] !== null) fields[fields[f]] = f
-})
-
-// different values of the 'type' field
-// paths match the values of Stats.isX() functions, where appropriate
-var types =
-  { 0: "File"
-  , "\0": "OldFile" // like 0
-  , "": "OldFile"
-  , 1: "Link"
-  , 2: "SymbolicLink"
-  , 3: "CharacterDevice"
-  , 4: "BlockDevice"
-  , 5: "Directory"
-  , 6: "FIFO"
-  , 7: "ContiguousFile" // like 0
-  // posix headers
-  , g: "GlobalExtendedHeader" // k=v for the rest of the archive
-  , x: "ExtendedHeader" // k=v for the next file
-  // vendor-specific stuff
-  , A: "SolarisACL" // skip
-  , D: "GNUDumpDir" // like 5, but with data, which should be skipped
-  , I: "Inode" // metadata only, skip
-  , K: "NextFileHasLongLinkpath" // data = link path of next file
-  , L: "NextFileHasLongPath" // data = path of next file
-  , M: "ContinuationFile" // skip
-  , N: "OldGnuLongPath" // like L
-  , S: "SparseFile" // skip
-  , V: "TapeVolumeHeader" // skip
-  , X: "OldExtendedHeader" // like x
-  }
-
-Object.keys(types).forEach(function (t) {
-  types[types[t]] = types[types[t]] || t
-})
-
-// values for the mode field
-var modes =
-  { suid: 04000 // set uid on extraction
-  , sgid: 02000 // set gid on extraction
-  , svtx: 01000 // set restricted deletion flag on dirs on extraction
-  , uread:  0400
-  , uwrite: 0200
-  , uexec:  0100
-  , gread:  040
-  , gwrite: 020
-  , gexec:  010
-  , oread:  4
-  , owrite: 2
-  , oexec:  1
-  , all: 07777
-  }
-
-var numeric =
-  { mode: true
-  , uid: true
-  , gid: true
-  , size: true
-  , mtime: true
-  , devmaj: true
-  , devmin: true
-  , cksum: true
-  , atime: true
-  , ctime: true
-  , dev: true
-  , ino: true
-  , nlink: true
-  }
-
-Object.keys(modes).forEach(function (t) {
-  modes[modes[t]] = modes[modes[t]] || t
-})
-
-var knownExtended =
-  { atime: true
-  , charset: true
-  , comment: true
-  , ctime: true
-  , gid: true
-  , gname: true
-  , linkpath: true
-  , mtime: true
-  , path: true
-  , realtime: true
-  , security: true
-  , size: true
-  , uid: true
-  , uname: true }
-
-
-exports.fields = fields
-exports.fieldSize = fieldSize
-exports.fieldOffs = fieldOffs
-exports.fieldEnds = fieldEnds
-exports.types = types
-exports.modes = modes
-exports.numeric = numeric
-exports.headerSize = headerSize
-exports.blockSize = blockSize
-exports.knownExtended = knownExtended
-
-exports.Pack = require("./lib/pack.js")
-exports.Parse = require("./lib/parse.js")
-exports.Extract = require("./lib/extract.js")
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/LICENCE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-Copyright (c) Isaac Z. Schlueter
-All rights reserved.
-
-The BSD License
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions
-are met:
-1. Redistributions of source code must retain the above copyright
-   notice, this list of conditions and the following disclaimer.
-2. Redistributions in binary form must reproduce the above copyright
-   notice, this list of conditions and the following disclaimer in the
-   documentation and/or other materials provided with the distribution.
-
-THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS
-``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
-TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
-PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS
-BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
-INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
-CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
-ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
-POSSIBILITY OF SUCH DAMAGE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,17 +0,0 @@
-Use this module to convert a username/groupname to a uid/gid number.
-
-Usage:
-
-```
-npm install uid-number
-```
-
-Then, in your node program:
-
-```javascript
-var uidNumber = require("uid-number")
-uidNumber("isaacs", function (er, uid, gid) {
-  // gid is null because we didn't ask for a group name
-  // uid === 24561 because that's my number.
-})
-```
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/get-uid-gid.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-if (module !== require.main) {
-  throw new Error("This file should not be loaded with require()")
-}
-
-if (!process.getuid || !process.getgid) {
-  throw new Error("this file should not be called without uid/gid support")
-}
-
-var argv = process.argv.slice(2)
-  , user = argv[0] || process.getuid()
-  , group = argv[1] || process.getgid()
-
-if (!isNaN(user)) user = +user
-if (!isNaN(group)) group = +group
-
-console.error([user, group])
-
-try {
-  process.setgid(group)
-  process.setuid(user)
-  console.log(JSON.stringify({uid:+process.getuid(), gid:+process.getgid()}))
-} catch (ex) {
-  console.log(JSON.stringify({error:ex.message,errno:ex.errno}))
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,35 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me/"
-  },
-  "name": "uid-number",
-  "description": "Convert a username/group name to a uid/gid number",
-  "version": "0.0.3",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/uid-number.git"
-  },
-  "main": "uid-number.js",
-  "dependencies": {},
-  "devDependencies": {},
-  "optionalDependencies": {},
-  "engines": {
-    "node": "*"
-  },
-  "license": "BSD",
-  "_npmUser": {
-    "name": "isaacs",
-    "email": "i@izs.me"
-  },
-  "_id": "uid-number@0.0.3",
-  "_engineSupported": true,
-  "_npmVersion": "1.1.23",
-  "_nodeVersion": "v0.7.10-pre",
-  "_defaultsLoaded": true,
-  "dist": {
-    "shasum": "cefb0fa138d8d8098da71a40a0d04a8327d6e1cc"
-  },
-  "_from": "../uid-number"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/uid-number/uid-number.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,54 +0,0 @@
-module.exports = uidNumber
-
-// This module calls into get-uid-gid.js, which sets the
-// uid and gid to the supplied argument, in order to find out their
-// numeric value.  This can't be done in the main node process,
-// because otherwise node would be running as that user from this
-// point on.
-
-var child_process = require("child_process")
-  , path = require("path")
-  , uidSupport = process.getuid && process.setuid
-  , uidCache = {}
-  , gidCache = {}
-
-function uidNumber (uid, gid, cb) {
-  if (!uidSupport) return cb()
-  if (typeof cb !== "function") cb = gid, gid = null
-  if (typeof cb !== "function") cb = uid, uid = null
-  if (gid == null) gid = process.getgid()
-  if (uid == null) uid = process.getuid()
-  if (!isNaN(gid)) gid = uidCache[gid] = +gid
-  if (!isNaN(uid)) uid = uidCache[uid] = +uid
-
-  if (uidCache.hasOwnProperty(uid)) uid = uidCache[uid]
-  if (gidCache.hasOwnProperty(gid)) gid = gidCache[gid]
-
-  if (typeof gid === "number" && typeof uid === "number") {
-    return process.nextTick(cb.bind(null, null, uid, gid))
-  }
-
-  var getter = require.resolve("./get-uid-gid.js")
-
-  child_process.execFile( process.execPath
-                        , [getter, uid, gid]
-                        , function (code, out, err) {
-    if (er) return cb(new Error("could not get uid/gid\n" + err))
-    try {
-      out = JSON.parse(out+"")
-    } catch (ex) {
-      return cb(ex)
-    }
-
-    if (out.error) {
-      var er = new Error(out.error)
-      er.errno = out.errno
-      return cb(er)
-    }
-
-    if (isNaN(out.uid) || isNaN(out.gid)) return cb(new Error(
-      "Could not get uid/gid: "+JSON.stringify(out)))
-
-    cb(null, uidCache[uid] = +out.uid, uidCache[gid] = +out.gid)
-  })
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/LICENSE	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,23 +0,0 @@
-Copyright 2009, 2010, 2011 Isaac Z. Schlueter.
-All rights reserved.
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/README.md	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,5 +0,0 @@
-The "which" util from npm's guts.
-
-Finds the first instance of a specified executable in the PATH
-environment variable.  Does not cache the results, so `hash -r` is not
-needed when the PATH changes.
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/bin/which	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,14 +0,0 @@
-#!/usr/bin/env node
-var which = require("../")
-if (process.argv.length < 3) {
-  console.error("Usage: which <thing>")
-  process.exit(1)
-}
-
-which(process.argv[2], function (er, thing) {
-  if (er) {
-    console.error(er.message)
-    process.exit(er.errno || 127)
-  }
-  console.log(thing)
-})
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,34 +0,0 @@
-{
-  "author": {
-    "name": "Isaac Z. Schlueter",
-    "email": "i@izs.me",
-    "url": "http://blog.izs.me"
-  },
-  "name": "which",
-  "description": "Like which(1) unix command. Find the first instance of an executable in the PATH.",
-  "version": "1.0.5",
-  "repository": {
-    "type": "git",
-    "url": "git://github.com/isaacs/node-which.git"
-  },
-  "main": "which.js",
-  "bin": {
-    "which": "./bin/which"
-  },
-  "engines": {
-    "node": "*"
-  },
-  "dependencies": {},
-  "devDependencies": {},
-  "_npmUser": {
-    "name": "isaacs",
-    "email": "i@izs.me"
-  },
-  "_id": "which@1.0.5",
-  "optionalDependencies": {},
-  "_engineSupported": true,
-  "_npmVersion": "1.1.2",
-  "_nodeVersion": "v0.7.6-pre",
-  "_defaultsLoaded": true,
-  "_from": "which@1"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/node_modules/which/which.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,104 +0,0 @@
-module.exports = which
-which.sync = whichSync
-
-var path = require("path")
-  , fs
-  , COLON = process.platform === "win32" ? ";" : ":"
-  , isExe
-
-try {
-  fs = require("graceful-fs")
-} catch (ex) {
-  fs = require("fs")
-}
-
-if (process.platform == "win32") {
-  // On windows, there is no good way to check that a file is executable
-  isExe = function isExe () { return true }
-} else {
-  isExe = function isExe (mod, uid, gid) {
-    //console.error(mod, uid, gid);
-    //console.error("isExe?", (mod & 0111).toString(8))
-    var ret = (mod & 0001)
-        || (mod & 0010) && process.getgid && gid === process.getgid()
-        || (mod & 0100) && process.getuid && uid === process.getuid()
-    //console.error("isExe?", ret)
-    return ret
-  }
-}
-
-
-
-function which (cmd, cb) {
-  if (isAbsolute(cmd)) return cb(null, cmd)
-  var pathEnv = (process.env.PATH || "").split(COLON)
-    , pathExt = [""]
-  if (process.platform === "win32") {
-    pathEnv.push(process.cwd())
-    pathExt = (process.env.PATHEXT || ".EXE").split(COLON)
-    if (cmd.indexOf(".") !== -1) pathExt.unshift("")
-  }
-  //console.error("pathEnv", pathEnv)
-  ;(function F (i, l) {
-    if (i === l) return cb(new Error("not found: "+cmd))
-    var p = path.resolve(pathEnv[i], cmd)
-    ;(function E (ii, ll) {
-      if (ii === ll) return F(i + 1, l)
-      var ext = pathExt[ii]
-      //console.error(p + ext)
-      fs.stat(p + ext, function (er, stat) {
-        if (!er &&
-            stat &&
-            stat.isFile() &&
-            isExe(stat.mode, stat.uid, stat.gid)) {
-          //console.error("yes, exe!", p + ext)
-          return cb(null, p + ext)
-        }
-        return E(ii + 1, ll)
-      })
-    })(0, pathExt.length)
-  })(0, pathEnv.length)
-}
-
-function whichSync (cmd) {
-  if (isAbsolute(cmd)) return cmd
-  var pathEnv = (process.env.PATH || "").split(COLON)
-    , pathExt = [""]
-  if (process.platform === "win32") {
-    pathEnv.push(process.cwd())
-    pathExt = (process.env.PATHEXT || ".EXE").split(COLON)
-    if (cmd.indexOf(".") !== -1) pathExt.unshift("")
-  }
-  for (var i = 0, l = pathEnv.length; i < l; i ++) {
-    var p = path.join(pathEnv[i], cmd)
-    for (var j = 0, ll = pathExt.length; j < ll; j ++) {
-      var cur = p + pathExt[j]
-      var stat
-      try { stat = fs.statSync(cur) } catch (ex) {}
-      if (stat &&
-          stat.isFile() &&
-          isExe(stat.mode, stat.uid, stat.gid)) return cur
-    }
-  }
-  throw new Error("not found: "+cmd)
-}
-
-var isAbsolute = process.platform === "win32" ? absWin : absUnix
-
-function absWin (p) {
-  if (absUnix(p)) return true
-  // pull off the device/UNC bit from a windows path.
-  // from node's lib/path.js
-  var splitDeviceRe =
-        /^([a-zA-Z]:|[\\\/]{2}[^\\\/]+[\\\/][^\\\/]+)?([\\\/])?/
-    , result = splitDeviceRe.exec(p)
-    , device = result[1] || ''
-    , isUnc = device && device.charAt(1) !== ':'
-    , isAbsolute = !!result[2] || isUnc // UNC paths are always absolute
-
-  return isAbsolute
-}
-
-function absUnix (p) {
-  return p.charAt(0) === "/" || p === ""
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/package.json	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,143 +0,0 @@
-{
-  "version": "1.3.14",
-  "name": "npm",
-  "publishConfig": {
-    "proprietary-attribs": false
-  },
-  "description": "A package manager for node",
-  "keywords": [
-    "package manager",
-    "modules",
-    "install",
-    "package.json"
-  ],
-  "preferGlobal": true,
-  "config": {
-    "publishtest": false
-  },
-  "homepage": "https://npmjs.org/doc/",
-  "author": "Isaac Z. Schlueter <i@izs.me> (http://blog.izs.me)",
-  "repository": {
-    "type": "git",
-    "url": "https://github.com/isaacs/npm"
-  },
-  "bugs": {
-    "email": "npm-@googlegroups.com",
-    "url": "http://github.com/isaacs/npm/issues"
-  },
-  "directories": {
-    "doc": "./doc",
-    "man": "./man",
-    "lib": "./lib",
-    "bin": "./bin"
-  },
-  "main": "./lib/npm.js",
-  "bin": "./bin/npm-cli.js",
-  "dependencies": {
-    "semver": "~2.2.1",
-    "ini": "~1.1.0",
-    "slide": "~1.1.5",
-    "abbrev": "~1.0.4",
-    "graceful-fs": "~2.0.0",
-    "minimatch": "~0.2.12",
-    "nopt": "~2.1.2",
-    "rimraf": "~2.2.0",
-    "request": "~2.27.0",
-    "which": "1",
-    "tar": "~0.1.18",
-    "fstream": "~0.1.23",
-    "block-stream": "0.0.7",
-    "mkdirp": "~0.3.5",
-    "read": "~1.0.4",
-    "lru-cache": "~2.3.1",
-    "node-gyp": "~0.11.0",
-    "fstream-npm": "~0.1.6",
-    "uid-number": "0",
-    "archy": "0",
-    "chownr": "0",
-    "npmlog": "0.0.6",
-    "ansi": "~0.2.1",
-    "npm-registry-client": "~0.2.29",
-    "read-package-json": "~1.1.4",
-    "read-installed": "~0.2.2",
-    "glob": "~3.2.6",
-    "init-package-json": "0.0.11",
-    "osenv": "0",
-    "lockfile": "~0.4.0",
-    "retry": "~0.6.0",
-    "once": "~1.3.0",
-    "npmconf": "~0.1.5",
-    "opener": "~1.3.0",
-    "chmodr": "~0.1.0",
-    "cmd-shim": "~1.1.1",
-    "sha": "~1.2.1",
-    "editor": "0.0.5",
-    "child-process-close": "~0.1.1",
-    "npm-user-validate": "0.0.3",
-    "github-url-from-git": "1.1.1",
-    "github-url-from-username-repo": "0.0.2"
-  },
-  "bundleDependencies": [
-    "semver",
-    "ini",
-    "slide",
-    "abbrev",
-    "graceful-fs",
-    "minimatch",
-    "nopt",
-    "rimraf",
-    "request",
-    "which",
-    "tar",
-    "fstream",
-    "block-stream",
-    "inherits",
-    "mkdirp",
-    "read",
-    "lru-cache",
-    "node-gyp",
-    "fstream-npm",
-    "uid-number",
-    "archy",
-    "chownr",
-    "npmlog",
-    "ansi",
-    "npm-registry-client",
-    "read-package-json",
-    "read-installed",
-    "glob",
-    "init-package-json",
-    "osenv",
-    "lockfile",
-    "retry",
-    "once",
-    "npmconf",
-    "opener",
-    "chmodr",
-    "cmd-shim",
-    "sha",
-    "child-process-close",
-    "editor",
-    "npm-user-validate",
-    "github-url-from-git",
-    "github-url-from-username-repo",
-    "normalize-package-data"
-  ],
-  "devDependencies": {
-    "ronn": "~0.3.6",
-    "tap": "~0.4.0",
-    "npm-registry-mock": "~0.5.3"
-  },
-  "engines": {
-    "node": ">=0.6",
-    "npm": "1"
-  },
-  "scripts": {
-    "test": "node ./test/run.js && tap test/tap/*.js",
-    "tap": "tap test/tap/*.js",
-    "prepublish": "node bin/npm-cli.js prune --prefix=. --no-global && rm -rf test/*/*/node_modules && make -j4 doc",
-    "dumpconf": "env | grep npm | sort | uniq",
-    "echo": "node bin/npm-cli.js"
-  },
-  "license": "Artistic-2.0"
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/clean-old.sh	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,165 +0,0 @@
-#!/bin/bash
-
-# look for old 0.x cruft, and get rid of it.
-# Should already be sitting in the npm folder.
-
-# This doesn't have to be quite as cross-platform as install.sh.
-# There are some bash-isms, because maintaining *two*
-# fully-portable posix/bourne sh scripts is too much for
-# one project with a sane maintainer.
-
-# If readlink isn't available, then this is just too tricky.
-# However, greadlink is fine, so Solaris can join the party, too.
-readlink="readlink"
-which $readlink >/dev/null 2>/dev/null
-if [ $? -ne 0 ]; then
-  readlink="greadlink"
-  which $readlink >/dev/null 2>/dev/null
-  if [ $? -ne 0 ]; then
-    echo "Can't find the readlink or greadlink command. Aborting."
-    exit 1
-  fi
-fi
-
-if [ "x$npm_config_prefix" != "x" ]; then
-  PREFIXES=$npm_config_prefix
-else
-  node="$NODE"
-  if [ "x$node" = "x" ]; then
-    node=`which node`
-  fi
-  if [ "x$node" = "x" ]; then
-    echo "Can't find node to determine prefix. Aborting."
-    exit 1
-  fi
-
-
-  PREFIX=`dirname $node`
-  PREFIX=`dirname $PREFIX`
-  echo "cleanup prefix=$PREFIX"
-  PREFIXES=$PREFIX
-
-  altprefix=`"$node" -e process.installPrefix`
-  if [ "x$altprefix" != "x" ] && [ "x$altprefix" != "x$PREFIX" ]; then
-    echo "altprefix=$altprefix"
-    PREFIXES="$PREFIX $altprefix"
-  fi
-fi
-
-# now prefix is where npm would be rooted by default
-# go hunting.
-
-packages=
-for prefix in $PREFIXES; do
-  packages="$packages
-    "`ls "$prefix"/lib/node/.npm 2>/dev/null | grep -v .cache`
-done
-
-packages=`echo $packages`
-
-filelist=()
-fid=0
-
-for prefix in $PREFIXES; do
-  # remove any links into the .npm dir, or links to
-  # version-named shims/symlinks.
-  for folder in share/man bin lib/node; do
-    find $prefix/$folder -type l | while read file; do
-      target=`$readlink $file | grep '/\.npm/'`
-      if [ "x$target" != "x" ]; then
-        # found one!
-        filelist[$fid]="$file"
-        let 'fid++'
-        # also remove any symlinks to this file.
-        base=`basename "$file"`
-        base=`echo "$base" | awk -F@ '{print $1}'`
-        if [ "x$base" != "x" ]; then
-          find "`dirname $file`" -type l -name "$base"'*' \
-          | while read l; do
-              target=`$readlink "$l" | grep "$base"`
-              if [ "x$target" != "x" ]; then
-                filelist[$fid]="$1"
-                let 'fid++'
-              fi
-            done
-        fi
-      fi
-    done
-
-    # Scour for shim files.  These are relics of 0.2 npm installs.
-    # note: grep -r is not portable.
-    find $prefix/$folder -type f \
-      | xargs grep -sl '// generated by npm' \
-      | while read file; do
-          filelist[$fid]="$file"
-          let 'fid++'
-        done
-  done
-
-  # now remove the package modules, and the .npm folder itself.
-  if [ "x$packages" != "x" ]; then
-    for pkg in $packages; do
-      filelist[$fid]="$prefix/lib/node/$pkg"
-      let 'fid++'
-      for i in $prefix/lib/node/$pkg\@*; do
-        filelist[$fid]="$i"
-        let 'fid++'
-      done
-    done
-  fi
-
-  for folder in lib/node/.npm lib/npm share/npm; do
-    if [ -d $prefix/$folder ]; then
-      filelist[$fid]="$prefix/$folder"
-      let 'fid++'
-    fi
-  done
-done
-
-# now actually clean, but only if there's anything TO clean
-if [ "${#filelist[@]}" -gt 0 ]; then
-  echo ""
-  echo "This script will find and eliminate any shims, symbolic"
-  echo "links, and other cruft that was installed by npm 0.x."
-  echo ""
-
-  if [ "x$packages" != "x" ]; then
-    echo "The following packages appear to have been installed with"
-    echo "an old version of npm, and will be removed forcibly:"
-    for pkg in $packages; do
-      echo "    $pkg"
-    done
-    echo "Make a note of these. You may want to install them"
-    echo "with npm 1.0 when this process is completed."
-    echo ""
-  fi
-
-  OK=
-  if [ "x$1" = "x-y" ]; then
-    OK="yes"
-  fi
-
-  while [ "$OK" != "y" ] && [ "$OK" != "yes" ] && [ "$OK" != "no" ]; do
-    echo "Is this OK?"
-    echo "  enter 'yes' or 'no'"
-    echo "  or 'show' to see a list of files "
-    read OK
-    if [ "x$OK" = "xshow" ] || [ "x$OK" = "xs" ]; then
-      for i in "${filelist[@]}"; do
-        echo "$i"
-      done
-    fi
-  done
-  if [ "$OK" = "no" ]; then
-    echo "Aborting"
-    exit 1
-  fi
-  for i in "${filelist[@]}"; do
-    rm -rf "$i"
-  done
-fi
-
-echo ""
-echo 'All clean!'
-
-exit 0
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/doc-build.sh	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,84 +0,0 @@
-#!/usr/bin/env bash
-
-if [[ $DEBUG != "" ]]; then
-  set -x
-fi
-set -o errexit
-set -o pipefail
-
-if ! [ -x node_modules/.bin/ronn ]; then
-  ps=0
-  if [ -f .building_ronn ]; then
-    pid=$(cat .building_ronn)
-    ps=$(ps -p $pid | grep $pid | wc -l) || true
-  fi
-
-  if [ -f .building_ronn ] && [ $ps != 0 ]; then
-    while [ -f .building_ronn ]; do
-      sleep 1
-    done
-  else
-    # a race to see which make process will be the one to install ronn
-    echo $$ > .building_ronn
-    sleep 1
-    if [ $(cat .building_ronn) == $$ ]; then
-      make node_modules/.bin/ronn
-      rm .building_ronn
-    else
-      while [ -f .building_ronn ]; do
-        sleep 1
-      done
-    fi
-  fi
-fi
-
-src=$1
-dest=$2
-name=$(basename ${src%.*})
-date=$(date -u +'%Y-%M-%d %H:%m:%S')
-version=$(node cli.js -v)
-
-mkdir -p $(dirname $dest)
-
-case $dest in
-  *.[1357])
-    ./node_modules/.bin/ronn --roff $src \
-    | sed "s|@VERSION@|$version|g" \
-    | perl -pi -e 's/(npm\\-)?([^\(]*)\(1\)/npm help \2/g' \
-    | perl -pi -e 's/(npm\\-)?([^\(]*)\([57]\)/npm help \3 \2/g' \
-    | perl -pi -e 's/(npm\\-)?([^\(]*)\(3\)/npm apihelp \2/g' \
-    | perl -pi -e 's/npm\(1\)/npm help npm/g' \
-    | perl -pi -e 's/npm\(3\)/npm apihelp npm/g' \
-    > $dest
-    exit $?
-    ;;
-  *.html)
-    (cat html/dochead.html && \
-     ./node_modules/.bin/ronn -f $src &&
-     cat html/docfoot.html)\
-    | sed "s|@NAME@|$name|g" \
-    | sed "s|@DATE@|$date|g" \
-    | sed "s|@VERSION@|$version|g" \
-    | perl -pi -e 's/<h1>([^\(]*\([0-9]\)) -- (.*?)<\/h1>/<h1>\1<\/h1> <p>\2<\/p>/g' \
-    | perl -pi -e 's/npm-npm/npm/g' \
-    | perl -pi -e 's/([^"-])(npm-)?README(\(1\))?/\1<a href="..\/..\/doc\/README.html">README<\/a>/g' \
-    | perl -pi -e 's/<title><a href="[^"]+README.html">README<\/a><\/title>/<title>README<\/title>/g' \
-    | perl -pi -e 's/([^"-])([^\(> ]+)(\(1\))/\1<a href="..\/cli\/\2.html">\2\3<\/a>/g' \
-    | perl -pi -e 's/([^"-])([^\(> ]+)(\(3\))/\1<a href="..\/api\/\2.html">\2\3<\/a>/g' \
-    | perl -pi -e 's/([^"-])([^\(> ]+)(\(5\))/\1<a href="..\/files\/\2.html">\2\3<\/a>/g' \
-    | perl -pi -e 's/([^"-])([^\(> ]+)(\(7\))/\1<a href="..\/misc\/\2.html">\2\3<\/a>/g' \
-    | perl -pi -e 's/\([1357]\)<\/a><\/h1>/<\/a><\/h1>/g' \
-    | (if [ $(basename $(dirname $dest)) == "doc" ]; then
-        perl -pi -e 's/ href="\.\.\// href="/g'
-      else
-        cat
-      fi) \
-    > $dest \
-    && cat html/docfoot-script.html >> $dest
-    exit $?
-    ;;
-  *)
-    echo "Invalid destination type: $dest" >&2
-    exit 1
-    ;;
-esac
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/index-build.js	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,62 +0,0 @@
-#!/usr/bin/env node
-var fs = require("fs")
-  , path = require("path")
-  , root = path.resolve(__dirname, "..")
-  , glob = require("glob")
-  , conversion = { "cli": 1, "api": 3, "files": 5, "misc": 7 }
-
-glob(root + "/{README.md,doc/*/*.md}", function (er, files) {
-  if (er)
-    throw er
-  output(files.map(function (f) {
-    var b = path.basename(f)
-    if (b === "README.md")
-      return [0, b]
-    if (b === "index.md")
-      return null
-    var s = conversion[path.basename(path.dirname(f))]
-    return [s, f]
-  }).filter(function (f) {
-    return f
-  }).sort(function (a, b) {
-    return (a[0] === b[0])
-           ? ( path.basename(a[1]) === "npm.md" ? -1
-             : path.basename(b[1]) === "npm.md" ? 1
-             : a[1] > b[1] ? 1 : -1 )
-           : a[0] - b[0]
-  }))
-})
-
-return
-
-function output (files) {
-  console.log(
-    "npm-index(7) -- Index of all npm documentation\n" +
-    "==============================================\n")
-
-  writeLines(files, 0)
-  writeLines(files, 1, "Command Line Documentation")
-  writeLines(files, 3, "API Documentation")
-  writeLines(files, 5, "Files")
-  writeLines(files, 7, "Misc")
-}
-
-function writeLines (files, sxn, heading) {
-  if (heading)
-    console.log("# %s\n", heading)
-  files.filter(function (f) {
-    return f[0] === sxn
-  }).forEach(writeLine)
-}
-
-
-function writeLine (sd) {
-  var sxn = sd[0] || 1
-    , doc = sd[1]
-    , d = path.basename(doc, ".md")
-
-  var content = fs.readFileSync(doc, "utf8").split("\n")[0].split("-- ")[1]
-
-  console.log("## %s(%d)\n", d, sxn)
-  console.log(content + "\n")
-}
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/install.sh	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,308 +0,0 @@
-#!/bin/sh
-
-# A word about this shell script:
-#
-# It must work everywhere, including on systems that lack
-# a /bin/bash, map 'sh' to ksh, ksh97, bash, ash, or zsh,
-# and potentially have either a posix shell or bourne
-# shell living at /bin/sh.
-#
-# See this helpful document on writing portable shell scripts:
-# http://www.gnu.org/s/hello/manual/autoconf/Portable-Shell.html
-#
-# The only shell it won't ever work on is cmd.exe.
-
-if [ "x$0" = "xsh" ]; then
-  # run as curl | sh
-  # on some systems, you can just do cat>npm-install.sh
-  # which is a bit cuter.  But on others, &1 is already closed,
-  # so catting to another script file won't do anything.
-  curl -s https://npmjs.org/install.sh > npm-install-$$.sh
-  sh npm-install-$$.sh
-  ret=$?
-  rm npm-install-$$.sh
-  exit $ret
-fi
-
-# See what "npm_config_*" things there are in the env,
-# and make them permanent.
-# If this fails, it's not such a big deal.
-configures="`env | grep 'npm_config_' | sed -e 's|^npm_config_||g'`"
-
-npm_config_loglevel="error"
-if [ "x$npm_debug" = "x" ]; then
-  (exit 0)
-else
-  echo "Running in debug mode."
-  echo "Note that this requires bash or zsh."
-  set -o xtrace
-  set -o pipefail
-  npm_config_loglevel="verbose"
-fi
-export npm_config_loglevel
-
-# make sure that node exists
-node=`which node 2>&1`
-ret=$?
-if [ $ret -eq 0 ] && [ -x "$node" ]; then
-  (exit 0)
-else
-  echo "npm cannot be installed without nodejs." >&2
-  echo "Install node first, and then try again." >&2
-  echo "" >&2
-  echo "Maybe node is installed, but not in the PATH?" >&2
-  echo "Note that running as sudo can change envs." >&2
-  echo ""
-  echo "PATH=$PATH" >&2
-  exit $ret
-fi
-
-# set the temp dir
-TMP="${TMPDIR}"
-if [ "x$TMP" = "x" ]; then
-  TMP="/tmp"
-fi
-TMP="${TMP}/npm.$$"
-rm -rf "$TMP" || true
-mkdir "$TMP"
-if [ $? -ne 0 ]; then
-  echo "failed to mkdir $TMP" >&2
-  exit 1
-fi
-
-BACK="$PWD"
-
-ret=0
-tar="${TAR}"
-if [ -z "$tar" ]; then
-  tar="${npm_config_tar}"
-fi
-if [ -z "$tar" ]; then
-  tar=`which tar 2>&1`
-  ret=$?
-fi
-
-if [ $ret -eq 0 ] && [ -x "$tar" ]; then
-  echo "tar=$tar"
-  echo "version:"
-  $tar --version
-  ret=$?
-fi
-
-if [ $ret -eq 0 ]; then
-  (exit 0)
-else
-  echo "No suitable tar program found."
-  exit 1
-fi
-
-
-
-# Try to find a suitable make
-# If the MAKE environment var is set, use that.
-# otherwise, try to find gmake, and then make.
-# If no make is found, then just execute the necessary commands.
-
-# XXX For some reason, make is building all the docs every time.  This
-# is an annoying source of bugs. Figure out why this happens.
-MAKE=NOMAKE
-
-if [ "x$MAKE" = "x" ]; then
-  make=`which gmake 2>&1`
-  if [ $? -eq 0 ] && [ -x $make ]; then
-    (exit 0)
-  else
-    make=`which make 2>&1`
-    if [ $? -eq 0 ] && [ -x $make ]; then
-      (exit 0)
-    else
-      make=NOMAKE
-    fi
-  fi
-else
-  make="$MAKE"
-fi
-
-if [ -x "$make" ]; then
-  (exit 0)
-else
-  # echo "Installing without make. This may fail." >&2
-  make=NOMAKE
-fi
-
-# If there's no bash, then don't even try to clean
-if [ -x "/bin/bash" ]; then
-  (exit 0)
-else
-  clean="no"
-fi
-
-node_version=`"$node" --version 2>&1`
-ret=$?
-if [ $ret -ne 0 ]; then
-  echo "You need node to run this program." >&2
-  echo "node --version reports: $node_version" >&2
-  echo "with exit code = $ret" >&2
-  echo "Please install node before continuing." >&2
-  exit $ret
-fi
-
-t="${npm_install}"
-if [ -z "$t" ]; then
-  # switch based on node version.
-  # note that we can only use strict sh-compatible patterns here.
-  case $node_version in
-    0.[0123].* | v0.[0123].*)
-      echo "You are using an outdated and unsupported version of" >&2
-      echo "node ($node_version).  Please update node and try again." >&2
-      exit 99
-      ;;
-    v0.[45].* | 0.[45].*)
-      echo "install npm@1.0"
-      t=1.0
-      ;;
-    v0.[678].* | 0.[678].*)
-      echo "install npm@1.1"
-      t=1.1
-      ;;
-    *)
-      echo "install npm@latest"
-      t="latest"
-      ;;
-  esac
-fi
-
-# the npmca cert
-cacert='
------BEGIN CERTIFICATE-----
-MIIChzCCAfACCQDauvz/KHp8ejANBgkqhkiG9w0BAQUFADCBhzELMAkGA1UEBhMC
-VVMxCzAJBgNVBAgTAkNBMRAwDgYDVQQHEwdPYWtsYW5kMQwwCgYDVQQKEwNucG0x
-IjAgBgNVBAsTGW5wbSBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkxDjAMBgNVBAMTBW5w
-bUNBMRcwFQYJKoZIhvcNAQkBFghpQGl6cy5tZTAeFw0xMTA5MDUwMTQ3MTdaFw0y
-MTA5MDIwMTQ3MTdaMIGHMQswCQYDVQQGEwJVUzELMAkGA1UECBMCQ0ExEDAOBgNV
-BAcTB09ha2xhbmQxDDAKBgNVBAoTA25wbTEiMCAGA1UECxMZbnBtIENlcnRpZmlj
-YXRlIEF1dGhvcml0eTEOMAwGA1UEAxMFbnBtQ0ExFzAVBgkqhkiG9w0BCQEWCGlA
-aXpzLm1lMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDLI4tIqPpRW+ACw9GE
-OgBlJZwK5f8nnKCLK629Pv5yJpQKs3DENExAyOgDcyaF0HD0zk8zTp+ZsLaNdKOz
-Gn2U181KGprGKAXP6DU6ByOJDWmTlY6+Ad1laYT0m64fERSpHw/hjD3D+iX4aMOl
-y0HdbT5m1ZGh6SJz3ZqxavhHLQIDAQABMA0GCSqGSIb3DQEBBQUAA4GBAC4ySDbC
-l7W1WpLmtLGEQ/yuMLUf6Jy/vr+CRp4h+UzL+IQpCv8FfxsYE7dhf/bmWTEupBkv
-yNL18lipt2jSvR3v6oAHAReotvdjqhxddpe5Holns6EQd1/xEZ7sB1YhQKJtvUrl
-ZNufy1Jf1r0ldEGeA+0ISck7s+xSh9rQD2Op
------END CERTIFICATE-----
-'
-
-echo "$cacert" > "$TMP/cafile.crt"
-cacert="$TMP/cafile.crt"
-
-# need to echo "" after, because Posix sed doesn't treat EOF
-# as an implied end of line.
-url=`(curl -SsL --cacert "$cacert" https://registry.npmjs.org/npm/$t; echo "") \
-     | sed -e 's/^.*tarball":"//' \
-     | sed -e 's/".*$//'`
-
-ret=$?
-if [ "x$url" = "x" ]; then
-  ret=125
-  # try without the -e arg to sed.
-  url=`(curl -SsL --cacert "$cacert" https://registry.npmjs.org/npm/$t; echo "") \
-       | sed 's/^.*tarball":"//' \
-       | sed 's/".*$//'`
-  ret=$?
-  if [ "x$url" = "x" ]; then
-    ret=125
-  fi
-fi
-if [ $ret -ne 0 ]; then
-  echo "Failed to get tarball url for npm/$t" >&2
-  exit $ret
-fi
-
-
-echo "fetching: $url" >&2
-
-cd "$TMP" \
-  && curl -SsL --cacert "$cacert" "$url" \
-     | $tar -xzf - \
-  && rm "$cacert" \
-  && cd "$TMP"/* \
-  && (req=`"$node" bin/read-package-json.js package.json engines.node`
-      if [ -d node_modules ]; then
-        "$node" node_modules/semver/bin/semver -v "$node_version" -r "$req"
-        ret=$?
-      else
-        "$node" bin/semver.js -v "$node_version" -r "$req"
-        ret=$?
-      fi
-      if [ $ret -ne 0 ]; then
-        echo "You need node $req to run this program." >&2
-        echo "node --version reports: $node_version" >&2
-        echo "Please upgrade node before continuing." >&2
-        exit $ret
-      fi) \
-  && (ver=`"$node" bin/read-package-json.js package.json version`
-      isnpm10=0
-      if [ $ret -eq 0 ]; then
-        req=`"$node" bin/read-package-json.js package.json engines.node`
-        if [ -d node_modules ]; then
-          if "$node" node_modules/semver/bin/semver -v "$ver" -r "1"
-          then
-            isnpm10=1
-          fi
-        else
-          if "$node" bin/semver -v "$ver" -r ">=1.0"; then
-            isnpm10=1
-          fi
-        fi
-      fi
-
-      ret=0
-      if [ $isnpm10 -eq 1 ] && [ -f "scripts/clean-old.sh" ]; then
-        if [ "x$skipclean" = "x" ]; then
-          (exit 0)
-        else
-          clean=no
-        fi
-        if [ "x$clean" = "xno" ] \
-            || [ "x$clean" = "xn" ]; then
-          echo "Skipping 0.x cruft clean" >&2
-          ret=0
-        elif [ "x$clean" = "xy" ] || [ "x$clean" = "xyes" ]; then
-          NODE="$node" /bin/bash "scripts/clean-old.sh" "-y"
-          ret=$?
-        else
-          NODE="$node" /bin/bash "scripts/clean-old.sh" </dev/tty
-          ret=$?
-        fi
-      fi
-
-      if [ $ret -ne 0 ]; then
-        echo "Aborted 0.x cleanup.  Exiting." >&2
-        exit $ret
-      fi) \
-  && (if [ "x$configures" = "x" ]; then
-        (exit 0)
-      else
-        echo "./configure "$configures
-        echo "$configures" > npmrc
-      fi) \
-  && (if [ "$make" = "NOMAKE" ]; then
-        (exit 0)
-      elif "$make" uninstall install; then
-        (exit 0)
-      else
-        make="NOMAKE"
-      fi
-      if [ "$make" = "NOMAKE" ]; then
-        "$node" cli.js rm npm -gf
-        "$node" cli.js install -gf
-      fi) \
-  && cd "$BACK" \
-  && rm -rf "$TMP" \
-  && echo "It worked"
-
-ret=$?
-if [ $ret -ne 0 ]; then
-  echo "It failed" >&2
-fi
-exit $ret
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/release.sh	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,36 +0,0 @@
-#!/bin/bash
-
-# script for creating a zip and tarball for inclusion in node
-
-unset CDPATH
-
-set -e
-
-rm -rf release *.tgz || true
-mkdir release
-npm pack --loglevel error >/dev/null
-mv *.tgz release
-cd release
-tar xzf *.tgz
-
-mkdir node_modules
-mv package node_modules/npm
-
-# make the zip for windows users
-cp node_modules/npm/bin/*.cmd .
-zipname=npm-$(npm -v).zip
-zip -q -9 -r -X "$zipname" *.cmd node_modules
-
-# make the tar for node's deps
-cd node_modules
-tarname=npm-$(npm -v).tgz
-tar czf "$tarname" npm
-
-cd ..
-mv "node_modules/$tarname" .
-
-rm -rf *.cmd
-rm -rf node_modules
-
-echo "release/$tarname"
-echo "release/$zipname"
--- a/node/node-v0.10.22-linux-x86/lib/node_modules/npm/scripts/relocate.sh	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,26 +0,0 @@
-#!/bin/bash
-
-# Change the cli shebang to point at the specified node
-# Useful for when the program is moved around after install.
-# Also used by the default 'make install' in node to point
-# npm at the newly installed node, rather than the first one
-# in the PATH, which would be the default otherwise.
-
-# bash /path/to/npm/scripts/relocate.sh $nodepath
-# If $nodepath is blank, then it'll use /usr/bin/env
-
-dir="$(dirname "$(dirname "$0")")"
-cli="$dir"/bin/npm-cli.js
-tmp="$cli".tmp
-
-node="$1"
-if [ "x$node" = "x" ]; then
-  node="/usr/bin/env node"
-fi
-node="#!$node"
-
-sed -e 1d "$cli" > "$tmp"
-echo "$node" > "$cli"
-cat "$tmp" >> "$cli"
-rm "$tmp"
-chmod ogu+x $cli
--- a/node/node-v0.10.22-linux-x86/share/man/man1/node.1	Fri Nov 22 15:38:32 2013 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,448 +0,0 @@
-.TH NODE.JS "1" "2010" "" ""
-
-
-.SH "NAME"
-node \- Server-side JavaScript
-
-.SH SYNOPSIS
-
-
-.B node
-[
-.B \-v
-]
-[
-.B \-\-debug
-|
-.B \-\-debug-brk
-]
-[
-.B \-\-v8-options
-]
-.br
-     [
-.B \-e
-.I command
-|
-.I script.js
-]
-[
-.I arguments
-]
-
-Execute without arguments to start the REPL.
-
-
-.SH DESCRIPTION
-
-Node is a set of libraries for javascript which allows
-it to be used outside of the browser. It is primarily
-focused on creating simple, easy to build network clients
-and servers.
-
-
-.SH OPTIONS
-
-  -v, --version          print node's version
-
-  -e, --eval script      evaluate script
-
-  -p, --print            print result of --eval
-
-  -i, --interactive      always enter the REPL even if stdin
-                         does not appear to be a terminal
-
-  --no-deprecation       silence deprecation warnings
-
-  --trace-deprecation    show stack traces on deprecations
-
-  --throw-deprecation    throw errors on deprecations
-
-  --v8-options           print v8 command line options
-
-  --max-stack-size=val   set max v8 stack size (bytes)
-
-
-.SH ENVIRONMENT VARIABLES
-
-.IP NODE_PATH
-\':\'\-separated list of directories prefixed to the module search path.
-
-.IP NODE_MODULE_CONTEXTS
-If set to 1 then modules will load in their own global contexts.
-
-.IP NODE_DISABLE_COLORS
-If set to 1 then colors will not be used in the REPL.
-
-.SH V8 OPTIONS
-
-  --use_strict (enforce strict mode)
-        type: bool  default: false
-  --es5_readonly (activate correct semantics for inheriting readonliness)
-        type: bool  default: false
-  --es52_globals (activate new semantics for global var declarations)
-        type: bool  default: false
-  --harmony_typeof (enable harmony semantics for typeof)
-        type: bool  default: false
-  --harmony_scoping (enable harmony block scoping)
-        type: bool  default: false
-  --harmony_modules (enable harmony modules (implies block scoping))
-        type: bool  default: false
-  --harmony_proxies (enable harmony proxies)
-        type: bool  default: false
-  --harmony_collections (enable harmony collections (sets, maps, and weak maps))
-        type: bool  default: false
-  --harmony (enable all harmony features (except typeof))
-        type: bool  default: false
-  --packed_arrays (optimizes arrays that have no holes)
-        type: bool  default: false
-  --smi_only_arrays (tracks arrays with only smi values)
-        type: bool  default: true
-  --clever_optimizations (Optimize object size, Array shift, DOM strings and string +)
-        type: bool  default: true
-  --unbox_double_arrays (automatically unbox arrays of doubles)
-        type: bool  default: true
-  --string_slices (use string slices)
-        type: bool  default: true
-  --crankshaft (use crankshaft)
-        type: bool  default: true
-  --hydrogen_filter (optimization filter)
-        type: string  default: 
-  --use_range (use hydrogen range analysis)
-        type: bool  default: true
-  --eliminate_dead_phis (eliminate dead phis)
-        type: bool  default: true
-  --use_gvn (use hydrogen global value numbering)
-        type: bool  default: true
-  --use_canonicalizing (use hydrogen instruction canonicalizing)
-        type: bool  default: true
-  --use_inlining (use function inlining)
-        type: bool  default: true
-  --max_inlined_source_size (maximum source size in bytes considered for a single inlining)
-        type: int  default: 600
-  --max_inlined_nodes (maximum number of AST nodes considered for a single inlining)
-        type: int  default: 196
-  --max_inlined_nodes_cumulative (maximum cumulative number of AST nodes considered for inlining)
-        type: int  default: 196
-  --loop_invariant_code_motion (loop invariant code motion)
-        type: bool  default: true
-  --collect_megamorphic_maps_from_stub_cache (crankshaft harvests type feedback from stub cache)
-        type: bool  default: true
-  --hydrogen_stats (print statistics for hydrogen)
-        type: bool  default: false
-  --trace_hydrogen (trace generated hydrogen to file)
-        type: bool  default: false
-  --trace_phase (trace generated IR for specified phases)
-        type: string  default: Z
-  --trace_inlining (trace inlining decisions)
-        type: bool  default: false
-  --trace_alloc (trace register allocator)
-        type: bool  default: false
-  --trace_all_uses (trace all use positions)
-        type: bool  default: false
-  --trace_range (trace range analysis)
-        type: bool  default: false
-  --trace_gvn (trace global value numbering)
-        type: bool  default: false
-  --trace_representation (trace representation types)
-        type: bool  default: false
-  --stress_pointer_maps (pointer map for every instruction)
-        type: bool  default: false
-  --stress_environments (environment for every instruction)
-        type: bool  default: false
-  --deopt_every_n_times (deoptimize every n times a deopt point is passed)
-        type: int  default: 0
-  --trap_on_deopt (put a break point before deoptimizing)
-        type: bool  default: false
-  --deoptimize_uncommon_cases (deoptimize uncommon cases)
-        type: bool  default: true
-  --polymorphic_inlining (polymorphic inlining)
-        type: bool  default: true
-  --use_osr (use on-stack replacement)
-        type: bool  default: true
-  --array_bounds_checks_elimination (perform array bounds checks elimination)
-        type: bool  default: false
-  --array_index_dehoisting (perform array index dehoisting)
-        type: bool  default: false
-  --trace_osr (trace on-stack replacement)
-        type: bool  default: false
-  --stress_runs (number of stress runs)
-        type: int  default: 0
-  --optimize_closures (optimize closures)
-        type: bool  default: true
-  --inline_construct (inline constructor calls)
-        type: bool  default: true
-  --inline_arguments (inline functions with arguments object)
-        type: bool  default: true
-  --loop_weight (loop weight for representation inference)
-        type: int  default: 1
-  --optimize_for_in (optimize functions containing for-in loops)
-        type: bool  default: true
-  --experimental_profiler (enable all profiler experiments)
-        type: bool  default: true
-  --watch_ic_patching (profiler considers IC stability)
-        type: bool  default: false
-  --frame_count (number of stack frames inspected by the profiler)
-        type: int  default: 1
-  --self_optimization (primitive functions trigger their own optimization)
-        type: bool  default: false
-  --direct_self_opt (call recompile stub directly when self-optimizing)
-        type: bool  default: false
-  --retry_self_opt (re-try self-optimization if it failed)
-        type: bool  default: false
-  --count_based_interrupts (trigger profiler ticks based on counting instead of timing)
-        type: bool  default: false
-  --interrupt_at_exit (insert an interrupt check at function exit)
-        type: bool  default: false
-  --weighted_back_edges (weight back edges by jump distance for interrupt triggering)
-        type: bool  default: false
-  --interrupt_budget (execution budget before interrupt is triggered)
-        type: int  default: 5900
-  --type_info_threshold (percentage of ICs that must have type info to allow optimization)
-        type: int  default: 15
-  --self_opt_count (call count before self-optimization)
-        type: int  default: 130
-  --trace_opt_verbose (extra verbose compilation tracing)
-        type: bool  default: false
-  --debug_code (generate extra code (assertions) for debugging)
-        type: bool  default: false
-  --code_comments (emit comments in code disassembly)
-        type: bool  default: false
-  --enable_sse2 (enable use of SSE2 instructions if available)
-        type: bool  default: true
-  --enable_sse3 (enable use of SSE3 instructions if available)
-        type: bool  default: true
-  --enable_sse4_1 (enable use of SSE4.1 instructions if available)
-        type: bool  default: true
-  --enable_cmov (enable use of CMOV instruction if available)
-        type: bool  default: true
-  --enable_rdtsc (enable use of RDTSC instruction if available)
-        type: bool  default: true
-  --enable_sahf (enable use of SAHF instruction if available (X64 only))
-        type: bool  default: true
-  --enable_vfp3 (enable use of VFP3 instructions if available - this implies enabling ARMv7 instructions (ARM only))
-        type: bool  default: true
-  --enable_armv7 (enable use of ARMv7 instructions if available (ARM only))
-        type: bool  default: true
-  --enable_fpu (enable use of MIPS FPU instructions if available (MIPS only))
-        type: bool  default: true
-  --expose_natives_as (expose natives in global object)
-        type: string  default: NULL
-  --expose_debug_as (expose debug in global object)
-        type: string  default: NULL
-  --expose_gc (expose gc extension)
-        type: bool  default: false
-  --expose_externalize_string (expose externalize string extension)
-        type: bool  default: false
-  --stack_trace_limit (number of stack frames to capture)
-        type: int  default: 10
-  --builtins_in_stack_traces (show built-in functions in stack traces)
-        type: bool  default: false
-  --disable_native_files (disable builtin natives files)
-        type: bool  default: false
-  --inline_new (use fast inline allocation)
-        type: bool  default: true
-  --stack_trace_on_abort (print a stack trace if an assertion failure occurs)
-        type: bool  default: true
-  --trace (trace function calls)
-        type: bool  default: false
-  --mask_constants_with_cookie (use random jit cookie to mask large constants)
-        type: bool  default: true
-  --lazy (use lazy compilation)
-        type: bool  default: true
-  --trace_opt (trace lazy optimization)
-        type: bool  default: false
-  --trace_opt_stats (trace lazy optimization statistics)
-        type: bool  default: false
-  --opt (use adaptive optimizations)
-        type: bool  default: true
-  --always_opt (always try to optimize functions)
-        type: bool  default: false
-  --prepare_always_opt (prepare for turning on always opt)
-        type: bool  default: false
-  --trace_deopt (trace deoptimization)
-        type: bool  default: false
-  --min_preparse_length (minimum length for automatic enable preparsing)
-        type: int  default: 1024
-  --always_full_compiler (try to use the dedicated run-once backend for all code)
-        type: bool  default: false
-  --trace_bailout (print reasons for falling back to using the classic V8 backend)
-        type: bool  default: false
-  --compilation_cache (enable compilation cache)
-        type: bool  default: true
-  --cache_prototype_transitions (cache prototype transitions)
-        type: bool  default: true
-  --trace_debug_json (trace debugging JSON request/response)
-        type: bool  default: false
-  --debugger_auto_break (automatically set the debug break flag when debugger commands are in the queue)
-        type: bool  default: true
-  --enable_liveedit (enable liveedit experimental feature)
-        type: bool  default: true
-  --break_on_abort (always cause a debug break before aborting)
-        type: bool  default: true
-  --stack_size (default size of stack region v8 is allowed to use (in kBytes))
-        type: int  default: 984
-  --max_stack_trace_source_length (maximum length of function source code printed in a stack trace.)
-        type: int  default: 300
-  --always_inline_smi_code (always inline smi code in non-opt code)
-        type: bool  default: false
-  --max_new_space_size (max size of the new generation (in kBytes))
-        type: int  default: 0
-  --max_old_space_size (max size of the old generation (in Mbytes))
-        type: int  default: 0
-  --max_executable_size (max size of executable memory (in Mbytes))
-        type: int  default: 0
-  --gc_global (always perform global GCs)
-        type: bool  default: false
-  --gc_interval (garbage collect after <n> allocations)
-        type: int  default: -1
-  --trace_gc (print one trace line following each garbage collection)
-        type: bool  default: false
-  --trace_gc_nvp (print one detailed trace line in name=value format after each garbage collection)
-        type: bool  default: false
-  --print_cumulative_gc_stat (print cumulative GC statistics in name=value format on exit)
-        type: bool  default: false
-  --trace_gc_verbose (print more details following each garbage collection)
-        type: bool  default: false
-  --trace_fragmentation (report fragmentation for old pointer and data pages)
-        type: bool  default: false
-  --collect_maps (garbage collect maps from which no objects can be reached)
-        type: bool  default: true
-  --flush_code (flush code that we expect not to use again before full gc)
-        type: bool  default: true
-  --incremental_marking (use incremental marking)
-        type: bool  default: true
-  --incremental_marking_steps (do incremental marking steps)
-        type: bool  default: true
-  --trace_incremental_marking (trace progress of the incremental marking)
-        type: bool  default: false
-  --use_idle_notification (Use idle notification to reduce memory footprint.)
-        type: bool  default: true
-  --send_idle_notification (Send idle notifcation between stress runs.)
-        type: bool  default: false
-  --use_ic (use inline caching)
-        type: bool  default: true
-  --native_code_counters (generate extra code for manipulating stats counters)
-        type: bool  default: false
-  --always_compact (Perform compaction on every full GC)
-        type: bool  default: false
-  --lazy_sweeping (Use lazy sweeping for old pointer and data spaces)
-        type: bool  default: true
-  --never_compact (Never perform compaction on full GC - testing only)
-        type: bool  default: false
-  --compact_code_space (Compact code space on full non-incremental collections)
-        type: bool  default: true
-  --cleanup_code_caches_at_gc (Flush inline caches prior to mark compact collection and flush code caches in maps during mark compact cycle.)
-        type: bool  default: true
-  --random_seed (Default seed for initializing random generator (0, the default, means to use system random).)
-        type: int  default: 0
-  --use_verbose_printer (allows verbose printing)
-        type: bool  default: true
-  --allow_natives_syntax (allow natives syntax)
-        type: bool  default: false
-  --trace_sim (Trace simulator execution)
-        type: bool  default: false
-  --check_icache (Check icache flushes in ARM and MIPS simulator)
-        type: bool  default: false
-  --stop_sim_at (Simulator stop after x number of instructions)
-        type: int  default: 0
-  --sim_stack_alignment (Stack alingment in bytes in simulator (4 or 8, 8 is default))
-        type: int  default: 8
-  --trace_exception (print stack trace when throwing exceptions)
-        type: bool  default: false
-  --preallocate_message_memory (preallocate some memory to build stack traces.)
-        type: bool  default: false
-  --randomize_hashes (randomize hashes to avoid predictable hash collisions (with snapshots this option cannot override the baked-in seed))
-        type: bool  default: true
-  --hash_seed (Fixed seed to use to hash property keys (0 means random)(with snapshots this option cannot override the baked-in seed))
-        type: int  default: 0
-  --preemption (activate a 100ms timer that switches between V8 threads)
-        type: bool  default: false
-  --regexp_optimization (generate optimized regexp code)
-        type: bool  default: true
-  --testing_bool_flag (testing_bool_flag)
-        type: bool  default: true
-  --testing_int_flag (testing_int_flag)
-        type: int  default: 13
-  --testing_float_flag (float-flag)
-        type: float  default: 2.500000
-  --testing_string_flag (string-flag)
-        type: string  default: Hello, world!
-  --testing_prng_seed (Seed used for threading test randomness)
-        type: int  default: 42
-  --testing_serialization_file (file in which to serialize heap)
-        type: string  default: /tmp/serdes
-  --help (Print usage message, including flags, on console)
-        type: bool  default: true
-  --dump_counters (Dump counters on exit)
-        type: bool  default: false
-  --debugger (Enable JavaScript debugger)
-        type: bool  default: false
-  --remote_debugger (Connect JavaScript debugger to the debugger agent in another process)
-        type: bool  default: false
-  --debugger_agent (Enable debugger agent)
-        type: bool  default: false
-  --debugger_port (Port to use for remote debugging)
-        type: int  default: 5858
-  --map_counters (Map counters to a file)
-        type: string  default: 
-  --js_arguments (Pass all remaining arguments to the script. Alias for "--".)
-        type: arguments  default: 
-  --debug_compile_events (Enable debugger compile events)
-        type: bool  default: true
-  --debug_script_collected_events (Enable debugger script collected events)
-        type: bool  default: true
-  --gdbjit (enable GDBJIT interface (disables compacting GC))
-        type: bool  default: false
-  --gdbjit_full (enable GDBJIT interface for all code objects)
-        type: bool  default: false
-  --gdbjit_dump (dump elf objects with debug info to disk)
-        type: bool  default: false
-  --gdbjit_dump_filter (dump only objects containing this substring)
-        type: string  default: 
-  --force_marking_deque_overflows (force overflows of marking deque by reducing it's size to 64 words)
-        type: bool  default: false
-  --stress_compaction (stress the GC compactor to flush out bugs (implies --force_marking_deque_overflows))
-        type: bool  default: false
-  --log (Minimal logging (no API, code, GC, suspect, or handles samples).)
-        type: bool  default: false
-  --log_all (Log all events to the log file.)
-        type: bool  default: false
-  --log_runtime (Activate runtime system %Log call.)
-        type: bool  default: false
-  --log_api (Log API events to the log file.)
-        type: bool  default: false
-  --log_code (Log code events to the log file without profiling.)
-        type: bool  default: false
-  --log_gc (Log heap samples on garbage collection for the hp2ps tool.)
-        type: bool  default: false
-  --log_handles (Log global handle events.)
-        type: bool  default: false
-  --log_snapshot_positions (log positions of (de)serialized objects in the snapshot.)
-        type: bool  default: false
-  --log_suspect (Log suspect operations.)
-        type: bool  default: false
-  --prof (Log statistical profiling information (implies --log-code).)
-        type: bool  default: false
-  --prof_auto (Used with --prof, starts profiling automatically)
-        type: bool  default: true
-  --prof_lazy (Used with --prof, only does sampling and logging when profiler is active (implies --noprof_auto).)
-        type: bool  default: false
-  --prof_browser_mode (Used with --prof, turns on browser-compatible mode for profiling.)
-        type: bool  default: true
-  --log_regexp (Log regular expression execution.)
-        type: bool  default: false
-  --sliding_state_window (Update sliding state window counters.)
-        type: bool  default: false
-  --logfile (Specify the name of the log file.)
-        type: string  default: v8.log
-  --ll_prof (Enable low-level linux profiler.)
-        type: bool  default: false
-
-.SH RESOURCES AND DOCUMENTATION
-
-See the website for documentation http://nodejs.org/
-
-Mailing list: http://groups.google.com/group/nodejs
-
-IRC: irc.freenode.net #node.js