S3 bucket for python packages
Create a new S3 Bucket (optionally use ip address restrictions and other policies/permissions)
mybucket.s3-website-us-east-1.amazonaws.com (Bucket permissions are blocking all access except for 11.220.157.102, 130.100.28.243)
since it's a "virtual host" in order to use HTTPS (security!) https://s3.amazonaws.com/mybucket/index.html
sudo apt-get install python python-setuptools curl
download http://peak.telecommunity.com/dist/ez_setup.py python ez_setup.py Installed c:\python27\lib\site-packages\setuptools-0.6c11-py2.7.egg
sudo curl https://raw.github.com/pypa/pip/master/contrib/get-pip.py | python
sudo pip install pip2pi # https://github.com/wolever/pip2pi
CREATE (or update) a package for the bucket:
Create a directory where you will be downloading the packages and creating an index, e.g. /tmp/pypi-packages
Ensure your pypi-packages directory has a copy of the existing index and packages from mybucket.s3-website-us-east-1.amazonaws.com
pip2tgz /tmp/pypi-packages simplejson
dir2pi /tmp/pypi-packages (dir2pi will actually create an index for every .tar.gz in the directory)
Copy the new pytest.tar.gz and pytest directory up to the bucket
pip install --index-url=https://s3.amazonaws.com/mybucket/python/simple pytest
if --upgrade is appended it will force installation
test by choosing a package you haven't uploaded: "Could not find any downloads that satisfy the requirement pytest"
One further automation: a chef run with the myservice::python_packages recipe in order to install using the URL index
include_recipe "python::#{node['python']['install_method']}" include_recipe "python::pip" include_recipe "python::virtualenv"
%w{ boto pytest redis simplejson
}.each do |pkg| python_pip pkg do options "--index-url=https://s3.amazonaws.com/mybucket/python/simple" action :install end end
remove simple so it's only a directory of .tar.gzs, then use dir2pi
s3fox creates python_$folder$ and somehow symlinks work ok
python_pip "supervisor-3.0b2" do version "3.0b2" options "--index-url=http://mybucket.s3-website-us-east-1.amazonaws.com/python/simple" action :install end
knife cookbook upload python_packages
not using easy_install because there's no uninstall and creating eggs is more complex than above
easy_install -H None -f https://s3.amazonaws.com/mybucket/python redis
To uninstall an .egg you need to rm -rf the egg (it might be a directory) and remove the matching line from site-packages/easy-install.pth
sudo apt-get install s3cmd
~/.s3cfg
[default] access_key = AKIAIRT2PMYKEY secret_key = UFZ/KLpCIPMYSECRETKEY use_https = True
s3cmd ls s3cmd info s3://mybucket s3cmd ls s3://mybucket s3cmd setacl --acl-public --recursive s3://mybucket # sets the bucket and everything inside public
s3cmd setacl --acl-private --recursive s3://mybucket # sets the bucket and everything inside private
s3cmd put /tmp/mystuff --recursive s3://mybucket # uploads a local directory and all contents s3cmd get s3://BUCKET/OBJECT LOCAL_FILE
s3cmd cp s3://BUCKET1/OBJECT1 s3://BUCKET2[/OBJECT2] s3cmd mv s3://BUCKET1/OBJECT1 s3://BUCKET2[/OBJECT2]
s3cmd du [s3://BUCKET[/PREFIX]] # disk usage
s3cmd mb s3://newbucketname # make bucket s3cmd rb s3://newbucketname # remove bucket (must be empty)
s3cmd sync s3://mybucket/python /home/ubuntu/mypypi/python/