Skip to content
  • Tim O'Donnell's avatar
    2893567b
    This is an attempt to have a single docker image that can be used as a base... · 2893567b
    Tim O'Donnell authored
    This is an attempt to have a single docker image that can be used as a base for cloud runs (with mhcflurry-cloud) and also eventually as a way for users to experiment with mhcflurry. Plan is to have this built automatically at https://hub.docker.com/r/hammerlab/mhcflurry/
    
     * supports cpu and theoretically gpu (not tested though)
     * supports python 2 and 3
     * putting Dockerfile in the root of our repo lets us just copy the current checkout of the mhcflurry repo into the image instead of pulling it from github, so it works with branches and non-released versions
     * by default this runs a python 3 jupyter notebook on port 8888 with mhcflurry and some convenience packages installed
     * does not try to train any models or run tests. Models will eventually be downloaded from google cloud storage once we have that working
     * it’s pretty hefty unfortunately, around 2 gig
    
    Also: removed the pin on keras<1.0 in requirements.txt
    2893567b
    This is an attempt to have a single docker image that can be used as a base...
    Tim O'Donnell authored
    This is an attempt to have a single docker image that can be used as a base for cloud runs (with mhcflurry-cloud) and also eventually as a way for users to experiment with mhcflurry. Plan is to have this built automatically at https://hub.docker.com/r/hammerlab/mhcflurry/
    
     * supports cpu and theoretically gpu (not tested though)
     * supports python 2 and 3
     * putting Dockerfile in the root of our repo lets us just copy the current checkout of the mhcflurry repo into the image instead of pulling it from github, so it works with branches and non-released versions
     * by default this runs a python 3 jupyter notebook on port 8888 with mhcflurry and some convenience packages installed
     * does not try to train any models or run tests. Models will eventually be downloaded from google cloud storage once we have that working
     * it’s pretty hefty unfortunately, around 2 gig
    
    Also: removed the pin on keras<1.0 in requirements.txt
Loading