I am trying to run a Django management command from cron. I am using virtualenv to keep my project sandboxed.
I have seen examples here and elsewhere that show running management commands from within virtualenv's like:
0 3 * * * source /home/user/project/env/bin/activate && /home/user/project/manage.py command arg
However, even though syslog shows an entry when the task should have started, this task never actually runs (the log file for the script is empty). If I run the line manually from the shell, it works as expected.
The only way I can currently get the command to run via cron, is to break the commands up and put them in a dumb bash wrapper script:
#!/bin/sh
source /home/user/project/env/bin/activate
cd /home/user/project/
./manage.py command arg
EDIT:
ars came up with a working combination of commands:
0 3 * * * cd /home/user/project && /home/user/project/env/bin/python /home/user/project/manage.py command arg
At least in my case, invoking the activate script for the virtualenv did nothing. This works, so on with the show.
env
and export
them all in a bash script wrapper you call from the crontab.
You should be able to do this by using the python
in your virtual environment:
/home/my/virtual/bin/python /home/my/project/manage.py command arg
EDIT: If your django project isn't in the PYTHONPATH, then you'll need to switch to the right directory:
cd /home/my/project && /home/my/virtual/bin/python ...
You can also try to log the failure from cron:
cd /home/my/project && /home/my/virtual/bin/python /home/my/project/manage.py > /tmp/cronlog.txt 2>&1
Another thing to try is to make the same change in your manage.py
script at the very top:
#!/home/my/virtual/bin/python
Running source
from a cronfile won't work as cron uses /bin/sh
as its default shell, which doesn't support source
. You need to set the SHELL environment variable to be /bin/bash
:
SHELL=/bin/bash
*/10 * * * * root source /path/to/virtualenv/bin/activate && /path/to/build/manage.py some_command > /dev/null
It's tricky to spot why this fails as /var/log/syslog
doesn't log the error details. Best to alias yourself to root so you get emailed with any cron errors. Simply add yourself to /etc/aliases
and run sendmail -bi
.
More info here: http://codeinthehole.com/archives/43-Running-django-cronjobs-within-a-virtualenv.html
the link above is changed to: https://codeinthehole.com/tips/running-django-cronjobs-within-a-virtualenv/
. /path/to/virtualenv/bin/activate
root
user and its privileges (king of the kings). The first column after cronjob timing is the user to execute the cronjob.
Don't look any further:
0 3 * * * /usr/bin/env bash -c 'cd /home/user/project && source /home/user/project/env/bin/activate && ./manage.py command arg' > /dev/null 2>&1
Generic approach:
* * * * * /usr/bin/env bash -c 'YOUR_COMMAND_HERE' > /dev/null 2>&1
The beauty about this is you DO NOT need to change the SHELL
variable for crontab from sh
to bash
bash -c
is unnecessary. Just use .
command instead of source
. You won't need to change the SHELL variable in crontab, or wrap your command with bash -c
. See stackoverflow.com/questions/3287038/cron-and-virtualenv/…
cd /home/user/project && source /home/user/project/env/bin/activate && ./manage.py command arg
The only correct way to run python cron jobs when using a virtualenv is to activate the environment and then execute the environment's python to run your code.
One way to do this is use virtualenv's activate_this
in your python script, see: http://virtualenv.readthedocs.org/en/latest/userguide.html#using-virtualenv-without-bin-python
Another solution is echoing the complete command including activating the environment and piping it into /bin/bash
. Consider this for your /etc/crontab
:
***** root echo 'source /env/bin/activate; python /your/script' | /bin/bash
.
command instead of source
. See stackoverflow.com/questions/3287038/cron-and-virtualenv/…
I am sorry for that nth answer but I checked the answers and there is really simpler and neater.
Long story short
Use the python binary of your venv in your cron :
0 3 * * * /home/user/project/env/bin/python /home/user/project/manage.py
Long story
We activate the virtual environment when we want to set the current shell with the python config of that specific virtual environment(that is binaries and modules of that).
It is relevant to work with the current shell : execute multiple python commands on the current shell without the need to reference the full python path of the venv.
In the frame of a cron or even a bash, which value to activate the environment ? Besides I read in some answers some references to bash
rather than sh
or still to define a wrapper to call the Python code. But why the hell should we bother with these ?
I repeat, just do it :
0 3 * * * /home/user/project/env/bin/python /home/user/project/manage.py
The documentation confirms that :
You don’t specifically need to activate an environment; activation just prepends the virtual environment’s binary directory to your path, so that “python” invokes the virtual environment’s Python interpreter and you can run installed scripts without having to use their full path. However, all scripts installed in a virtual environment should be runnable without activating it, and run with the virtual environment’s Python automatically.
Rather than mucking around with virtualenv-specific shebangs, just prepend PATH
onto the crontab.
From an activated virtualenv, run these three commands and python scripts should just work:
$ echo "PATH=$PATH" > myserver.cron
$ crontab -l >> myserver.cron
$ crontab myserver.cron
The crontab's first line should now look like this:
PATH=/home/me/virtualenv/bin:/usr/bin:/bin: # [etc...]
The best solution for me was to both
use the python binary in the venv bin/ directory
set the python path to include the venv modules directory.
man python
mentions modifying the path in shell at $PYTHONPATH
or in python with sys.path
Other answers mention ideas for doing this using the shell. From python, adding the following lines to my script allows me to successfully run it directly from cron.
import sys
sys.path.insert(0,'/path/to/venv/lib/python3.3/site-packages');
Here's how it looks in an interactive session --
Python 3.3.2+ (default, Feb 28 2014, 00:52:16)
[GCC 4.8.1] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import sys
>>> sys.path
['', '/usr/lib/python3.3', '/usr/lib/python3.3/plat-x86_64-linux-gnu', '/usr/lib/python3.3/lib-dynload']
>>> import requests
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named 'requests'
>>> sys.path.insert(0,'/path/to/venv/modules/');
>>> import requests
>>>
I'd like to add this because I spent some time solving the issue and did not find an answer here for combination of variables usage in cron and virtualenv. So maybe it'll help someone.
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
DIR_SMTH="cd /smth"
VENV=". venv/bin/activate"
CMD="some_python_bin do_something"
# m h dom mon dow command
0 * * * * $DIR_SMTH && $VENV && $CMD -k2 some_target >> /tmp/crontest.log 2>&1
It did not work well when it was configured like
DIR_SMTH="cd /smth && . venv/bin/activate"
Thanks @davidwinterbottom, @reed-sandberg and @mkb for giving the right direction. The accepted answer actually works fine until your python need to run a script which have to run another python binary from venv/bin directory.
This is a simple way that keeps the crontab command very similar to regular command (tested in Ubuntu 18.04). Some key notes to keep in mind:
You can use the . command instead of source. (crontab uses sh by default, not bash, so it doesn't have source.)
~ and $variables are expanded in crontab commands. (It's only crontab environment statements that don't do variable expansion.)
Here are examples if you have a file ~/myproject/main.py
:
* * * * * cd ~/myproject && . .venv/bin/activate && python main.py > /tmp/out1 2>&1
You could also directly call the specific path of the python
in the venv directory, then you don't need to call activate
.
* * * * * ~/myproject/.venv/bin/python ~/myproject/main.py > /tmp/out2 2>&1
The downside of that is you would need to specify the project path twice, which makes maintenance trickier. To avoid that, you could use a shell variable so you only specify the project path once:
* * * * * project_dir=~/myproject ; $project_dir/.venv/bin/python $project_dir/main.py > /tmp/out3 2>&1
If you're on python and using a Conda Virtual Environment where your python script contains the shebang #!/usr/bin/env python the following works:
* * * * * cd /home/user/project && /home/user/anaconda3/envs/envname/bin/python script.py 2>&1
Additionally, if you want to capture any outputs in your script (e.g. print, errors, etc) you can use the following:
* * * * * cd /home/user/project && /home/user/anaconda3/envs/envname/bin/python script.py >> /home/user/folder/script_name.log 2>&1
python script
from datetime import datetime
import boto # check wheather its taking the virtualenv or not
import sys
param1=sys.argv[1] #Param
myFile = open('appendtxt.txt', 'a')
myFile.write('\nAccessed on ' + param1+str(datetime.now()))
Cron command
*/1 * * * * cd /Workspace/testcron/ && /Workspace/testcron/venvcron/bin/python3 /Workspace/testcron/testcronwithparam.py param
In above command
*/1 * * * * - Execute every one minte
cd /Workspace/testcron/ - Path of the python script
/Workspace/testcron/venvcron/bin/python3 - Virtualenv path
Workspace/testcron/testcronwithparam.py - File path
param - parameter
I've added the following script as manage.sh
inside my Django project, it sources the virtualenv and then runs the manage.py
script with whatever arguments you pass to it. It makes it very easy in general to run commands inside the virtualenv (cron, systemd units, basically anywhere):
#! /bin/bash
# this is a convenience script that first sources the venv (assumed to be in
# ../venv) and then executes manage.py with whatever arguments you supply the
# script with. this is useful if you need to execute the manage.py from
# somewhere where the venv isn't sourced (e.g. system scripts)
# get the script's location
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
# source venv <- UPDATE THE PATH HERE WITH YOUR VENV's PATH
source $DIR/../venv/bin/activate
# run manage.py script
$DIR/manage.py "$@"
Then in your cron entry you can just run:
0 3 * * * /home/user/project/manage.sh command arg
Just remember that you need to make the manage.sh
script executable
Since a cron executes in its own minimal sh
environment, here's what I do to run Python scripts in a virtual environment:
* * * * * . ~/.bash_profile; . ~/path/to/venv/bin/activate; python ~/path/to/script.py
(Note: if . ~/.bash_profile
doesn't work for you, then try . ~/.bashrc
or . ~/.profile
depending on how your server is set up.)
This loads your bash
shell environment, then activates your Python virtual environment, essentially leaving you with the same setup you tested your scripts in.
No need to define environment variables in crontab and no need to modify your existing scripts.
I had the same issue and spent a lot of time solving that. None of the solutions here helped me, so I'm sharing what worked for me:
Open a new file "pick_name.sh" open it inside of your project directory. Inside the "pick_name.sh" file, write and save the following lines:
#!/bin/bash
source /YOUR_VIRTUAL_ENV_PATH/bin/activate
export PYTHONPATH="${PYTHONPATH}:/PATH_TO_CUSTOM_MODULE_YOU_CREATED**OPTIONAL**"
export PYTHONPATH="${PYTHONPATH}:/PATH_TO_ANOTHER_CUSTOM_MODULE_YOU_CREATED**OPTIONAL**"
cd /PATH_TO_DIR_STORING_FILE_NAME.PY python file_name.py
Go to /var/spool/cron/crontabs (or to where your cron management file sits) and open the 'root' file. Add these lines to the root file which's inside the crontab folder:
# m h dom mon dow command
* * * * * /PATH_TO_DIR_WHERE_PICK_NAME.SH_SITS/pick_name.sh >> /YOUR_PROJECT_DIR/cron_output.txt 2>&1
Notes:
This command (section 4.) will run the "pick_name.sh" file. In this example it runs every minute, so make sure you change it according to your needs. It writes all logs to a log file called "cron_ouput". No need to create the file before, it will be created automatically.
Make sure to replace all paths (I wrote them in capital letters) to your paths.
You can change file names, if so, make sure to change it in all appearances in my instructions to avoid errors.
If you want to add another py file to run by cron, you need to add it to the "pick_nam.sh" file* not to the cron. Simply duplicate section 2. lines in the "pick_nam.sh" but without the "#!/bin/bash" part. Then, every time the cron will run "pick_name.sh" it will run all the files you specified inside of it.
Make sure to restart cron after changes, it could have saved me a lot of debugging time, use this command:
systemctl restart cron
This is a solution that has worked well for me.
source /root/miniconda3/etc/profile.d/conda.sh && \
conda activate <your_env> && \
python <your_application> &
I am using miniconda with Conda version 4.7.12 on a Ubuntu 18.04.3 LTS.
I am able to place the above inside a script and run it via crontab as well without any trouble.
This will also work on crontab -e
* */5 * * * cd /home/project && sudo /home/project/venv/bin/python scripte.py
I had this same issue:
I had written a custom django command to check for geodjango position coordinates inside of geodjango polygons and had trouble automating the task to run, however using this command with crontab worked for me:
* * * * * ./home/project/locations/locations.sh >> /var/log/locations.log 2>&1
Success story sharing
~
with the full path? (You probably did, just making sure ...)