Step 0: Create a SAS program for testing purposes:
%put Program: /home/trb/sas/run_batch.sas;
%put Execution Mode: Batch via cron;
%put Job Start Time: &sysdate9 &systime;
%put SLC Version: SLC &sysver;
%put User ID: &sysuserid;
%put Process ID: &sysjobid;
%put System Day: &sysday;
%put Operating Sys: &sysscp;
%put Site Name: &syssite;
%put - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ;
/* Your processing code here */
/* Final status reporting */
%put - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ;
%put Job End Time: &sysdate9 &systime;
%put Final SYSCC: &syscc;
%put Final SYSERR: &syserr;
%if &syserr > 0 %then %do;
%put ERROR: Job completed with errors. SYSERR=&syserr;
%let syscc = 8;
%end;
%else %do;
%put NOTE: Job completed successfully;
%end;
%put Job Return Code: &syscc;
/* Exit with appropriate return code */
%if &syscc > 0 %then %do;
%abort return &syscc;
%end;
Step 1: Test Your Batch Command
Before scheduling, make sure your command works correctly from the command line. As a best practice, include complete path names since the execution environment will have minimal context. We tested with:
/opt/altair/slc/2026/bin/wps ~trb/sas_source/run_batch.sas
-log ~trb/sas_logs/run_batch.log
-print ~trb/sas_lst/run_batch.lst
-config /opt/altair/slc/2026/altairslc.cfg
We ran this command from the user’s home directory, executing a program called run_batch.sas containing the statements in Step 0 above. The log output goes to the path:
/home/trb/sas_logs/run_batch.log
And we included the path to the SLC configuration file. In our case:
/opt/altair/slc/2026/altairslc.cfg
Any number of command-line options can be ‘stacked’ on the SLC command-line command.
In addition, if the altairslcenv.sh file is found in the SLC home directory, it will automatically be executed as part of the SLC command-line command processing. It commonly contains environment variables needed for database connection when the SLC process is initiated. See our example altairslcenv.sh in Appendix A.
Common SLC Command-Line Command Options:
Syntax | Purpose |
---|
-logfilename | Logfileoutputlocation |
-lstfilename | Listingoutputfilelocation |
-configfilename | Configurationfilelocation |
-sysinfilename(alternativetopositional parameterwhichisalwaysfirst) | SASlanguageprogramtobeexecuted |
-workdirectory | WORKlibrarylocation |
-setmacro=value | Setmacrovariables |
-nosyntaxcheck | Skipsyntaxcheck |
-nodate | Suppressdateinlisting |
-linesizen | Setlinesize |
-pagesizen | Setpagesize |
Step 2: Create a Shell Script (Recommended)
It's best practice to wrap your command in a shell script for cron jobs:
Create the script file
nano /home/trb/run_sas_job.sh
Add this content to the run_sas_job.sh script:
#!/bin/bash
cd /home/trb
#Create directories if they don't exist
mkdir -p /home/trb/sas_logs
mkdir -p /home/trb/sas_lst
echo "Job started at $(date)" >> /home/trb/cron_job.log
#Run SAS job with full paths (single line to avoid continuation issues)
/opt/altair/slc/2026/bin/wps /home/trb/sas_source/run_batch.sas
-log /home/trb/sas_logs/run_batch.log -print
/home/trb/sas_lst/run_batch.lst -config
/opt/altair/slc/2026/altairslc.cfg
RETURN_CODE=$?
echo "Job completed at $(date) with return code: $RETURN_CODE"
/home/trb/cron_job.log
exit $RETURN_CODE
Make this file executable:
chmod +x /home/trb/run_sas_job.sh
Step 3: Open Crontab for Editing
crontab -e
Step 4: Add Your Cron Job Entry
The cron format is: minute hour day_of_month month day_of_week command Here are some examples:
Run daily at 2:30 AM:
30 2 * * * /home/trb/run_sas_job.sh
Run every Monday at 9:00 AM:
0 9 * * 1 /home/trb/run_sas_job.sh
Run on the 1st of every month at 6:00 AM:
0 6 1 * * /home/trb/run_sas_job.sh
Run Monday through Friday at 8:30 AM:
30 8 * * 1-5 /home/trb/run_sas_job.sh
Step 5: Add Logging and Error Handling
For better monitoring, modify your cron entry to capture output:
30 2 * * * /home/trb/run_sas_job.sh >> /home/trb/cron_output.log 2>&1
Or enhance your shell script with more robust logging:
#!/bin/bash
cd /home/trb
echo "Job started at $(date)" >> /home/trb/cron_job.log
/opt/altair/slc/2026/bin/wps input_program.sas -log
/home/trb/log >> /home/trb/cron_job.log 2>&1
echo "Job completed at $(date)" >> /home/trb/cron_job.log echo "---" >> /home/trb/cron_job.log
Step 6: Save and Exit
After adding your cron entry, save and exit the editor. The cron daemon will automatically pick up the changes.
Step 7: Verify Your Cron Job
Check that your cron job was added:
crontab -l
Important Tips:
- Use full paths in cron jobs since the environment is minimal
- Set environment variables if needed at the top of your crontab:
PATH=/usr/local/bin:/usr/bin:/bin
- Test your script manually before scheduling
- Check system logs for cron execution: grep CRON /var/log/syslog
- Crontab time entries must be ‘synchronized’ with the server’s time zone value. This can be checked with:
timedatectl status
for our test environment we have:
Local time: Wed 2025-08-13 15:52:01 PDT
Universal time: Wed 2025-08-13 22:52:01 UTC
RTC time: Wed 2025-08-13 22:52:01
Time zone: America/Los_Angeles (PDT, -0700) System clock synchronized: yes
NTP service: active RTC in local TZ: no
In our case, all crontab entries should be set based on the PDT time zone.
Cron Time Format Quick Reference:
*****
command
│ │ │ │ │
│ │ │ │ └─ Day of week (0-7, Sunday = 0 or 7)
│ │ │ └─── Month (1-12)
│ │ └───── Day of month (1-31)
│ └─────── Hour (0-23)
└───────── Minute (0-59)
Appendix A: altairslcenv.sh
#!/bin/bash
#Updated SLC BigQuery Environment Setup (based on Altair support config)
#Clean up existing LD_LIBRARY_PATH to prevent duplicates
export LD_LIBRARY_PATH=""
#Core SLC paths (lib and lib64 don't exist, libraries are in bin)
export LD_LIBRARY_PATH="/opt/altair/slc/2026/bin"
#Add system library paths
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/lib/x86_64-linux-gnu:/usr/lib64:/usr/local/lib"
#Add Simba BigQuery driver path
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/opt/simba/googlebigqueryodbc/lib"
#Add Microsoft SQL Server ODBC driver path
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/opt/microsoft/msodbcsql17/lib 64"
#ODBC Configuration - Use SLC's etc directory
export ODBCINI=/opt/altair/slc/2026/etc/odbc.ini
export ODBCSYSINI=/opt/altair/slc/2026/etc
export ODBCINSTINI=/opt/altair/slc/2026/etc/odbcinst.ini
#Simba-specific configuration (use proper variable name)
export SIMBAGOOGLEBIGQUERYODBCINI=/opt/altair/slc/2026/etc/simba.googlebigqueryodbc.ini
#Google Cloud credentials
export GOOGLE_APPLICATION_CREDENTIALS=/home/trb/.gcp/bigquery- service-account.json
#SSL certificates
export SSL_CERT_FILE=/etc/ssl/certs/ca-certificates.crt
export SSL_CERT_DIR=/etc/ssl/certs
export CURL_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt
export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt
#PATH
export PATH="/opt/altair/slc/2026/bin:$PATH"
echo "SLC BigQuery Environment Variables:" echo " PATH: $PATH"
echo " ODBCINI: $ODBCINI"
echo " ODBCSYSINI: $ODBCSYSINI" echo " ODBCINSTINI: $ODBCINSTINI"
echo " SIMBAGOOGLEBIGQUERYODBCINI: $SIMBAGOOGLEBIGQUERYODBCINI"
echo " LD_LIBRARY_PATH: $LD_LIBRARY_PATH"
echo " GOOGLE_APPLICATION_CREDENTIALS:
$GOOGLE_APPLICATION_CREDENTIALS"
echo ""