Session and Environment Control
Verify the active Python interpreter version:
python3 --version
Manage persistent terminal sessions using GNU Screen:
# List active sessions
screen -list
# Initialize a named session for background tasks
screen -S model_training
# Reattach to a detached session
screen -r model_training
To detach without terminating processes, press Ctrl+A followed by D.
Terminate unresponsive processes forcefully:
kill -SIGKILL <PID>
Modify shell environment variables and apply changes immediately:
vim ~/.bashrc
# Press 'i' to insert, 'Esc' to exit insert mode, then type ':wq' to save and quit
source ~/.bashrc
Package and Dependancy Management
Handle Python libraries via pip:
pip3 --version
python3 -m pip install --upgrade pip
pip3 install pandas==2.0.3 --index-url https://pypi.org/simple/
pip3 cache purge
Manage environments and packages via Conda:
conda --version
conda update -n base conda
conda install scikit-learn
conda clean -a
Hardware and Resource Monitoring
Inspect NVIDIA GPU utilization, memory allocation, and thermal status:
nvidia-smi --query-gpu=name,memory.used,memory.total,utilization.gpu --format=csv
Retrieve operating system release details:
cat /etc/os-release
lsb_release -d
Analyze storage allocation and block devices:
df -hT
lsblk -o NAME,SIZE,ROTA,FSTYPE
du -sh /opt/workspace/datasets/
Monitor CPU architecture, load, and memory consumption:
vmstat 2 5
top -d 2
getconf LONG_BIT
free -h
lscpu | grep -E "^CPU\(|Model name"
awk '/MemTotal|MemAvailable|SwapTotal/ {printf "%-15s %6.2f GB\n", $1, $2/1024/1024}' /proc/meminfo
File System and Directory Operations
Extract archives and manipulate directories:
unzip archive_data.zip
mkdir -p experiments/run_04
cp -r src/ backups/src_latest/
mv deprecated_script.py archived_script.py
rm -i temporary_log.txt
rm -rf /tmp/ml_cache/
Network connectivity verification and file metadata inspection:
ping -c 4 1.1.1.1
touch pipeline_runner.sh
file weights_v2.bin
history | grep "train"
Advanced Search and Archive Handling
Locate files based on name patterns or size thresholds:
find /data -type f -name "*.csv"
find /var/log -type f -size +50M
ls -la | grep "\.json$"
Search file contents efficiently across directories:
find . -type f -name "*.py" -exec grep -l "import tensorflow" {} +
grep -n "RuntimeError" app.log system.log
Compress, inspect, and extract tarball archives:
tar -czvf project_backup.tar.gz /opt/workspace/src/
tar -tzvf project_backup.tar.gz
tar -xzvf project_backup.tar.gz -C /restore/destination/