Working with Websites

Some great tools for web or internet stuff I like.


  • curl (great for testing HTTP requests)
  • aria2c (great for faster downloads via multiple threads)
  • httrack (mirrors websites but really slow)
  • youtube-dl (download or stream media)
  • rtmpdump (or just use youtube-dl)
  • vnstat (checking network usage)
  • nmap (local subnet IP searching)
  • ipscan (simple java IP scanner)
  • iperf (custom link speed test)
  • iodine (handy VPN over DNS for restricted networks)
  • sshuttle (tunnel network traffic over SSH)
  • wireguard (do i need to mention this? fast, safe vpn protocol)
  • Postman (for testing REST API's without memorizing cURL)


Mirror all the files off of a website, with the full folder structure

wget -m -p -E -k -K -np

On Windows, you can save links to websites. In Linux, Ctrl+S and drag/drop just save the HTML.

Well, turns out you can make .desktop files to just link to websites. Here's the jankjank script to easily make desktop files.

echo "[Desktop Entry]" >> "$1.desktop"
echo "Encoding=UTF-8" >> "$1.desktop"
echo "Name=$1" >> "$1.desktop"
echo "Type=Link" >> "$1.desktop"
echo URL="$2" >> "$1.desktop"
echo "Icon=text-html" >> "$1.desktop"

Parameter 1 is the name of the shortcut and file, parameter 2 is the actual URL. Change icon and other params as you see fit.

Save as a bash function or as a script. Usage:

bash "TonyWiki" ""
  • SFTP
  • SCP
  • Rsync
  • SSHuttle (VPN over SSH)
  • NetCat (just yeet stuff across systems)
  • Rclone (for other cloud)
  • S3FS (for when I need more disk space through B2)
  • scripts/working_with_websites.txt
  • Last modified: 2022-06-12 04:14
  • by Tony